Learning HCI and InfoVis in the Open
In this paper, we report on our experiences with novel learning strategies for HCI and Information Visualisation, in the context of a computer science master curriculum. What sets our experiences somewhat apart is the focus on openness. This includes the use of Open Educational Resources (OER), as well as open communication between the students, the professor and the general public, through web2.0 tools like blogs, wikis, Facebook and Twitter.
open standards that realise interoperability between the different components of a learning infrastructure at the technical level ;
open source implementations of such components that enable developers to inspect and modify the source code ;
open content that can be shared, repurposed and reused freely - often called ‘open educational resources’ or OER ;
open learning where also the learning activities are more openly shared .
Moreover, we position our work in the context of 21st century competencies (information literacy, creativity & innovation, collaboration, problem solving, communication and responsible citizenship), rather than drill-and-practice repetitive fact memorisation or very closed problem solving as is more typical in many university courses. These competencies comprise both skills  and dispositions to learn . Basically, the idea is that we should teach students to solve problems we don’t know using technologies we don’t know, especially in technology oriented domains like computer science where the domain continues to evolve at increasing speed under the influence of Moore’s law .
A currently very active instantiation of the open learning idea are MOOC’s or ‘Massively Open Online Courses’ that enable anyone to take courses anywhere . As often well-known prestigious universities organised themselves in networks of providers of such MOOC’s and their offerings attract regularly many tens of thousands of learners, the concept of MOOC’s has received substantial media coverage . Our course is similar but it is not massive by any meaning of the word: we typically have about 30 students. Our course is open though, but in a different way than MOOC’s. Whereas the openness in the latter mainly relates to the low barrier for people to register, we focus more on making students work together and communicate with each other and the team in an open way, through public blogs and Twitter. Our courses are typically only partly “on-line”: we do use blogs and wikis and Twitter to communicate, but we also do face-to-face studio sessions throughout the semester.
2 The Courses
As such, the main goal of the courses is to make the students change their perspective on how software is evaluated - not from a technical point of view, but focused on how it impacts the user experience. Indeed, evaluation is the main topic of the course: we start with the evaluation of an existing software product (over the years, we have evaluated Mendeley, Google Plus, Pinterest, and others). After this first activity, we start cycles of iterative development, going from brainstorming sessions over user scenario development and paper prototyping to more and more functional digital prototypes and, finally a release ‘in the wild’, i.e. to the general public. Throughout these activities, we continuously evaluate intermediate versions and the students are expected to document the outcomes of their evaluations and how these outcomes dictate their iterative development cycles.
Rather than an institutional Learning Management System (LMS) like Blackboard or Moodle, we use a wiki as the main ‘landing page’ of the course (see Fig. 1). This is important, as an LMS environment is typically set up and under the control of the central institutional services. We replace this environment with a tool that is under complete control of the students: initially, the wiki page only contains a title and empty placeholder text. The clear message, although implicit, is that students are expected to take responsibility for their own learning .
Students typically work in groups of three and communicate with the other groups and teaching staff through public blogs. They are explicitly asked to not only read the blogs of the other groups, but also to comment on the posts of other groups. The intent is to create a community of practice, where learning takes place in an open spirit of collaboration and communication .
Whereas the blogs are intended for more substantial posts and comments on these posts, we rely on Twitter hashtags for more ephemeral messages that create a ‘pulse of the course’ and support continuous awareness of ongoing activities in the course (see Fig. 2).
Statistics on the 2013 HCI course
Number of students
Number of blog posts
Number of blog comments by students
Number of blog comments by staff
Number of blog comments by outsiders
The result is that the on-line communication happens in a ‘class without walls’: external people can follow and post to the Twitter stream, subscribe to the blog posts, and leave comments on the student blogs. Although this kind of external input is a small percentage of the overall activity (see also Table 1), it contributes significantly to the authentic nature of the course activities: in the words of the students, ‘real people’ (i.e. not the teaching staff or peer students) take an interest in what they post! One recent example is a reaction from a representative of a software tool, who commented on one of the student blogs to help them with problems they experienced with the tool.
A more inspiring example is that of Fig. 3: in reaction to evaluations that the students blogged about a bibliographic reference manager (mendeley, http://mendeley.com), one of the founders of the company commented on these evaluations. For the students, it is quite motivating to discover that their comments are not only read by the course staff and that these comments may actually serve another purpose than passing an exam or getting a degree.
Another variation on the same theme of openness is participation in ‘hackatons’, i.e. intensive events where teams design and develop software over a period of a few hours to a few days. In the Information Visualisation course, we participated in the ‘2012 Visualizing Global Marathon’ (http://visualizing.org/marathon2012/), an event from Friday evening until Sunday midnight, with locations across the world. Students were given three data sets at the start and could participate in video conferences with experts on visualisation. A Twitter hashtag organised Twitter messages. After the event, a jury evaluated the results. Actually, some of the teaching staff also participated in the hackaton and one of them, Till Nagel, was awarded a prize for his work (see Fig. 4). Again, this creates an open atmosphere in which students work not only under supervision of the teaching staff on a project that they hand in, but rather work with staff in a global community of students, staff and experts, in an intensive setting where friendly competition motivates but doesn’t prevent collaboration!
The number and diversity of interactions (Twitter, blog posts and comments, wiki edits, ...) can be a bit overwhelming. This can be problematic for students and staff alike. However, we believe that coping with this kind of ‘information overload’ (or rather ‘filter failure’ when this becomes a problem ) is an important 21st century meta-skill in itself and that it is actually useful that students acquire this skill as well. Moreover, we also research the use of learning analytics dashboard applications [14, 15] that enable staff and students to be aware of, reflect on, make sense of and act on what is going on.
There are obvious concerns around privacy and ‘trusted environments’ when learning takes place in this open way: students are sometimes concerned that potential future employers will be able to find out about the mistakes they made in class. On the other hand, our perceptions of what is appropriate to share and what should be private are shifting as technology develops . More importantly, we believe that it is also valuable that students learn to act in public and engage with an external audience, the more so as this is a way to realise a more authentic learning context, as we have argued above.
There is, of course, a certain overhead involved in the social interactions among students, and between students and the general public. Although we have already mentioned that we believe that mastering social interaction is a useful skill in its own right, there is still a question whether the added effort actually results in more learning - i.e. less time required overall, or better learning outcomes. In fact, grading for these courses is based on the project outcome: in that sense, students are not rewarded directly for social interaction and collaboration. On the other hand, by making use of the opportunities for interaction, students can receive more feedback about progress on their projects as they proceed, which should result in increased motivation and a higher quality outcome, and thus a higher grade. However, carefully evaluating the impact of the open approach on the learning outcome remains tricky as it is difficult to isolate the effect of our approach from all the other variables that affect student results.
It is unclear how well this approach would scale from our current group of around 30 students to hundreds or the many thousands of students in MOOC’s. On the one hand, as mentioned above, students already now report feeling a bit overwhelmed. Scaling up the courses risks making this problem much more serious. On the other hand, it would be possible to make students interact intensively with just a small subset of a large group. And to channel the contributions from outsiders, so that they would still reach all of the students, unless the participation from outsiders also increases with the student numbers, and we can also partition their contributions to subsets of students. Yet, how to scale up without turning the experience into a set of isolated mini-courses for each subset of students remains unclear.
In this paper, we have presented our work on teaching Human-Computer Interaction and Information Visualisation in an open way. Our work is part of a broader evolution towards more open software, content and learning in general. Our basic premise is that a more open approach prepares students better for a 21st century professional career and life in general and that the more authentic context for learning improves motivation and thus increases the likelihood of positive learning outcomes. Although this approach can at times be challenging for students and staff alike, we believe that this openness is essential for deeper authentic learning.
- 1.de Waard, I., Abajian, S., Gallagher, M.S., Hogue, R., Keskin, N., Koutropoulos, A., Rodriguez, O.C.: Using mlearning and moocs to understand chaos, emergence, and complexity in education. Int. Rev. Res. Open Distance Learn. 12(7), 94–115 (2011)Google Scholar
- 2.Crick, R.D.: Assessment in schools - dispositions. In: McGaw, B., Peterson, P., Baker, E. (eds.) The International Encyclopedia of Education. Elsevier, Amsterdam (2009)Google Scholar
- 3.Dron, J.:. Social software and the emergence of control. In: Sixth International Conference on Advanced Learning Technologies, pp. 904–908 (2006)Google Scholar
- 4.Duval, E., Verbert, K.: On the role of technical standards for learning technologies. IEEE Trans. Learn. Technol. 1(4), 229–234 (2008)Google Scholar
- 7.Iiyoshi, T., Kumar, M.S.V. (eds.): Opening Up Education. MIT Press, Cambridge (2008)Google Scholar
- 8.Jarvis, J.: Public Parts: How Sharing in the Digital Age Improves the Way We Work and Live. Simon and Schuster, New York (2011)Google Scholar
- 9.Kurzweil, R.: The Singularity Is Near: When Humans Transcend Biology. Penguin books, New York (2005)Google Scholar
- 10.Pappano, L.: The year of the mooc. The New York Times, 4 (2012)Google Scholar
- 12.Rheingold, H.: The Virtual Community: Homesteading on the Electronic Frontier. MIT Press, Cambridge (2000)Google Scholar
- 13.Shirky, C.: Here Comes Everybody: The Power of Organizing Without Organizations. Penguin Press, New York (2008)Google Scholar
- 14.Suthers, D., Verbert, K., Duval, E., Ochoa, X. (eds.): LAK ’13: Proceedings of the Third International Conference on Learning Analytics and Knowledge. ACM, New York (2013)Google Scholar
- 15.Verbert, K., Duval, E., Klerkx, J., Govaerts, S., Santos, J. L.: Learning analytics dashboard applications. American Behavioral Scientist, 10p. (2013, in press)Google Scholar