Keywords

1 Introduction and Background

A flurry of research conducted over the last decade is geared at rethinking the way we think about education. The long-standing formal lecture teaching mode is less popular with students [1] and less effective than more interactive modes of learning [2]. Educator Prensky has argued that today’s generation of students are “Digital Natives” whose brains are wired fundamentally differently than the previous generation of “Digital Immigrants” and should be taught in dramatically new ways [3]. Guzdial and Soloway similarly argued that a heavily text means of teaching programming is ill-suited to the “Nintendo Generation,” which thrives on sounds, graphics, animation, and speed [4].

Research on digital game-based learning (also called serious games for education) show that students who learn through games have increased feeling of alertness, activity, involvement in contrast to boredom during standard lecture mode [5]. Game-based learning allows educators to tap into the enthusiasm that students show for computer games and bring those attitudes to the classroom as “interested, competitive, cooperative, results-oriented, actively seeking information and solutions” [6].

A number of serious games have been created for specific topics in CS education; for example, loops and arrays [7,8,9], binary trees [10], and object-oriented programming [11], along with a number of games created for teaching introductory programming [12,13,14,15,16] and other CS topics [17,18,19]. Experimental evaluations of these games have indicated that they are an effective and fun teaching tool. However, we do not know of any games developed for advanced programming topics needed to prepare students for the critical Data Structure courses. We have developed a game in Unity that teaches and assesses student knowledge of advanced programming topics; we focus on C++ pointers, which we find to be particularly confusing to students. In this work, we explain our game and report on feedback from students piloting the game.

2 Game Design

The game that we created is a side-scrolling platformer. Its storyline features rogue robots that have dismantled a critical government computer. An intrepid computer mouse is on a mission to rebuild it and save society. The game is composed of three levels; each part has two stages: a learning step and an assessment step. In the learning step, as the mouse jumps from platform to platform, it encounters a variety of scientists. Each scientist, when approached, shares a particular lesson in the use of pointers. At the end of the learning stage, a summary of the lessons learned is shown on the screen. The player then progresses to an assessment stage, during which s/he must correctly answer multiple-choice questions about the material just taught in order to defeat the enemy. After the player correctly answers all the questions, a missing part of the computer is displayed on the screen and the player progresses to the next level, which is both more challenging in terms of the gameplay and the material being taught. After the player completes all three levels, a picture of the fully assembled computer is shown, along with the message that the mouse has saved the world. Screenshots of our game are shown in Fig. 1.

Fig. 1.
figure 1

Screenshots of the game

Research has shown that there are fundamental differences in the ways that male and female players play computer games (see, e.g. [20,21,22]). For example, males tend to be more enthusiastic players than females [23]; males tend to play games more frequently and for longer durations than do their female peers [24]; male players have stronger desires for competition and tend to be more motivated by a “need to win,” while female players prefer the within-game social dynamics between game characters [25]. Hence, effort must be invested to ensure that educational games are appropriate for both genders [26]. This game was designed by a team of women developers and we tried to make the game appealing to female players through the use of a storyline that included a meaningful goal, the use of facial expressions and human-like animations on our sprites, positive feedback, and rewards at the end of each level to motivate persistence. This is a “by women, for women” game that we hope will help all students, especially female ones, learn difficult Advanced Programming concepts.

Our game conforms to the four principles of gamified learning given by Stott and Neustaedter [27]:

  • freedom to fail: Players are allowed unlimited attempts to answer the multiple choice questions; answering them correctly increases their score but incorrect answers do not make the game end.

  • rapid feedback: Immediately after answering a question, students are told if their answer is correct without needing to complete an entire set of questions. This is essential to good learning; “the more frequent and targeted the feedback, the more effective the learning” [28].

  • progression: Student learning and assessment progresses from basic concepts of pointers to more advanced or complex topics. This aids students to build on their areas of knowledge and expertise and add to them.

  • storytelling: Research such as [13] has shown that players are more engaged and persist for longer when they are motivated by meaningful goals. Our storyline attempts to draw players in through an appeal to their help to save the world.

Our game also incorporates principles of good learning of educator Gee [29] such as “just in time” directions to instruct users just when it becomes relevant, and a “pleasantly frustrating” gameplay.

3 Experimental Design

We piloted our game on undergraduate students of Brooklyn College and College of Staten Island (both senior colleges of City University of New York) to learn about our game’s effectiveness. Each participant was asked to take a pre-test that tested their level of knowledge about pointers, play the game, and take a post-test that measured their knowledge of pointers again. The pre-test and post-test questions differed only trivially; questions were repeated with different variable names or values. In addition, the pre-test asked for basic demographic information (college attended, programming course currently taking, and gender). The post-test included a questionnaire measuring their levels of intrinsic motivation, shown in Fig. 2, based on [30], except that we used a 5-point Likert scale instead of a 7-point one. We also asked three additional questions about the user’s thoughts about the game (Fig. 3) using the same 5-point scale, as well as a request for general feedback on the game design.

Fig. 2.
figure 2

Measuring intrinsic motivation

Fig. 3.
figure 3

Measuring engagement attitude towards game

Besides for the information mentioned here, no other personal information was collected, except for the participants’ email addresses, which we used as their unique user IDs for the game. (We suggested that participants create anonymous email addresses if they did not want to be identified, but very few did this.) We used Google Forms to conduct the pre- and post- evaluations.

4 Results

Twenty-eight students played the game and completed both pre- and post-tests. (An additional six students completed a pre-test but either did not play the game or did not complete the post-test; they are excluded from our analysis.) Twelve participants are Brooklyn College students and 16 are College of Staten Island students. Twelve of the participants are female and sixteen male. Twenty-six of the participants are Computer Science majors or related (e.g. multimedia majors); two are non-CS majors who were interested in playing the game anyway.

Among the students who completed both tests, the average pre-test score was a 49, the average post-test score was a 60, and the average improvement was about 12%.

We calculated levels of intrinsic motivation using the standard approach, giving one point to all “strongly disagree,” two points to “agree,” and so on until “strongly agree.” We computed the average across all four intrinsic motivation questions to get an “intrinsic motivation score” for each participant. A score of three indicates average levels of motivation (the equivalent of “neutral” responses for all questions). The average score for our participants was 3.5, indicating above-average levels of motivation. In total, 60% of participants reported above-average levels of motivation; 67% of female participants and 57% of male participants indicated high motivation.

We used a similar process to evaluate the questions about the engagement of the game. The average score for these questions was 4, indicating that students found the game a fun and effective learning tool. (The average score for “I would play for fun” was 3.6; the average scores for “I would play to learn about pointers” and “I would play to assess my knowledge of pointers” were 4.29 and 4.25, respectively, indicating that students would be more likely to play our game as a learning tool than as a fun tool, which is what we would have intuitively expected.) 83% of female participants and 94% of male participants gave the game an above-average score in this area.

When asked for feedback, participants pointed out a number of technical issues, which we plan to fix before our next release. They also gave a lot of positive feedback about the UI, graphics and music, with responses such as:

  • “The graphic design is excellent. Very pleasant art style to look at.”

  • “Graphics are very pleasing and well thought out.”

  • “[I liked] the mouse, and layout of the game (how each stage gave you a puzzle piece).”

  • “The visuals and music were great.”

  • “User friendly appearance, easy to use.”

Participants also praised the educational aspect of the game:

  • “The questions were a great way to test your knowledge or refresh yourself on pointers. Great way to have fun and learn at the same time.”

  • “The premise of the game is neat; Platforming while simultaneously learning about pointers while the music is playing offers a unique experience for those wanting to learn more or just wanting to play the game.”

  • “The questions were brief enough to not be annoying but also long enough to have useful content.”

  • “Testing your knowledge while fighting bosses is a great feature and really shows that the user understands what they learned throughout the level.”

  • “Its super fun, and it teaches you a lot about pointers.”

  • “It was very informative and certainly refreshed my memory on pointers and probably I would like to play this game before I walk into an interview. Also, I think the difficulty made me want to play much more so that was really good motivation to keep playing the game.”

5 Discussion

Despite our small sample, our pilot study of this game shows a number of encouraging results. We attracted a large number of female testers. Despite the fact that female students represent well under a third of the CIS majors at these institutions, female students constituted 43% of the participants who played the game. Our game received positive feedback and it had a moderate effect on tested knowledge of pointers. Levels of motivation and interest in the game were high among our participants, and many of them reported enjoying the learning + playing experience.

We were a bit puzzled by five students whose post-test scores actually decreased by comparison to their pre-test scores and conjecture that students did not think carefully when responding and may have guessed or made random choices. We are attempting to figure out a better way to encourage careful thought in answering the questions in our next round of testing, perhaps by offering an incentive to students who do well.

We plan to test the game on a bigger pool of students from both institutions by the end of the academic year.