Advertisement

Bridging Two Worlds: Principled Game-Based Assessment in Industry for Playful Learning at Scale

  • V. Elizabeth Owen
  • Diana Hughes
Chapter
Part of the Advances in Game-Based Learning book series (AGBL)

Abstract

In recent years, a large body of research in game-based assessment (GBA) has been rooted in the application of methodologies like evidence-centered design to digital game-based learning. This approach affords principled alignment of the target competency with evidence of learning and task design—thus enabling assessment to be embedded in the game experience, allowing student-responsive scaffolding and supporting engagement through formative feedback. Application of these principles to large-scale learning game production can be vital to expanding the benefits of GBA in practice for impact on playful, engaged learning at scale. Practices that support this application to game industry are therefore important, addressing challenges of implementing principled design on short production timelines, sustaining production values to support user adoption and financial sustainability, and integrating with “big data” industry culture to support learning insights (in which disciplines like educational data mining can be particularly relevant). In addressing these challenges, this chapter offers an example of a working GBA practice in an industry context, which implements ECD-based learning design—integrated with principles of educational data mining to inform corresponding event-stream data design—for the production of data-driven educational games to support learning for students at scale. These games can leverage this data-driven approach to support learning in a classroom setting, offering teacher-centered dashboard tools with visualization of student progress and yielding significant learning gains in a recent classroom pilot study with the target learner age group.

Keywords

Educational data mining Evidence-centered design Embedded assessment Game production Learning game design Game-based assessment 

Notes

Acknowledgements

We wish to thank the entire mastery team at Age of Learning, especially Doug Dohring, Sunil Gunderia, Daniel Jacobs, K.P. Thai, and Vesper Burnett.

References

  1. Aarseth, E. (2007). I fought the law: Transgressive play and the implied player. In Situated Play. Proceedings of DiGRA (pp. 24–28).Google Scholar
  2. Asbell-Clarke, J., Rowe, E., & Sylvan, E. (2013). Assessment design for emergent game-based learning. In W. E. MacKay, S. Brewster, & S. Bodker (Eds.), CHI’13 extended abstracts on human factors in computing systems (pp. 679–684). Retrieved from http://dl.acm.org/citation.cfm?id=2468476
  3. Baker, E. L., Chung, G. K. W. K., & Delacruz, G. C. (2012). The best and future uses of assessment in games. In Technology-based assessments for 21st century skills: Theoretical and practical implications from modern research (pp. 227–246). Charlotte, NC: IAP - Information Age Publishing, Inc.Google Scholar
  4. Baker, R. S., & Clarke-Midura, J. (2013). Predicting successful inquiry learning in a virtual performance assessment for science. In S. Carberry, S. Weibelzahl, A. Micarelli, & G. Semeraro (Eds.), Proceedings of the 21st International Conference on User Modeling, Adaptation, and Personalization (pp. 203–214). Retrieved from http://www.columbia.edu/~rsb2162/UMAP-2013-BCM-v9.pdf
  5. Baker, R. S., & Siemens, G. (2014). Educational data mining and learning analytics. In K. Sawyer (Ed.), Cambridge handbook of the learning sciences (2nd ed., pp. 253–274). New York, NY: Cambridge University Press.Google Scholar
  6. Baker, R. S., & Yacef, K. (2009). The state of educational data mining in 2009: A review and future visions. Journal of Educational Data Mining, 1(1), 3–17.Google Scholar
  7. Beck, J. E., & Gong, Y. (2013). Wheel-spinning: Students who fail to master a skill. In H. C. Lane, K. Yacef, J. Mostow, & P. Pavlik (Eds.), Artificial Intelligence in Education (Vol. 7926, pp. 431–440).  https://doi.org/10.1007/978-3-642-39112-5_44
  8. Bielaczyc, K., & Kapur, M. (2010). Playing epistemic games in science and mathematics classrooms. Retrieved from https://repository.nie.edu.sg/handle/10497/14456
  9. Canossa, A., Badler, J. B., El-Nasr, M. S., Tignor, S., & Colvin, R. C. (2015, June). In your face (t) impact of personality and context on gameplay behavior. Presented at the FDG Conference (Foundations of Digital Games), Pacific Grove, CA.Google Scholar
  10. Chung, G. K. W. K. (2015). Guidelines for the design, implementation, and analysis of game telemetry. In C. S. Loh, Y. Sheng, & D. Ifenthaler (Eds.), Serious games analytics: Methodologies for performance measurement, assessment, and improvement (pp. 59–79). New York, NY: Springer.Google Scholar
  11. Clarke-Midura, J., Code, J., Dede, C., Mayrath, M. C., & Zap, N. (2012). Thinking outside the bubble: Virtual performance assessments for measuring complex learning. In M. C. Mayrath, J. Clarke-Midura, D. H. Robinson, & G. Schraw (Eds.), Technology-based assessments for 21st century skills: Theoretical and practical implications from modern research (pp. 125–148). Charlotte, NC: Information Age Publishing.Google Scholar
  12. Clements, D. H., & Sarama, J. (2004). Learning trajectories in mathematics education. Mathematical Thinking and Learning, 6(2), 81–89.  https://doi.org/10.1207/s15327833mtl0602_1Google Scholar
  13. Corrigan, S., DiCerbo, K. E., Frenz, M., Hoffman, E., John, M., & Owen, V. E. (2015). GlassLab game design handbook. Retrieved from http://gamedesign.glasslabgames.org/
  14. Danielak, B. (2014). Analyzing data with ADAGE. Retrieved from https://capbri.gitbooks.io/makescape-adage-gitbook/
  15. DiCerbo, K. E. (2014). Task models in the digital ocean. Measurement: Interdisciplinary Research and Perspectives, 12(1–2), 39–41.Google Scholar
  16. DiCerbo, K. E., Bertling, M., Stephenson, S., Jia, Y., Mislevy, R. J., Bauer, M., & Jackson, T. G. (2015). An application of exploratory data analysis in the development of game-based assessments. In C. S. Loh, Y. Sheng, & D. Ifenthaler (Eds.), Serious games analytics: Methodologies for performance measurement, assessment, and improvement (pp. 319–342). New York, NY: Springer.Google Scholar
  17. DiCerbo, K. E., & Kidwai, K. (2013). Detecting player goals from game log files. In S. K. D’Mello, R. A. Calvo, & A. Olney (Eds.), Proceedings of the 6th International Conference on Educational Data Mining (pp. 314–316). Retrieved from http://www.educationaldatamining.org/EDM2013/proceedings/EDM2013Proceedings.pdf
  18. Gee, J. P. (2003). What video games have to teach us about learning and literacy. New York, NY: Palgrave Macmillan.Google Scholar
  19. Gee, J. P. (2005). Learning by design: Good video games as learning machines. E-Learning and Digital Media, 2(1), 5–16.Google Scholar
  20. Gee, J. P. (2012, September). Games can drive assessment to a new place. Retrieved from http://gamesandimpact.org/wp-content/uploads/2012/09/Games-Can-Drive-Assessment-to-a-New-Place.pdf
  21. Ginsburg, H., & Baroody, A. (2003). TEMA-3 examiners manual (3rd ed.). Austin, TX: PRO-ED.Google Scholar
  22. Grace, L. D. (2014). A linguistic analysis of mobile games: Verbs and nouns for content estimation. In Proceedings of the 9th International Conference on the Foundations of Digital Games (p. 8). FDG.Google Scholar
  23. Groff, J., Clarke-Midura, J., Owen, V. E., Rosenheck, L., & Beall, M. (2015). Better learning in games: A balanced design lens for a new generation of learning games [white paper]. Cambridge, MA: MIT Education Arcade and Learning Games Network. Retrieved from https://www.media.mit.edu/publications/better-learning-in-games-a-balanced-design-lens-for-a-new-generation-of-learning-games/
  24. Halverson, R., & Owen, V. E. (2014). Game based assessment: An integrated model for capturing evidence of learning in play. International Journal of Learning Technology, 9(2), 111–138.  https://doi.org/10.1504/IJLT.2014.064489Google Scholar
  25. Hao, J., Smith, L., Mislevy, R., von Davier, A., & Bauer, M. (2016). Taming log files from game/simulation-based assessments: Data models and data analysis tools (pp. 1–17). Princeton, NJ: Educational Testing Service.Google Scholar
  26. Hicks, D., Eagle, M., Rowe, E., Asbell-Clarke, J., Edwards, T., & Barnes, T. (2016). Using game analytics to evaluate puzzle design and level progression in a serious game. In D. Gasevic, G. Lynch, S. Dawson, H. Drachsler, & C. P. Rosé (Eds.), Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (pp. 440–448).  https://doi.org/10.1145/2883851.2883953
  27. Ifenthaler, D., Eseryel, D., & Ge, X. (Eds.). (2012). Assessment in game-based learning: Foundations, innovations, and perspectives. New York, NY: Springer.Google Scholar
  28. Jacobs, D., Thai, K. P., Owen, V. E., Keylor, E., Roy, M.-H., & Burnett, V. (2018, October). Mastering Math: Comparing an educational video game to an established assessment. Presented at the Meaningful Play, East Lansing, MI.Google Scholar
  29. Juul, J. (2013). The art of failure: An essay on the pain of playing video games. Cambridge, MA: MIT.Google Scholar
  30. Kai, S., Almeda, M. V., Baker, R. S., Heffernan, C., & Heffernan, N. (2018). Decision tree modeling of wheel-spinning and productive persistence in skill builders. Journal of Educational Data Mining, 10(1), 36–71.Google Scholar
  31. Kai, S., Paquette, L., Baker, R. S., Bosch, N., D’Mello, S., Ocumpaugh, J., … Ventura, M. (2015). A comparison of video-based and interaction-based affect detectors in physics playground. In O. C. Santos, J. G. Boticario, C. Romero, M. Pechenizkiy, A. Merceron, P. Mitros, …M. C. Desmarais (Eds.), Proceedings of the 8th International Conference on Educational Data Mining (pp. 77–84). Retrieved from http://www.educationaldatamining.org/EDM2015/proceedings/edm2015_proceedings.pdf
  32. Ke, F., Shute, V. J., Clark, K. M., & Erlebacher, G. (2019). Designing dynamic support for game-based learning. In Interdisciplinary design of game-based learning platforms (pp. 119–140). New York, NY: Springer.Google Scholar
  33. Kerr, D., & Chung, G. (2012). Identifying key features of student performance in educational video games and simulations through cluster analysis. Journal of Educational Data Mining, 4(1), 144–182.Google Scholar
  34. Kevan, J. M., & Ryan, P. R. (2016). Experience API: Flexible, decentralized and activity-centric data collection. Technology, Knowledge and Learning, 21(1), 143–149.  https://doi.org/10.1007/s10758-015-9260-xGoogle Scholar
  35. Keylor, E., & Beukers, D. (2018). DVIT (data validation and integrity tool). Glendale, CA: Age of Learning, Inc.Google Scholar
  36. Martinez-Garza, M. M., & Clark, D. B. (2017). Investigating epistemic stances in game play with data mining. International Journal of Gaming and Computer-Mediated Simulations, 9(3), 1–40.  https://doi.org/10.4018/IJGCMS.2017070101Google Scholar
  37. Mislevy, R., Oranje, A., Bauer, M., von Davier, A., Hao, J., Corrigan, S., … John, M. (2014). Psychometric considerations in game-based assessment. Redwood City, CA: GlassLab.Google Scholar
  38. Mislevy, R. J., Almond, R. G., & Lukas, J. F. (2003). A brief introduction to evidence-centered design. Research Report-Educational Testing Service Princeton RR, 16. Retrieved from http://marces.org/EDMS623/Mislevy%20on%20ECD.pdf
  39. Mislevy, R. J., Corrigan, S., Oranje, A., DiCerbo, K. E., Bauer, M., von Davier, A., & John, M. (2016). Psychometrics and game-based assessment. In F. Drasgow (Ed.), Technology and testing: Improving educational and psychological measurement (pp. 23–48). New York, NY: Routledge.Google Scholar
  40. Mislevy, R. J. (2011). Evidence-centered design for simulation-based assessment (no. 800). Los Angeles, CA: National Center for Research on Evaluation, Standards, and Student Testing (CRESST).Google Scholar
  41. Mislevy, R. J., Behrens, J. T., DiCerbo, K. E., & Levy, R. (2012). Design and discovery in educational assessment: Evidence-centered design, psychometrics, and educational data mining. Journal of Educational Data Mining, 4(1), 38.Google Scholar
  42. Norton, D. (2008, June). A practical model for separating games and simulations. Presented at the 5th Annual Games+Learning+Society Conference. Retrieved from http://www.previous.glsconference.org/2008/session.html?id=139
  43. Owen, V. E., Anton, G., & Baker, R. S. (2016). Modeling user exploration and boundary testing in digital learning games. In J. Vassileva, J. Blustein, L. Aroyo, & S. D’Mello (Eds.), Proceedings of the 2016 conference on user modeling adaptation and personalization (pp. 301–302). New York, NY: ACM.Google Scholar
  44. Owen, V. E., & Baker, R. S. (2018). Fueling prediction of player decisions: Foundations of feature engineering for optimized behavior modeling in serious games. Technology, Knowledge and Learning, 24, 1–26.  https://doi.org/10.1007/s10758-018-9393-9Google Scholar
  45. Owen, V. E., & Baker, R. S. (2019). Learning analytics for games. In J. L. Plass, R. Meyer, & B. D. Homer (Eds.), Handbook of game-based learning. Cambridge, MA: MIT.Google Scholar
  46. Owen, V. E., Roy, M.-H., Thai, K., Burnett, V., Jacobs, D., Keylor, E., & Baker, R. S. (2019, July). Detecting productive persistence in an adaptive game-based learning system. Presented at the 12th International Conference on Educational Data Mining, Montreal, Canada.Google Scholar
  47. Owen, V. E., Wills, N., & Halverson, R. (2012). CyberSTEM: A game-based evidence model. In A. Ochsner (Ed.), Proceedings of the 8th Annual Games, Learning, and Society Conference. Pittsburgh, PA: ETC.Google Scholar
  48. Plass, J. L., Homer, B. D., Kinzer, C. K., Chang, Y. K., Frye, J., Kaczetow, W., … Perlin, K. (2013). Metrics in simulations and games for learning. In M. Seif El-Nasr, A. Drachen, & A. Canossa (Eds.), Game analytics (pp. 697–729).  https://doi.org/10.1007/978-1-4471-4769-5_31Google Scholar
  49. Plass, J. L., Homer, B. D., Kinzer, C. K., & Perlin, K. (2012). Games for learning institute (G4LI) white paper: Ideas for impact games. Retrieved from http://gamesandimpact.org/wp-content/uploads/2012/09/PlassNYU-Ideas-for-Impact-Games-2.pdf
  50. Ramirez, D. (2016). How player movement data improves educational game assessments (Doctoral dissertation, University of Wisconsin-Madison). Retrieved from ProQuest Dissertations and Theses database (14118).Google Scholar
  51. Rieber, L. P. (1996). Seriously considering play: Designing interactive learning environments based on the blending of microworlds, simulations, and games. Educational Technology Research and Development, 44(2), 43–58.Google Scholar
  52. Rodrigo, M. M. T., & Baker, R. S. (2011). Comparing learners’ affect while using an intelligent tutor and an educational game. Research and Practice in Technology Enhanced Learning, 6(1), 43–66.Google Scholar
  53. Salen, K., & Zimmerman, E. (2004). Rules of play: Game design fundamentals. Cambridge, MA: MIT.Google Scholar
  54. Sarama, J., & Clements, D. H. (2004). Building blocks for early childhood mathematics. Early Childhood Research Quarterly, 19(1), 181–189.  https://doi.org/10.1016/j.ecresq.2004.01.014Google Scholar
  55. Schell, J. (2008). The art of game design: A book of lenses. Burlington, MA: Elsevier.Google Scholar
  56. Serrano-Laguna, A., Martinez-Ortiz, I., Haag, J., Regan, D., Johnson, A., & Fernández-Manjóna, B. (2017). Applying standards to systematize learning analytics in serious games. Computer Standards & Interfaces, 50, 116–123.  https://doi.org/10.1016/j.csi.2016.09.014Google Scholar
  57. Shute, V. J. (2011). Stealth assessment in computer-based games to support learning. In S. Tobias & J. D. Fletcher (Eds.), Computer games and instruction (pp. 503–524). Retrieved from http://myweb.fsu.edu/vshute/pdf/shute%20pres_h.pdfGoogle Scholar
  58. Shute, V. J., D’Mello, S., Baker, R. S., Cho, K., Bosch, N., Ocumpaugh, J., … Almeda, V. (2015). Modeling how incoming knowledge, persistence, affective states, and in-game progress influence student learning from an educational game. Computers & Education, 86, 224–235.  https://doi.org/10.1016/j.compedu.2015.08.001Google Scholar
  59. Shute, V. J., & Kim, Y. J. (2014). Formative and stealth assessment. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational communications and technology (pp. 311–321).  https://doi.org/10.1007/978-1-4614-3185-5_25
  60. Shute, V. J., Wang, L., Greiff, S., Zhao, W., & Moore, G. (2016). Measuring problem solving skills via stealth assessment in an engaging video game. Computers in Human Behavior, 63, 106–117.  https://doi.org/10.1016/j.chb.2016.05.047Google Scholar
  61. Slater, S., Bowers, A. J., Kai, S., & Shute, V. J. (2017, July). A typology of players in the game physics playground. Presented at the Digital Games Research Association (DiGRA), Melbourne, Australia.Google Scholar
  62. Squire, K. (2006). From content to context: Videogames as designed experience. Educational Researcher, 35(8), 19–29.  https://doi.org/10.3102/0013189X035008019Google Scholar
  63. Squire, K. (2011). Video games and learning. Teaching and participatory culture in the digital age. Retrieved from http://eric.ed.gov/?id=ED523599
  64. Steinkuehler, C., & Duncan, S. (2008). Scientific habits of mind in virtual worlds. Journal of Science Education and Technology, 17(6), 530–543.  https://doi.org/10.1007/s10956-008-9120-8Google Scholar
  65. Stenerson, M. E., Salmon, A., Berland, M., & Squire, K. (2014). Adage: An open API for data collection in educational games. In Proceedings of the First ACM SIGCHI Annual Symposium on Computer-Human Interaction in Play (pp. 437–438). New York, NY: ACM.Google Scholar
  66. Stephenson, S., Baker, R. S., & Corrigan, S. (2014, June). Towards building an automated detector of engaged and disengaged behavior in game- based assessments. Presented at the 10th Annual Games+Learning+Society Conference, Madison, WI. Retrieved from http://radix.www.upenn.edu/learninganalytics/ryanbaker/GLS_Stephenson_v2.pdf
  67. Sweet, S. J., & Rupp, A. A. (2012). Using the ECD framework to support evidentiary reasoning in the context of a simulation study for detecting learner differences in epistemic games. Journal of Educational Data Mining, 4(1), 183.Google Scholar
  68. Thai, K. P., Li, L., & Schachner, A. (2019, April). Accelerating early math learning with a digital math resource: A cluster randomized controlled trial. Presented at the 2019 AERA Annual Meeting, Toronto, Canada.Google Scholar
  69. Wilson, M. (2002). Six views of embodied cognition. Psychonomic Bulletin & Review, 9(4), 625–636.  https://doi.org/10.3758/BF03196322Google Scholar
  70. Zagal, J. P., Mateas, M., Fernández-Vara, C., Hochhalter, B., & Lichti, N. (2005). Towards an ontological language for game analysis. In Proceedings of Digra 2005 Conference (p. 13). Authors & Digital Games Research Association DiGRA.Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • V. Elizabeth Owen
    • 1
  • Diana Hughes
    • 1
  1. 1.Age of LearningLos AngelesUSA

Personalised recommendations