Skip to main content

Using Institutional Data to Evaluate Game-Based Instructional Designs: Challenges and Recommendations

  • Chapter
  • First Online:
Assessment in Game-Based Learning
  • 3762 Accesses

Abstract

Over the course of the last 5 years, an iterative attempt has been made to develop a series of alternate reality learning games to support undergraduate students in an introductory computer literacy course. During that time, three separate narratives and 18 different iterations of the course have evolved in response to quantitative assessment and course evaluation data as well as qualitative data captured in reflective student web log reflections and interviews with instructors and learners. One major challenge to completing assessments of each iteration was the dearth in the availability of institutional data such as demographics including student year classification or student major (i.e., freshman/business major). Lacking such data that is necessary to conducting factor and other forms of statistical analysis, we have developed approaches during that time to capture necessary information in order to more accurately assess and evaluate the effectiveness of the learning game components as well as the degree to which satisfaction levels could be detected. This chapter discusses both our challenges in using institutional data for research on learning games at the postsecondary level and suggests, from our experience, means to overcome these using technological and methodological approaches.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Al-Issa, A., & Sulieman, H. (2007). Student evaluations of teaching: Perceptions and biasing factors. Quality Assurance in Education, 15(3), 302–317.

    Article  Google Scholar 

  • Bachen, C., McLoughlin, M., & Garcia, S. (1999). Assessing the role of gender in college students evaluations of faculty. Communication Education, 48(3), 193–210.

    Article  Google Scholar 

  • Barab, S. (2006). Design-based research: A methodological toolkit for the learning scientist. In R. K. Sawyer (Ed.), The Cambridge Handbook of the Learning Sciences (pp. 153–169). New York: Cambridge University Press.

    Article  Google Scholar 

  • Bonk, C., Kirkley, J., Hara, N., & Denned, V. (Eds.). (2001). Finding the instructor in post-secondary online learning: Pedagogical, social, managerial and technological locations. London: Kogan Page.

    Google Scholar 

  • Braskamp, L., & Ory, J. (1994). Assessing faculty work: Enhancing individual and institutional performance. San Francisco, CA: Jossey-Bass.

    Google Scholar 

  • Brown, M. J. (2008). Student perceptions of teaching evaluations. Journal of Instructional Psychology, 35(2), 177–181.

    Google Scholar 

  • Cardy, R. L., & Dobbins, G. H. (1986). Affect and appraisal accuracy: Liking as an integral dimension in evaluating performance. Journal of Applied Psychology, 71(4), 672–678.

    Article  Google Scholar 

  • Cashin, W. (1990). Students do rate different academic fields differently. New Directions for Teaching and Learning, 43, 113–121.

    Article  Google Scholar 

  • Crumbley, L., Henry, B., & Kratchman, S. (2001). Students perceptions of the evaluation of college teaching. Quality Assurance in Education, 9(4), 197–207.

    Article  Google Scholar 

  • Denson, N., Loveday, T., & Dalton, H. (2010). Student evaluation of courses: What predicts satisfaction? Higher Education Research and Development, 29(4), 339–356.

    Article  Google Scholar 

  • Dondlinger, M. J., & Warren, S. J. (2009). Alternate reality games as simulations to support capstone learning experiences. In D. Gibson & Y. K. Baek (Eds.), Digital simulations for improving education: Learning through artificial teaching environments. Hershey, PA: IGI Global.

    Google Scholar 

  • Driscoll, J., & Cadden, D. (2010). Student evaluation instruments: The interactive impact of course requirement, student level, department and anticipated grade. American Journal of Business Education, 3(5), 21–30.

    Google Scholar 

  • Feldman, K. A. (1993). College students views of male and female college teachers: Part II—Evidence from students evaluations of their classroom teachers. Research in Higher Education, 34(2), 151–211.

    Article  Google Scholar 

  • Frey, P., Leonard, D., & Beatty, W. (1975). Student ratings of instruction: Validation research. American Educational Research Journal, 12(4), 435–447.

    Google Scholar 

  • Husbands, C., & Fosh, P. (1993). Students evaluation of teaching in higher education: Experiences from four European countries and some implications of the practice. Assessment & Evaluation in Higher Education, 18(2), 95–114.

    Article  Google Scholar 

  • Jonassen, D. (Ed.). (1999). Designing constructivist learning environments (Vol. 2). Mahwah, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Klopfer, E. (2008). Augmented learning: Research and design of mobile educational games. Cambridge, MA: MIT.

    Google Scholar 

  • Koh, C., & Tan, T. (1997). Empirical investigation of the factors affecting SET results. International Journal of Educational Management, 11(4), 170–178.

    Article  Google Scholar 

  • Marsh, H. (1987). Students evaluations of university teaching: Research findings, methodological issues, and directions for future research. International Journal of Educational Research, 11(3).

    Google Scholar 

  • Marsh, H., & Dunkin, M. (Eds.). (1992). Students evaluations of university teaching: A multidimensional perspective (Vol. 8). New York, NY: Agathon Press.

    Google Scholar 

  • Marsh, H., & Roche, L. A. (1997). Making students evaluations of teaching effectiveness effective: The critical issues of validity, bias, and utility. American Psychologist, 52, 1187–1197.

    Article  Google Scholar 

  • Mintu-Wimsatt, A. (2001). Traditional vs. technology-mediated learning: A comparison of students course evaluations. Marketing Education Review, 11, 65–75.

    Google Scholar 

  • Mintu-Wimsatt, A., Ingram, K., Milward, M., & Russ, C. (2006). On different teaching delivery methods: What happens to instructor courses evaluations? Marketing Education Review, 16(3), 49–57.

    Google Scholar 

  • Nelson, L. M. (1999). Collaborative problem solving. In C. Reigeluth (Ed.), Instructional-design theories and models: A new paradigm of instructional theory (Vol. 2, pp. 241–267). Mahwah, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Ponder, J. (2007). Is student evaluation of teaching worthwhile? An analytical framework for answering the question. Quality Assurance in Education, 15(2), 178–191.

    Article  Google Scholar 

  • Robson, C. (2002). Real world research. Malden, MA: Blackwell.

    Google Scholar 

  • Salen, K., & Zimmerman, E. (2004). Rules of play: Game design fundamentals. Cambridge, MA: MIT.

    Google Scholar 

  • Savery, J. R., & Duffy, T. M. (1995). Problem-based learning: An instructional model and its constructivist framework. In B. Wilson (Ed.), Constructivist learning environments: Case studies in instructional design. Englewood Cliffs, NJ: Educational Technology Publications.

    Google Scholar 

  • Shavelson, R., Phillips, D., Towne, L., & Feuer, M. (2003). On the science of educational design studies. Educational Researcher, 32(1), 25–28.

    Article  Google Scholar 

  • Simpson, P. M., & Siguaw, J. A. (2000). Student evaluations of teaching: An exploratory study of the faculty response. Journal of Marketing Education, 22(3), 199–213.

    Article  Google Scholar 

  • Smith, B. P., & Anderson, K. J. (2005). Students ratings of professors: The teaching style contingency for Latino professors. Journal of Latinos and Education, 4(2), 115–136.

    Article  Google Scholar 

  • Tatro, C. (1995). Gender effects on student evaluations of faculty. Journal of Research and Development in Education, 28(3), 169–173.

    Google Scholar 

  • Warren, S. J., Barab, S., & Dondlinger, M. (2008). A MUVE towards PBL writing: Effects of a digital learning environment designed to improve elementary student writing. Journal of Research on Technology in Education, 41(1), 113–140.

    Google Scholar 

  • Warren, S. J., Dondlinger, M., Jones, J., & Whitworth, C. (2010). Leveraging PBL and game to redesign and introductory course [Research]. i-manager’s Journal of Educational Technology, 7(1), 40–51.

    Google Scholar 

  • Warren, S. J., Dondlinger, M., McLeod, J., & Bigenho, C. (2011). Opening the door: An evaluation of the efficacy of a problem-based learning game [Research]. Computers & Education, 58, 1–15.

    Google Scholar 

  • Warren, S. J., Dondlinger, M., Stein, R., & Barab, S. (2009). Educational game as supplemental learning tool: Benefits, challenges, and tensions arising from use in an elementary school classroom. Journal of Interactive Learning Research, 20(4), 487–505.

    Google Scholar 

  • Warren, S. J., Jones, G., & Lin, L. (2010). Usability and play testing: The often missed assessment. In L. Annetta & S. Bronack (Eds.), Serious educational game assessment: Practical methods and models for educational games, simulations and virtual worlds (pp. 131–146). Rotterdam: Sense Publishers.

    Google Scholar 

  • Warren, S. J., & Dondlinger, M. J. (2009). Examining four games for learning: Research-based lessons learned from five years of learning game designs and development. Paper presented at the Association for Educational Communications and Technology. Louisville, KY, USA

    Google Scholar 

  • Warren, S. J., Stein, R., Dondlinger, M. J., & Barab, S. (2009). A look inside a design process: Blending instructional design and game principles to target writing skills. Journal of Educational Computing Research, 40(3), 295–301.

    Article  Google Scholar 

  • Whalen, S. P. (1999). Finding flow at school and at home: A conversation with Mihaly Csikszentmihalyi. Journal of Secondary Gifted Education, 10(4), 161–166.

    Google Scholar 

  • Wirkala, C., & Kuhn, D. (2011). Problem-based learning in K-12 education: Is it effective and how does it achieve its effects? American Educational Research Journal, 48(5), 1157–1186.

    Article  Google Scholar 

  • Zimmerman, B. J. (1990). Self-regulated learning and academic achievement: An overview. Educational Psychologist, 25, 3–17.

    Article  Google Scholar 

Download references

Acknowledgments

We would like to thank the University of North Texas for the Quality Enhancement Plan grant that funded the development of The Door version of this course. We would also like to thank Mary Jo Dondlinger, Julie McLeod, Tip Robertson, and Cliff Whitworth who helped with the development and initial research on that course iteration.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Scott J. Warren .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer Science+Business Media New York

About this chapter

Cite this chapter

Warren, S.J., Bigenho, C. (2012). Using Institutional Data to Evaluate Game-Based Instructional Designs: Challenges and Recommendations. In: Ifenthaler, D., Eseryel, D., Ge, X. (eds) Assessment in Game-Based Learning. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-3546-4_16

Download citation

Publish with us

Policies and ethics