Skip to main content

Assessing and Evaluating Virtual World Effectiveness

  • Chapter
  • First Online:
V-Learning

Abstract

Of all the chapters in this book, this chapter may be the most read. Designing and developing quality instruction is not an easy task, especially when adding new technologies. Assessing and evaluating online courses is arguably a more daunting task. This chapter will focus on assessment and evaluation in virtual worlds, games, and simulations on which we have been working so far

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Avouris, N., Margaritis, M., & Komis, V. (2004, September). Modelling interaction during small-group synchronous problem-solving activities: The Synergo approach. 2nd International Workshop on Designing Computational Models of Collaborative Learning Interaction, 7th Conference on Intelligent Tutoring Systems, Maceio, Brasil.

    Google Scholar 

  • Barab, S. A., MaKinster, J. G., & Scheckler, R. (2004). Designing system dualities: Characterizing online community. In R. K. J. G. S. A. Barab (Ed.), Designing for virtual communities in the service of learning. Cambridge: Cambridge University Press.

    Google Scholar 

  • Beerer, K., & Bodzin, A. (2003). Promoting inquiry-based science instruction: The validation of the Science Teacher Inquiry Rubric (STIR). Journal of Elementary Science Education, 15(2), 39–49.

    Article  Google Scholar 

  • Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., et al. (2004). How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research, 74, 379–439.

    Article  Google Scholar 

  • Brant, G., Hooper, E., & Sugrue, B. (1991). Which comes first: The simulation or the lecture? Journal of Educational Computing Research, 7(4), 469–481.

    Article  Google Scholar 

  • Brophy, S. P. (1998). Sequencing problem solving and hands on activities: Does it matter? Paper presented at the Annual Meeting of the American Educational Research Association, San Diego, CA.

    Google Scholar 

  • Brown, A. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2(2), 141–178.

    Article  Google Scholar 

  • Carlsen, D. D., & Andre, T. (1992). Use of a microcomputer simulation and conceptual change text to overcome students’ preconceptions about electric circuits. Journal of Computer-Based Instruction, 19, 105–109.

    Google Scholar 

  • Chapman, E. (2003). Alternative approaches to assessing student engagement rates. Practical Assessment, Research & Evaluation, 8(13), Retrieved August 17, 2006, from http://PAREonline.net/getvn.asp?v=8&n=13.

  • Cobb, P. (2001). Supporting the improvement of learning and teaching in social and institutional context. In S. Carver & D. Klahr (Eds.), Cognition and instruction: 25 years of progress (pp. 455–478). Cambridge, MA: Lawrence Erlbaum Associates, Inc.

    Google Scholar 

  • Cobb, P., Confrey, J., deSessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1), 9–13.

    Article  Google Scholar 

  • Coffey, A., & Atkinson, P. (1996). Making sense of qualitative data analysis: Complementary strategies. Thousand Oaks CA: Sage.

    Google Scholar 

  • Collins, A. (1992). Towards a design science of education. In E. Scanlon & T. O’Shea (Eds.), New directions in educational technology (pp. 15–22). Berlin: Springer.

    Google Scholar 

  • Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. Journal of the Learning Sciences, 13(1), 15–42.

    Article  Google Scholar 

  • Creswell, J. W. (2003). Research design: Qualitative, quantitative and mixed methods approaches (2nd ed.). London: Sage.

    Google Scholar 

  • Creswell, J. W., Plano Clark, V. L., Gutmann, M. L., & Hanson, W. E. (2003). Advanced mixed methods research designs. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 209–240). Thousand Oaks, CA: Sage.

    Google Scholar 

  • Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32(1), 5–8.

    Article  Google Scholar 

  • Dondi, C., & Moretti, M. (2007). A methodological proposal for learning games selection and quality assessment. British Journal of Educational Technology, 38(3), 502–512.

    Article  Google Scholar 

  • Dumas, J. S., & Redish, J. C. (1999). A practical guide to usability testing (2nd ed.) Bristol, England: Intellect.

    Google Scholar 

  • Dwyer, C. A., Millett, C. M., & Payne, D. G. (2006). A culture of evidence: Postsecondary assessment and learning outcomes. Princeton, NJ: ETS.

    Google Scholar 

  • Ellett, C. D., & Chauvin, E. (1991). Development, validity, and reliability of a new generation of assessments of effective teaching and learning: Future directions for the study of learning environments. Journal of Classroom Interaction, 26(2), 25–36.

    Google Scholar 

  • Erickson, F. (1992). Ethnographic microanalysis of interaction. In M. D. LeCompte, W. L. Millroy, & J. Preissle (Eds.), The handbook of qualitative research in education (pp. 202–225). San Diego: Academic Press, Inc.

    Google Scholar 

  • Ericsson, K. A., & Simon, H. A. (1993). Protocol analysis: Verbal reports as data (Revised edition). Cambridge, MA: The MIT Press.

    Google Scholar 

  • Falk, J. H., & Dierking, L. D. (2000). Learning from museums: Visitor experiences and the making of meaning. American Association for State and Local History Book Series. Walnut Creek, CA: AltaMira Press.

    Google Scholar 

  • Feng, M., & Heffernan, N. T. (2005). Informing teachers live about student learning: Reporting in the ASSISTment system. In workshop on usage analysis in Learning Systems at 12th Annual Conference on Artificial Intelligence in Education, Amsterdam.

    Google Scholar 

  • Fransella, F., & Bannister, D. (1977). A manual for repertory grid technique. London, New York, San Francisco: Academic Press.

    Google Scholar 

  • Gallant, L. M., Boone, G. M., & Heap, A. (2007). Five heuristics for designing and evaluating Web-based communities. First Monday, 12(3).

    Google Scholar 

  • Garris, R., Ahlers, R., & Driskell, J. E. (2002). Games, motivation, and learning: A research and practice model. Simulation and Gaming, 33(4), 441–467.

    Article  Google Scholar 

  • Gee, J. P. (1999). An introduction to discourse analysis: Theory and method. New York: Routledge.

    Google Scholar 

  • Geertz, C. (1973). Thick Description: Toward an interpretive theory of culture. In: Geertz, C. (Ed.), The interpretation of cultures: Selected essays (pp. 3–30). New York: Basic Books.

    Google Scholar 

  • Greenwood, C. R., & Delquadri, J. (1988). Code for instructional structure and student academic response (CISSAR). In M. Hersen & A. S. Bellack (Eds.), Dictionary of behavioural assessment (pp. 120–122). New York: Pergamon.

    Google Scholar 

  • Gumperz, J. J. (1982). Discourse strategies. Cambridge: Cambridge University Press.

    Google Scholar 

  • Halliday, M. A. K. (1978). Language as social semiotic: The social interpretation of language and meaning. Sydney: Edward Arnold.

    Google Scholar 

  • Heck, S., Steigelbauer, S. M., Hall, G. E., & Loucks, S. F. (1981). Measuring innovation configurations: Procedures and applications. Austin, TX: Research and Development Center for Teacher Education, University of Texas.

    Google Scholar 

  • Hulshof, C. D., Wilhelm, P., Beishuizen, J. J., & Van Rijn, H. (2005). FILE: A tool for the study of inquiry learning. Computers in Human Behavior, 21, 945–956.

    Article  Google Scholar 

  • Hutchins, E. (1995). Cognition in the wild. Cambridge, MA: MIT Press.

    Google Scholar 

  • Kennerly, D. (2003). Better game design through data mining. Gamasutra.com.

    Google Scholar 

  • Ketelhut, D. (2005, April). Assessing science self-efficacy in a virtual environment: A measurement pilot. Paper presented at the National Association of Research in Science Teaching Conference, Dallas.

    Google Scholar 

  • Kress, G. (1985). Linguistic processes in sociocultural practice. ECS806 Socio-cultural aspects of language and education. Victoria: Deakin University.

    Google Scholar 

  • LeCompte, M. D., Millroy, W., & Preissle, J. (Eds.). (1992). The handbook of qualitative research in education (881 pp). San Diego, CA: Academic Press.

    Google Scholar 

  • Lincoln, Y., & Guba, E. (1985). Naturalistic inquiry. Newbury Park: Sage.

    Google Scholar 

  • Linton, F., Goodman, B., Gaimari, R., Zarella, J., & Ross, H. (2003). Student modeling for an intelligent agent in a collaborative learning environment (pp. 342–351). Proceedings of User Modeling 2003, Johnstown, PA.

    Google Scholar 

  • Miles, M. M., & Huberman, A. M. (1984). Qualitative data analysis: A sourcebook of new methods. Newbury Park, CA: Sage.

    Google Scholar 

  • National Research Council Committee on Information Technology Literacy. (1999). Being fluent with information technology. Washington, DC: National Academy Press.

    Google Scholar 

  • Pellegrino, J. W., Chudowsky, N., & Glaser, R. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academy Press.

    Google Scholar 

  • Phipps, R., & Merisotis, J. (1999). What’s the difference? A review of contemporary research on the effectiveness of distance education in higher education. Journal of Distance Education, 14, 102–114.

    Google Scholar 

  • Presby, L. (2001). Increasing productivity in course delivery. Technologic Horizons in Education Journal, 28(7).

    Google Scholar 

  • Quintana, C., Reiser, B. J., Davis, E. A., Krajcik, J., Fretz, E., Duncan, R. G., et al. (2004). A scaffolding design framework for software to support science inquiry. Journal of the Learning Sciences, 13(3), 337–386.

    Article  Google Scholar 

  • Randolph, R. (2007). What’s the difference still? A follow up methodological review of the distance education research. Information in Education, 6, 179–188.

    Google Scholar 

  • Schuler, D., & Namioka, A. (Eds.). (1993). Participatory design: Principles and practices. Hillsdale, NJ: Lawrence Erlbaum Associates.

    Google Scholar 

  • Seydim, A. Y. (1999): Intelligent agents: A data mining perspective. Dallas, TX 75275: Dept. of Computer Science and Engineering, Southern Methodist University.

    Google Scholar 

  • Skinner, E. A., & Belmont, M. J. (1993). Motivation in the classroom: Reciprocal effects of teacher behavior and student engagement across the school year. Journal of Educational Psychology, 85(4), 571–581.

    Article  Google Scholar 

  • Skinner, E. A., Wellborn, J. G., & Connell, J. P. (1990). What it takes to do well in school and whether I’ve got it: The role of perceived control in children’s engagement and school achievement. Journal of Educational Psychology, 82, 22–32.

    Article  Google Scholar 

  • Squire, K. D. (2006). From content to context: Videogames as designed experience. Educational Researcher, 35(8), 19–29.

    Article  Google Scholar 

  • Squire, K., & Barab, S. (2004, June). Replaying history: Engaging urban underserved students in learning world history through computer simulation games (pp. 505–512). Proceedings of the 2004 International Conference of the Learning Sciences. Santa Monica, CA.

    Google Scholar 

  • Steinkuehler, C. A. (2006) Massively multiplayer online videogaming as participation in a discourse, mind, culture, and activity (pp. 38–52). Cambridge, MA: The MIT Press.

    Google Scholar 

  • Street, B. (1984). Literacy in theory and practice. New York: Cambridge University Press.

    Google Scholar 

  • Suthers, D., & Hundhausen, C. (2003). An empirical study of the effects of representational guidance on collaborative learning. Journal of the Learning Sciences, 12(2), 183–219.

    Article  Google Scholar 

  • Sweet, A. P., Guthrie, J. T., & Ng, M. (1996). Teacher perceptions and students’ motivation to read (Reading Research Report No. 69). Athens, GA: National Reading Research Center.

    Google Scholar 

  • Tripp, S. C., & Bichelmeyer, B. (1990). Rapid prototyping: An alternative instructional design strategy. ETR &D, 38(1), 31–44.

    Article  Google Scholar 

  • Tyre, P. (2007, November 16). Standardized tests in college?. Newsweek. Retrieved from http://www.newsweek.com/id/70750

  • Wang, F., & Hannafin, M. J. (2005). Design-based research and technology-enhanced learning environments. Educational Technology Research & Development, 53(4), 5–23.

    Article  Google Scholar 

  • Wickens, C., & Hollands, J. (2000). Engineering psychology and human performance. Upper Saddle River, NJ: Prentice Hall.

    Google Scholar 

  • Ysseldyke, J., & Christenson, S. (1993). TIES-II: The Instructional Environment System-II. A system to identify a student’s instructional needs. Eric Document No. 367072.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Leonard A. Annetta .

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer Science+Business Media B.V.

About this chapter

Cite this chapter

Annetta, L.A., Folta, E., Klesath, M. (2010). Assessing and Evaluating Virtual World Effectiveness. In: V-Learning. Springer, Dordrecht. https://doi.org/10.1007/978-90-481-3627-8_10

Download citation

Publish with us

Policies and ethics