Abstract
Of all the chapters in this book, this chapter may be the most read. Designing and developing quality instruction is not an easy task, especially when adding new technologies. Assessing and evaluating online courses is arguably a more daunting task. This chapter will focus on assessment and evaluation in virtual worlds, games, and simulations on which we have been working so far
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Avouris, N., Margaritis, M., & Komis, V. (2004, September). Modelling interaction during small-group synchronous problem-solving activities: The Synergo approach. 2nd International Workshop on Designing Computational Models of Collaborative Learning Interaction, 7th Conference on Intelligent Tutoring Systems, Maceio, Brasil.
Barab, S. A., MaKinster, J. G., & Scheckler, R. (2004). Designing system dualities: Characterizing online community. In R. K. J. G. S. A. Barab (Ed.), Designing for virtual communities in the service of learning. Cambridge: Cambridge University Press.
Beerer, K., & Bodzin, A. (2003). Promoting inquiry-based science instruction: The validation of the Science Teacher Inquiry Rubric (STIR). Journal of Elementary Science Education, 15(2), 39–49.
Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., et al. (2004). How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research, 74, 379–439.
Brant, G., Hooper, E., & Sugrue, B. (1991). Which comes first: The simulation or the lecture? Journal of Educational Computing Research, 7(4), 469–481.
Brophy, S. P. (1998). Sequencing problem solving and hands on activities: Does it matter? Paper presented at the Annual Meeting of the American Educational Research Association, San Diego, CA.
Brown, A. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2(2), 141–178.
Carlsen, D. D., & Andre, T. (1992). Use of a microcomputer simulation and conceptual change text to overcome students’ preconceptions about electric circuits. Journal of Computer-Based Instruction, 19, 105–109.
Chapman, E. (2003). Alternative approaches to assessing student engagement rates. Practical Assessment, Research & Evaluation, 8(13), Retrieved August 17, 2006, from http://PAREonline.net/getvn.asp?v=8&n=13.
Cobb, P. (2001). Supporting the improvement of learning and teaching in social and institutional context. In S. Carver & D. Klahr (Eds.), Cognition and instruction: 25 years of progress (pp. 455–478). Cambridge, MA: Lawrence Erlbaum Associates, Inc.
Cobb, P., Confrey, J., deSessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1), 9–13.
Coffey, A., & Atkinson, P. (1996). Making sense of qualitative data analysis: Complementary strategies. Thousand Oaks CA: Sage.
Collins, A. (1992). Towards a design science of education. In E. Scanlon & T. O’Shea (Eds.), New directions in educational technology (pp. 15–22). Berlin: Springer.
Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. Journal of the Learning Sciences, 13(1), 15–42.
Creswell, J. W. (2003). Research design: Qualitative, quantitative and mixed methods approaches (2nd ed.). London: Sage.
Creswell, J. W., Plano Clark, V. L., Gutmann, M. L., & Hanson, W. E. (2003). Advanced mixed methods research designs. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 209–240). Thousand Oaks, CA: Sage.
Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32(1), 5–8.
Dondi, C., & Moretti, M. (2007). A methodological proposal for learning games selection and quality assessment. British Journal of Educational Technology, 38(3), 502–512.
Dumas, J. S., & Redish, J. C. (1999). A practical guide to usability testing (2nd ed.) Bristol, England: Intellect.
Dwyer, C. A., Millett, C. M., & Payne, D. G. (2006). A culture of evidence: Postsecondary assessment and learning outcomes. Princeton, NJ: ETS.
Ellett, C. D., & Chauvin, E. (1991). Development, validity, and reliability of a new generation of assessments of effective teaching and learning: Future directions for the study of learning environments. Journal of Classroom Interaction, 26(2), 25–36.
Erickson, F. (1992). Ethnographic microanalysis of interaction. In M. D. LeCompte, W. L. Millroy, & J. Preissle (Eds.), The handbook of qualitative research in education (pp. 202–225). San Diego: Academic Press, Inc.
Ericsson, K. A., & Simon, H. A. (1993). Protocol analysis: Verbal reports as data (Revised edition). Cambridge, MA: The MIT Press.
Falk, J. H., & Dierking, L. D. (2000). Learning from museums: Visitor experiences and the making of meaning. American Association for State and Local History Book Series. Walnut Creek, CA: AltaMira Press.
Feng, M., & Heffernan, N. T. (2005). Informing teachers live about student learning: Reporting in the ASSISTment system. In workshop on usage analysis in Learning Systems at 12th Annual Conference on Artificial Intelligence in Education, Amsterdam.
Fransella, F., & Bannister, D. (1977). A manual for repertory grid technique. London, New York, San Francisco: Academic Press.
Gallant, L. M., Boone, G. M., & Heap, A. (2007). Five heuristics for designing and evaluating Web-based communities. First Monday, 12(3).
Garris, R., Ahlers, R., & Driskell, J. E. (2002). Games, motivation, and learning: A research and practice model. Simulation and Gaming, 33(4), 441–467.
Gee, J. P. (1999). An introduction to discourse analysis: Theory and method. New York: Routledge.
Geertz, C. (1973). Thick Description: Toward an interpretive theory of culture. In: Geertz, C. (Ed.), The interpretation of cultures: Selected essays (pp. 3–30). New York: Basic Books.
Greenwood, C. R., & Delquadri, J. (1988). Code for instructional structure and student academic response (CISSAR). In M. Hersen & A. S. Bellack (Eds.), Dictionary of behavioural assessment (pp. 120–122). New York: Pergamon.
Gumperz, J. J. (1982). Discourse strategies. Cambridge: Cambridge University Press.
Halliday, M. A. K. (1978). Language as social semiotic: The social interpretation of language and meaning. Sydney: Edward Arnold.
Heck, S., Steigelbauer, S. M., Hall, G. E., & Loucks, S. F. (1981). Measuring innovation configurations: Procedures and applications. Austin, TX: Research and Development Center for Teacher Education, University of Texas.
Hulshof, C. D., Wilhelm, P., Beishuizen, J. J., & Van Rijn, H. (2005). FILE: A tool for the study of inquiry learning. Computers in Human Behavior, 21, 945–956.
Hutchins, E. (1995). Cognition in the wild. Cambridge, MA: MIT Press.
Kennerly, D. (2003). Better game design through data mining. Gamasutra.com.
Ketelhut, D. (2005, April). Assessing science self-efficacy in a virtual environment: A measurement pilot. Paper presented at the National Association of Research in Science Teaching Conference, Dallas.
Kress, G. (1985). Linguistic processes in sociocultural practice. ECS806 Socio-cultural aspects of language and education. Victoria: Deakin University.
LeCompte, M. D., Millroy, W., & Preissle, J. (Eds.). (1992). The handbook of qualitative research in education (881Â pp). San Diego, CA: Academic Press.
Lincoln, Y., & Guba, E. (1985). Naturalistic inquiry. Newbury Park: Sage.
Linton, F., Goodman, B., Gaimari, R., Zarella, J., & Ross, H. (2003). Student modeling for an intelligent agent in a collaborative learning environment (pp. 342–351). Proceedings of User Modeling 2003, Johnstown, PA.
Miles, M. M., & Huberman, A. M. (1984). Qualitative data analysis: A sourcebook of new methods. Newbury Park, CA: Sage.
National Research Council Committee on Information Technology Literacy. (1999). Being fluent with information technology. Washington, DC: National Academy Press.
Pellegrino, J. W., Chudowsky, N., & Glaser, R. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academy Press.
Phipps, R., & Merisotis, J. (1999). What’s the difference? A review of contemporary research on the effectiveness of distance education in higher education. Journal of Distance Education, 14, 102–114.
Presby, L. (2001). Increasing productivity in course delivery. Technologic Horizons in Education Journal, 28(7).
Quintana, C., Reiser, B. J., Davis, E. A., Krajcik, J., Fretz, E., Duncan, R. G., et al. (2004). A scaffolding design framework for software to support science inquiry. Journal of the Learning Sciences, 13(3), 337–386.
Randolph, R. (2007). What’s the difference still? A follow up methodological review of the distance education research. Information in Education, 6, 179–188.
Schuler, D., & Namioka, A. (Eds.). (1993). Participatory design: Principles and practices. Hillsdale, NJ: Lawrence Erlbaum Associates.
Seydim, A. Y. (1999): Intelligent agents: A data mining perspective. Dallas, TX 75275: Dept. of Computer Science and Engineering, Southern Methodist University.
Skinner, E. A., & Belmont, M. J. (1993). Motivation in the classroom: Reciprocal effects of teacher behavior and student engagement across the school year. Journal of Educational Psychology, 85(4), 571–581.
Skinner, E. A., Wellborn, J. G., & Connell, J. P. (1990). What it takes to do well in school and whether I’ve got it: The role of perceived control in children’s engagement and school achievement. Journal of Educational Psychology, 82, 22–32.
Squire, K. D. (2006). From content to context: Videogames as designed experience. Educational Researcher, 35(8), 19–29.
Squire, K., & Barab, S. (2004, June). Replaying history: Engaging urban underserved students in learning world history through computer simulation games (pp. 505–512). Proceedings of the 2004 International Conference of the Learning Sciences. Santa Monica, CA.
Steinkuehler, C. A. (2006) Massively multiplayer online videogaming as participation in a discourse, mind, culture, and activity (pp. 38–52). Cambridge, MA: The MIT Press.
Street, B. (1984). Literacy in theory and practice. New York: Cambridge University Press.
Suthers, D., & Hundhausen, C. (2003). An empirical study of the effects of representational guidance on collaborative learning. Journal of the Learning Sciences, 12(2), 183–219.
Sweet, A. P., Guthrie, J. T., & Ng, M. (1996). Teacher perceptions and students’ motivation to read (Reading Research Report No. 69). Athens, GA: National Reading Research Center.
Tripp, S. C., & Bichelmeyer, B. (1990). Rapid prototyping: An alternative instructional design strategy. ETR &D, 38(1), 31–44.
Tyre, P. (2007, November 16). Standardized tests in college?. Newsweek. Retrieved from http://www.newsweek.com/id/70750
Wang, F., & Hannafin, M. J. (2005). Design-based research and technology-enhanced learning environments. Educational Technology Research & Development, 53(4), 5–23.
Wickens, C., & Hollands, J. (2000). Engineering psychology and human performance. Upper Saddle River, NJ: Prentice Hall.
Ysseldyke, J., & Christenson, S. (1993). TIES-II: The Instructional Environment System-II. A system to identify a student’s instructional needs. Eric Document No. 367072.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2010 Springer Science+Business Media B.V.
About this chapter
Cite this chapter
Annetta, L.A., Folta, E., Klesath, M. (2010). Assessing and Evaluating Virtual World Effectiveness. In: V-Learning. Springer, Dordrecht. https://doi.org/10.1007/978-90-481-3627-8_10
Download citation
DOI: https://doi.org/10.1007/978-90-481-3627-8_10
Published:
Publisher Name: Springer, Dordrecht
Print ISBN: 978-90-481-3620-9
Online ISBN: 978-90-481-3627-8
eBook Packages: Humanities, Social Sciences and LawEducation (R0)