Skip to main content

A Futures Perspective on Information Technology and Assessment

  • Living reference work entry
  • First Online:
Book cover Second Handbook of Information Technology in Primary and Secondary Education

Part of the book series: Springer International Handbooks of Education ((SIHE))

Abstract

Assessment is perhaps the area, like no other, where the utility of information technology in education is tested. The possibilities for assessing using these technologies are expanding rapidly. In particular, new technologies afford possibilities for focusing assessment on learning as an ongoing developmental process, rather than on performance. Building on notions of assessment grounded in measurement theory, there are prospects for assessing students continuously while they learn in a developmental way through the use of data and analytics. The resulting picture of student development will then allow for a more holistic and systemic approach to assessment in the years ahead. While it is often problematic to make predictions about the future, in this chapter, I will attempt to draw on current developments to provide suggestions about where the intersections of assessment and information technologies are likely headed. That future is likely to entail more continuous, personalized forms of assessment that focus heavily on helping students to make better judgments about their own learning and development.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

References

  • Allen, M. J., & Yen, W. M. (2001). Introduction to measurement theory. Long Grove: Waveland Press.

    Google Scholar 

  • Amin, T. G., & Levrini, O. (2017). Converging perspectives on conceptual change: Mapping an emerging paradigm in the learning sciences. Abingdon: Routledge.

    Google Scholar 

  • Arguel, A., Lockyer, L., Lipp, O., Lodge, J. M., & Kennedy, G. (2017). Inside out: Ways of detecting learners confusion for successful e-learning. Journal of Educational Computing Research, 55(4), 526–551. https://doi.org/10.1177/0735633116674732. (pre-press version).

    Article  Google Scholar 

  • Bakr, M. M., Massey, W., & Alexander, H. (2013). Evaluation of Simodont® Haptic 3D virtual reality dental training simulator. International Journal of Dental Clinics, 5(4), 1–6.

    Google Scholar 

  • Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2003). Assessment for learning: Putting it into practice. Buckingham: Open University Press.

    Google Scholar 

  • Black, P., McCormick, R., James, M., & Pedder, D. (2006). Learning how to learn and assessment for learning: A theoretical inquiry. Research Papers in Education, 21(2), 119–132. https://doi.org/10.1080/02671520600615612.

    Article  Google Scholar 

  • Blikstein, P., Worsley, M., Piech, C., Sahami, M., Cooper, S., & Koller, D. (2014). Programming pluralism: Using learning analytics to detect patterns in the learning of computer programming. The Journal of the Learning Sciences, 23(4), 561–599.

    Article  Google Scholar 

  • Buckingham Shum, S., SÁndor, Á., Goldsmith, R., Wang, X., Bass, R., & McWilliams, M. (2016). Reflecting on reflective writing analytics: Assessment challenges and iterative evaluation of a prototype tool. In 6th international learning analytics & knowledge conference. New York: ACM. https://doi.org/10.1145/2883851.2883955.

  • Caramazza, A., & Mahon, B. Z. (2003). The organization of conceptual knowledge: The evidence from category-specific semantic deficits. Trends in Cognitive Science, 7, 354–361.

    Article  Google Scholar 

  • Clement, J. (1982). Students preconceptions in introductory mechanics. American Journal of Physics, 50, 66–71.

    Article  Google Scholar 

  • D’Mello, S., Lehman, B., Pekrun, R., & Graesser, A. (2014). Confusion can be beneficial for learning. Learning and Instruction, 29, 153–170. https://doi.org/10.1016/j.learninstruc.2012.05.003.

    Article  Google Scholar 

  • Dalgarno, B., Kennedy, G., & Bennett, S. (2014). The impact of students’ exploration strategies on discovery learning using computer-based simulations. Educational Media International, 51(4), 310–329. https://doi.org/10.1080/09523987.2014.977009.

    Article  Google Scholar 

  • Dann, R. (2014). Assessment as learning: Blurring the boundaries of assessment and learning for theory, policy and practice. Assessment in Education: Principles, Policy & Practice, 21(2), 149–166.

    Article  Google Scholar 

  • De Houwer, J., Barnes-Holmes, D., & Moors, A. (2013). What is learning? On the nature and merits of a functional definition of learning. Psychonomic Bulletin & Review, 24(4), 631–642. https://doi.org/10.3758/s13423-013-0386-3.

    Article  Google Scholar 

  • Gartner Inc. (2015). Research methodologies: Hype cycles. Stamford: Gartner. Retrieved on 10 Oct 2016, from http://www.gartner.com/technology/research/methodologies/hype-cycle.jsp.

    Google Scholar 

  • Graesser, A., Chipman, P., Haynes, B., & Olney, A. (2005). AutoTutor: An intelligent tutoring system with mixed-initiative dialogue. IEEE Transactions on Education, 48(4), 612–618.

    Article  Google Scholar 

  • Hays, R. T., Jacobs, J. W., Prince, C., & Salas, E. (1992). Flight simulator training effectiveness: A meta-analysis. Military Psychology, 4(2), 63–74.

    Article  Google Scholar 

  • Horvath, J. C., & Lodge, J. M. (2017). A framework for organizing and translating science of learning research. In J. C. Horvath, J. M. Lodge, & J. A. C. Hattie (Eds.), From the laboratory to the classroom: Translating learning sciences for teachers. Abingdon: Routledge.

    Google Scholar 

  • Jacobson, M. J., Kapur, M., & Reimann, P. (2016). Conceptualizing debates in learning and educational research: Toward a complex systems conceptual framework of learning. Educational Psychologist, 51(2), 210–218. https://doi.org/10.1080/00461520.2016.1166963.

    Article  Google Scholar 

  • Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2(2), 130–144.

    Article  Google Scholar 

  • Kim, Y. J., Almond, R. G., & Shute, V. J. (2016). Applying evidence-centered design for the development of game- based assessments in physics playground. International Journal of Testing, 16(2), 142–163.

    Article  Google Scholar 

  • Leighton, J. P., & Gierl, M. J. (Eds.). (2007). Cognitive diagnostic assessment for education: Theories and applications. Cambridge, UK: Cambridge University Press.

    Google Scholar 

  • Lodge, J. M., & Corrin, L. (2017). What data and analytics can and do say about effective learning. Nature npj Science of Learning, 2(1), 4–5. https://doi.org/10.1038/s41539-017-0006-5.

    Article  Google Scholar 

  • Lodge, J. M., & Horvath, J. C. (2017). Science of learning and digital learning environments. In J. C. Horvath, J. M. Lodge, & J. A. C. Hattie (Eds.), From the laboratory to the classroom: Translating learning sciences for teachers. Abingdon: Routledge.

    Google Scholar 

  • Lodge, J. M., & Kennedy, G. E. (2015). Prior knowledge, confidence and understanding in interactive tutorials and simulations. In T. Reiners, B. R. Von Konsky, D. Gibson, V. Chang, L. Irving, & K. Clarke (Eds.), Globally connected, digitally enabled (pp. 178–188). Proceedings ascilite 2015 in Perth. ASCILITE, Tugun, Qld.

    Google Scholar 

  • Lodge, J. M., Kennedy, G., & Hattie, J. A. C. (2018). Understanding, assessing and enhancing student evaluative judgement in digital environments. In D. Boud, R. Ajjawi, P. Dawson, & J. Tai (Eds.), Developing evaluative judgement in higher education: Assessment for knowing and producing quality work. Abingdon: Routledge.

    Google Scholar 

  • Mason, M. (2008). Complexity theory and the philosophy of education. Educational Philosophy and Theory, 40(1), 4–18.

    Article  Google Scholar 

  • Meyer, R. H. (1997). Value added indicators of school performance: A primer. Economics of Education Review, 16, 283–301.

    Article  Google Scholar 

  • Milligan, S., & Griffin, P. (2016). Understanding learning and learning design in MOOCs: A measurement-based interpretation. Journal of Learning Analytics, 3(2), 88–115. https://doi.org/10.18608/jla.2016.32.5.

    Article  Google Scholar 

  • Nitko, A. (1995). Curriculum-based continuous assessment: A framework for concepts, procedures and policy. Assessment in Education, 2(3), 321–337.

    Article  Google Scholar 

  • Pekrun, R., & Linnenbrink-Garcia, L. (Eds.). (2014). International handbook of emotions in education. New York: Routledge.

    Google Scholar 

  • Pieterse, V. (2013). Automated assessment of programming assignments. ACM Transactions on Graphics, 28(4), 1–12. https://doi.org/10.1145/1559755.1559763.

    Google Scholar 

  • Piromchai, P., Ioannou, I., Wijewickrema, S., Kasemsiri, P., Lodge, J. M., Kennedy, G., & O'Leary, S. (2017). The effects of anatomical variation on trainee performance in a virtual reality temporal bone surgery simulator – A pilot study. The Journal of Laryngology & Otology, 131(S1), S29–S35. https://doi.org/10.1017/S0022215116009233.

    Article  Google Scholar 

  • Putnam, A. L., Nestojko, J. F., & Roediger, H. L. (2017). Improving student learning: Two strategies to make it stick. In J. C. Horvath, J. M. Lodge, & J. A. C. Hattie (Eds.), From the laboratory to the classroom: Translating learning sciences for teachers. Abingdon: Routledge.

    Google Scholar 

  • Roediger, H. L., & Karpicke, J. D. (2006). Test-enhanced learning: Taking memory tests improves long-term retention. Psychological Science, 17, 249–255. https://doi.org/10.1111/j.1467-9280.2006.01693.x.

    Article  Google Scholar 

  • Roediger, H. L., & Marsh, E. J. (2005). The positive and negative consequence of multiple-choice testing. Journal of Experimental Psychology: Learning, Memory, and Cognition, 31, 1155–1159.

    Google Scholar 

  • Selwyn, N. (2014). Distrusting educational technology: Critical questions for changing times. New York: Routledge.

    Google Scholar 

  • Shute, V. J. (2011). Stealth assessment in computer-based games to support learning. Computer Games and Instruction, 55(2), 503–524.

    Google Scholar 

  • Shute, V. J., & Kim, Y. J. (2014). Formative and stealth assessment. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational communications and technology (4th ed.). New York: Springer.

    Google Scholar 

  • Shute, V. J., & Ventura, M. (2013). Measuring and supporting learning in games: Stealth assessment. Cambridge, MA: MIT Press.

    Google Scholar 

  • Soderstrom, N. C., & Bjork, R. A. (2015). Learning versus performance: An integrative review. Perspectives on Psychological Science, 10(2), 176–199. https://doi.org/10.1177/1745691615569000.

    Article  Google Scholar 

  • van der Linden, W. J., & Hambleton, R. K. (1997). Handbook of modern item response theory. New York: Springer.

    Book  Google Scholar 

  • Webb, M., Gibson, D. C., & Forkosh-Baruch, A. (2013). Challenges for information technology supporting educational assessment. Journal of Computer Assisted Learning, 29(5), 451–462. https://doi.org/10.1111/jcal.12033.

    Article  Google Scholar 

  • Wiliam, D. (2011). What is assessment for learning? Studies in Educational Evaluation, 37(1), 3–14.

    Article  Google Scholar 

  • Woolf, B. P. (2009). Building intelligent interactive tutors. Burlington: Morgan Kaufmann.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jason M. Lodge .

Editor information

Editors and Affiliations

Section Editor information

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this entry

Check for updates. Verify currency and authenticity via CrossMark

Cite this entry

Lodge, J.M. (2018). A Futures Perspective on Information Technology and Assessment. In: Voogt, J., Knezek, G., Christensen, R., Lai, KW. (eds) Second Handbook of Information Technology in Primary and Secondary Education . Springer International Handbooks of Education. Springer, Cham. https://doi.org/10.1007/978-3-319-53803-7_43-1

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-53803-7_43-1

  • Received:

  • Accepted:

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-53803-7

  • Online ISBN: 978-3-319-53803-7

  • eBook Packages: Springer Reference EducationReference Module Humanities and Social SciencesReference Module Education

Publish with us

Policies and ethics