Advertisement

International Large-Scale Computer-Based Studies on Information Technology Literacy in Education

  • Julian FraillonEmail author
Reference work entry
Part of the Springer International Handbooks of Education book series (SIHE)

Abstract

The first international large-scale study in Information Technology (IT) literacy was conducted in 1987, and a broad range of studies that assess IT literacy and related areas of digital learning have followed. This chapter discusses recent developments in international large-scale studies of IT literacy-related achievement with a focus on their shared necessary attributes and the associated challenges with operationalizing these attributes in the test instruments. Two key attributes addressed are: (i) that the test contents reflect real-world use of Information and Communications Technology (ICT) and (ii) that the tests make use of the dynamic functionality and multimodal opportunities afforded by the computer-based environment. Challenges associated with these attributes include ensuring that the individual tasks within each assessment are independent of each other, maintaining a standardized test-taker experience, providing test-takers with plausible feedback from the computer-based environment, and maintaining construct validity. Examples are discussed of how the common challenges in creating the test instruments are addressed in the design of the instruments with some further discussion of possible future directions in large-scale international studies related to IT literacy.

Keywords

Information technology Information technology literacy Information and communication technology literacy ICT literacy Large-scale assessment Computer and information literacy Digital literacy 

References

  1. Anderson, R. (2008). Handbook of research on new literacies. In J. Coiro, M. Knobel, C. Lankshear, & D. J. Leu (Eds.). New York: Lawrence Erlbaum.Google Scholar
  2. Angeli, C., Voogt, J., Fluck, A., Webb, M. E., Cox, M. J., Malyn-Smith, J., & Zagami, J. (2016). A K-6 computational thinking curriculum framework: Implications for teacher knowledge. Educational Technology & Society, 19(3), 47–57.Google Scholar
  3. Berezner, A., & Adams, R. J. (2017). Why large-scale assessments use scaling and item response theory. In P. Lietz, J. C. Creswell, K. Rust, & R. J. Adams (Eds.), Implementation of large-scale education assessments. Chichester, Wiley.Google Scholar
  4. Bocconi, S., Chioccariello, A., Dettori, G., Ferrari, A., Engelhardt, K., Kampylis, P., & Punie, Y. (2016). Developing computational thinking in compulsory education. Implications for policy and practice. EUR – Scientific and Technical Research Reports.  https://doi.org/10.2791/792158.
  5. Finch, W. H., & Jeffers, H. (2016). A Q3-based permutation test for assessing local independence. Applied Psychological Measurement, 40(2), 157–160.CrossRefGoogle Scholar
  6. Foshay, A. W. (1962). In A. W. Foshay, R. L. Thorndike, F. Hotyat, D. A. Pidgeon, & D. A. Walker (Eds.), Educational achievements of thirteen-year-olds in twelve countries. Results of an international research project, 1959–1961. Hamburg: UNESCO Institute for Education.Google Scholar
  7. Fraillon, J., Schulz, W., & Ainley, J. (2013). International computer and information literacy study: Assessment framework. Amsterdam: International Association for the Evaluation of Educational Achievement (IEA).Google Scholar
  8. Fraillon, J., Ainley, J., Schulz, W., Friedman, T., & Gebhardt, E. (2014). Preparing for life in a digital age. The IEA international computer and literacy information study international report. Amsterdam: International Association for the Evaluation of Educational Achievement (IEA).Google Scholar
  9. Fraillon, J., Schulz, W., Friedman, T., Ainley, J., & Gebhardt, E. (Eds.). (2015). International computer and literacy information study 2013 technical report. Amsterdam: International Association for the Evaluation of Educational Achievement (IEA).Google Scholar
  10. Griffin, P., & Care, E. (Eds.). (2015). Assessment and teaching of 21st century skills: Methods and approach. Dordrecht: Springer.Google Scholar
  11. Griffin, P., McGaw, B., & Care, E. (2012). Assessment and teaching of 21st century skills. Dordrecht: Springer.CrossRefGoogle Scholar
  12. Haßler, B., Major, L., & Hennessy, S. (2016). Tablet use in schools: A critical review of the evidence for learning outcomes. Journal of Computer Assisted Learning, 32(2), 139–156.CrossRefGoogle Scholar
  13. Hesse, F., Care, E., Buder, J., Sassenberg, K., & Griffin, P. (2015). A framework for teachable collaborative problem solving skills. In P. Griffin & E. Care (Eds.), Assessment and teaching of 21stcentury skills: Methods and approach (pp. 37–56). Dordrecht: Springer.Google Scholar
  14. Kozma, R. B. (Ed.). (2003). Technology, innovation, and educational change: A global perspective. A report of the Second Information Technology in Education Study Module 2. Eugene: ISTE.Google Scholar
  15. LaRoche, S., & Foy, P. (2017). Sample implementation in PIRLS 2016. In M. O. Martin, I. V. S. Mullis, & M. Hooper (Eds.), Methods and procedures in PIRLS 2016 (pp. 5.1–5.126). Retrieved from Boston College, TIMSS & PIRLS International Study Center website: https://timssandpirls.bc.edu/publications/pirls/2016-methods/chapter-5.html.
  16. Law, N., Pelgrum, W. J., & Plomp, T. (Eds.). (2008). Pedagogy and ICT use in schools around the world: Findings from the IEA SITES 2006 study. Hong Kong: CERC-Springer.Google Scholar
  17. Lennon, M., Kirsch, I., Von Davier, M., Wagner, M., & Yamamoto, K. (2003). Feasibility study for the PISA ICT literacy assessment. Report to network A. Princeton: Educational Testing Service. Retrieved from: https://files.eric.ed.gov/fulltext/ED504154.pdf.Google Scholar
  18. Mendelovits, J. (2017). Test developmnent. In P. Lietz, J. C. Creswell, K. Rust, & R. J. Adams (Eds.), Implementation of large-scale education assessments. Chichester, Wiley.Google Scholar
  19. Mendelovits, J., Ramalingam, D., & Lumley, T. (2012). Print and digital reading in PISA 2009: Comparison and contrast. http://research.acer.edu.au/pisa/6.
  20. Mullis, I. V. S., & Martin, M. O. (Eds.). (2015). PIRLS 2016 assessment framework (2nd ed.). Chestnut Hill: Boston College.Google Scholar
  21. Mullis, I. V. S., Martin, M. O., Foy, P., & Hooper, M. (2017). ePIRLS 2016 International Results in Online Informational Reading. Chestnut Hill MA: Boston College.Google Scholar
  22. OECD. (2010). PISA computer-based assessment of student skills in science. Paris: OECD Publishing.CrossRefGoogle Scholar
  23. OECD. (2011). PISA 2009 results: Digital technologies and performance (Volume VI). Paris: OECD Publishing.Google Scholar
  24. OECD. (2013). OECD skills outlook 2013: First results from the survey of adult skills. Paris: OECD Publishing.  https://doi.org/10.1787/9789264204256-en.CrossRefGoogle Scholar
  25. OECD. (2014). Technical report of the survey of adult skills (PIAAC). Paris: OECD Publishing.Google Scholar
  26. Pelgrum, W. J., & Anderson, R. (1999). ICT and the emerging paradigm for life long learning. A worldwide educational assessment of infrastructure, goals and practices. Amsterdam: IEA.Google Scholar
  27. Pelgrum, W. J., & Plomp, T. (Eds.). (1993). The IEA study of computers in education. Implementation of an innovation in 21 education systems. Oxford: Pergamon Press.Google Scholar
  28. Pelgrum, W. J., & Plomp, T. (2008). Methods for large-scale international studies on ICT in Education. In J. Voogt & G. Knezek (Eds.), International handbook of information technology in primary and secondary education (pp. 1053–1066). New York: Springer.CrossRefGoogle Scholar
  29. Pelgrum, W. J., Janssen-Reinen, I. A. M., & Plomp, T. (Eds.). (1993). Schools, teachers, students and computers. A cross-national perspective: IEA-Comped Study Stage 2. Enschede: University of Twente.Google Scholar
  30. Robinsohn, S. B. (1962). In A. W. Foshay, R. L. Thorndike, F. Hotyat, D. A. Pidgeon, & D. A. Walker (Eds.), Educational achievements of thirteen-year-olds in twelve countries. Results of an international research project, 1959–1961. Hamburg: UNESCO Institute for Education.Google Scholar
  31. Turner, R. (2017). Processing responses to open-ended survey questions. In P. Lietz, J. C. Creswell, K. Rust, & R. J. Adams (Eds.), Implementation of large-scale education assessments. Chichester, Wiley.Google Scholar
  32. United Nations. (2016). Global indicator framework for the Sustainable Development Goals and targets for the 2030 Agenda for Sustainable Development. Annex: A/RES/21/313. Accessed from: https://unstats.un.org/sdgs/indicators/Global%20Indicator%20Framework_A.RES.71.313%20Annex.pdf.
  33. Walker, M. (2017). Computer based delivery of cognitive assessment and questionnaires. In P. Lietz, J. C. Creswell, K. Rust, & R. J. Adams (Eds.), Implementation of large-scale education assessments. Chichester, Wiley.Google Scholar
  34. Wilson, M., & Scalise, K. (2015). Assessment of learning in digital networks. In P. Griffin & E. Care (Eds.), Assessment and teaching of 21st century skills: Methods and approach (pp. 57–81). Dordrecht: Springer.Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Australian Council for Educational ResearchCamberwellAustralia

Section editors and affiliations

  • Margaret Cox
    • 1
  • Joke Voogt
    • 2
  1. 1.King's College LondonLondonUK
  2. 2.Department of Child Development and EducationUniversity of Amsterdam/ Windesheim University of Applied SciencesAmsterdamThe Netherlands

Personalised recommendations