Advertisement

Analysing the (Mis)Use and Consequences of International Large-Scale Assessments

  • Stefan JohanssonEmail author
Chapter
  • 15 Downloads
Part of the Globalisation, Comparative Education and Policy Research book series (GCEP, volume 20)

Abstract

When insights are shared across borders, similarities in structures, polices, pedagogies and curricula can emerge. One global force is international large-scale assessments (ILSA), which have been criticized for spreading isomorphic ideologies. At the same time, ILSA data may have the potential to legitimize informed decisions, now covering long-term trend databases from many school-systems. Further, IEA encyclopedias, papers presented at IEA and PISA research conferences, and a growing volume of academic publications all point to numerous studies that draw on international assessment datasets to explore issues of pedagogy and classroom practice. Given the rigorous test administration of ILSA’s, the data generated has the potential to provide nuanced snapshots of characteristics of different school-systems, provided that is that the data are used with caution. But are data used with caution? The current chapter discusses the use and possibilities of ILSA data and how results on ILSA’s impact education and policy reforms world-wide.

Keywords

Assessment Comparative education Education policy Globalization International large-scale assessments PISA Consequential validity Policy impact 

References

  1. Baker, D., & LeTendre, G. (2005). National differences, global similarities: World culture and the future of schooling. Stanford: Stanford University Press.Google Scholar
  2. Ball, S. J. (2010). New voices, new Knowledges and the new politics of education research: The gathering of a perfect storm? European Educational Research Journal, 9(2), 124–137.  https://doi.org/10.2304/eerj.2010.9.2.124.CrossRefGoogle Scholar
  3. Berliner, D. (2018). PISA is simply another standardized test. In S. Lindblad, D. Pettersson, & T. S. Popkewitz (Eds.), Education by the numbers and the making of society. The expertise of international assessments. New York/London: Routledge.Google Scholar
  4. Biggs, J. (1999). What the student does: Teaching for enhanced learning. Higher Education Research & Development, 18(1), 57–75.  https://doi.org/10.1080/0729436990180105.CrossRefGoogle Scholar
  5. Bosma, N., Wennekers, S., & Amorós, J. (2012). Global entrepreneurship monitor: 2011 extended report: Entrepreneurs and entrepreneurial employees across the globe. London: Global Entrepreneurship Research Association.Google Scholar
  6. Carvalho, L. M., & Costa, E. (2015). Seeing education with one’s own eyes and through PISA lenses: Considerations of the reception of PISA in European countries. Discourse: Studies in the Cultural Politics of Education, 36(5), 638–646.  https://doi.org/10.1080/01596306.2013.871449.CrossRefGoogle Scholar
  7. Gorur, R., & Wu, M. (2014). Leaning too far? PISA, policy and Australia’ ‘top five’ ambitions. Discourse: Studies in the Cultural Politics of Education, 36(5), 1–18.  https://doi.org/10.1080/01596306.2014.930020.CrossRefGoogle Scholar
  8. Grek, S. (2009). Governing by numbers: The PISA ‘effect’ in Europe. Journal of Education Policy, 24(1), 23–37.  https://doi.org/10.1080/02680930802412669.CrossRefGoogle Scholar
  9. Gustafsson, J.-E. (2008). Effects of international comparative studies on educational quality on the quality of educational research. European Educational Research Journal, 7(1), 1–17.  https://doi.org/10.2304/eerj.2008.7.1.1.CrossRefGoogle Scholar
  10. Hanushek, E. A., & Woessmann, L. (2007). Education quality and economic growth. Washington, DC: The World Bank.CrossRefGoogle Scholar
  11. Hanushek, E. A., & Woessmann, L. (2015). The knowledge capital of nations: Education and the economics of growth. Cambridge, MA: MIT Press.CrossRefGoogle Scholar
  12. Husén, T. (1979). An international research venture in retrospect: The IEA surveys. Comparative Education Review, 23, 371–385.CrossRefGoogle Scholar
  13. Johansson, S. (2018). Do students’ high scores on international assessments translate to low levels of creativity? Phi Delta Kappan, 99(7), 57–61.  https://doi.org/10.1177/0031721718767863.CrossRefGoogle Scholar
  14. Johansson, S., & Strietholt, R. (2016). Konvergieren Leistungsprofile in Mathematik? Evidenz aus fünf IEA Studien. In R. Strietholt, W. Bos, H.-G. Holtappels, & N. McElvany (Eds.), Jahrbuch der Schulentwicklung. Band 19 – Daten, Beispiele und Perspektiven. Weinheim Basel: Beltz Juventa.Google Scholar
  15. Kane, M. (2006). Validation. In R. Brennan (Ed.), Educational measurement (4th ed.). Washington, DC: American Council on Education and National Council on Measurement in Education.Google Scholar
  16. Keeves, J. P. (1972). Educational environment and student achievement. Stockholm: Almqvist and Wiksell.Google Scholar
  17. Klemenčič, E., & Mirazchiyski, P. V. (2018). League tables in educational evidence-based policy-making: Can we stop the horse race, please? Comparative Education, 54(3), 309–324.  https://doi.org/10.1080/03050068.2017.1383082.CrossRefGoogle Scholar
  18. Komatsu, H., & Rappleye, J. (2017a). A new global policy regime founded on invalid statistics? Hanushek, Woessmann, PISA, and economic growth. Comparative Education, 53(2), 166–191.  https://doi.org/10.1080/03050068.2017.1300008.CrossRefGoogle Scholar
  19. Komatsu, H., & Rappleye, J. (2017b). A PISA paradox? An alternative theory of learning as a possible solution for variations in PISA scores. Comparative Education Review, 61(2), 269–297.  https://doi.org/10.1086/690809.CrossRefGoogle Scholar
  20. Kreiner, S., & Christensen, K. (2014). Analyses of model fit and robustness. A new look at the PISA scaling model underlying ranking of countries according to Reading literacy. Psychometrika, 79(2), 210–231.  https://doi.org/10.1007/s11336-013-9347-z.CrossRefGoogle Scholar
  21. Lockheed, M., & Wagemaker, H. (2013). International large-scale assessments: Thermometers, whips or useful policy tools? Research in Comparative and International Education, 8(3), 296–306.CrossRefGoogle Scholar
  22. Messick, S. (1989). Validity. In R. L. Linn (Ed.), Educational measurement (3rd ed., pp. 13–103). New York: American Council on Education/Macmillian.Google Scholar
  23. Meyer, H. D., Strietholt, R., & Epstein, D. Y. (2018). Three models of global education quality and the emerging democratic deficit in global education governance. In M. Akiba & G. K. LeTendre (Eds.), Routledge international handbook of teacher quality and policy. New York: Routledge.Google Scholar
  24. Novóa, A., & Yariv-Mashal, T. (2003). Comparative research in education: A mode of governance or a historical journey? Comparative Education, 39(4), 423–438.CrossRefGoogle Scholar
  25. OECD. (2017). PISA 2015 technical report. Paris: OECD.Google Scholar
  26. Olsen, R. V. (2005). Achievement tests from an item perspective. An exploration of single item data from PISA and TIMSS studies, and how such data can informm us about students’ knowledge and thinking in science. PhD dissertation, University of Oslo.Google Scholar
  27. Pettersson, D. (2008). Internationell kunskapsbedömning som inslag i nationell styrning av skolan [International knowledge assessments: an element of national educational steering]. PhD dissertation, Uppsala University.Google Scholar
  28. Robitaille, D. F., & Garden, R. A. (1989). The IEA study of mathematics II. Context and outcomes of school mathematics. Pergamon: Oxford.Google Scholar
  29. Sahlberg, P. (2006). Education reform for raising economic competitiveness. Journal of Educational Change, 7(4), 259–287.  https://doi.org/10.1007/s10833-005-4884-6.CrossRefGoogle Scholar
  30. Schoultz, J., Säljö, R., & Wyndhamn, J. (2001). Conceptual knowledge in talk and text: What does it take to understand a science question? Instructional Science, 29(3), 213–236.  https://doi.org/10.1023/A:1017586614763.CrossRefGoogle Scholar
  31. Serder, M., & Ideland, M. (2016). PISA truth effects: The construction of low performance. Discourse: Studies in the Cultural Politics of Education, 37(3), 341–357.  https://doi.org/10.1080/01596306.2015.1025039.CrossRefGoogle Scholar
  32. Serder, M., & Jakobsson, A. (2015). “Why bother so incredibly much?”: Student perspectives on PISA science assignments. Cultural Studies of Science Education, 10(3), 833–853.  https://doi.org/10.1007/s11422-013-9550-3.CrossRefGoogle Scholar
  33. Spring, J. (2008). Research on globalization and education. Review of Educational Research, 78(2), 330–363.  https://doi.org/10.3102/0034654308317846.CrossRefGoogle Scholar
  34. Walker, D. A. (1976). The IEA six subject survey: An empirical study of education in twenty-one countries. Uppsala: Almqvist & Wiksell International.Google Scholar
  35. Wiseman, A. W., Astiz, M. F., & Baker, D. P. (2013). Comparative education research framed by neo-institutional theory: A review of diverse approaches and conflicting assumptions. Compare: A Journal of Comparative and International Education, 44(5), 688–709.  https://doi.org/10.1080/03057925.2013.800783.CrossRefGoogle Scholar
  36. Zhao, Y. (2012). Flunking innovation and creativity. Phi Delta Kappan, 94(1), 56–61.  https://doi.org/10.1177/003172171209400111.CrossRefGoogle Scholar

Copyright information

© Springer Nature B.V. 2020

Authors and Affiliations

  1. 1.Department of Education and Special EducationUniversity of GothenburgGothenburgSweden

Personalised recommendations