Advertisement

Concerns with Using Test Results for Political and Pedagogical Purposes: A Danish Perspective

  • Jens DolinEmail author
Part of the The Enabling Power of Assessment book series (EPAS, volume 2)

Abstract

Testing – classroom based as well as large-scale testing for comparative purposes – is becoming an increasingly important factor in educational policy in Denmark as in the other Nordic countries. Test results are attracting headlines in the media, often because these results disturb the national self-image of being among the best in the world, and improvement in these tests is established as a political goal. Especially high-stakes and large-scale tests affect both the national educational policy and the teaching in the classroom – not necessarily directly, but increasingly indirectly through the values and the discourse they impose on the school and society. These effects will be illustrated through the results from two larger research projects in which the author participated. The first is a Danish Clearinghouse study on the pedagogical consequences of high-stakes tests, showing the negative influence of these tests on teaching and on student behaviour. The other is a research project validating the PISA test in a Danish context, showing how the PISA tests, as an example of large-scale, comparative tests, have become a lever for dramatic changes in the Danish educational policy, without building on a valid justification. On a general level, these examples are seen as confirming the overall shift driven by global test systems, from a ‘bildung/didaktik’ approach (traditionally undertaken in the Nordic/Central European countries) towards a curriculum/policy driven approach (Anglo-American tradition) within education. Finally, the article will draw from the demonstrated tendencies to present some leadership and policy implications.

Keywords

Standardised tests PISA Bildung Denmark Nordic leadership Validity Assessment Assessment paradigm Didaktik Curriculum School leadership Educational policy 

References

  1. Aarsland, L., Danielsen, A. B., & Winther, A. (2004, December 7). PISA-rapporten: Ny nedtur for folkeskolen [English: PISA report says: Folkeskolen fails, once again]. Politiken (Newspaper), p. 2.Google Scholar
  2. Abd-El-Khalick, F. (2012). Nature of science in science education: Toward a coherent framework for synergetic research and development. In B. J. Fraser et al. (Eds.), Second international handbook of science education. Dordrecht, The Netherlands: Springer.Google Scholar
  3. Andersen, A. M., Egelund, N., Jensen, T. P., Krone, M., Lindenskov, L., & Mejding, J. (2001). Forventninger og færdigheder - danske unge i en international sammenligning [English: Expectations and skills – Danish youth in an international comparison]. København, Denmark: AKF, DPU og SFI-Survey. (The Danish PISA2000 report – in Danish)Google Scholar
  4. Au, W. (2007). High stakes testing and curricular control: A qualitative metasynthesis. Educational Researcher, 36(5), 258–267.CrossRefGoogle Scholar
  5. Bennett, J., Lubben, F., Hogarth, S., & Campbell, B. (2005). Systematic reviews of research in science education: Rigour or rigidity? International Journal of Science Education, 27(4), 387–406.CrossRefGoogle Scholar
  6. Bindslev, M. W. (2004, December 7). Danske elever er middelmådige [English: The mediocre performance of Danish students]. Kristeligt Dagblad (Newspaper), p. 1.Google Scholar
  7. Buhagiar, M. A. (2007). Classroom assessment within the alternative assessment paradigm: Revisiting the territory. The Curriculum Journal, 18, 39–56.CrossRefGoogle Scholar
  8. Bybee, R. (1997). Towards an understanding of scientific literacy. In W. Graeber & C. Bolte (Eds.), Scientific literacy (pp. 37–68). Kiel, Germany: Independent Publishers Network.Google Scholar
  9. Chia, R. (1995). From modern to postmodern organizational analysis. Organizational Studies, 16(4), 579–604.CrossRefGoogle Scholar
  10. Committee on a Conceptual Framework for New K-12 Science Education Standards. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: National Academies Press.Google Scholar
  11. Dolin, J. (2012). Using large scale test results for pedagogical purposes. In C. Bruguière, A. Tiberghien, & P. Clément (Eds.), E-book proceedings of the ESERA 2011 conference: Science learning and citizenship. Part 10 (pp. 16–22). Lyon, France: European Science Education Research Association.Google Scholar
  12. Dolin, J. (2013). Dannelse, kompetence og kernefaglighed (English: Bildung, competence, and core subject specific knowledge). In E. Damberg, J. Dolin, G. H. Ingerslev, & P. Kaspersen (Eds.), Gymnasiepædagogik [English: Pedagogy for upper secondary] (pp. 67–86). København, Denmark: Hans Reitzel (In Danish).Google Scholar
  13. Dolin, J., & Krogh, L. B. (2008). Den naturfaglige evalueringskultur i folkeskolen. Anden delrapport fra VAP-projektet [English: The assessment culture in the science subjects in the compulsory school. Second report from the VAP project] (INDs skriftserie nr. 17). København, Denmark: Institut for Naturfagenes Didaktik/Københavns Universitet. (In Danish)Google Scholar
  14. Dolin, J., & Krogh, L. B. (2010). The relevance and consequences of PISA science in a Danish context. International Journal of Science and Mathematics Education, 8, 565–592.CrossRefGoogle Scholar
  15. Eijck, M. V. (2012). Capturing the dynamics of science in science education. In B. J. Fraser et al. (Eds.), Second international handbook of science education. Dordrecht, The Netherlands: Springer.Google Scholar
  16. EPPI CENTRE. (2002). EPPI-Reviewer, version 2.0 (Web edition), EPPI-Centre software. London: Social Science Research Unit, Institute of Education. Retrieved from http://eppi.ioe.ac.uk/cms/
  17. European Commission. (2007). Science education now: A renewed pedagogy for the future of Europe. Brussels, Belgium: European Commission.Google Scholar
  18. Gardner, J. (Ed.). (2006). Assessment and learning. London, UK: SAGE Publications.Google Scholar
  19. Gipps, C. (1999). Socio-cultural aspects of assessment. Review of Research in Education, 24, 355–392.Google Scholar
  20. Greve, C. (2002). New public management. København, Denmark: Nordisk Kultur Institut. (In Danish)Google Scholar
  21. Harlen, W. (2013). Assessment & inquiry-based science education: Issues in policy and practice. Global Network of Science Academies (IAP) Science Education Programme (SEP). Retrieved from http://www.interacademies.net/File.aspx?id=21245
  22. Hartig, J., Klieme, E., & Leutner, D. (2008). Assessment of competences in educational contexts. Göttingen, Deutschland: Hogrefe.Google Scholar
  23. Helliwell, J. F., Layard, R., & Sachs, J. (Eds.). (2013). World happiness report 2013. New York: UN Sustainable Development Solutions Network.Google Scholar
  24. Krogh, L. B., & Dolin, J. (2011). PISA 2006 Science testen og danske elevers naturfaglige formåen [English: The PISA 2006 Science test and Danish students’ scientific literacy]. Periodicals from Department of Science Education, 25, University of Copenhagen.Google Scholar
  25. Latour, B. (1987). Science in action. Cambridge, MA: Harvard University Press.Google Scholar
  26. Moos, L. (Ed.). (2013). Transnational influences on values and practices in Nordic educational leadership. Is there a Nordic model? Dordrecht, The Netherlands: Springer.Google Scholar
  27. Nordenbo, S. E., Allerup, P., Andersen, H., Dolin, J., Korp, H., Larsen, M.S., et al (2009). Pædagogisk brug af test – Et systematisk review [English: Pedagogic use of tests – A systematic review]. København, Denmark: Danmarks Pædagogiske Universitets Forlag.Google Scholar
  28. OECD. (1999). Measuring student knowledge and skills. A new framework for assessment. Paris, France: OECD Publications.Google Scholar
  29. OECD. (2004). Learning for tomorrow’s world. First results from PISA 2003. Paris: OECD Publications.Google Scholar
  30. Raae, P. H. (2013). Skoleudvikling (English: School development). In E. Damberg, J. Dolin, G. Ingerslev, & P. Kaspersen (Eds.). Gymnasiepædagogik (English: Pedagogy for upper secondary) (pp. 714–735). København, Denmark: Hans Reitzel. (In Danish)Google Scholar
  31. Schou, L. R. (2010). Test og evaluering: Løsningen eller problemet? (English: Test or evaluation: the solution or the problem?). Dansk Pædagogisk tidsskrift, 1(10), 74–81 (In Danish).Google Scholar
  32. Schoultz, J., Saljo, R., & Wyndhamn, J. (2001a). Conceptual knowledge in talk and text: What does it take to understand a science question? Instructional Science, 29, 213–236.CrossRefGoogle Scholar
  33. Schoultz, J., Saljo, R., & Wyndhamn, J. (2001b). Heavenly talk: Discourse, artifacts, and children's understanding of elementary astronomy. Human Development, 44, 103–118.CrossRefGoogle Scholar
  34. Shavelson, R. (2011, February). An approach to testing and modeling competence. Presentation at the conference on Modeling and Measurement of Competencies in Higher Education, Berlin, Germany.Google Scholar
  35. Smith, P. S., Hounshell, P. B., Copolo, C., & Wilkerson, S. (1992). The impact of end-of-course testing on curriculum and instruction. Science Education, 76(5), 523–530.CrossRefGoogle Scholar
  36. Sturman, L. (2003). Teaching to the test: Science or intuition? Educational Research, 45(3), 261–273.CrossRefGoogle Scholar
  37. UNDP. (2013). Human development report 2013. New York: United Nations Development Programme (UNDP).Google Scholar
  38. Weinert, F. E. (2001). Concept of competence: A conceptual clarification. In D. S. Rychen & L. H. Salganik (Eds.), Defining and selecting key competencies. Göttingen, Deutschland: Hogrefe & Huber Publishers.Google Scholar
  39. Westbury, I. (2000). Teaching as a reflective practice: What might Didaktik teach curriculum? In I. Westbury, S. Hopmann, & K. Riquarts (Eds.), Teaching as a reflective practice: The German Didaktik tradition (pp. 15–39). Mahwah, NJ: Lawrence Erlbaum.Google Scholar
  40. World Economic Forum. (2008, 2009). The global competitiveness report 2008, 2009. Geneva, Switzerland: World Economic Forum.Google Scholar
  41. Young, M. (2010). Alternative educational futures for a knowledge society. The European Educational Research Journal, 4(1), 1–12.CrossRefGoogle Scholar
  42. Zhao, Y., & Meyer, H. D. (2013). High on PISA, low on entrepreneurship? What PISA does not measure. In H. D. Meyer & A. Benavot (Eds.), PISA, power, and policy: The emergence of global educational governance. Oxford, UK: Symposium Books.Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Department of Science EducationUniversity of CopenhagenCopenhagenDenmark

Personalised recommendations