Advertisement

Mathematical Competency Demands of Assessment Items: a Search for Empirical Evidence

  • Andreas PettersenEmail author
  • Johan Braeken
Article

Abstract

The implementation of mathematical competencies in school curricula requires assessment instruments to be aligned with this new view on mathematical mastery. However, there are concerns over whether existing assessments capture the wide variety of cognitive skills and abilities that constitute mathematical competence. The current study applied an explanatory item response modelling approach to investigate how teacher-rated mathematical competency demands could account for the variation in item difficulty for mathematics items from the Programme for International Student Assessment (PISA) 2012 survey and a Norwegian national grade 10 exam. The results show that the rated competency demands can explain slightly more than and less than half of the variance in item difficulty for the PISA and exam items, respectively. This provides some empirical evidence for the relevance of the mathematical competencies for solving the assessment items. The results also show that for the Norwegian exam, only two of the competencies, Reasoning and argument and Symbols and formalism, appear to influence the difficulty of the items, which questions to what extent the exam items capture the variety of cognitive skills and abilities that constitute mathematical competence. We argue that this type of empirical data from the psychometric modelling should be used to improve assessments and assessment items, as well as to inform and possibly further develop theoretical concepts of mathematical competence.

Keywords

Assessment items Explanatory item response modelling Mathematical competency demand Sources of item difficulty 

Notes

Acknowledgements

The authors would like to thank Ross Turner for his support, feedback and contribution of material during this study. Further, we would also like to thank the teachers and prospective teachers for their contribution, the Norwegian PISA Group for allowing access to the PISA material and the Norwegian Directorate for Education and Training for access to the national exam material.

References

  1. Akaike, H. (1973). Information theory and an extension of the maximum likelihood principle. In B. N. Petrov & F. Csáki (Eds.), Second international symposium on information theory (pp. 267–281). Budapest, Hungary: Aademiai Kiado.Google Scholar
  2. Bates, D., Mächler, M., Bolker, B., & Walker, S. (2015). Fitting linear mixed-effects models using lme4. Journal of Statistical Software, 67, 1–48.  https://doi.org/10.18637/jss.v067.i01.
  3. Blömeke, S., Gustafsson, J.-E., & Shavelson, R. J. (2015). Beyond dichotomies: Viewing competence as a continuum. Zeitschrift für Psychologie, 223(1), 3–13.CrossRefGoogle Scholar
  4. Boesen, J., Helenius, O., Bergqvist, E., Bergqvist, T., Lithner, J., Palm, T., & Palmberg, B. (2014). Developing mathematical competence: From the intended to the enacted curriculum. The Journal of Mathematical Behavior, 33, 72–87.CrossRefGoogle Scholar
  5. Boesen, J., Lithner, J. & Palm, T. (2016). Assessing mathematical competencies: An analysis of Swedish national mathematics tests. Scandinavian Journal of Educational Research, 1–16.  https://doi.org/10.1080/00313831.2016.1212256.
  6. Daniel, R. C., & Embretson, S. E. (2010). Designing cognitive complexity in mathematical problem-solving items. Applied Psychological Measurement, 34(5), 348–364.CrossRefGoogle Scholar
  7. De Boeck, P., Cho, S. J., & Wilson, M. (2016). Explanatory item response models. In A. A. Rupp & J. P. Leighton (Eds.), The handbook of cognition and assessment: Frameworks, methodologies, and applications (pp. 249–268). Hoboken: Wiley.Google Scholar
  8. Duval, R. (2006). A cognitive analysis of problems of comprehension in a learning of mathematics. Educational Studies in Mathematics, 61(1), 103–131.CrossRefGoogle Scholar
  9. Elia, I., Panaoura, A., Eracleous, A., & Gagatsis, A. (2007). Relations between secondary pupils’ conceptions about functions and problem solving in different representations. International Journal of Science and Mathematics Education, 5(3), 533–556.CrossRefGoogle Scholar
  10. Embretson, S. E., & Daniel, R. C. (2008). Understanding and quantifying cognitive complexity level in mathematical problem solving items. Psychology Science, 50(3), 328–344.Google Scholar
  11. Embretson, S. E., & Gorin, J. (2001). Improving construct validity with cognitive psychology principles. Journal of Educational Measurement, 38(4), 343–368.CrossRefGoogle Scholar
  12. Enright, M. K., Morley, M., & Sheehan, K. M. (2002). Items by design: The impact of systematic feature variation on item statistical characteristics. Applied Measurement in Education, 15(1), 49–74.CrossRefGoogle Scholar
  13. Feeley, T. H. (2002). Comment on halo effects in rating and evaluation research. Human Communication Research, 28(4), 578–586.  https://doi.org/10.1111/j.1468-2958.2002.tb00825.x.
  14. Gorin, J. S., & Embretson, S. E. (2006). Item difficulty modeling of paragraph comprehension items. Applied Psychological Measurement, 30(5), 394–411.CrossRefGoogle Scholar
  15. Graf, E. A., Peterson, S., Steffen, M., & Lawless, R. (2005). Psychometric and cognitive analysis as a basis for the design and revision of quantitative item models (No. RR-05-25). Princeton: Educational Testing Service.Google Scholar
  16. Harks, B., Klieme, E., Hartig, J., & Leiss, D. (2014). Separating cognitive and content domains in mathematical competence. Educational Assessment, 19(4), 243–266.CrossRefGoogle Scholar
  17. Janssen, R., Schepers, J., & Peres, D. (2004). Models with item and item group predictors. In P. De Boeck & M. Wilson (Eds.), Explanatory item response models (pp. 189–212). New York: Springer.Google Scholar
  18. Kilpatrick, J. (2014). Competency frameworks in mathematics education. In S. Lerman (Ed.), Encyclopedia of mathematics education (pp. 85–87). Dordrecht, The Netherlands: Springer.Google Scholar
  19. Koedinger, K. R., & Nathan, M. J. (2004). The real story behind story problems: Effects of representations on quantitative reasoning. The Journal of the Learning Sciences, 13(2), 129–164.CrossRefGoogle Scholar
  20. Koeppen, K., Hartig, J., Klieme, E., & Leutner, D. (2008). Current issues in competence modeling and assessment. Zeitschrift für Psychologie, 216(2), 61–73.CrossRefGoogle Scholar
  21. Lane, S. (2004). Validity of high-stakes assessment: Are students engaged in complex thinking? Educational Measurement: Issues and Practice, 23(3), 6–14.CrossRefGoogle Scholar
  22. Messick, S. (1995). Validity of psychological assessment: Validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. American Psychologist, 50(9), 741–749.CrossRefGoogle Scholar
  23. National Council of Teachers of Mathematics (NCTM). (2000). Principles and standards for school mathematics. Reston: NCTM.Google Scholar
  24. Niss, M. (2007). Reflections on the state of and trends in research on mathematics teaching and learning. In F. K. J. Lester (Ed.), Second handbook of research on mathematics teaching and learning (pp. 1293–1312). Charlotte, NC: Information Age.Google Scholar
  25. Niss, M. (2015). Mathematical competencies and PISA. In K. Stacey & R. Turner (Eds.), Assessing mathematical literacy: The PISA experience (pp. 35–55): Heidelberg: Springer.Google Scholar
  26. Niss, M., Bruder, R., Planas, N., Turner, R., & Villa-Ochoa, J. A. (2016). Survey team on: Conceptualisation of the role of competencies, knowing and knowledge in mathematics education research. ZDM, 48(5), 611–632.CrossRefGoogle Scholar
  27. Niss, M., & Højgaard, T. (Eds.). (2011). Competencies and mathematical learning. Denmark: Roskilde University.Google Scholar
  28. Norwegian Directorate for Education and Training [Utdanningsdirektoratet]. (2014). Eksamensveiledning - om vurdering av eksamensbesvarelser. MAT0010 Matematikk. Sentralt gitt skriftlig eksamen. Grunnskole [Manual - to be used to assess exam papers. MAT0010 Mathematics. National written exam, end of compulsory education]. Oslo: Utdanningsdirektoratet.Google Scholar
  29. Organization for Economic Co-operation and Development (OECD). (2013). PISA 2012 Assessment and analytical framework: Mathematics, reading, science, problem solving and financial literacy. Paris: OECD Publishing.  https://doi.org/10.1787/9789264190511-en.
  30. Organization for Economic Co-operation and Development (OECD). (2014). PISA 2012 technical report. Paris: OECD Publishing. Retrieved from https://www.oecd.org/pisa/pisaproducts/PISA-2012-technical-report-final.pdf
  31. Pettersen, A., & Nortvedt, G. A. (2017). Identifying competency demands in mathematical tasks: recognising what matters. International Journal of Science and Mathematics Education.  https://doi.org/10.1007/s10763-017-9807-5.
  32. R Core Team. (2016). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. Retrieved from http://www.R-project.org
  33. Stylianou, D. A. (2011). An examination of middle school students’ representation practices in mathematical problem solving through the lens of expert work: Towards an organizing scheme. Educational Studies in Mathematics, 76(3), 265–280.CrossRefGoogle Scholar
  34. Turner, R., Blum, W., & Niss, M. (2015). Using competencies to explain mathematical item demand: A work in progress. In K. Stacey & R. Turner (Eds.), Assessing mathematical literacy: The PISA experience (pp. 85–115). New York: Springer.Google Scholar
  35. Turner, R., Dossey, J., Blum, W., & Niss, M. (2013). Using mathematical competencies to predict item difficulty in PISA: A MEG study. In M. Prenzel, M. Kobarg, K. Schöps, & S. Rönnebeck (Eds.), Research on PISA (pp. 23–37). New York: Springer.Google Scholar
  36. Valenta, A., Nosrati, M., & Wæge, K. (2015). Skisse av den «ideelle læreplan i matematikk» [Draft of the «ideal curriculum in mathematics»]. Trondheim: Nasjonalt senter for matematikk i opplæringen. Retrieved from https://nettsteder.regjeringen.no/fremtidensskole/files/2014/05/Skisse-av-den-ideellel%C3%A6replanen-i-matematikk.pdf.
  37. Wilson, M., De Boeck, P., & Carstensen, C. H. (2008). Explanatory item response models: A brief introduction. In E. Klieme & D. Leutner (Eds.), Assessment of competencies in educational contexts: State of the art and future prospects (pp. 91–120). Göttingen: Hogrefe & Huber.Google Scholar
  38. Zlatkin-Troitschanskaia, O., Shavelson, R. J., & Kuhn, C. (2015). The international state of research on measurement of competency in higher education. Studies in Higher Education, 40(3), 393–411.CrossRefGoogle Scholar

Copyright information

© Ministry of Science and Technology, Taiwan 2017

Authors and Affiliations

  1. 1.Department of Teacher Education and School ResearchUniversity of OsloOsloNorway
  2. 2.Centre for Educational Measurement, Faculty of Educational SciencesUniversity of OsloOsloNorway

Personalised recommendations