Skip to main content

Item Response Models in Computerized Adaptive Testing: A Simulation Study

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 8581))

Abstract

In the digital world, any conceptual assessment framework faces two main challenges: (a) the complexity of knowledge, capacities and skills to be assessed; (b) the increasing usability of web-based assessments, which requires innovative approaches to the development, delivery and scoring of tests. Statistical methods play a central role in such framework. Item response models have been the most common statistical methods used to address such kind of measurement challenges, and they have been used in computer-based adaptive tests, which allow the item selection adaptively, from an item pool, according to the person ability during test administration. The test is tailored to each student. In this paper we conduct a simulation study based on the minimum error-variance criterion method varying the item exposure rate (0.1, 0.3, 0.5) and the test maximum length (18, 27, 36). The comparison is done by examining the absolute bias, the root mean square-error, and the correlation. Hypotheses tests are applied to compare the true and estimated distributions. The results suggest the considerable reduction of bias as the number of item administered increases, the occurrence of ceiling effect in very small size tests, the full agreement between true and empirical distributions for computerized tests of length smaller than the paper-and-pencil tests.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Association for Educational Assessment - Europe: European Framework of Standards for Educational Assessment 1.0 (2010)

    Google Scholar 

  2. Lord, F.: A Theory of Test Scores. Psychometric Monograph, vol. (7). Richmond, VA (1952)

    Google Scholar 

  3. Lord, F.M., Novick, M.R., Birnbaum, A.: Statistical Theories of Mental Test Scores. Addison-Wesley, Oxford (1968)

    Google Scholar 

  4. Lord, F.M.: Applications of Item Response Theory to Practical Testing Problems. Erlbaum, Hillsdale (1980)

    Google Scholar 

  5. Lin, C.: Comparisons between Classical Test Theory and Item Response Theory in Automated Assembly of Parallel Test Forms in Automated Assembly of Parallel Test Forms. J. Technol. Learn. Assess. 6, 1–43 (2008)

    Google Scholar 

  6. Eignor, D.R.: Future Challenges in Psychometrics: Linking Scores Across Computer and Paper-Based Modes of Test Administration. In: Handbook of Statistics (Psychometrics), pp. 1099–1102 (2007)

    Google Scholar 

  7. Yao, L.: Multidimensional CAT Item Selection Methods for Domain Scores and Composite Scores: Theory and Applications. Psychometrika 77, 495–523 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  8. Van der Linden, W.J.: Multidimensional adaptive testing with a minimum error-variance criterion. J. Educ. Behav. Stat. 24, 398–412 (1999)

    Article  Google Scholar 

  9. Yao, L.: simuMCAT: Simulation of multidimensional computer adaptive testing. Monterey (2011)

    Google Scholar 

  10. Ferrão, M.E., Costa, P., Navio, V.M., Dias, V.M.: Medição da competência dos alunos do ensino básico em Matemática, 3EMat: uma proposta. In: Machado, C., Almeida, L., Guisande, M.A., Gonçalves, M., Ramalho, V. (eds.) XI Conferência Internacional Avaliação Psicológica: Formas e Contextos, pp. 905–915. Psiquilíbrios, Braga (2006)

    Google Scholar 

  11. Costa, P., Ferrão, M.E., Fernandes, N., Soares, T.: Uma aplicação da análise factorial na detecção das dimensões cognitivas em testes de avaliação em larga escala em Portugal (An application of Factorial Analysis for the Detection of Cognitive Dimensions in Large Scale Assessment in Portugal). In: Anais do XLI Simpósio Brasileiro de Pesquisa Operacional – Pesquisa Operacional na Gestão do Conhecimento, pp. 391–402. SOBRAPO, Rio de Janeiro (2009)

    Google Scholar 

  12. Ferrão, M.E., Costa, P.: Melhoria da Qualidade dos Instrumentos e Escalas de Aferição dos Resultados Escolares: Ligação entre Escalas das Provas de Aferição de Matemática e 3EMat. Covilhã (2009)

    Google Scholar 

  13. Ferrão, M.E., Loureiro, M.J., Navio, V.M., Coelho, I.: Aferição das Aprendizagens em Matemática no Ensino Básico: A proposta 3EMat. Universidade da Beira Interior, Covilhã (2009)

    Google Scholar 

  14. Ministério da Educação: Currículo Nacional do Ensino Básico: Competências Essenciais (National Curriculum in Primary, Elementary and Lower Secondary Education: essential competencies). Ministério da Educação, Lisboa (2001)

    Google Scholar 

  15. Ministério da Educação: Currículo Nacional do Ensino Básico: Organização Curricular e Programas do Ensino Básico – 1° ciclo (National Curriculum in Primary, Elementary and Lower Secondary Education: curricular organization). Ministério da Educação, Lisboa (2001)

    Google Scholar 

  16. Yao, L.: The BMIRT Toolkit. Monterey (2013)

    Google Scholar 

  17. Yao, L.: Comparing the performance of five multidimensional CAT selection procedures with different stopping rules. Appl. Psychol. Meas. 37, 3–23 (2012)

    Article  Google Scholar 

  18. AERA, APA, NCME: The Standards for Educational and Psychological Testing. Washington DC (1999)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Ferrão, M.E., Prata, P. (2014). Item Response Models in Computerized Adaptive Testing: A Simulation Study. In: Murgante, B., et al. Computational Science and Its Applications – ICCSA 2014. ICCSA 2014. Lecture Notes in Computer Science, vol 8581. Springer, Cham. https://doi.org/10.1007/978-3-319-09150-1_40

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-09150-1_40

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-09149-5

  • Online ISBN: 978-3-319-09150-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics