Skip to main content

Validation of the Pre-licensure Examination for Pre-service Teachers in Professional Education Using Rasch Analysis

  • Conference paper
  • First Online:
Pacific Rim Objective Measurement Symposium (PROMS) 2015 Conference Proceedings
  • 500 Accesses

Abstract

Teacher education is recognized as having a crucial role in preparing future teachers and in shaping quality education. One of the crucial tasks a teacher education institution has to do is preparing its teachers for the national licensure examination and in meeting national standards. The primary purpose of this study was to provide preliminary validity evidence for a 200-item Pre-Licensure Examination Test for Teachers which is designed to measure the knowledge and skills of 152 pre-service teachers on Professional education as preparation for taking the real Licensure Exam for teachers. This paper is divided into two major sections: a literature review about the role of assessment in student learning and a report of the findings from validating a test instrument for pre-service teachers. The first part discusses what literature says about the importance of assessment in learning and the use of tests in the classroom. It further enumerates the psychometric qualities of a good test. The first section also highlights a discussion about classical test theory and item response theory specifically Rasch analysis. Part two of this paper presents the results of the validation done on the mock test given to pre-service teachers before they take the Licensure Examination for Teachers. The analysis shows that each of the 7 construct measures what it should measure and each construct has good psychometric qualities. Three of the constructs (curriculum, educational technology and teaching profession) have all items which have the required fit; social dimension has only 1 item which was under fitting; assessment has 2 ‘misfit’ items, social dimension has 1 under fitting item, principle of teaching having 1 under fit item and theories of education with 1 over fitting item. Items of each dimension possess a good discrimination power of separating the student which has a high ability from the less performing students. However, putting all the items together to measure the general construct on understanding and application of professional education created a problem thus, there is a need to review the competencies in each construct vis-avis the general goal of professional education as a whole.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Al-Sabbah, S. A., Lan, O. S., & Mey, S. C. (2010). The using of Rasch-Model in validating the Arabic version of multiple intelligence development assessment scale (MIDAS). [Report]. International Journal of Behavioral, Cognitive, Education and Psychological Sciences, 2(3), 152.

    Google Scholar 

  • Alagumalai, S., & Curtis, D. D. (2005). Classical Test Theory. In S. Alagumalai, D. D. Curtis, & N. Hungi (Eds.), Applied Rasch measurement: A book of exemplars (pp. 1–14). The Netherlands: Springer.

    Chapter  Google Scholar 

  • Bechger, T. M., Maris, G., Verstralen, H. H. F. M., & Béguin, A. A. (2003). Using classical test theory in combination with item response theory. Applied Psychological Measurement, 27(5), 319–334.

    Google Scholar 

  • Belvedere, S. L., & de Morton, N. A. (2010). Application of Rasch analysis in health care is increasing and is applied for variable reasons in mobility instruments. Journal of Clinical Epidemiology, 63(12), 1287–1297.

    Article  PubMed  Google Scholar 

  • Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education: Principles, Policy & Practice, 5(1), 7–74.

    Article  Google Scholar 

  • Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human sciences. Mahwah, N.J: Lawrence Erlbaum Associates Publishers.

    Google Scholar 

  • Boone, W. J., & Scantlebury, K. (2006). The role of Rasch analysis when conducting science education research utilizing multiple-choice tests. Science Education, 90, 253–269.

    Article  Google Scholar 

  • Brew, C., Riley, P., & Walta, C. (2009). Education students and their teachers: Comparing views on participative assessment practices. Assessment & Evaluation in Higher Education, 1–16.

    Google Scholar 

  • Choppin, B. (1983). The Rasch model for item analysis. Center for the Study of Evaluation, University of California.

    Google Scholar 

  • DiBattista, D., & Kurzawa, L. (2011). Examination of the Quality of Multiple-Choice Items on Classroom Tests. Canadian Journal for the Scholarship of Teaching and Learning, 2(2).

    Google Scholar 

  • Franchignoni, F., Ferriero, G., Giordano, A., Sartorio, F., Vercelli, S., & Brigatti, E. (2011). Psychometric properties of QuickDASH—A classical test theory and Rasch analysis study. Manual Therapy, 16(2), 177–182.

    Article  PubMed  Google Scholar 

  • Fuentealba, C. (2011). The role of assessment in the student learning process. Journal of Veterinary Medical Education, 38(2), 157.

    Article  PubMed  Google Scholar 

  • Hambleton, R. K. & Jones, R. W. (1993). Comparison of classical test theory and item response theory and their applications to test development. Instructional Topics in Educational Measurement, Module 16. Retrieved 30 November 2011 from http://www.ncme.org/pubs/items/24.pdf.

  • Griffin, P. (2009). Teachers’ use of assessment data Educational assessment in the 21st century (pp. 183–208). Springer.

    Google Scholar 

  • Guskey, T. R. (2003). How classroom assessments improve learning. Educational Leadership, 60(5), 6–11.

    Google Scholar 

  • Lamb, R., Annetta, L., Meldrum, J., Vallett, D., Lamb, R., Annetta, L., & Vallett, D. (2011). Measuring science interest: Rasch validation of the science interest survey. International Journal of Science and Mathematics Education, 10(3), 643–668.

    Article  Google Scholar 

  • Martone, A., & Sireci, S. G. (2009). Evaluating alignment between curriculum, assessment, and instruction. Review of Educational Research, 79(4), 1332–1361.

    Article  Google Scholar 

  • McTighe, J., & Wiggins, G. (2005). Understanding by Design (Expanded Second Edition). Association for Supervision & Curriculum Development.

    Google Scholar 

  • Mehrens, W. (1987). Validity issues in teacher licensure tests. Journal of Personnel Evaluation in Education, 1(2), 195–229.

    Article  Google Scholar 

  • Mertler, C. A., & Campbell, C. (2005). Measuring teachers’ knowledge and application of classroom assessment concepts: Development of the Assessment Literacy Inventory. Paper presented at the Annual Meeting of the American Educational Research Association, Montreal, QC.

    Google Scholar 

  • Purya, B., & Nazila, A. (2011). Validation of a multiple choice English vocabulary test with the Rasch model. Journal of Language Teaching and Research, 2(5), 1052.

    Google Scholar 

  • Rieg, S. A., & Wilson, B. A. (2009). An investigation of the instructional pedagogy and assessment strategies used by teacher educators in two universities within a state system of higher education. Education, 130(2), 277–294.

    Google Scholar 

  • Santelices, M. V., & Wilson, M. (2012). On the relationship between differential item functioning and item difficulty: An issue of methods? Item response theory approach to differential item functioning. Educational and Psychological Measurement, 72(1), 5–36.

    Article  Google Scholar 

  • Shepard, L. A. (1991). Will national tests improve student learning? The Phi Delta Kappan, 73(3), 232–238.

    Google Scholar 

  • Shimberg, B. (1981). Testing for licensure and certification. American Psychologist, 36(10), 1138–1146.

    Article  Google Scholar 

  • Skelton, C. (2002). The ‘feminisation of schooling’ or ‘re-masculinising’ primary education? International Studies in Sociology of Education, 12(1), 77–96. doi:10.1080/09620210200200084.

    Google Scholar 

  • Stiggins, R. J. (1999). Evaluating classroom assessment training in teacher education programs. Educational Measurement: Issues and Practice, 18(1), 23–27.

    Article  Google Scholar 

  • Webber, C. F. E., & Lupart, J. L. E. (2012). Leading student assessment (Vol. 15). Dordrecht: Springer, Netherlands, Dordrecht.

    Book  Google Scholar 

  • Wiberg, M. (2004). Classical test theory vs. item response theory: An evaluation of the theory test in the Swedish driving-license test.

    Google Scholar 

  • Wu, M., & Adams, R. (2007). Applying the Rasch model to psycho-social measurement: A practical approach: Educational Measurement Solutions Melbourne.

    Google Scholar 

  • Wu, M. L., Adams, R. J., Wilson, M. R., & Haldane, S. A. (2007). ConQuest Version 2.0. Camberwell, Victoria: ACER Press.

    Google Scholar 

  • Wright & Linacre, (1994) http://www.rasch-analysis.com/rasch-analysis.htm.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jovelyn Delosa .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer Science+Business Media Singapore

About this paper

Cite this paper

Delosa, J. (2016). Validation of the Pre-licensure Examination for Pre-service Teachers in Professional Education Using Rasch Analysis. In: Zhang, Q. (eds) Pacific Rim Objective Measurement Symposium (PROMS) 2015 Conference Proceedings. Springer, Singapore. https://doi.org/10.1007/978-981-10-1687-5_8

Download citation

Publish with us

Policies and ethics