Abstract
Teacher education is recognized as having a crucial role in preparing future teachers and in shaping quality education. One of the crucial tasks a teacher education institution has to do is preparing its teachers for the national licensure examination and in meeting national standards. The primary purpose of this study was to provide preliminary validity evidence for a 200-item Pre-Licensure Examination Test for Teachers which is designed to measure the knowledge and skills of 152 pre-service teachers on Professional education as preparation for taking the real Licensure Exam for teachers. This paper is divided into two major sections: a literature review about the role of assessment in student learning and a report of the findings from validating a test instrument for pre-service teachers. The first part discusses what literature says about the importance of assessment in learning and the use of tests in the classroom. It further enumerates the psychometric qualities of a good test. The first section also highlights a discussion about classical test theory and item response theory specifically Rasch analysis. Part two of this paper presents the results of the validation done on the mock test given to pre-service teachers before they take the Licensure Examination for Teachers. The analysis shows that each of the 7 construct measures what it should measure and each construct has good psychometric qualities. Three of the constructs (curriculum, educational technology and teaching profession) have all items which have the required fit; social dimension has only 1 item which was under fitting; assessment has 2 ‘misfit’ items, social dimension has 1 under fitting item, principle of teaching having 1 under fit item and theories of education with 1 over fitting item. Items of each dimension possess a good discrimination power of separating the student which has a high ability from the less performing students. However, putting all the items together to measure the general construct on understanding and application of professional education created a problem thus, there is a need to review the competencies in each construct vis-avis the general goal of professional education as a whole.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Al-Sabbah, S. A., Lan, O. S., & Mey, S. C. (2010). The using of Rasch-Model in validating the Arabic version of multiple intelligence development assessment scale (MIDAS). [Report]. International Journal of Behavioral, Cognitive, Education and Psychological Sciences, 2(3), 152.
Alagumalai, S., & Curtis, D. D. (2005). Classical Test Theory. In S. Alagumalai, D. D. Curtis, & N. Hungi (Eds.), Applied Rasch measurement: A book of exemplars (pp. 1–14). The Netherlands: Springer.
Bechger, T. M., Maris, G., Verstralen, H. H. F. M., & Béguin, A. A. (2003). Using classical test theory in combination with item response theory. Applied Psychological Measurement, 27(5), 319–334.
Belvedere, S. L., & de Morton, N. A. (2010). Application of Rasch analysis in health care is increasing and is applied for variable reasons in mobility instruments. Journal of Clinical Epidemiology, 63(12), 1287–1297.
Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education: Principles, Policy & Practice, 5(1), 7–74.
Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human sciences. Mahwah, N.J: Lawrence Erlbaum Associates Publishers.
Boone, W. J., & Scantlebury, K. (2006). The role of Rasch analysis when conducting science education research utilizing multiple-choice tests. Science Education, 90, 253–269.
Brew, C., Riley, P., & Walta, C. (2009). Education students and their teachers: Comparing views on participative assessment practices. Assessment & Evaluation in Higher Education, 1–16.
Choppin, B. (1983). The Rasch model for item analysis. Center for the Study of Evaluation, University of California.
DiBattista, D., & Kurzawa, L. (2011). Examination of the Quality of Multiple-Choice Items on Classroom Tests. Canadian Journal for the Scholarship of Teaching and Learning, 2(2).
Franchignoni, F., Ferriero, G., Giordano, A., Sartorio, F., Vercelli, S., & Brigatti, E. (2011). Psychometric properties of QuickDASH—A classical test theory and Rasch analysis study. Manual Therapy, 16(2), 177–182.
Fuentealba, C. (2011). The role of assessment in the student learning process. Journal of Veterinary Medical Education, 38(2), 157.
Hambleton, R. K. & Jones, R. W. (1993). Comparison of classical test theory and item response theory and their applications to test development. Instructional Topics in Educational Measurement, Module 16. Retrieved 30 November 2011 from http://www.ncme.org/pubs/items/24.pdf.
Griffin, P. (2009). Teachers’ use of assessment data Educational assessment in the 21st century (pp. 183–208). Springer.
Guskey, T. R. (2003). How classroom assessments improve learning. Educational Leadership, 60(5), 6–11.
Lamb, R., Annetta, L., Meldrum, J., Vallett, D., Lamb, R., Annetta, L., & Vallett, D. (2011). Measuring science interest: Rasch validation of the science interest survey. International Journal of Science and Mathematics Education, 10(3), 643–668.
Martone, A., & Sireci, S. G. (2009). Evaluating alignment between curriculum, assessment, and instruction. Review of Educational Research, 79(4), 1332–1361.
McTighe, J., & Wiggins, G. (2005). Understanding by Design (Expanded Second Edition). Association for Supervision & Curriculum Development.
Mehrens, W. (1987). Validity issues in teacher licensure tests. Journal of Personnel Evaluation in Education, 1(2), 195–229.
Mertler, C. A., & Campbell, C. (2005). Measuring teachers’ knowledge and application of classroom assessment concepts: Development of the Assessment Literacy Inventory. Paper presented at the Annual Meeting of the American Educational Research Association, Montreal, QC.
Purya, B., & Nazila, A. (2011). Validation of a multiple choice English vocabulary test with the Rasch model. Journal of Language Teaching and Research, 2(5), 1052.
Rieg, S. A., & Wilson, B. A. (2009). An investigation of the instructional pedagogy and assessment strategies used by teacher educators in two universities within a state system of higher education. Education, 130(2), 277–294.
Santelices, M. V., & Wilson, M. (2012). On the relationship between differential item functioning and item difficulty: An issue of methods? Item response theory approach to differential item functioning. Educational and Psychological Measurement, 72(1), 5–36.
Shepard, L. A. (1991). Will national tests improve student learning? The Phi Delta Kappan, 73(3), 232–238.
Shimberg, B. (1981). Testing for licensure and certification. American Psychologist, 36(10), 1138–1146.
Skelton, C. (2002). The ‘feminisation of schooling’ or ‘re-masculinising’ primary education? International Studies in Sociology of Education, 12(1), 77–96. doi:10.1080/09620210200200084.
Stiggins, R. J. (1999). Evaluating classroom assessment training in teacher education programs. Educational Measurement: Issues and Practice, 18(1), 23–27.
Webber, C. F. E., & Lupart, J. L. E. (2012). Leading student assessment (Vol. 15). Dordrecht: Springer, Netherlands, Dordrecht.
Wiberg, M. (2004). Classical test theory vs. item response theory: An evaluation of the theory test in the Swedish driving-license test.
Wu, M., & Adams, R. (2007). Applying the Rasch model to psycho-social measurement: A practical approach: Educational Measurement Solutions Melbourne.
Wu, M. L., Adams, R. J., Wilson, M. R., & Haldane, S. A. (2007). ConQuest Version 2.0. Camberwell, Victoria: ACER Press.
Wright & Linacre, (1994) http://www.rasch-analysis.com/rasch-analysis.htm.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer Science+Business Media Singapore
About this paper
Cite this paper
Delosa, J. (2016). Validation of the Pre-licensure Examination for Pre-service Teachers in Professional Education Using Rasch Analysis. In: Zhang, Q. (eds) Pacific Rim Objective Measurement Symposium (PROMS) 2015 Conference Proceedings. Springer, Singapore. https://doi.org/10.1007/978-981-10-1687-5_8
Download citation
DOI: https://doi.org/10.1007/978-981-10-1687-5_8
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-1686-8
Online ISBN: 978-981-10-1687-5
eBook Packages: Behavioral Science and PsychologyBehavioral Science and Psychology (R0)