Abstract
Educators and policy makers now recognize how important and necessary it is to assess student learning outcomes (SLOs) in higher education. The question has shifted from whether such outcomes should be measured to how they should be measured. Today SLOs are typically assessed by student self-reports of learning or with multiple-choice and short-answer tests. Each of these methods has its strengths and limitations; each one provides insights into the nature of teaching and learning. An alternative approach is the assessment of performance using “criterion” tasks that are drawn from real-world situations in which students are being educated, both within and across academic or professional domains. The international Performance Assessment of Learning (iPAL) project, described herein, consolidates previous research and moves to the next generation of performance assessments for local, national, and international use. iPAL, a voluntary collaborative of scholars and practitioners, seeks to develop, research, and use performance assessments of college students’ twenty-first-century skills (e.g., critical thinking, written communication, quantitative literacy, civic competency and engagement, intercultural perspective-taking) for both instructional improvement and accountability purposes.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Colombia, Egypt, Finland, Korea, Kuwait, Mexico, Norway, the Slovak Republic, and the USA (Connecticut, Missouri, Pennsylvania).
- 2.
For the assessment of discipline-specific skills, many different tests and assessments exist in various countries, for example, ETS’, Major Field Tests (MFTs) in the USA, and Exámenes Generales para el Egreso de Licenciatura (EGEL) by Ceneval in Mexico and KoKoHs in Germany and Austria (see an overview in Zlatkin-Troitschanskaia et al. 2016).
- 3.
Note that a government report in the USA, such as the Federal Aviation Reports on aircraft accidents, are considered to be highly reliable. However, in other countries, government reports are treated with great suspicion and not considered to be reliable. Hence the challenge in developing tasks that cross boundaries.
- 4.
For more details of challenges of international assessment, see also Zlatkin-Troitschanskaia et al. (2015, 2017).
References
Achtenhagen, F., & Winther, E. (2014). Workplace-based competence measurement: Developing innovative assessment systems for tomorrow’s VET programmes. Journal of Vocational Education & Training, 66, 281–295. https://doi.org/10.1080/13636820.2ß14.916740.
American Educational Research Association, American Psychological Association, & National Council on Measurement in Education (AERA, APA, & NCME). (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.
Corno, L., Cronbach, L. J., Kupermintz, H., Lohman, D. F., Mandinach, E. B., Porteus, A. W., & Talbert, J. E. (2002). Remaking the concept of aptitude: Extending the legacy of Richard E. Snow. New York: Routledge.
Council for Aid to Education. (2013). Introducing CLA+. Fostering great critical thinkers. New York: CAE. http://cae.org/images/uploads/pdf/Introduction_to_CLA_Plus.pdf
Educational Testing Service (ETS). (2017). Introducing the HEIghten: Outcomes assessment suite. https://www.ets.org/heighten
Fu, A. C., Kannan, A., Shavelson, R. J., Peterson, L., & Kurpius, A. (2016). Room for rigor: Designs and methods in informal science education evaluation. Visitor Studies, 19(1), 12–38. https://doi.org/10.1080/10645578.2016.1144025.
Hambleton, R. K., & Zenisky, L. (2010). Translating and adapting tests for cross-cultural assessments. https://doi.org/10.1017/CBO9780511779381.004
Holtsch, D., Rohr-Mentele, S., Wenger, E., Eberle, F., & Shavelson, R. J. (2016). Challenges of a cross-national computer-based test adaptation. Empirical Research in Vocational Education and Training, 8(18), 1–32.
International Test Commission. (2005). International Test Commission guidelines for translating and adapting tests. Retrieved from http://www.intestcom.org/files/guideline_test_adaptation.pdf
Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus and Giroux.
Koretz, D. (2016, April 4). Measuring postsecondary competencies: Lessons from large-scale K-12 assessments. Presentation at the KoKoHs conference, Berlin.
Lai, E. R., & Viering, M. (2012). Assessing 21st century skills: Integrating research findings. Paper presented at the annual meeting of the National Council on Measurement in Education, Vancouver, B.C., Canada.
Leighton, J. P. (2017). Using think-aloud interviews and cognitive labs in educational research. Oxford: Oxford University Press.
Liu, O. L., Mao, L., Frankel, L., & Xu, J. (2016). Assessing critical thinking in higher education: The HEIghten™ approach and preliminary validity evidence. Assessment & Evaluation in Higher Education, 41(5), 677–694. https://doi.org/10.1080/02602938.2016.1168358.
Marion, S. F., & Pellegrino, J. (2007). A validity framework for evaluating the technical quality of alternate assessments. Educational Measurement Issues and Practice, 25, 47–57.
McClelland, D. C. (1973). Testing for competence rather than intelligence. American Psychologist, 28, 1–14.
OECD. (2012). Assessment of higher education learning outcomes. Feasibility study report: Volume 1 – Design and implementation. Retrieved from http://www.oecd.org/edu/skills-beyond-school/AHELOFSReportVolume1.pdf
OECD. (2013a). Assessment of higher education learning outcomes. AHELO feasibility study report – Volume 2. Data analysis and national experiences. Paris: OECD.
OECD. (2013b). The survey of adult skills: Reader’s companion. OECD Publishing. https://doi.org/10.1787/9789264204027-en.
Pellegrino, J. W., & Hilton, M. L. (2012). Education for life and work: Developing transferable knowledge and skills in the 21st century. Washington, DC: National Academies Press.
Pellegrino, J. W., Chudowsky, N., & Glaser, R. (Eds.). (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: The National Academies Press.
Shavelson, R. J. (2008). Reflections on quantitative reasoning: An assessment perspective. In B. L. Madison & L. A. Steen (Eds.), Calculation vs. context: Quantitative literacy and its implications for teacher education. Washington, DC: Mathematical Association of America.
Shavelson, R. J. (2010). Measuring college learning responsibly: Accountability in a new era. Stanford: Stanford University Press.
Shavelson, R. J. (2012). Assessing business-planning competence using the collegiate learning assessment as a prototype. Empirical Research in Vocational Education and Training, 4, 77–90.
Shavelson, R. J. (2013a). On an approach to testing and modeling competence. Educational Psychologist, 48(2), 73–86.
Shavelson, R. J. (2013b). An approach to testing and modeling competencies. In S. Blömeke, O. Zlatkin-Troitschanskaia, C. Kuhn, & J. Fege (Eds.), Modeling and measuring competencies in higher education: Tasks and challenges. Boston: Sense.
Shavelson, R. J. (2017). Statistical significance and program effect: Rejoinder to “why assessment will never work in many business schools: A call for better utilization of pedagogical research”. Journal of Management Education, 41, 1–5.
Shavelson, R. J., Roeser, R. W., Kupermintz, H., Lau, S., Ayala, C., Haydel, A., Schultz, S., Quihuis, G., & Gallagher, L. (2002). Richard E. Snow’s remaking of the concept of aptitude and multidimensional test validity: Introduction to the special issue. Educational Assessment, 8(2), 77–100.
Shavelson, R. J., Davey, T., Ferrara, S., Holland, P., Webb, N., & Wise, L. (2015). Psychometric considerations for the next generation of performance assessment. Princeton: Educational Testing Service.
Shavelson, R. J., Domingue, B. W., Mariño, J. P., Molina-Mantilla, A., Morales, J. A., & Wiley, E. E. (2016). On the practices and challenges of measuring higher education value added: The case of Colombia. Assessment and Evaluation in Higher Education, 41(5), 695–720.
Shavelson, R. J., Marino, J., Zlatkin-Troitschanskaia, O., & Schmidt, S. (2017a). Reflections on the assessment of quantitative reasoning. In B. L. Madison & L. A. Steen (Eds.), Calculation vs. context: Quantitative literacy and its implications for teacher education. Washington, DC: Mathematical Association of America. (in press).
Shavelson, R. J., Zlatkin-Troitschanskaia, O., & Marino, J. (2017b). Performance indicators of learning in higher education institutions: Overview of the field. In E. Hazerkorn, H. Coates, & A. Cormick (Eds.), Research handbook on quality, performance and accountability in higher education. Edward Elgar. (in press).
Snow, R. E. (1996). Aptitude development and education. Psychology, Public Policy, and Law, 2(3/4), 536–560.
Solano-Flores, G., Shavelson, R. J., & Schneider, S. A. (2001). Expanding the notion of assessment Shell: From task development tool to instrument for guiding the process of science assessment development. Revista Electrónica de Investigació Educative, 3(1), 33–53.
Stanovich, K. E. (2009). What intelligence test miss: The psychology of rational thought. New Haven: Yale University Press.
Stanovich, K. E. (2016). The comprehensive assessment of rational thinking. Educational Psychologist, 51, 1–12. https://doi.org/10.1080/00461520.2015.1125787.
Strijbos, J., Engels, N., & Struyven, K. (2015). Criteria and standards of generic competences at bachelor degree level: A review study. Educational Research Review, 14, 18–32. https://doi.org/10.1016/j.edurev.2015.01.001.
Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 1124–1131.
Wolf, R., Zahner, D., Kostoris, F., & Benjamin, R. (2014). A case study of an international performance-based assessment of critical thinking skills. New York: Council for Aid to Education.
Zahner, D. (2013). Reliability and validity of CLA+. http://cae.org/images/uploads/pdf/Reliability_and_Validity_of_CLA_Plus.pdf
Zlatkin-Troitschanskaia, O., Shavelson, R. J., & Kuhn, C. (2015). The international state of research on measurement of competency in higher education. Studies in Higher Education, 40(3), 393–411.
Zlatkin-Troitschanskaia, O., Pant, H. A., Kuhn, C., Lautenbach, C., & Toepper, M. (2016). Assessment practices in higher education and results of the German research program modeling and measuring competencies in higher education (KoKoHs). Research & Practice in Assessment, 11, 46–54.
Zlatkin-Troitschanskaia, O., Pant, H. A., Lautenbach, C., Molerov, D., Toepper, M., & Brückner, S. (2017a). Modeling and measuring competencies in higher education. Approaches to challenges in higher education policy and practice. Wiesbaden: Springer.
Zlatkin-Troitschanskaia, O., Shavelson, R. J., & Pant, H. A. (2017b). Assessment of learning outcomes in higher education – International comparisons and perspectives. In C. Secolsky & B. Denison (Eds.), Handbook on measurement, assessment and evaluation in higher education (2nd ed.). New York: Routledge.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this chapter
Cite this chapter
Shavelson, R.J., Zlatkin-Troitschanskaia, O., Mariño, J.P. (2018). International Performance Assessment of Learning in Higher Education (iPAL): Research and Development. In: Zlatkin-Troitschanskaia, O., Toepper, M., Pant, H., Lautenbach, C., Kuhn, C. (eds) Assessment of Learning Outcomes in Higher Education. Methodology of Educational Measurement and Assessment. Springer, Cham. https://doi.org/10.1007/978-3-319-74338-7_10
Download citation
DOI: https://doi.org/10.1007/978-3-319-74338-7_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-74337-0
Online ISBN: 978-3-319-74338-7
eBook Packages: EducationEducation (R0)