Abstract
Higher education domains such as Business and Economics (B&E) generally lack evidence that standardized tests of competency actually assess those aspects of knowledge and reasoning that constitute important targets for learning and instruction. The construction of such a validity argument benefits from a framework to guide the collection of relevant evidence regarding the interpretive meaning of tests results, especially with respect to instruction and learning in B&E. Response processes analysis provides much needed evidence within such a validity argument. The present paper illustrates application of such a framework and the use of response processes analysis for assessment in the B&E higher education domain. It introduces three different mental operations with regard to their construct relevance (elaboration of economic concepts, deductive inferences, and economic heuristics) and examines the outcomes with regard to their relationship to claims about the cognitive, instructional, and inferential aspects of validity for the assessment tasks.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Matching items requires the respondent to combine economic facts, statements, and concepts with more or less accurate explanations. Sequencing items requires the respondent to causally or chronologically arrange economic statements, facts, or principles in a sequence (e.g., with the goal to conduct an economic analysis or optimization, for example, the steps to optimize a production sequence of a company) (for further information on this format, see Parkes and Zimmaro 2016).
- 2.
According to Cohen (1988), an effect with ω = 0.1 is classified as small effect, ω = 0.3 is a medium-sized effect, and ω = 0.5 is a large-sized effect.
- 3.
Thee of the items used were not included in the online rating, as they were newly developed in cooperation with experts so as to ensure that they were representative of the curriculum.
References
Afflerbach, P., & Cho, B.-Y. (2009). Identifying and describing constructively responsive comprehension strategies in new and traditional forms of reading. In S. E. Israel & G. G. Duffy (Eds.), Handbook of research on reading comprehension (pp. 69–90). New York, NY: Routledge.
American Educational Research Association, American Psychological Association, & National Council on Measurement in Education [AERA, APA, & NCME]. (2014). Standards for educational and psychological testing. Washington, DC: American Psychological Association.
Angell, R. B. (1964). Reasoning and logic. New York, NY: Appleton-Century-Crof.
Arts, J. (2007). Developing managerial expertise: Studies on managerial cognition and the implications for management education. Maastricht, Netherlands: University Library.
Baxter, G. P., & Glaser, R. (1998). Investigating the cognitive complexity of science assessments. Educational Measurement: Issues and Practice, 17, 37–45.
Beck, K. (1993). Dimensionen der ökonomischen Bildung. Meßinstrumente und Befunde. Abschlußbericht zum DFG-Projekt: Wirtschaftskundliche Bildung-Test (WBT). Normierung und internationaler Vergleich [Dimensions of economics literacy. Measuring instruments and findings. Closing report of the DFG project: Test of economic literacy (TEL). Standardization and international comparison]. Nürnberg, Germany: Universität Erlangen-Nürnberg.
Bielinska-Kwapisz, A., Brown, F. W., & Semenik, R. (2012). Interpreting standardized assessment test scores and setting performance goals in the context of student characteristics: The case of the major field test in business. Journal of Education for Business, 87, 7–13.
Borsboom, D., Mellenbergh, G. J., & van Heerden, J. (2004). The concept of validity. Psychological Review, 111, 1061–1071.
Brückner, S. (2013). Construct-irrelevant mental processes in university students’ responding to business and economic test items: Using symmetry based on verbal reports to establish the validity of test score interpretations. Brunswik Society Newsletter, 28, 16–20.
Brückner, S. (in press). Prozessbezogene Validierung anhand von mentalen Operationen bei der Bearbeitung wirtschaftswissenschaftlicher Testaufgaben [Process-related validation using mental operations during solving business and economics test items] (Doctoral dissertation). Johannes Gutenberg-Universität, Mainz. Landau, Germany: Verlag Empirische Pädagogik.
Brückner, S., & Kuhn, C. (2013). Die Methode des lauten Denkens und ihre Rolle für die Testentwicklung und Validierung [The think-aloud method and its significance in test development and validation] In O. Zlatkin-Troitschanskaia, R. Nickolaus, & K. Beck (Eds.), Lehrerbildung auf dem Prüfstand (Sonderheft). Kompetenzmodellierung und Kompetenzmessung bei Studierenden der Wirtschaftswissenschaften und der Ingenieurwissenschaften [Teacher education under scrutiny (special issue). Modeling and measuring students’ competencies in business, economics and engineering] (pp. 26–48). Landau, Germany: Verlag Empirische Pädagogik.
Brückner, S., & Pellegrino, J. W. (2016). Integrating the analysis of mental operations into multilevel models to validate an assessment of higher education students’ competency in business and economics. Journal of Educational Measurement, 53, 293–312.
Campbell, D. T., & Fiske, D. W. (1959). Convergent and discriminant validation by the multitrait-multimethod matrix. Psychological Bulletin, 56, 81–105.
Chi, M. T. H. (1997). Quantifying qualitative analyses of verbal data: A practical guide. The Journal of the Learning Sciences, 6, 271–315.
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: L. Erlbaum Associates.
Creswell, J. W. (2014). Research design: Qualitative, quantitative, and mixed methods approaches (4th ed.). Thousand Oaks, CA: SAGE Publications.
Davies, P. (2006). Threshold concepts. How can we recognise them? In J. Meyer & R. Land (Eds.), Overcoming barriers to student understanding. Threshold concepts and troublesome knowledge (pp. 70–84). London, UK: Routledge.
DiBello, L. V., Pellegrino, J. W., Gane, B. D., & Goldman, S. R. (2017). The contribution of student response processes to validity analyses for instructionally supportive assessments. In K. W. Ercikan & J. W. Pellegrino (Eds.), Validation of score meaning in the next generation of assessments. The use of response processes (pp. 85–99). London, UK: Routledge.
Ercikan, K., Arim, R., Law, D., Domene, J., Gagnon, F., & Lacroix, S. (2010). Application of think aloud protocols for examining and confirming sources of differential item functioning identified by expert reviews. Educational Measurement: Issues and Practice, 29, 24–35.
Ericsson, K. A., & Simon, H. A. (1993). Protocol analysis: Verbal reports as data (rev. ed). A Bradford book. Cambridge, MA: MIT Press.
Förster, M., Zlatkin-Troitschanskaia, O., Brückner, S., Happ, R., Hambleton, R. K., Walstad, W. B., et al. (2015). Validating test score interpretations by cross-national comparison: Comparing the results of students from Japan and Germany on an American test of economic knowledge in higher education. Zeitschrift für Psychologie, 223, 14–23.
Gorin, J. S. (2006). Test design with cognition in mind. Educational Measurement: Issues and Practice, 25, 21–35.
Größler, A., Wilhelm, O., Wittmann, W. W., & Milling, P. M. (2002). Measuring business knowledge for personnel selection in small and medium sized companies: Abschlussbericht zum Projekt: Die Erfassung von Wirtschaftswissen zur Personalauswahl in KMU (No. 44). Mannheim, Germany: Institut für Mittelstandsforschung der Universität Mannheim.
Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77, 81–112.
Howell, H., Phelps, G., Croft, A. J., Kirui, D., & Gitomer, D. (2013). Cognitive interviews as a tool for investigating the validity of content knowledge for teaching assessments (ETS Research Report No. RR-13-19). Princeton, NJ: ETS.
Kane, M. T. (2004). Certification testing as an illustration of argument-based validation. Measurement, 2, 135–170.
Kane, M. T. (2013). Validating the interpretations and uses of test scores. Journal of Educational Measurement, 50, 1–73.
Krugman, P. R., & Wells, R. (2015). Economics (4th ed.). New York, NY: W.H. Freeman & Co Ltd..
Leighton, J. P. (2004). Avoiding misconception, misuse, and missed opportunities: The collection of verbal reports in educational achievement testing. Educational Measurement: Issues and Practice, 23, 6–15.
Leighton, J. P. (2013). Item difficulty and interviewer knowledge effects on the accuracy and consistency of examinee response processes in verbal reports. Applied Measurement in Education, 26, 136–157.
Leiser, D., & Aroch, R. (2009). Lay understanding of macroeconomic causation: The good-begets-good heuristic. Applied Psychology, 58, 370–384.
Lissitz, R. W., & Samuelsen, K. (2007). A suggested change in terminology and emphasis regarding validity and education. Educational Researcher, 36, 437–448.
Loevinger, J. (1957). Objective tests as instruments of psychological theory. Psychological Reports, 3, 635–694.
Luecht, R. M. (2007). Using information from multiple-choice distractors to enhance cognitive-diagnostic score reporting. In J. P. Leighton & M. J. Gierl (Eds.), Cognitive diagnostic assessment for education. Theory and applications (pp. 319–340). Cambridge, UK: Cambridge University Press.
Lydersen, S., Pradhan, V., Senchaudhuri, P., & Laake, P. (2007). Choice of test for association in small sample unordered r x c tables. Statistics in Medicine, 26, 4328–4343.
Mankiw, N. G. (2012). Principles of economics (6th ed.). Mason, OH: South-Western Cengage Learning.
Messick, S. (1989). Validity. In R. L. Linn (Ed.), Educational measurement (3rd ed., pp. 13–103). New York, NY: Macmillan Publishing.
Minnameier, G. (2013). The inferential construction of knowledge in the domain of business and economics. In K. Beck & O. Zlatkin-Troitschanskaia (Eds.), Professional and VET learning: Vol. 2. From diagnostics to learning success. Proceedings in vocational education and training (pp. 141–156). Rotterdam, Netherlands: Sense Publishers.
Mislevy, R. J. (1994). Evidence and inference in educational assessment. Psychometrika, 59, 439–483.
Mislevy, R. J., & Haertel, G. D. (2006). Implications of evidence-centered design for educational testing. Educational Measurement: Issues and Practice, 25, 6–20.
Newton, P. E., & Shaw, S. (2014). Validity in educational and psychological assessment. Los Angeles, CA: SAGE.
Parkes, J., & Zimmaro, D. (2016). Learning and assessing with multiple choice questions in college classrooms. New York, NY: Routledge.
Patton, M. Q. (2015). Qualitative research & evaluation methods (4th ed.). Thousand Oaks, CA: SAGE.
Pellegrino, J. W., Baxter, G. P., & Glaser, R. (1999). Addressing the “two disciplines” problem: Linking theories of cognition and learning with assessment and instructional practice. Review of Research in Education, 24, 307–353.
Pellegrino, J. W., Chudowsky, N., & Glaser, R. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academies Press.
Pellegrino, J. W., DiBello, L. V., & Goldman, S. R. (2016). A framework for conceptualizing and evaluating the validity of instructionally relevant assessments. Educational Psychologist, 51, 59–81.
Pellegrino, J. W., & Glaser, R. (1979). Cognitive correlates and components in the analysis of individual differences. Intelligence, 3, 187–215.
Puntambekar, S., & Hubscher, R. (2005). Tools for scaffolding students in a complex learning environment: What have we gained and what have we missed? Educational Psychologist, 40, 1–12.
Rulon, P. J. (1946). On the validity of educational tests. Harvard Educational Review, 16, 290–296.
Shavelson, R. J. (2013). On an approach to testing and modeling competence. Educational Psychologist, 48, 73–86.
Teddlie, C., & Tashakkori, A. (2009). Foundations of mixed methods research: Integrating quantitative and qualitative approaches in the social and behavioral sciences. Los Angeles, CA: SAGE.
Thelk, A. D., & Hoole, E. R. (2006). What are you thinking? Postsecondary student think-alouds of scientific and quantitative reasoning items. The Journal of General Education, 55, 17–39.
Toulmin, S. E. (1958). The uses of argument (updated ed.). Cambridge, UK: Cambridge University Press.
Turner, C., & Fiske, D. W. (1968). Item quality and appropriateness of response processes. Educational and Psychological Measurement, 28, 297–315.
Vernooij, A. (2000). Tracking down the knowledge structure of students. In L. Borghans, W. H. Gijselaers, R. G. Milter, & J. E. Stinson (Eds.), Educational innovation in economics and business V. Business education for the changing workplace (pp. 437–450). Dordrecht, Netherlands: Kluwer Academic Publishers.
Vidal Uribe, R. (2013). Measurement of learning outcomes in higher education: The case of ceneval in Mexico. In S. Blömeke, O. Zlatkin-Troitschanskaia, C. Kuhn, & J. Fege (Eds.), Modeling and measuring competencies in higher education (pp. 137–146). Rotterdam, Netherlands: Sense Publishers.
Walstad, W. B., Watts, M., & Rebeck, K. (2007). Test of understanding in college economics: Examiner’s manual (4th ed.). New York, NY: National Council on Economic Education.
Wilson, M. (2005). Constructing measures: An item response modeling approach. Mahwah, NJ: Erlbaum.
Wöhe, G., & Döring, U. (2013). Einführung in die allgemeine Betriebswirtschaftslehre (25., überarb. und aktualisierte Aufl.) [Introduction into general business administration] (25th rev. vol.). München, Germany: Vahlen.
Zlatkin-Troitschanskaia, O., Förster, M., Brückner, S., & Happ, R. (2014). Insights from the German assessment of business and economics competence. In H. Coates (Ed.), Assessing learning outcomes: Perspectives for quality improvement (pp. 175–197). Frankfurt am Main, Germany: Lang.
Zumbo, B. D. (2009). Validity as contextualized and pragmatic explanation, and its implications for validation practice. In R. W. Lissitz (Ed.), The concept of validity. Revisions, new directions, and applications (pp. 65–82). Charlotte, NC: Information Age Pub.
Zumbo, B. D., & Chan, E. K. (2014). Validity and validation in social, behavioral, and health sciences (Vol. 54). New York, NY: Springer International Publishing.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this chapter
Cite this chapter
Brückner, S., Pellegrino, J.W. (2017). Contributions of Response Processes Analysis to the Validation of an Assessment of Higher Education Students’ Competence in Business and Economics. In: Zumbo, B., Hubley, A. (eds) Understanding and Investigating Response Processes in Validation Research. Social Indicators Research Series, vol 69. Springer, Cham. https://doi.org/10.1007/978-3-319-56129-5_3
Download citation
DOI: https://doi.org/10.1007/978-3-319-56129-5_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-56128-8
Online ISBN: 978-3-319-56129-5
eBook Packages: Social SciencesSocial Sciences (R0)