Skip to main content

Contributions of Response Processes Analysis to the Validation of an Assessment of Higher Education Students’ Competence in Business and Economics

  • Chapter
  • First Online:
Understanding and Investigating Response Processes in Validation Research

Part of the book series: Social Indicators Research Series ((SINS,volume 69))

Abstract

Higher education domains such as Business and Economics (B&E) generally lack evidence that standardized tests of competency actually assess those aspects of knowledge and reasoning that constitute important targets for learning and instruction. The construction of such a validity argument benefits from a framework to guide the collection of relevant evidence regarding the interpretive meaning of tests results, especially with respect to instruction and learning in B&E. Response processes analysis provides much needed evidence within such a validity argument. The present paper illustrates application of such a framework and the use of response processes analysis for assessment in the B&E higher education domain. It introduces three different mental operations with regard to their construct relevance (elaboration of economic concepts, deductive inferences, and economic heuristics) and examines the outcomes with regard to their relationship to claims about the cognitive, instructional, and inferential aspects of validity for the assessment tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Matching items requires the respondent to combine economic facts, statements, and concepts with more or less accurate explanations. Sequencing items requires the respondent to causally or chronologically arrange economic statements, facts, or principles in a sequence (e.g., with the goal to conduct an economic analysis or optimization, for example, the steps to optimize a production sequence of a company) (for further information on this format, see Parkes and Zimmaro 2016).

  2. 2.

    According to Cohen (1988), an effect with ω = 0.1 is classified as small effect, ω = 0.3 is a medium-sized effect, and ω = 0.5 is a large-sized effect.

  3. 3.

    Thee of the items used were not included in the online rating, as they were newly developed in cooperation with experts so as to ensure that they were representative of the curriculum.

References

  • Afflerbach, P., & Cho, B.-Y. (2009). Identifying and describing constructively responsive comprehension strategies in new and traditional forms of reading. In S. E. Israel & G. G. Duffy (Eds.), Handbook of research on reading comprehension (pp. 69–90). New York, NY: Routledge.

    Google Scholar 

  • American Educational Research Association, American Psychological Association, & National Council on Measurement in Education [AERA, APA, & NCME]. (2014). Standards for educational and psychological testing. Washington, DC: American Psychological Association.

    Google Scholar 

  • Angell, R. B. (1964). Reasoning and logic. New York, NY: Appleton-Century-Crof.

    Google Scholar 

  • Arts, J. (2007). Developing managerial expertise: Studies on managerial cognition and the implications for management education. Maastricht, Netherlands: University Library.

    Google Scholar 

  • Baxter, G. P., & Glaser, R. (1998). Investigating the cognitive complexity of science assessments. Educational Measurement: Issues and Practice, 17, 37–45.

    Article  Google Scholar 

  • Beck, K. (1993). Dimensionen der ökonomischen Bildung. Meßinstrumente und Befunde. Abschlußbericht zum DFG-Projekt: Wirtschaftskundliche Bildung-Test (WBT). Normierung und internationaler Vergleich [Dimensions of economics literacy. Measuring instruments and findings. Closing report of the DFG project: Test of economic literacy (TEL). Standardization and international comparison]. Nürnberg, Germany: Universität Erlangen-Nürnberg.

    Google Scholar 

  • Bielinska-Kwapisz, A., Brown, F. W., & Semenik, R. (2012). Interpreting standardized assessment test scores and setting performance goals in the context of student characteristics: The case of the major field test in business. Journal of Education for Business, 87, 7–13.

    Article  Google Scholar 

  • Borsboom, D., Mellenbergh, G. J., & van Heerden, J. (2004). The concept of validity. Psychological Review, 111, 1061–1071.

    Article  Google Scholar 

  • Brückner, S. (2013). Construct-irrelevant mental processes in university students’ responding to business and economic test items: Using symmetry based on verbal reports to establish the validity of test score interpretations. Brunswik Society Newsletter, 28, 16–20.

    Google Scholar 

  • Brückner, S. (in press). Prozessbezogene Validierung anhand von mentalen Operationen bei der Bearbeitung wirtschaftswissenschaftlicher Testaufgaben [Process-related validation using mental operations during solving business and economics test items] (Doctoral dissertation). Johannes Gutenberg-Universität, Mainz. Landau, Germany: Verlag Empirische Pädagogik.

    Google Scholar 

  • Brückner, S., & Kuhn, C. (2013). Die Methode des lauten Denkens und ihre Rolle für die Testentwicklung und Validierung [The think-aloud method and its significance in test development and validation] In O. Zlatkin-Troitschanskaia, R. Nickolaus, & K. Beck (Eds.), Lehrerbildung auf dem Prüfstand (Sonderheft). Kompetenzmodellierung und Kompetenzmessung bei Studierenden der Wirtschaftswissenschaften und der Ingenieurwissenschaften [Teacher education under scrutiny (special issue). Modeling and measuring students’ competencies in business, economics and engineering] (pp. 26–48). Landau, Germany: Verlag Empirische Pädagogik.

    Google Scholar 

  • Brückner, S., & Pellegrino, J. W. (2016). Integrating the analysis of mental operations into multilevel models to validate an assessment of higher education students’ competency in business and economics. Journal of Educational Measurement, 53, 293–312.

    Article  Google Scholar 

  • Campbell, D. T., & Fiske, D. W. (1959). Convergent and discriminant validation by the multitrait-multimethod matrix. Psychological Bulletin, 56, 81–105.

    Article  Google Scholar 

  • Chi, M. T. H. (1997). Quantifying qualitative analyses of verbal data: A practical guide. The Journal of the Learning Sciences, 6, 271–315.

    Article  Google Scholar 

  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: L. Erlbaum Associates.

    Google Scholar 

  • Creswell, J. W. (2014). Research design: Qualitative, quantitative, and mixed methods approaches (4th ed.). Thousand Oaks, CA: SAGE Publications.

    Google Scholar 

  • Davies, P. (2006). Threshold concepts. How can we recognise them? In J. Meyer & R. Land (Eds.), Overcoming barriers to student understanding. Threshold concepts and troublesome knowledge (pp. 70–84). London, UK: Routledge.

    Google Scholar 

  • DiBello, L. V., Pellegrino, J. W., Gane, B. D., & Goldman, S. R. (2017). The contribution of student response processes to validity analyses for instructionally supportive assessments. In K. W. Ercikan & J. W. Pellegrino (Eds.), Validation of score meaning in the next generation of assessments. The use of response processes (pp. 85–99). London, UK: Routledge.

    Google Scholar 

  • Ercikan, K., Arim, R., Law, D., Domene, J., Gagnon, F., & Lacroix, S. (2010). Application of think aloud protocols for examining and confirming sources of differential item functioning identified by expert reviews. Educational Measurement: Issues and Practice, 29, 24–35.

    Article  Google Scholar 

  • Ericsson, K. A., & Simon, H. A. (1993). Protocol analysis: Verbal reports as data (rev. ed). A Bradford book. Cambridge, MA: MIT Press.

    Google Scholar 

  • Förster, M., Zlatkin-Troitschanskaia, O., Brückner, S., Happ, R., Hambleton, R. K., Walstad, W. B., et al. (2015). Validating test score interpretations by cross-national comparison: Comparing the results of students from Japan and Germany on an American test of economic knowledge in higher education. Zeitschrift für Psychologie, 223, 14–23.

    Article  Google Scholar 

  • Gorin, J. S. (2006). Test design with cognition in mind. Educational Measurement: Issues and Practice, 25, 21–35.

    Article  Google Scholar 

  • Größler, A., Wilhelm, O., Wittmann, W. W., & Milling, P. M. (2002). Measuring business knowledge for personnel selection in small and medium sized companies: Abschlussbericht zum Projekt: Die Erfassung von Wirtschaftswissen zur Personalauswahl in KMU (No. 44). Mannheim, Germany: Institut für Mittelstandsforschung der Universität Mannheim.

    Google Scholar 

  • Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77, 81–112.

    Article  Google Scholar 

  • Howell, H., Phelps, G., Croft, A. J., Kirui, D., & Gitomer, D. (2013). Cognitive interviews as a tool for investigating the validity of content knowledge for teaching assessments (ETS Research Report No. RR-13-19). Princeton, NJ: ETS.

    Google Scholar 

  • Kane, M. T. (2004). Certification testing as an illustration of argument-based validation. Measurement, 2, 135–170.

    Google Scholar 

  • Kane, M. T. (2013). Validating the interpretations and uses of test scores. Journal of Educational Measurement, 50, 1–73.

    Article  Google Scholar 

  • Krugman, P. R., & Wells, R. (2015). Economics (4th ed.). New York, NY: W.H. Freeman & Co Ltd..

    Google Scholar 

  • Leighton, J. P. (2004). Avoiding misconception, misuse, and missed opportunities: The collection of verbal reports in educational achievement testing. Educational Measurement: Issues and Practice, 23, 6–15.

    Article  Google Scholar 

  • Leighton, J. P. (2013). Item difficulty and interviewer knowledge effects on the accuracy and consistency of examinee response processes in verbal reports. Applied Measurement in Education, 26, 136–157.

    Article  Google Scholar 

  • Leiser, D., & Aroch, R. (2009). Lay understanding of macroeconomic causation: The good-begets-good heuristic. Applied Psychology, 58, 370–384.

    Article  Google Scholar 

  • Lissitz, R. W., & Samuelsen, K. (2007). A suggested change in terminology and emphasis regarding validity and education. Educational Researcher, 36, 437–448.

    Article  Google Scholar 

  • Loevinger, J. (1957). Objective tests as instruments of psychological theory. Psychological Reports, 3, 635–694.

    Article  Google Scholar 

  • Luecht, R. M. (2007). Using information from multiple-choice distractors to enhance cognitive-diagnostic score reporting. In J. P. Leighton & M. J. Gierl (Eds.), Cognitive diagnostic assessment for education. Theory and applications (pp. 319–340). Cambridge, UK: Cambridge University Press.

    Chapter  Google Scholar 

  • Lydersen, S., Pradhan, V., Senchaudhuri, P., & Laake, P. (2007). Choice of test for association in small sample unordered r x c tables. Statistics in Medicine, 26, 4328–4343.

    Article  Google Scholar 

  • Mankiw, N. G. (2012). Principles of economics (6th ed.). Mason, OH: South-Western Cengage Learning.

    Google Scholar 

  • Messick, S. (1989). Validity. In R. L. Linn (Ed.), Educational measurement (3rd ed., pp. 13–103). New York, NY: Macmillan Publishing.

    Google Scholar 

  • Minnameier, G. (2013). The inferential construction of knowledge in the domain of business and economics. In K. Beck & O. Zlatkin-Troitschanskaia (Eds.), Professional and VET learning: Vol. 2. From diagnostics to learning success. Proceedings in vocational education and training (pp. 141–156). Rotterdam, Netherlands: Sense Publishers.

    Google Scholar 

  • Mislevy, R. J. (1994). Evidence and inference in educational assessment. Psychometrika, 59, 439–483.

    Article  Google Scholar 

  • Mislevy, R. J., & Haertel, G. D. (2006). Implications of evidence-centered design for educational testing. Educational Measurement: Issues and Practice, 25, 6–20.

    Article  Google Scholar 

  • Newton, P. E., & Shaw, S. (2014). Validity in educational and psychological assessment. Los Angeles, CA: SAGE.

    Book  Google Scholar 

  • Parkes, J., & Zimmaro, D. (2016). Learning and assessing with multiple choice questions in college classrooms. New York, NY: Routledge.

    Google Scholar 

  • Patton, M. Q. (2015). Qualitative research & evaluation methods (4th ed.). Thousand Oaks, CA: SAGE.

    Google Scholar 

  • Pellegrino, J. W., Baxter, G. P., & Glaser, R. (1999). Addressing the “two disciplines” problem: Linking theories of cognition and learning with assessment and instructional practice. Review of Research in Education, 24, 307–353.

    Google Scholar 

  • Pellegrino, J. W., Chudowsky, N., & Glaser, R. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academies Press.

    Google Scholar 

  • Pellegrino, J. W., DiBello, L. V., & Goldman, S. R. (2016). A framework for conceptualizing and evaluating the validity of instructionally relevant assessments. Educational Psychologist, 51, 59–81.

    Article  Google Scholar 

  • Pellegrino, J. W., & Glaser, R. (1979). Cognitive correlates and components in the analysis of individual differences. Intelligence, 3, 187–215.

    Article  Google Scholar 

  • Puntambekar, S., & Hubscher, R. (2005). Tools for scaffolding students in a complex learning environment: What have we gained and what have we missed? Educational Psychologist, 40, 1–12.

    Article  Google Scholar 

  • Rulon, P. J. (1946). On the validity of educational tests. Harvard Educational Review, 16, 290–296.

    Google Scholar 

  • Shavelson, R. J. (2013). On an approach to testing and modeling competence. Educational Psychologist, 48, 73–86.

    Article  Google Scholar 

  • Teddlie, C., & Tashakkori, A. (2009). Foundations of mixed methods research: Integrating quantitative and qualitative approaches in the social and behavioral sciences. Los Angeles, CA: SAGE.

    Google Scholar 

  • Thelk, A. D., & Hoole, E. R. (2006). What are you thinking? Postsecondary student think-alouds of scientific and quantitative reasoning items. The Journal of General Education, 55, 17–39.

    Article  Google Scholar 

  • Toulmin, S. E. (1958). The uses of argument (updated ed.). Cambridge, UK: Cambridge University Press.

    Google Scholar 

  • Turner, C., & Fiske, D. W. (1968). Item quality and appropriateness of response processes. Educational and Psychological Measurement, 28, 297–315.

    Article  Google Scholar 

  • Vernooij, A. (2000). Tracking down the knowledge structure of students. In L. Borghans, W. H. Gijselaers, R. G. Milter, & J. E. Stinson (Eds.), Educational innovation in economics and business V. Business education for the changing workplace (pp. 437–450). Dordrecht, Netherlands: Kluwer Academic Publishers.

    Google Scholar 

  • Vidal Uribe, R. (2013). Measurement of learning outcomes in higher education: The case of ceneval in Mexico. In S. Blömeke, O. Zlatkin-Troitschanskaia, C. Kuhn, & J. Fege (Eds.), Modeling and measuring competencies in higher education (pp. 137–146). Rotterdam, Netherlands: Sense Publishers.

    Chapter  Google Scholar 

  • Walstad, W. B., Watts, M., & Rebeck, K. (2007). Test of understanding in college economics: Examiner’s manual (4th ed.). New York, NY: National Council on Economic Education.

    Google Scholar 

  • Wilson, M. (2005). Constructing measures: An item response modeling approach. Mahwah, NJ: Erlbaum.

    Google Scholar 

  • Wöhe, G., & Döring, U. (2013). Einführung in die allgemeine Betriebswirtschaftslehre (25., überarb. und aktualisierte Aufl.) [Introduction into general business administration] (25th rev. vol.). München, Germany: Vahlen.

    Google Scholar 

  • Zlatkin-Troitschanskaia, O., Förster, M., Brückner, S., & Happ, R. (2014). Insights from the German assessment of business and economics competence. In H. Coates (Ed.), Assessing learning outcomes: Perspectives for quality improvement (pp. 175–197). Frankfurt am Main, Germany: Lang.

    Google Scholar 

  • Zumbo, B. D. (2009). Validity as contextualized and pragmatic explanation, and its implications for validation practice. In R. W. Lissitz (Ed.), The concept of validity. Revisions, new directions, and applications (pp. 65–82). Charlotte, NC: Information Age Pub.

    Google Scholar 

  • Zumbo, B. D., & Chan, E. K. (2014). Validity and validation in social, behavioral, and health sciences (Vol. 54). New York, NY: Springer International Publishing.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sebastian Brückner .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Cite this chapter

Brückner, S., Pellegrino, J.W. (2017). Contributions of Response Processes Analysis to the Validation of an Assessment of Higher Education Students’ Competence in Business and Economics. In: Zumbo, B., Hubley, A. (eds) Understanding and Investigating Response Processes in Validation Research. Social Indicators Research Series, vol 69. Springer, Cham. https://doi.org/10.1007/978-3-319-56129-5_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-56129-5_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-56128-8

  • Online ISBN: 978-3-319-56129-5

  • eBook Packages: Social SciencesSocial Sciences (R0)

Publish with us

Policies and ethics