Advertisement

Effect of response format on cognitive reflection: Validating a two- and four-option multiple choice question version of the Cognitive Reflection Test

  • Miroslav Sirota
  • Marie Juanchich
Article

Abstract

The Cognitive Reflection Test, measuring intuition inhibition and cognitive reflection, has become extremely popular because it reliably predicts reasoning performance, decision-making, and beliefs. Across studies, the response format of CRT items sometimes differs, based on the assumed construct equivalence of tests with open-ended versus multiple-choice items (the equivalence hypothesis). Evidence and theoretical reasons, however, suggest that the cognitive processes measured by these response formats and their associated performances might differ (the nonequivalence hypothesis). We tested the two hypotheses experimentally by assessing the performance in tests with different response formats and by comparing their predictive and construct validity. In a between-subjects experiment (n = 452), participants answered stem-equivalent CRT items in an open-ended, a two-option, or a four-option response format and then completed tasks on belief bias, denominator neglect, and paranormal beliefs (benchmark indicators of predictive validity), as well as on actively open-minded thinking and numeracy (benchmark indicators of construct validity). We found no significant differences between the three response formats in the numbers of correct responses, the numbers of intuitive responses (with the exception of the two-option version, which had a higher number than the other tests), and the correlational patterns of the indicators of predictive and construct validity. All three test versions were similarly reliable, but the multiple-choice formats were completed more quickly. We speculate that the specific nature of the CRT items helps build construct equivalence among the different response formats. We recommend using the validated multiple-choice version of the CRT presented here, particularly the four-option CRT, for practical and methodological reasons. Supplementary materials and data are available at https://osf.io/mzhyc/.

Keywords

Cognitive Reflection Test Cognitive reflection Construct equivalence Response formats Multiple-choice format 

Notes

Supplementary material

13428_2018_1029_MOESM1_ESM.docx (31 kb)
ESM 1 (DOCX 30 kb)

References

  1. Baron, J. (2008). Thinking and deciding. Cambridge, UK: Cambridge University Press.Google Scholar
  2. Baron, J., Scott, S., Fincher, K., & Metz, S. E. (2015). Why does the Cognitive Reflection Test (sometimes) predict utilitarian moral judgment (and other things)? Journal of Applied Research in Memory and Cognition, 4, 265–284. doi: https://doi.org/10.1016/j.jarmac.2014.09.003 CrossRefGoogle Scholar
  3. Bhatia, S. (2017). Conflict and bias in heuristic judgment. Journal of Experimental Psychology: Learning, Memory, and Cognition, 43, 319–325. doi: https://doi.org/10.1037/xlm0000307 PubMedGoogle Scholar
  4. Bonner, S. M. (2013). Mathematics strategy use in solving test items in varied formats. Journal of Experimental Education, 81, 409–428. doi: https://doi.org/10.1080/00220973.2012.727886 CrossRefGoogle Scholar
  5. Bosch-Domènech, A., Brañas-Garza, P., & Espín, A. M. (2014). Can exposure to prenatal sex hormones (2D:4D) predict cognitive reflection? Psychoneuroendocrinology, 43, 1–10. doi: https://doi.org/10.1016/j.psyneuen.2014.01.023 CrossRefPubMedGoogle Scholar
  6. Bridgeman, B. (1992). A comparison of quantitative questions in open-ended and multiple-choice formats. Journal of Educational Measurement, 29, 253–271.CrossRefGoogle Scholar
  7. Campitelli, G., & Labollita, M. (2010). Correlations of cognitive reflection with judgments and choices. Judgment and Decision Making, 5, 182–191.Google Scholar
  8. Cohen, J. (1988). Statistical power analysis for the behavioral sciences. (2nd ed.). Hillsdale, NJ: Erlbaum.Google Scholar
  9. De Neys, W. (2012). Bias and conflict: A case for logical intuitions. Perspectives on Psychological Science, 7, 28–38. doi: https://doi.org/10.1177/1745691611429354 CrossRefPubMedGoogle Scholar
  10. De Neys, W. (2014). Conflict detection, dual processes, and logical intuitions: Some clarifications. Thinking & Reasoning, 20, 169–187.CrossRefGoogle Scholar
  11. De Neys, W., Cromheeke, S., & Osman, M. (2011). Biased but in doubt: Conflict and decision confidence. PLoS ONE, 6, e15954. doi: https://doi.org/10.1371/journal.pone.0015954 CrossRefPubMedPubMedCentralGoogle Scholar
  12. De Neys, W., Rossi, S., & Houde, O. (2013). Bats, balls, and substitution sensitivity: Cognitive misers are no happy fools. Psychonomic Bulletin & Review, 20, 269–273. doi: https://doi.org/10.3758/s13423-013-0384-5 CrossRefGoogle Scholar
  13. Deppe, K. D., Gonzalez, F. J., Neiman, J. L., Jacobs, C., Pahlke, J., Smith, K. B., & Hibbing, J. R. (2015). Reflective liberals and intuitive conservatives: A look at the Cognitive Reflection Test and ideology. Judgment and Decision Making, 10, 314–331.Google Scholar
  14. Diedenhofen, B., & Musch, J. (2015). cocor: A comprehensive solution for the statistical comparison of correlations. PLoS ONE, 10, e0121945. doi: https://doi.org/10.1371/journal.pone.0121945 CrossRefPubMedPubMedCentralGoogle Scholar
  15. Diedenhofen, B., & Musch, J. (2016). cocron: A web interface and R package for the statistical comparison of Cronbach’s alpha coefficients. International Journal of Internet Science, 11.Google Scholar
  16. Evans, J. St. B. T., Barston, J. L., & Pollard, P. (1983). On the conflict between logic and belief in syllogistic reasoning. Memory & Cognition, 11, 295–306.CrossRefGoogle Scholar
  17. Frederick, S. (2005). Cognitive reflection and decision making. Journal of Economic Perspectives, 19, 25–42. doi: https://doi.org/10.1257/089533005775196732 CrossRefGoogle Scholar
  18. Gangemi, A., Bourgeois-Gironde, S., & Mancini, F. (2015). Feelings of error in reasoning—in search of a phenomenon. Thinking & Reasoning, 21, 383–396. doi: https://doi.org/10.1080/13546783.2014.980755 CrossRefGoogle Scholar
  19. Johnson, E. D., Tubau, E., & De Neys, W. (2016). The doubting system 1: Evidence for automatic substitution sensitivity. Acta Psychologica, 164, 56–64. doi: https://doi.org/10.1016/j.actpsy.2015.12.008 CrossRefPubMedGoogle Scholar
  20. Juanchich, M., Dewberry, C., Sirota, M., & Narendran, S. (2016). Cognitive reflection predicts real-life decision outcomes, but not over and above personality and decision-making styles. Journal of Behavioral Decision Making, 29, 52–59. doi: https://doi.org/10.1002/bdm.1875 CrossRefGoogle Scholar
  21. Kahan, D. M. (2013). Ideology, motivated reasoning, and cognitive reflection. Judgment and Decision Making, 8, 407–424.Google Scholar
  22. Kahneman, D., & Frederick, S. (2005). A model of heuristic judgment. In K. J. Holyoak & R. G. Morrison (Eds.), The Cambridge handbook of thinking and reasoning (pp. 267–293). New York, NY: Cambridge University Press.Google Scholar
  23. Kirkpatrick, L. A., & Epstein, S. (1992). Cognitive experiential self-theory and subjective-probability— Further evidence for 2 conceptual systems. Journal of Personality and Social Psychology, 63, 534–544.CrossRefPubMedGoogle Scholar
  24. Lee, M. D., & Wagenmakers, E.-J. (2014). Bayesian cognitive modeling: A practical course. Cambridge, UK: Cambridge University Press.Google Scholar
  25. Lesage, E., Navarrete, G., & De Neys, W. (2013). Evolutionary modules and Bayesian facilitation: The role of general cognitive resources. Thinking & Reasoning, 19, 27–53.CrossRefGoogle Scholar
  26. Liberali, J. M., Reyna, V. F., Furlan, S., Stein, L. M., & Pardo, S. T. (2012). Individual differences in numeracy and cognitive reflection, with implications for biases and fallacies in probability judgment. Journal of Behavioral Decision Making, 25, 361–381. doi: https://doi.org/10.1002/bdm.752 CrossRefPubMedGoogle Scholar
  27. Lipkus, I. M., Samsa, G., & Rimer, B. K. (2001). General performance on a numeracy scale among highly educated samples. Medical Decision Making, 21, 37–44. doi: https://doi.org/10.1177/0272989x0102100105 CrossRefPubMedGoogle Scholar
  28. Markovits, H., & Nantel, G. (1989). The belief-bias effect in the production and evaluation of logical conclusions. Memory & Cognition, 17, 11–17. doi: https://doi.org/10.3758/bf03199552 CrossRefGoogle Scholar
  29. Morey, R. D., & Rouder, J. N. (2015). BayesFactor: An R package for Bayesian data analysis (Version 0.9.10-2). Retrieved from https://cran.r-project.org/web/packages/BayesFactor/index.html
  30. Morsanyi, K., Busdraghi, C., & Primi, C. (2014). Mathematical anxiety is linked to reduced cognitive reflection: A potential road from discomfort in the mathematics classroom to susceptibility to biases. Behavioral and Brain Functions, 10, 31. doi: https://doi.org/10.1186/1744-9081-10-31 CrossRefPubMedPubMedCentralGoogle Scholar
  31. Oldrati, V., Patricelli, J., Colombo, B., & Antonietti, A. (2016). The role of dorsolateral prefrontal cortex in inhibition mechanism: A study on cognitive reflection test and similar tasks through neuromodulation. Neuropsychologia, 91, 499–508. doi: https://doi.org/10.1016/j.neuropsychologia.2016.09.010 CrossRefPubMedGoogle Scholar
  32. Peer, E., Vosgerau, J., & Acquisti, A. (2014). Reputation as a sufficient condition for data quality on Amazon Mechanical Turk. Behavior Research Methods, 46, 1023–1031. doi: https://doi.org/10.3758/s13428-013-0434-y CrossRefPubMedGoogle Scholar
  33. Pennycook, G., Cheyne, J. A., Seli, P., Koehler, D. J., & Fugelsang, J. A. (2012). Analytic cognitive style predicts religious and paranormal belief. Cognition, 123, 335–346. doi: https://doi.org/10.1016/j.cognition.2012.03.003 CrossRefPubMedGoogle Scholar
  34. Pennycook, G., Fugelsang, J. A., & Koehler, D. J. (2015a). Everyday consequences of analytic thinking. Current Directions in Psychological Science, 24, 425–432. doi: https://doi.org/10.1177/0963721415604610 CrossRefGoogle Scholar
  35. Pennycook, G., Fugelsang, J. A., & Koehler, D. J. (2015b). What makes us think? A three-stage dual-process model of analytic engagement. Cognitive Psychology, 80, 34–72. doi: https://doi.org/10.1016/j.cogpsych.2015.05.001 CrossRefPubMedGoogle Scholar
  36. Pennycook, G., Ross, R. M., Koehler, D. J., & Fugelsang, J. A. (2016). Atheists and agnostics are more reflective than religious believers: Four empirical studies and a meta-analysis. PLoS ONE, 11, e153039. doi: https://doi.org/10.1371/journal.pone.0153039 CrossRefGoogle Scholar
  37. Rodriguez, M. C. (2003). Construct equivalence of multiple-choice and constructed-response items: A random effects synthesis of correlations. Journal of Educational Measurement, 40, 163–184.CrossRefGoogle Scholar
  38. Royzman, E. B., Landy, J. F., & Leeman, R. F. (2015). Are thoughtful people more utilitarian? CRT as a unique predictor of moral minimalism in the dilemmatic context. Cognitive Science, 39, 325–352. doi: https://doi.org/10.1111/cogs.12136 CrossRefPubMedGoogle Scholar
  39. Sirota, M., Juanchich, M., & Hagmayer, Y. (2014). Ecological rationality or nested sets? Individual differences in cognitive processing predict Bayesian reasoning. Psychonomic Bulletin & Review, 21, 198–204. doi: https://doi.org/10.3758/s13423-013-0464-6 CrossRefGoogle Scholar
  40. Sirota, M., Kostovicova, L., Juanchich, M., Dewberry, C., & Marshall, A. (2018). Measuring cognitive reflection without maths: Developing CRT–Verbal. Manuscript in preparation.Google Scholar
  41. Sirota, M., Kostovičová, L., & Vallée-Tourangeau, F. (2015). Now you Bayes, now you don’t: Effects of set-problem and frequency-format mental representations on statistical reasoning. Psychonomic Bulletin & Review, 22, 1465–1473. doi: https://doi.org/10.3758/s13423-015-0810-y CrossRefGoogle Scholar
  42. Stanovich, K. E., & West, R. F. (1997). Reasoning independently of prior belief and individual differences in actively open-minded thinking. Journal of Educational Psychology, 89, 342–357. doi: https://doi.org/10.1037/0022-0663.89.2.342 CrossRefGoogle Scholar
  43. Szaszi, B., Szollosi, A., Palfi, B., & Aczel, B. (2017). The cognitive reflection test revisited: Exploring the ways individuals solve the test. Thinking & Reasoning, 23, 207–234. doi: https://doi.org/10.1080/13546783.2017.1292954 CrossRefGoogle Scholar
  44. Tobacyk, J. J. (2004). A revised paranormal belief scale. International Journal of Transpersonal Studies, 23, 94–98.CrossRefGoogle Scholar
  45. Toplak, M. E., West, R. F., & Stanovich, K. E. (2011). The Cognitive Reflection Test as a predictor of performance on heuristics-and-biases tasks. Memory & Cognition, 39, 1275–1289. doi: https://doi.org/10.3758/s13421-011-0104-1 CrossRefGoogle Scholar
  46. Toplak, M. E., West, R. F., & Stanovich, K. E. (2014). Assessing miserly information processing: An expansion of the Cognitive Reflection Test. Thinking & Reasoning, 20, 147–168. doi: https://doi.org/10.1080/13546783.2013.844729 CrossRefGoogle Scholar
  47. Toplak, M. E., West, R. F., & Stanovich, K. E. (2017). Real-world correlates of performance on heuristics and biases tasks in a community sample. Journal of Behavioral Decision Making, 30, 541–554. doi: https://doi.org/10.1002/bdm.1973 CrossRefGoogle Scholar
  48. Travers, E., Rolison, J. J., & Feeney, A. (2016). The time course of conflict on the Cognitive Reflection Test. Cognition, 150, 109–118. doi: https://doi.org/10.1016/j.cognition.2016.01.015 CrossRefPubMedGoogle Scholar

Copyright information

© Psychonomic Society, Inc. 2018

Authors and Affiliations

  1. 1.Department of PsychologyUniversity of EssexColchesterUK

Personalised recommendations