A Visualization Approach to Addressing Reviewer Bias in Holistic College Admissions

  • Poorna Talkad SukumarEmail author
  • Ronald Metoyer


Expert intuition plays a significant role in decision making in professional domains such as medical diagnostics and holistic reviews in admissions. Every case or application encountered is viewed as unique and human assessments are considered indispensable in their discernment. However, these domains can be representative of environments referred to as low-validity environments, which are those that do not offer adequate opportunity to observe regularities and develop the right intuitions. As a consequence, the experts can be susceptible to cognitive biases. One example of a low-validity environment is the holistic review process in college admissions where every student application is individually reviewed and subjectively evaluated in its entirety by at least one reviewer. We conducted interviews and observations to study the holistic review process at a university in the United States with the goal of designing information visualization tools to support the process. We list examples of potential reviewer biases identified and present theoretical ideas on how the biases can be mitigated through visualization tools. These ideas include employing strategies that conflict with the conventional principles of interface and visualization design. This chapter is intended as a discussion of the likely occurrence of biases in a domain where subjective evaluations are the norm and how these biases can be countered using visualizations by understanding the bias manifestations.


Holistic Review Expert Intuition Application Review Process Bias Mitigation Narrative Fallacy 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



We wish to thank the admissions officers at the university where we conducted our studies for their cooperation and participation in our research study. This material is based upon work supported by the National Science Foundation under Grant No. 1816620.


  1. 1.
    (2002) Best practices in admissions decisions. A Report on the Third College Board Conference on Admission Models. College Entrance Examination BoardGoogle Scholar
  2. 2.
    Alexander EC, Chang CC, Shimabukuro M, Franconeri S, Collins C, Gleicher M (2017) Perceptual biases in font size as a data encoding. IEEE Trans Visual Comput Graph 3(9):1667–1676Google Scholar
  3. 3.
    Andrews C, Endert A, North C (2010) Space to think: large high-resolution displays for sensemaking. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, pp 55–64Google Scholar
  4. 4.
    Aronson E (1969) The theory of cognitive dissonance: a current perspective. Adv Expe Soc Psychol 4:1–34CrossRefGoogle Scholar
  5. 5.
    Aronson E, Mills J (1959) The effect of severity of initiation on liking for a group. J Abnorm Soc Psychol 59(2):177CrossRefGoogle Scholar
  6. 6.
    Burke AS (2005) Improving prosecutorial decision making: some lessons of cognitive science. William & Mary Law Rev 47:1587Google Scholar
  7. 7.
    Chase WG, Simon HA (1973) Perception in chess. Cogn Psychol 4(1):55–81CrossRefGoogle Scholar
  8. 8.
    Cook MB, Smallman HS (2008) Human factors of the confirmation bias in intelligence analysis: decision support from graphical evidence landscapes. Hum Factors 50(5):745–754CrossRefGoogle Scholar
  9. 9.
    Correll M, Gleicher M (2014) Bad for data, good for the brain: knowledge-first axioms for visualization design. In: IEEE VIS 2014Google Scholar
  10. 10.
    Dragicevic P, Jansen Y (2014) Visualization-mediated alleviation of the planning fallacy. In: IEEE VIS 2014Google Scholar
  11. 11.
    Fekete JD, Van Wijk JJ, Stasko JT, North C (2008) The value of information visualization. In: Information visualization. Springer, pp 1–18Google Scholar
  12. 12.
    Goldstein EB (2014) Cognitive psychology: connecting mind, research and everyday experience. Nelson Education, ScarboroughGoogle Scholar
  13. 13.
    Kahneman D (2011) Thinking, fast and slow. Macmillan, New YorkGoogle Scholar
  14. 14.
    Kahneman D, Klein G (2009) Conditions for intuitive expertise: a failure to disagree. Am Psychol 64(6):515CrossRefGoogle Scholar
  15. 15.
    Lucido JA (2014) How admission decisions get made. In: Handbook of strategic enrollment management pp 147–173Google Scholar
  16. 16.
    MacEachren AM (2015) Visual analytics and uncertainty: its not about the dataGoogle Scholar
  17. 17.
    Micallef L, Dragicevic P, Fekete JD (2012) Assessing the effect of visualizations on bayesian reasoning through crowdsourcing. IEEE Trans Visual Comput Graph 18(12):2536–2545CrossRefGoogle Scholar
  18. 18.
    Nickerson RS (1998) Confirmation bias: a ubiquitous phenomenon in many guises. Rev Gen Psychol 2(2):175CrossRefGoogle Scholar
  19. 19.
    Norman D (2013) The design of everyday things: revised and expanded edition. Basic Books (AZ), New YorkGoogle Scholar
  20. 20.
    Ottley A, Peck EM, Harrison LT, Afergan D, Ziemkiewicz C, Taylor HA, Han PK, Chang R (2016) Improving Bayesian reasoning: the effects of phrasing, visualization, and spatial ability. IEEE Trans Visual Comput Graph 22(1):529–538CrossRefGoogle Scholar
  21. 21.
    Radecki RP, Medow MA (2007) Cognitive debiasing through sparklines in clinical data displays. AMIA Annual Symposium Proceedings 11:1085Google Scholar
  22. 22.
    Ross HJ (2014) Everyday bias: identifying and navigating unconscious judgments in our daily lives. Rowman & Littlefield, MinneapolisGoogle Scholar
  23. 23.
    Shrinivasan YB, van Wijk JJ (2008) Supporting the analytical reasoning process in information visualization. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, pp 1237–1246Google Scholar
  24. 24.
    Simonsohn U (2007) Clouds make nerds look good: field evidence of the impact of incidental factors on decision making. J Behav Decis Making 20(2):143–152CrossRefGoogle Scholar
  25. 25.
    Stanovich K (2011) Rationality and the reflective mind. Oxford University Press, New YorkGoogle Scholar
  26. 26.
    Svenson O, Maule AJ (1993) Time pressure and stress in human judgment and decision making. Plenum Press, New YorkCrossRefGoogle Scholar
  27. 27.
    Tsai J, Miller S, Kirlik A (2011) Interactive visualizations to improve Bayesian reasoning. In: Proceedings of the human factors and ergonomics society annual meeting, vol 55. SAGE Publications, Los Angeles, CA, pp 385–389CrossRefGoogle Scholar
  28. 28.
    Tufte E, Graves-Morris P (2014) The visual display of quantitative information (1983)Google Scholar
  29. 29.
    Tufte ER (2006) Beautiful evidence. Graphics Press, New YorkGoogle Scholar
  30. 30.
    Tversky A, Kahneman D (1975) Judgment under uncertainty: heuristics and biases. Utility, probability, and human decision making. Springer, The Netherlands, pp 141–162CrossRefGoogle Scholar
  31. 31.
    Valdez AC, Ziefle M, Sedlmair M (2017) A framework for studying biases in visualization research. In: IEEE VIS 2017Google Scholar
  32. 32.
    Wason PC (1960) On the failure to eliminate hypotheses in a conceptual task. Q J Exp Psychol 12(3):129–140CrossRefGoogle Scholar
  33. 33.
    Wright W, Schroh D, Proulx P, Skaburskis A, Cort B (2006) The Sandbox for analysis: concepts and methods. In: Proceedings of the SIGCHI conference on Human Factors in computing systems, ACM, pp 801–810Google Scholar
  34. 34.
    Zakay D (1993) The impact of time perception processes on decision making under time stress. In: Time pressure and stress in human judgment and decision making. Springer, pp 59–72Google Scholar
  35. 35.
    Zuk T, Carpendale S (2007) Visualization of uncertainty and reasoning. In: Smart graphics. Springer, pp 164–177Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.University of Notre DameNotre DameUSA

Personalised recommendations