Data Visualization Literacy and Visualization Biases: Cases for Merging Parallel Threads

  • Hamid MansoorEmail author
  • Lane Harrison


People are prone to many biases when viewing data visualizations. Recent visualization research has uncovered biases that manifest during visualization use, quantified their impact and developed strategies for mitigating such biases. In a parallel thread, visualization research has investigated how to measure a person’s data visualization literacy and examine the performance consequences of individual differences in these literacy measures. The aim of this chapter is to make a case for merging these threads. To bridge the gap, we highlight research in cognitive biases, that has established that there are relationships between the impact of biases and factors such as experience and cognitive ability. Drawing on prior work in visualization biases, we provide examples of how visualization literacy measures may have led to different results in these studies. As research continues to identify and quantify the biases that occur in visualizations, the impact of people’s individual abilities may prove to be an important consideration for analysis and design.


  1. 1.
    Aigner W, Miksch S, Thurnher B, Biffl S (2005) Planninglines: novel glyphs for representing temporal uncertainties and their evaluation. In: Proceedings. Ninth international conference on Information visualisation, 2005. IEEE, pp 457–463Google Scholar
  2. 2.
    Baker RS, Corbett AT, Koedinger KR (2001) Toward a model of learning data representations. In: Proceedings of the 23rd annual conference of the Cognitive Science Society, pp 45–50Google Scholar
  3. 3.
    Baker RS, Corbett AT, Koedinger KR (2004) Learning to distinguish between representations of data: a cognitive tutor that uses contrasting cases. In: Proceedings of the 6th international conference on Learning sciences, International Society of the Learning Sciences, pp 58–65Google Scholar
  4. 4.
    Boy J, Rensink RA, Bertini E, Fekete JD (2014) A principled way of assessing visualization literacy. IEEE Trans Visualization Comput Graphics 20(12):1963–1972CrossRefGoogle Scholar
  5. 5.
    Christensen-Szalanski JJ, Beck DE, Christensen-Szalanski CM, Koepsell TD (1983) Effects of expertise and experience on risk judgments. J Appl Psych 68(2):278CrossRefGoogle Scholar
  6. 6.
    Cohen RJ, Swerdlik ME, Phillips SM (1996) Psychological testing and assessment: An introduction to tests and measurement. Mayfield Publishing CoGoogle Scholar
  7. 7.
    Correll M, Gleicher M (2014a) Bad for data, good for the brain : Knowledge-first axioms for visualization design. In: Ellis G (ed) DECISIVe : Workshop on Dealing with Cognitive Biases in Visualisations,
  8. 8.
    Correll M, Gleicher M (2014b) Error bars considered harmful: exploring alternate encodings for mean and error. IEEE Trans Visualization Comput Graphics 20(12):2142–2151CrossRefGoogle Scholar
  9. 9.
    Dimara E, Dragicevic P, Bezerianos A (2014) Accounting for availability biases in information visualization. In: Ellis G (ed) DECISIVe : workshop on dealing with cognitive biases in visualisations,
  10. 10.
    Dimara E, Bezerianos A, Dragicevic P (2017) The attraction effect in information visualization. IEEE Trans Visualization Comput Graphics 23(1):471–480CrossRefGoogle Scholar
  11. 11.
    Dragicevic P, Jansen Y (2014) Visualization-mediated alleviation of the planning fallacy. In: Ellis G (ed) DECISIVe : workshop on dealing with cognitive Biases in visualisations,
  12. 12.
    Grammel L, Tory M, Storey MA (2010) How information visualization novices construct visualizations. IEEE Trans Visualization Comput Graphics 16(6):943–952CrossRefGoogle Scholar
  13. 13.
    Kwon BC, Lee B (2016) A comparative evaluation on online learning approaches using parallel coordinate visualization. In: Proceedings of the 2016 CHI conference on human factors in computing systems, ACM, pp 993–997Google Scholar
  14. 14.
    Laina V, Wilkerson M (2016) Distributions, trends, and contradictions: a case study in sensemaking with interactive data visualizations. In: Proceedings of the 12th international conference of the learning sciences. The international society of the learning sciences, SingaporeGoogle Scholar
  15. 15.
    Lee S, Kim SH, Hung YH, Lam H, Ya Kang, Yi JS (2016) How do people make sense of unfamiliar visualizations?: a grounded model of novice’s information visualization sensemaking. IEEE Trans Visualization Comput Graphics 22(1):499–508CrossRefGoogle Scholar
  16. 16.
    Lee S, Kim SH, Kwon BC (2017) Vlat: development of a visualization literacy assessment test. IEEE Trans Visualization Comput Graphics 23(1):551–560CrossRefGoogle Scholar
  17. 17.
    Peck EM, Yuksel BF, Harrison L, Ottley A, Chang R (2012) Position paper: towards a 3-dimensional model of individual cognitive differences. ACM BELIVGoogle Scholar
  18. 18.
    Stanovich KE, West RF (2008) On the relative independence of thinking biases and cognitive ability. J Personality Social Psych 94(4):672CrossRefGoogle Scholar
  19. 19.
    Tenbrink T (2014) Cognitive discourse analysis for cognitively supportive visualisations. In: Ellis G (ed) DECISIVe : workshop on dealing with cognitive biases in visualisations,
  20. 20.
    Verbeiren T, Sakai R, Aerts J (2014) A pragmatic approach to biases in visual data analysis. In: Ellis G (ed) DECISIVe : workshop on dealing with cognitive biases in visualisations,
  21. 21.
    Wilkerson-Jerde MH, Wilensky UJ (2015) Patterns, probabilities, and people: making sense of quantitative change in complex systems. J. Learning Sci 24(2):204–251CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Worcester Polytechnic InstituteWorcesterUSA

Personalised recommendations