Time-Preserving Visual Attention Maps
Exploring the visual attention paid to a static scene can be done by visual analysis in form of attention maps also referred to as heat maps. Such maps can be computed by aggregating eye movement data to a density field which is later color coded to support the rapid identification of hot spots. Although many attempts have been made to enhance such visual attention maps, they typically do not integrate the time-varying visual attention in the static map. In this paper we investigate the problem of incorporating the dynamics of the visual attention paid to a static scene in a corresponding attention map. To reach this goal we first compute time-weighted Voronoi-based density fields for each eye-tracked person which are aggregated or averaged for a group of those people. These density values are then smoothed by a box reconstruction filter to generate aesthetically pleasing diagrams. To achieve better readability of the similar color values in the maps we enrich them by interactively adaptable isolines indicating the borders of hot spot regions of different density values. We illustrate the usefulness of our time-preserving visual attention maps in an application example investigating the analysis of visual attention in a formerly conducted eye tracking study for solving route finding tasks in public transport maps.
I would like to thank Robin Woods from Communicarta Ldt to provide some metro maps for showing them in our eye tracking studies. Without those maps this work would not have been possible.
- 2.Blascheck, T., Kurzhals, K., Raschke, M., Burch, M., Weiskopf, D., Ertl, T.: State-of-the-art of visualization for eye tracking data. In: EuroVis—STARs, pp. 63–82 (2014)Google Scholar
- 3.Blascheck, T., Burch, M., Raschke, M., Weiskopf, D.: Challenges and perspectives in big eye-movement data visual analytics. In: Proceedings of the 1st International Symposium on Big Data Visual Analytics, pp. 17–24 (2015)Google Scholar
- 4.Bojko, A.: Informative or misleading? Heatmaps deconstructed. In: Proceedings of Human-Computer Interaction. pp. 30–39. Springer (2009)Google Scholar
- 5.Burch, M., Kurzhals, K., Weiskopf, D.: Visual task solution strategies in public transport maps. In: Proceedings of ET4S@GISCIENCE, pp. 32–36 (2014)Google Scholar
- 6.Burch, M., Andrienko, G.L., Andrienko, N.V., Höferlin, M., Raschke, M., Weiskopf, D.: Visual task solution strategies in tree diagrams. In: Proceedings of IEEE Pacific Visualization Symposium, pp. 169–176 (2013)Google Scholar
- 9.Burch, M., Raschke, M., Blascheck, T., Kurzhals, K., Weiskopf, D.: How do people read metro maps? An eye tracking study. In: Proceedings of the 1st International Workshop on Schematic Mapping (Schematics) (2014)Google Scholar
- 10.Burch, M., Woods, R., Netzel, R., Weiskopf, D.: The challenges of designing metro maps. In: Proceedings of International Conference on Information Visualization Theory and Applications (2016)Google Scholar
- 11.Holmqvist, K., Nyström, M., Dewhurst, R., Jarodzka, H., van de Weijer, R.: Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press (2011)Google Scholar
- 12.Kurzhals, K., Fisher, B.D., Burch, M., Weiskopf, D.: Evaluating visual analytics with eye tracking. In: Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization, BELIV, pp. 61–69 (2014)Google Scholar
- 14.Rosenholtz, R., Li, Y., Mansfield, J., Jin, Z.: Feature congestion: a measure of display clutter. In: Proceedings of the Conference on Human Factors in Computing Systems (CHI), pp. 761–770 (2005)Google Scholar
- 16.Spakov, O., Miniotas, D.: Visualization of eye gaze data using heat maps. Electron. Electr. Eng. 2(74), 55–58 (2007)Google Scholar
- 17.Tufte, E.R.: The Visual Display of Quantitative Information. Graphics Press (1992)Google Scholar
- 18.Ware, C.: Visual Thinking: for Design. Morgan Kaufmann Series in Interactive Technologies, Paperback (2008)Google Scholar
Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 2.5 International License (http://creativecommons.org/licenses/by-nc/2.5/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.