Advertisement

Time-Preserving Visual Attention Maps

  • Michael Burch
Conference paper
Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 57)

Abstract

Exploring the visual attention paid to a static scene can be done by visual analysis in form of attention maps also referred to as heat maps. Such maps can be computed by aggregating eye movement data to a density field which is later color coded to support the rapid identification of hot spots. Although many attempts have been made to enhance such visual attention maps, they typically do not integrate the time-varying visual attention in the static map. In this paper we investigate the problem of incorporating the dynamics of the visual attention paid to a static scene in a corresponding attention map. To reach this goal we first compute time-weighted Voronoi-based density fields for each eye-tracked person which are aggregated or averaged for a group of those people. These density values are then smoothed by a box reconstruction filter to generate aesthetically pleasing diagrams. To achieve better readability of the similar color values in the maps we enrich them by interactively adaptable isolines indicating the borders of hot spot regions of different density values. We illustrate the usefulness of our time-preserving visual attention maps in an application example investigating the analysis of visual attention in a formerly conducted eye tracking study for solving route finding tasks in public transport maps.

Notes

Acknowledgments

I would like to thank Robin Woods from Communicarta Ldt to provide some metro maps for showing them in our eye tracking studies. Without those maps this work would not have been possible.

References

  1. 1.
    Andrienko, G.L., Andrienko, N.V., Burch, M., Weiskopf, D.: Visual analytics methodology for eye movement studies. IEEE Trans. Vis. Comput. Graphics 18(12), 2889–2898 (2012)CrossRefGoogle Scholar
  2. 2.
    Blascheck, T., Kurzhals, K., Raschke, M., Burch, M., Weiskopf, D., Ertl, T.: State-of-the-art of visualization for eye tracking data. In: EuroVis—STARs, pp. 63–82 (2014)Google Scholar
  3. 3.
    Blascheck, T., Burch, M., Raschke, M., Weiskopf, D.: Challenges and perspectives in big eye-movement data visual analytics. In: Proceedings of the 1st International Symposium on Big Data Visual Analytics, pp. 17–24 (2015)Google Scholar
  4. 4.
    Bojko, A.: Informative or misleading? Heatmaps deconstructed. In: Proceedings of Human-Computer Interaction. pp. 30–39. Springer (2009)Google Scholar
  5. 5.
    Burch, M., Kurzhals, K., Weiskopf, D.: Visual task solution strategies in public transport maps. In: Proceedings of ET4S@GISCIENCE, pp. 32–36 (2014)Google Scholar
  6. 6.
    Burch, M., Andrienko, G.L., Andrienko, N.V., Höferlin, M., Raschke, M., Weiskopf, D.: Visual task solution strategies in tree diagrams. In: Proceedings of IEEE Pacific Visualization Symposium, pp. 169–176 (2013)Google Scholar
  7. 7.
    Burch, M., Konevtsova, N., Heinrich, J., Höferlin, M., Weiskopf, D.: Evaluation of traditional, orthogonal, and radial tree diagrams by an eye tracking study. IEEE Trans. Visual. Comput. Graphics 17(12), 2440–2448 (2011)CrossRefGoogle Scholar
  8. 8.
    Burch, M., Kull, A., Weiskopf, D.: AOI rivers for visualizing dynamic eye gaze frequencies. Comput. Graphics Forum 32(3), 281–290 (2013)CrossRefGoogle Scholar
  9. 9.
    Burch, M., Raschke, M., Blascheck, T., Kurzhals, K., Weiskopf, D.: How do people read metro maps? An eye tracking study. In: Proceedings of the 1st International Workshop on Schematic Mapping (Schematics) (2014)Google Scholar
  10. 10.
    Burch, M., Woods, R., Netzel, R., Weiskopf, D.: The challenges of designing metro maps. In: Proceedings of International Conference on Information Visualization Theory and Applications (2016)Google Scholar
  11. 11.
    Holmqvist, K., Nyström, M., Dewhurst, R., Jarodzka, H., van de Weijer, R.: Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press (2011)Google Scholar
  12. 12.
    Kurzhals, K., Fisher, B.D., Burch, M., Weiskopf, D.: Evaluating visual analytics with eye tracking. In: Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization, BELIV, pp. 61–69 (2014)Google Scholar
  13. 13.
    Netzel, R., Burch, M., Weiskopf, D.: Comparative eye tracking study on node-link visualizations of trajectories. IEEE Trans. Vis. Comput. Graphics 20(12), 2221–2230 (2014)CrossRefGoogle Scholar
  14. 14.
    Rosenholtz, R., Li, Y., Mansfield, J., Jin, Z.: Feature congestion: a measure of display clutter. In: Proceedings of the Conference on Human Factors in Computing Systems (CHI), pp. 761–770 (2005)Google Scholar
  15. 15.
    Scinto, L.F., Pillalamarri, R., Karsh, R.: Cognitive strategies for visual search. Acta Psychol. 62(3), 263–292 (1986)CrossRefGoogle Scholar
  16. 16.
    Spakov, O., Miniotas, D.: Visualization of eye gaze data using heat maps. Electron. Electr. Eng. 2(74), 55–58 (2007)Google Scholar
  17. 17.
    Tufte, E.R.: The Visual Display of Quantitative Information. Graphics Press (1992)Google Scholar
  18. 18.
    Ware, C.: Visual Thinking: for Design. Morgan Kaufmann Series in Interactive Technologies, Paperback (2008)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 2.5 International License (http://creativecommons.org/licenses/by-nc/2.5/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Authors and Affiliations

  1. 1.VISUS, University of StuttgartStuttgartGermany

Personalised recommendations