Skip to main content

Visualizing Dynamic Ambient/Focal Attention with Coefficient \(K\)

  • Conference paper
  • First Online:
Eye Tracking and Visualization (ETVIS 2015)

Part of the book series: Mathematics and Visualization ((MATHVISUAL))

Included in the following conference series:

Abstract

Using coefficient \(\mathcal{K}\), defined on a parametric scale, derived from processing a traditionally eye-tracked time course of eye movements, we propose a straightforward method of visualizing ambient/focal fixations in both scanpath and heatmap visualizations. The \(\mathcal{K}\) coefficient indicates the difference of fixation duration and following saccade amplitude expressed in standard deviation units, facilitating parametric statistical testing. Positive and negative ordinates of \(\mathcal{K}\) indicate focal or ambient fixations, respectively, and are colored by luminance variation depicting relative intensity of focal fixation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Biele, C., Kopacz, A., Krejtz, K.: Shall we care about the user’s feelings? Influence of affect and engagement on visual attention. In: Proceedings of the International Conference on Multimedia, Interaction, Design and Innovation, MIDI’13, pp. 7:1–7:8. ACM, New York (2013). doi:10.1145/2500342.2500349. http://doi.acm.org/10.1145/2500342.2500349

  2. Blascheck, T., Kurzhals, K., Raschke, M., Burch, M., Weiskopf, D., Ertl, T.: Start-of-the-art of visualization for eye tracking data. In: Borgo, R., Maciejewski, R., Viola, I. (eds.) EuroGraphics Conference on Visualization (EuroVis). EuroVis STAR—State of the Art Report (2014)

    Google Scholar 

  3. Borland, D., Taylor II, R.M.: Rainbow color map (still) considered harmful. IEEE Comput. Graph. Appl. 27 (2), 14–17 (2007)

    Article  Google Scholar 

  4. Breslow, L.A., Ratwani, R.M., Trafton, J.G.: Cognitive models of the influence of color scale on data visualization tasks. Hum. Factors 51 (3), 321–338 (2009)

    Article  Google Scholar 

  5. Brewer, C., Harrower, M., Woodruff, A., Heyman, D.: Colorbrewer 2.0: color advice for maps. Online Resource (2009). http://colorbrewer2.org. Last accessed Dec 2010

  6. Duchowski, A.T., Price, M.M., Meyer, M., Orero, P.: Aggregate gaze visualization with real-time heatmaps. In: Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA’12, pp. 13–20. ACM, New York (2012). doi:10.1145/2168556.2168558. http://doi.acm.org/10.1145/2168556.2168558

  7. Elias, G., Sherwin, G., Wise, J.: Eye movements while viewing NTSC format television. Technical report, SMPTE Psychophysics Subcommittee (1984)

    Google Scholar 

  8. Follet, B., Le Meur, O., Baccino, T.: New insights on ambient and focal visual fixations using an automatic classification algorithm. i-Perception 2 (6), 592–610 (2011)

    Google Scholar 

  9. Irwin, D.E., Zelinsky, G.J.: Eye movements and scene perception: memory for things observed. Percept. Psychophys. 64, 882–895 (2002)

    Article  Google Scholar 

  10. Krejtz, I., Szarkowska, A., Krejtz, K., Walczak, A., Duchowski, A.: Audio description as an aural guide of children’s visual attention: evidence from an eye-tracking study. In: ETRA’12: Proceedings of the 2012 Symposium on Eye Tracking Research & Applications, ETRA’12, pp. 99–106. ACM, New York (2012). doi:10.1145/2168556.2168572

    Google Scholar 

  11. Krejtz, K., Duchowski, A., Krejtz, I., Szarkowska, A., Kopacz, A.: Discerning ambient/focal attention with coefficient \(\mathcal{K}\). Trans. Appl. Percept. 13 (3), Article 11 (2016). http://dx.doi.org/10.1145/2896452

  12. Krejtz, K., Duchowski, A.T., Çöltekin, A.: High-level gaze metrics from map viewing: charting ambient/focal visual attention. In: Kiefer, P., Giannopoulos, I., Raubal, M., Krüger, A. (eds.) Proceedings of the 2nd International Workshop on Eye Tracking for Spatial Research (ET4S), Vienna (2014)

    Google Scholar 

  13. LimeSurvey Project Team/Carsten Schmitz: LimeSurvey: an open source survey tool. LimeSurvey Project, Hamburg (2012). http://www.limesurvey.org

  14. Lin, S., Fortuna, J., Kulkarni, C., Stone, M., Heer, J.: Selecting semantically-resonant colors for data visualization. In: Proceedings of the 15th Eurographics Conference on Visualization, EuroVis’13, pp. 401–410. Eurographics/John Wiley, Chichester (2013). doi:10.1111/cgf.12127. http://dx.doi.org/10.1111/cgf.12127

  15. Mello-Thoms, C., Nodine, C.F., Kundel, H.L.: What attracts the eye to the location of missed and reported breast cancers? In: ETRA’02: Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, pp. 111–117. ACM, New York (2002). http://doi.acm.org/10.1145/507072.507095

  16. Mital, P.K., Smith, T.J., Hill, R.L., Henderson, J.M.: Clustering of gaze during dynamic scene viewing is predicted by motion. Cogn. Comput. 3, 5–24 (2011)

    Article  Google Scholar 

  17. Nodine, C.F., Kundel, H.L., Toto, L.C., Krupinski, E.A.: Recording and analyzing eye-position data using a microcomputer workstation. Behav. Res. Methods 24 (3), 475–485 (1992)

    Article  Google Scholar 

  18. Nothdurft, H.C.: Focal attention in visual search. Vis. Res. 39, 2305–2310 (1999)

    Article  Google Scholar 

  19. Pannasch, S., Helmert, J.R., Roth, K., Herbold, A.K., Walter, H.: Visual fixation durations and saccade amplitudes: shifting relationship in a variety of conditions. J. Eye Mov. Res. 2 (2), 1–19 (2008)

    Google Scholar 

  20. Paris, S., Durand, F.: A fast approximation of the bilateral filter using a signal processing approach. Technical report, MIT-CSAIL-TR-2006-073, Massachusetts Institute of Technology (2006)

    Google Scholar 

  21. Pomplun, M., Ritter, H., Velichkovsky, B.: Disambiguating complex visual information: towards communication of personal views of a scene. Perception 25 (8), 931–948 (1996)

    Article  Google Scholar 

  22. R Development Core Team: R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna (2011)

    Google Scholar 

  23. Ratwani, R.M., Trafton, J.G., Boehm-Davis, D.A.: Thinking graphically: connecting vision and cognition during graph comprehension. J. Exp. Psychol. Appl. 14 (1), 36–49 (2008)

    Article  Google Scholar 

  24. Rogowitz, B., Treinish, L.: Data visualization: the end of the rainbow. IEEE Spectr. 35 (12), 52–59 (1998)

    Article  Google Scholar 

  25. Treisman, A., Gelade, G.: A feature integration theory of attention. Cogn. Psychol. 12, 97–136 (1980)

    Article  Google Scholar 

  26. Unema, P.J.A., Pannasch, S., Joos, M., Velichkovsky, B.: Time course of information processing during scene perception. Vis. Cogn. 12 (3), 473–494 (2005)

    Article  Google Scholar 

  27. van Gisbergen, M.S., van der Most, J., Aelen, P.: Visual attention to online search engine results. Technical report, De Vos & Jansen in cooperation with Checkit (2007). http://www.iprospect.nl/wp-content/themes/iprospect/pdf/checkit/eyetracking_research.pdf. Last accessed Dec 2011

  28. Velichkovsky, B.M., Joos, M., Helmert, J.R., Pannasch, S.: Two visual systems and their eye movements: evidence from static and dynamic scene perception. In: CogSci 2005: Proceedings of the XXVII Conference of the Cognitive Science Society, Stresa, pp. 2283–2288 (2005)

    Google Scholar 

  29. Vitak, S.A., Ingram, J.E., Duchowski, A.T., Ellis, S., Gramopadhye, A.K.: Gaze-augmented think-aloud as an aid to learning. In: Proceedings of the SIGCHI Conference on Human Factors in computing systems, CHI’12, pp. 1253–1262. ACM, New York (2012). doi:http://doi.acm.org/10.1145/1124772.1124961. http://doi.acm.org/10.1145/1124772.1124961

  30. Wooding, D.S.: Fixation maps: quantifying eye-movement traces. In: ETRA’02: Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, pp. 31–36. ACM, New York (2002). doi:http://doi.acm.org/10.1145/507072.507078

Download references

Acknowledgements

We thank Dr. Helena Duchowska (MD, retired) for her help in reading the CXR images and pinpointing the anomalies contained therein.

This work was partially supported by a 2015 research grant “Influence of affect on visual attention dynamics during visual search” from the SWPS University of Social Sciences and Humanities.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. T. Duchowski .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Duchowski, A.T., Krejtz, K. (2017). Visualizing Dynamic Ambient/Focal Attention with Coefficient \(K\) . In: Burch, M., Chuang, L., Fisher, B., Schmidt, A., Weiskopf, D. (eds) Eye Tracking and Visualization. ETVIS 2015. Mathematics and Visualization. Springer, Cham. https://doi.org/10.1007/978-3-319-47024-5_13

Download citation

Publish with us

Policies and ethics