Advertisement

Eye Tracking for Dynamic, User-Driven Workflows

  • Laura A. McNamaraEmail author
  • Kristin M. Divis
  • J. Daniel Morrow
  • David Perkins
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10284)

Abstract

Researchers at Sandia National Laboratories in Albuquerque, New Mexico, are engaged in the empirical study of human-information interaction in high-consequence national security environments. This focus emerged from our longstanding interactions with military and civilian intelligence analysts working across a broad array of domains, from signals intelligence to cybersecurity to geospatial imagery analysis. In this paper, we discuss how several years’ of work with Synthetic Aperture Radar (SAR) imagery analysts revealed the limitations of eye tracking systems for capturing gaze events in the dynamic, user-driven problem-solving strategies characteristic of geospatial analytic workflows. We also explain the need for eye tracking systems capable of supporting inductive study of dynamic, user-driven problem-solving strategies characteristic of geospatial analytic workflows. We then discuss an ongoing project in which we are leveraging some of the unique properties of SAR image products to develop a prototype eyetracking data collection and analysis system that will support inductive studies of visual workflows in SAR image analysis environments.

Keywords

Visual search Synthetic Aperture Radar Information foraging Eye tracking Imagery analysis 

Notes

Acknowledgments

This research was funded by the Sandia National Laboratories’ Laboratory Research and Development Program. Sandia National Laboratories is a multiprogram laboratory managed and operated by the Sandia Corporation, a wholly owned subsidiary of the Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000.

References

  1. 1.
    Nardi, B.A., O’Day, V.: Information Ecologies: Using Technology with Heart. MIT Press, Cambridge (1999)Google Scholar
  2. 2.
    Magnuson, S.: Military swimming in sensors and drowning in data. In: National Defense, p. 1, January 2010Google Scholar
  3. 3.
    Ackerman, S.: Air force chief: it’ll be ‘years’ before we catch up on drone data. In: Wired Magazine (2012). http://www.wired.com/dangerroomn/2012/05/air-force/drone-data/
  4. 4.
    Doerry, A.W., Dieckey, F.M.: Synthetic aperture radar. Opt. Photonics News 15(11), 28–33 (2004)CrossRefGoogle Scholar
  5. 5.
    McNamara, L.A., Cole, K., Haass, M.J., Matzen, L.E., Daniel Morrow, J., Stevens-Adams, S.M., McMichael, S.: Ethnographic methods for experimental design: case studies in visual search. In: Schmorrow, D.D., Fidopiastis, C.M. (eds.) AC 2015. LNCS (LNAI), vol. 9183, pp. 492–503. Springer, Cham (2015). doi: 10.1007/978-3-319-20816-9_47 CrossRefGoogle Scholar
  6. 6.
    Cole, K., Stevens-Adams, S., McNamara, L., Ganter, J.: Applying cognitive work analysis to a synthetic aperture radar system. In: Harris, D. (ed.) EPCE 2014. LNCS (LNAI), vol. 8532, pp. 313–324. Springer, Cham (2014). doi: 10.1007/978-3-319-07515-0_32 Google Scholar
  7. 7.
    Stevens-Adams, S., Cole, K., McNamara, L.: Hierarchical task analysis of a synthetic aperture radar analysis process. In: Harris, D. (ed.) EPCE 2014. LNCS (LNAI), vol. 8532, pp. 545–554. Springer, Cham (2014). doi: 10.1007/978-3-319-07515-0_54 Google Scholar
  8. 8.
    McNamara, L.A., Klein, L.M.: Context sensitive design and human interaction principles for usable, useful and adoptable radars. In: Proceedings of SPIE Decision Support Systems, Baltimore, MD. SPIE (2016)Google Scholar
  9. 9.
    Matzen, L.E., Haass, M., Tran, J., McNamara, L.: Using eye tracking metrics and visual salience maps to assess image utility. In: Proceedings of Conference on Human Vision and Electronic Imaging, Baltimore, MD, pp. 1–8. SPIE (2016)Google Scholar
  10. 10.
    Drury, C.G.: Inspection performance. In: Handbook of Industrial Engineering, pp. 2283–2314. Wiley, New York (1992)Google Scholar
  11. 11.
    Drury, C.G., Watson, J.: Good Practices in Visual Inspection. US Department of Transportation, Federal Aviation Administration, Washington, DC (2002)Google Scholar
  12. 12.
    Sarac, A., Batta, R., Drury, C.G.: Extension of the visual search models of inspection. Theor. Issues Ergon. Sci. 21(3), 531–556 (2007)CrossRefGoogle Scholar
  13. 13.
    See, J.E.: Visual Inspection: A Review of the Literature (SAND 2012-8590). Sandia National Laboratories, Albuquerque (2012)CrossRefGoogle Scholar
  14. 14.
    Chi, E.H., Pirolli, P., Chen, K., Pitkow, J.: Using information scent to model user information needs on the web. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 490–497. ACM (2001)Google Scholar
  15. 15.
    Paik, J., Pirolli, P.: ACT-R models of information foraging in geospatial intelligence tasks. Comput. Math. Organ. Theory 21(3), 274–295 (2015)CrossRefGoogle Scholar
  16. 16.
    Pirolli, P., Card, S.: Information foraging. Psychol. Rev. 106(21), 643–675 (1999)CrossRefGoogle Scholar
  17. 17.
    Chi, E.H., Gumbrecht, M., Hong, L.: Visual foraging of highlighted text: an eye-tracking study. In: Jacko, J.A. (ed.) HCI 2007. LNCS, vol. 4552, pp. 589–598. Springer, Heidelberg (2007). doi: 10.1007/978-3-540-73110-8_64 CrossRefGoogle Scholar
  18. 18.
    Wang, M.K., Lin, S.C., Drury, C.G.: Training for strategy in visual search. Int. J. Ind. Econ. 20, 101–108 (1997)Google Scholar
  19. 19.
    Triesman, A., Gelade, G.: A feature integration theory of attention. Cogn. Psychol. 12, 97–136 (1980)CrossRefGoogle Scholar
  20. 20.
    Wolfe, J.: Visual search. In: Pashler, H. (ed.) Attention, pp. 13–73. Psychology Press, Taylor and Francis, New York (1998)Google Scholar
  21. 21.
    Wolfe, J.: Guided search 4.0: current progress with a model of visual search. In: Gray, W.D. (ed.) Integrative Models of Cognitive Systems, pp. 99–119. Oxford University Press, New York (2007)CrossRefGoogle Scholar
  22. 22.
    Spencer, F.W.: Visual inspection research report on benchmark inspections, Department of Transportation, Federal Aviation Administration (1996)Google Scholar
  23. 23.
    Gramopadhye, A.K., Drury, C.G., Prabhu, P.V.: Training strategies for visual inspection. Hum. Factors Ergon. Manuf. 7(3), 171–196 (1997)CrossRefGoogle Scholar
  24. 24.
    Megaw, E.D.: Factors affecting visual inspection accuracy. Appl. Ergon. 10, 27–32 (1979)CrossRefGoogle Scholar
  25. 25.
    Megaw, E.D., Richardson, J.: Eye movements and industrial inspection. Appl. Ergon. 10(3), 145–154 (1979)CrossRefGoogle Scholar
  26. 26.
    Biggs, A.T., Cain, M.S., Clark, K., Darling, E.F., Mitroff, S.R.: Assessing visual search performance differences between Transportation Security Administration Officers and nonprofessional visual searchers. Vis. Cogn. 21(3), 330–352 (2013)CrossRefGoogle Scholar
  27. 27.
    Clark, K., Cain, M.S., Adamo, S.H., Mitroff, S.: Overcoming hurdles in translating visual search research between the lab and the field. In: Dodd, M.D., Flowers, J.H. (eds.) The Influence of Attention, Learning, and Motivation on Visual Search. Nebraska Symposium on Motivation, pp. 147–181. Springer, New York (2012)CrossRefGoogle Scholar
  28. 28.
    McCarley, J.S., et al.: Visual skills in airport-security screening. Psychol. Sci. 15(5), 302–306 (2004)CrossRefGoogle Scholar
  29. 29.
    Duchowski, A.T.: A breadth-first survey of eye-tracking applications. Behav. Res. Methods Instrum. Comput. 34(4), 455–470 (2002)CrossRefGoogle Scholar
  30. 30.
    Latshaw, G.L., Zuzelo, P.L., Briggs, S.J.: Tactical photointerpreter evaluations of hardcopy and softcopy imagery. In: 1978 Technical Symposium East. International Society for Optics and Photonics (1978)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Laura A. McNamara
    • 1
    Email author
  • Kristin M. Divis
    • 1
  • J. Daniel Morrow
    • 1
  • David Perkins
    • 1
  1. 1.Sandia National LaboratoriesAlbuquerqueUSA

Personalised recommendations