Skip to main content

Eye Tracking in Virtual Reality

  • Living reference work entry
  • First Online:
Encyclopedia of Computer Graphics and Games

Synonyms

Foveated rendering; Gaze prioritized graphics; Gaze tracking; Gaze-contingent displays

Definitions

Eye tracking techniques have many applications of research in medicine, psychology, marketing, and human factors. It is also used as a human–computer interface for applications such as gaze-based typing and communication (Majaranta 2012; Ten Kate et al. 1979), driving safety (Chen et al. 2018; Grace et al. 1998; Kutila et al. 2007; Sinha et al. 2018), and gaming (Smith and Graham 2006; Tobii Gaming n.d.). Besides being a research tool and human–computer interface in VR (virtual reality), gaze-based techniques are also used to enhance the graphics quality and performance of displays with methods of gaze prioritized graphics, also known as foveated rendering. Furthermore, statistical models of eye tracking data are employed to provide eye movements for computer-generated avatars (Gemmell et al. 2000; Seele et al. 2017; Vinayagamoorthy et al. 2004).

The human sight is limited to...

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

References

  • Advani, S., Sustersic, J., Irick, K., Narayanan, V.: A multi-resolution saliency framework to drive foveation. In: 2013 IEEE. International Conference on Acoustics, Speech and Signal Processing (2013)

    Google Scholar 

  • Albert, R., Patney, A., Luebke, D., Kim, J.: Latency Requirements for Foveated Rendering in Virtual Reality. ACM Transactions on Applied Perception. 14, 1–13 (2017)

    Article  Google Scholar 

  • Arabadzhiyska, E., Tursun, O., Myszkowski, K., Seidel, H., Didyk, P.: Saccade landing position prediction for gaze-contingent rendering. ACM Transactions on Graphics. 36, pp. 1–12 (2017)

    Article  Google Scholar 

  • Arndt, S., Antons, J.N.: Enhancing video streaming using real-time gaze tracking. In: Proceedings of the 5th ISCA/DEGA Workshop on Perceptual Quality of Systems, pp. 6–9 (2016)

    Google Scholar 

  • Baldauf, M., Fröhlich, P., Hutter, S.: KIBITZER: a wearable system for eye-gaze-based mobile urban exploration. In: Proceedings of the 1st Augmented Human International Conference, pp. 9–13 (2010)

    Google Scholar 

  • Blascheck, T., Kurzhals, K., Raschke, M., Burch, M., Weiskopf, D., Ertl, T.: Visualization of eye tracking data: a taxonomy and survey. Comput. Graph. Forum. 36, 260–284 (2017)

    Article  Google Scholar 

  • Carnegie, K., Rhee, T.: Reducing visual discomfort with HMDs using dynamic depth of field. IEEE Comput. Graph. Appl. 35, 34–41 (2015)

    Article  Google Scholar 

  • Chen, L.B., Chang, W.J., Hu, W.W., Wang, C.K., Lee, D.H., Chiou, Y.Z.: A band-pass IR light photodetector for wearable intelligent glasses in a drowsiness-fatigue-detection system. In: Consumer Electronics (ICCE), 2018 IEEE International Conference on, pp. 1–2. IEEE (2018)

    Google Scholar 

  • Duchowski, A.T.: Eye Tracking Methodology. Theory and Practice. Springer International Publishing AG, Cham (2017)

    Book  Google Scholar 

  • Duchowski, A.T.: Gaze-based interaction: a 30 year retrospective. Comput. Graph. 73, 59–69 (2018)

    Article  Google Scholar 

  • Duchowski, A.T., Jörg, S.: Eye animation. In: Müller, B., Wolf, S.I. (eds.) Handbook of Human Motion, Springer Nature, Cham, pp. 1–19 (2016)

    Google Scholar 

  • Duchowski, A.T., Shivashankaraiah, V., Rawls, T., Gramopadhye, A.K., Melloy, B.J., Kanki, B.: Binocular eye tracking in virtual reality for inspection training. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 89–96 (2000)

    Google Scholar 

  • Duchowski, A.T., Medlin, E., Cournia, N., Gramopadhye, A., Melloy, B., Nair, S.: 3D eye movement analysis for VR visual inspection training. In: Proceedings of the Symposium on Eye Tracking Research & Applications – ETRA 02 (2002)

    Google Scholar 

  • Duchowski, A.T., Cournia, N., Cumming, B., Mccallum, D., Gramopadhye, A., Greenstein, J., Sadasivan, S., Tyrrell, R.A.: Visual deictic reference in a collaborative virtual environment. In: Proceedings of the Eye Tracking Research & Applications Symposium on Eye Tracking Research & Applications – ETRA2004 (2004)

    Google Scholar 

  • Duchowski, A.T., House, D.H., Gestring, J., Wang, R.I., Krejtz, K., Krejtz, I., Mantiuk, R., Bazyluk, B.: Reducing visual discomfort of 3D stereoscopic displays with gaze-contingent depth-of-field. In: Proceedings of the ACM Symposium on Applied Perception – SAP 14 (2014)

    Google Scholar 

  • Gemmell, J., Toyama, K., Zitnick, C.L., Kang, T., Seitz, S.: Gaze awareness for video-conferencing: a software approach. IEEE MultiMedia. 7(4), 26–35 (2000)

    Article  Google Scholar 

  • Grace, R., Byrne, V., Bierman, D., Legrand, J.-M., Gricourt, D., Davis, B., Staszewski, J., Carnahan, B.: A drowsy driver detection system for heavy vehicles. In: 17th DASC. AIAA/IEEE/SAE. Digital Avionics Systems Conference (1998)

    Google Scholar 

  • Greenwald, S.W., Loreti, L., Funk, M., Zilberman, R., Maes, P.: Eye gaze tracking with google cardboard using purkinje images. In: Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology, pp. 19–22 (2016)

    Google Scholar 

  • Haffegee, A., Barrow, R.: Eye tracking and gaze based interaction within immersive virtual environments. In: International Conference on Computational Science, pp. 729–736. Springer, Berlin/Heidelberg (2009)

    Google Scholar 

  • Hollomon, M.J., Kratchounova, D., Newton, D.C., Gildea, K., Knecht, W.R.: Current status of gaze control research and technology literature. Technical report. Federal Aviation Administration, Washington, DC (2017)

    Google Scholar 

  • Itoh, K., Hansen, J.P., Nielsen, F.R.: Cognitive modelling of a ship navigator based on protocol and eye-movement analysis. Trav. Hum. 61, 99–127 (1998)

    Google Scholar 

  • Itoh, K., Tanaka, H., Seki, M.: Eye-movement analysis of track monitoring patterns of night train operators: effects of geographic knowledge and fatigue. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 44(27), 360–363 (2000)

    Article  Google Scholar 

  • Iwamoto, K., Katsumata, S., Tanie, K.: An eye movement tracking type head mounted display for virtual reality system: evaluation experiments of a prototype system. In: IEEE International Conference on Humans, Information and Technology, no. 1, pp. 13–18 (1994)

    Google Scholar 

  • Jacob, R.J.: Eye tracking in advanced interface design. In: Barfield, W., Furness, T.A. (eds.) Virtual Environments and Advanced Interface Design, pp. 258–288. Oxford University Press, New York (1995)

    Google Scholar 

  • Just, M.A., Carpenter, P.A.: Eye fixations and cognitive processes. Cogn. Psychol. 8, 441–480 (1976)

    Article  Google Scholar 

  • Kellnhofer, P., Didyk, P., Myszkowski, K., Hefeeda, M.M., Seidel, H.-P., Matusik, W.: GazeStereo3D: seamless disparity manipulations. ACM Trans. Graph. 35, 1–13 (2016)

    Google Scholar 

  • Khamis, M., Oechsner, C., Alt, F., Bulling, A.: VRpursuits: interaction in virtual reality using smooth pursuit eye movement. Proceedings of the 2018 International Conference on Advanced Visual Interfaces - AVI ’18 May 29-June 1, Castiglione della Pescaia, Italy (2018)

    Google Scholar 

  • Koulieris, G., Drettakis, G., Cunningham, D., Mania, K.: An automated high-level saliency predictor for smart game balancing. ACM Trans. Appl. Percept. 11, 1–21 (2015)

    Article  Google Scholar 

  • Koulieris, G., Drettakis, G., Cunningham, D., Mania, K.: Gaze prediction using machine learning for dynamic stereo manipulation in games. In: 2016 IEEE Virtual Reality (VR) (2016)

    Google Scholar 

  • Kulshreshth, A., Laviola, J.J.: Dynamic stereoscopic 3D parameter adjustment for enhanced depth discrimination. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems – CHI 16 (2016)

    Google Scholar 

  • Kutila, M., Jokela, M., Markkula, G., Rue, M.R.: Driver distraction detection with a camera vision system. In: 2007 IEEE International Conference on Image Processing (2007)

    Google Scholar 

  • Lavoué, G., Cordier, F., Seo, H., Larabi, M.: Visual attention for rendered 3D shapes. Comput. Graph. Forum. 37, 191–203 (2018)

    Article  Google Scholar 

  • Levoy, M., Whitaker, R.: Gaze-directed volume rendering. ACM SIGGRAPH Comput. Graph. 24, 217–223 (1990)

    Article  Google Scholar 

  • Luebke, D., Erikson, C.: View-dependent simplification of arbitrary polygonal environments. In: Proceedings of the 24th Annual Conference on Computer Graphics and Interactive Techniques – SIGGRAPH ‘97 (1997)

    Google Scholar 

  • Luebke, D., Hallen, B.: Perceptually Driven Simplification for Interactive Rendering. Eurographics. 223–234 (2001)

    Google Scholar 

  • Majaranta, P.: Communication and text entry by gaze. In: Majaranta, P., et al. (eds.) Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies, pp. 63–77. IGI Global, Hershey (2012)

    Chapter  Google Scholar 

  • Mine, M.: Virtual Environment Interaction Techniques. UNC Chapel Hill Computer Science technical report TR95–018. University of North Carolina, Chapel Hill (1995)

    Google Scholar 

  • Mon-Williams, M., Wann, J.P.: Binocular virtual reality displays: when problems do and don’t occur. Hum. Factors. 40, 42–49 (1998)

    Article  Google Scholar 

  • Murphy, H., Duchowski, A.T.: Gaze-contingent level of detail rendering. In: EuroGraphics (2001)

    Google Scholar 

  • Murphy, H., Duchowski, A., Tyrrell, R.: Hybrid image/model-based gaze-contingent rendering. ACM Trans. Appl. Percept. 5, 1–21 (2009)

    Article  Google Scholar 

  • Murray, N., Roberts, D., Steed, A., Sharkey, P., Dickerson, P., Rae, J., Wolff, R.: Eye gaze in virtual environments: evaluating the need and initial work on implementation. Concurr. Comput. Pract. Exp. 21(11), 1437–1449 (2009)

    Article  Google Scholar 

  • Ohshima, T., Yamamoto, H., Tamura, H.: Gaze-directed adaptive rendering for interacting with virtual space. In: Proceedings of the IEEE Virtual Reality Annual International Symposium, pp. 103–110 (1996)

    Google Scholar 

  • Paletta, L., Santner, K., Fritz, G., Mayer, H., Schrammel, J.: 3D attention: measurement of visual saliency using eye tracking glasses. In: CHI’13 Extended Abstracts on Human Factors in Computing Systems, pp. 199–204. ACM (2013)

    Google Scholar 

  • Patney, A., Salvi, M., Kim, J., Kaplanyan, A., Wyman, C., Benty, N., Luebke, D., Lefohn, A.: Towards foveated rendering for gaze-tracked virtual reality. ACM Trans. Graph. 35, 1–12 (2016)

    Article  Google Scholar 

  • Pfeiffer, T.: Towards gaze interaction in immersive virtual reality: evaluation of a monocular eye tracking set-up. In: Virtuelle und Erweiterte Realität-Fünfter Workshop der GI-Fachgruppe VR/AR (2008)

    Google Scholar 

  • Pfeiffer, T.: Measuring and visualizing attention in space with 3D attention volumes. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 29–36 (2012)

    Google Scholar 

  • Pfeiffer, T., Memili, C.: Model-based real-time visualization of realistic three-dimensional heat maps for mobile eye tracking and eye tracking in virtual reality. In: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, pp. 95–102 (2016)

    Google Scholar 

  • Pfeiffer, T., Latoschik, M.E., Wachsmuth, I.: Evaluation of binocular eye trackers and algorithms for 3D gaze interaction in virtual reality environments. J. Virtual Real. Broadcast. 5(16), 1660 (2008)

    Google Scholar 

  • Pfeuffer, K., Mayer, B., Mardanbegi, D., Gellersen, H.: Gaze + pinch interaction in virtual reality. In: Proceedings of the 5th Symposium on Spatial User Interaction, pp. 99–108 (2017)

    Google Scholar 

  • Piumsomboon, T., Lee, G., Lindeman, R., Billinghurst, M.: Exploring natural eye-gaze-based interaction for immersive virtual reality. 2017 IEEE Symposium on 3D User Interfaces (3DUI). 18–19 March, Los Angeles, CA, USA. pp. 36–39 (2017)

    Google Scholar 

  • Pohl, D., Zhang, X., Bulling, A.: Combining eye tracking with optimizations for lens astigmatism in modern wide-angle HMDs. In: 2016 IEEE Virtual Reality (VR) (2016)

    Google Scholar 

  • Poole, A., Ball, L.J.: Eye tracking in HCI and usability research. In: Encyclopedia of human computer interaction, vol. 1, Idea Group Reference, London, UK pp. 211–219 (2006)

    Google Scholar 

  • Ramloll, R., Trepagnier, C., Sebrechts, M., Beedasy, J.: Gaze data visualization tools: opportunities and challenges. In: Proceedings of the 8th International Conference on Information Visualisation, pp. 173–180 (2004)

    Google Scholar 

  • Rayner, K.: Eye movements in reading and information processing: 20 years of research. Psychol. Bull. 124(3), 372–422 (1998)

    Article  Google Scholar 

  • Rayner, K.: The 35th Sir Frederick Bartlett lecture: eye movements and attention in reading, scene perception, and visual search. Q. J. Exp. Psychol. 62(8), 1457–1506 (2009)

    Article  Google Scholar 

  • Roberts, D.J., Fairchild, A.J., Campion, S.P., Ohare, J., Moore, C.M., Aspin, R., Duckworth, T., Gasparello, P., Tecchia, F.: withyou – an experimental end-to-end telepresence system using video-based reconstruction. IEEE J. Sel. Top. Signal Process. 9, 562–574 (2015)

    Article  Google Scholar 

  • Roth, T., Weier, M., Hinkenjann, A., Li, Y., Slusallek, P.: An analysis of eye-tracking data in foveated ray tracing. In: 2016 IEEE Second Workshop on Eye Tracking and Visualization (ETVIS) (2016)

    Google Scholar 

  • Roth, T., Weier, M., Hinkenjann, A., Li, Y., Slusallek, P.: A quality-centered analysis of eye tracking data in foveated rendering. J. Eye Mov. Res. 10(5) pp. 1–12 (2017)

    Google Scholar 

  • Rötting, M., Göbel, M., Springer, J.: Automatic object identification and analysis of eye movement record-ings. MMI Interakt. 1(2) pp. 1–7 (1999)

    Google Scholar 

  • Sadasivan, S., Rele, R., Greenstein, J.S., Duchowski, A.T., Gramopadhye, A.K.: Simulating on-the-job training using a collaborative virtual environment with head slaved visual deictic reference. In: Proceedings of HCI International Annual Conference, pp. 22–27 (2005)

    Google Scholar 

  • Schulz, C.M., Schneider, E., Fritz, L., Vockeroth, J., Hapfelmeier, A., Brandt, T., Kochs, E.F., Schneider, G.: Visual attention of anaesthetists during simulated critical incidents. Br. J. Anaesth. 106(6), 807–813 (2011)

    Article  Google Scholar 

  • Seele, S., Misztal, S., Buhler, H., Herpers, R., Schild, J.: Here’s looking at you anyway!: how important is realistic gaze behavior in co-located social virtual reality games? In: Proceedings of the Annual Symposium on Computer-Human Interaction in Play, pp. 531–540. ACM (2017)

    Google Scholar 

  • Sinha, O., Singh, S., Mitra, A., Ghosh, S.K., Raha, S.: Development of a drowsy driver detection system based on EEG and IR-based eye blink detection analysis. In: Bera, R., Kumar, S., Chakraborty, S.S. (eds.) Advances in Communication, Devices and Networking, Springer Nature Pte Ltd., Singapore pp. 313–319 (2018)

    Chapter  Google Scholar 

  • Smith, J.D., Graham, T.C.: Use of eye movements for video game control. In: Proceedings of the 2006 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology (2006)

    Google Scholar 

  • Stellmach, S., Nacke, L., Dachselt, R.: 3D attentional maps: aggregated gaze visualizations in three-dimensional virtual environments. In: Proceedings of the International Conference on Advanced Visual Interfaces, pp. 345–348. ACM (2010)

    Google Scholar 

  • Steptoe, W., Oyekoya, O., Murgia, A., Wolff, R., Rae, J., Guimaraes, E., Roberts, D., Steed, A.: Eye tracking for avatar eye gaze control during object-focused multiparty interaction in immersive collaborative virtual environments. In: 2009 IEEE Virtual Reality Conference (2009)

    Google Scholar 

  • Swafford, N., Iglesias-Guitian, J., Koniaris, C., Moon, B., Cosker, D., Mitchell, K.: User, metric, and computational evaluation of foveated rendering methods. In: Proceedings of the ACM Symposium on Applied Perception – SAP ‘16 (2016)

    Google Scholar 

  • Tanriverdi, V., Jacob, R.J.: Interacting with eye movements in virtual environments. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 265–272. ACM (2000)

    Google Scholar 

  • Ten Kate, J.H., Frietman, E.E., Willems, W., Romeny, B.T.H., Tenkink, E.: Eye-switch controlled communication aids. In: Proceedings of the 12th International Conference on Medical & Biological Engineering, pp. 19–20 (1979)

    Google Scholar 

  • Tobii Gaming.: https://tobiigaming.com/

  • Triesch, J., Sullivan, B.T., Hayhoe, M.M., Ballard, D.H.: Saccade contingent updating in virtual reality. In: Proceedings of the Symposium on Eye Tracking Research & Applications – ETRA 02 (2002)

    Google Scholar 

  • Tsang, H.Y., Tory, M., Swindells, C.: eSeeTrack – visualizing sequential fixation patterns. IEEE Trans. Vis. Comput. Graph. 16(6), 953–962 (2010)

    Article  Google Scholar 

  • Vinayagamoorthy, V., Garau, M., Steed, A., Slater, M.: An eye gaze model for dyadic interaction in an immersive virtual environment: practice and experience. Comput. Graph. Forum. 23(1), 1–11 (2004)

    Article  Google Scholar 

  • Watson, B., Walker, N., Hodges, L., Worden, A.: Managing level of detail through peripheral degradation: effects on search performance with a head-mounted display. ACM Trans. Comput.-Hum. Interact. 4, 323–346 (1997)

    Article  Google Scholar 

  • Weibel, N., Fouse, A., Emmenegger, C., Kimmich, S., Hutchins, E.: Let’s look at the cockpit: exploring mobile eye-tracking for observational research on the flight deck. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 107–114. ACM (2012)

    Google Scholar 

  • Weier, M., Roth, T., Kruijff, E., Hinkenjann, A., Pérard-Gayot, A., Slusallek, P., Li, Y.: Foveated real-time ray tracing for head-mounted displays. Comput. Graph. Forum. 35, 289–298 (2016)

    Article  Google Scholar 

  • Zeleznik, R.C., Forsberg, A.S., Schulze, J.P.: Look-that-there: exploiting gaze in virtual reality interactions. Technical report, Technical Report CS-05 (2005)

    Google Scholar 

  • Zha, H., Makimoto, Y., Hasegawa, T.: Dynamic gaze-controlled levels of detail of polygonal objects in 3-D environment modeling. In: Second International Conference on 3-D Digital Imaging and Modeling, pp. 321–330 (1999)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mehmet Ilker Berkman .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this entry

Check for updates. Verify currency and authenticity via CrossMark

Cite this entry

Berkman, M.I. (2019). Eye Tracking in Virtual Reality. In: Lee, N. (eds) Encyclopedia of Computer Graphics and Games. Springer, Cham. https://doi.org/10.1007/978-3-319-08234-9_170-1

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-08234-9_170-1

  • Received:

  • Accepted:

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-08234-9

  • Online ISBN: 978-3-319-08234-9

  • eBook Packages: Springer Reference Computer SciencesReference Module Computer Science and Engineering

Publish with us

Policies and ethics