Eye Tracking and Eye-Based Human–Computer Interaction

Chapter
Part of the Human–Computer Interaction Series book series (HCIS)

Abstract

Eye tracking has a long history in medical and psychological research as a tool for recording and studying human visual behavior. Real-time gaze-based text entry can also be a powerful means of communication and control for people with physical disabilities. Following recent technological advances and the advent of affordable eye trackers, there is a growing interest in pervasive attention-aware systems and interfaces that have the potential to revolutionize mainstream human-technology interaction. In this chapter, we provide an introduction to the state-of-the art in eye tracking technology and gaze estimation. We discuss challenges involved in using a perceptual organ, the eye, as an input modality. Examples of real life applications are reviewed, together with design solutions derived from research results. We also discuss how to match the user requirements and key features of different eye tracking systems to find the best system for each task and application.

Keywords

Retina Marketing Sorting Photography Essig 

References

  1. Alapetite A, Hansen JP, MacKenzie IS (2012) Demo of gaze controlled flying. In: Proceedings of the 7th Nordic conference on human-computer interaction: making sense through design, NordiCHI’12. ACM, New York, pp 773–774Google Scholar
  2. Ashmore M, Duchowski AT, Shoemaker G (2005) Efficient eye pointing with a fisheye lens. In: Proceedings of graphics interface 2005, GI’05. Canadian Human-Computer Communications Society, Waterloo, Ontario, pp 203–210Google Scholar
  3. Barea F, Boquete L, Mazo M, Lopez E (2002) System for assisted mobility using eye movements based on electrooculography. IEEE Trans Neural Syst Rehabil Eng 10(4):209–218CrossRefGoogle Scholar
  4. Bates R, Donegan M, Istance HO et al (2006) Introducing COGAIN—communication by gaze interaction. In: Clarkson J, Langdon P, Robinson P (eds) Designing accessible technology. Springer, London, pp 77–84CrossRefGoogle Scholar
  5. Bates R, Istance H (2002) Zooming interfaces!: enhancing the performance of eye controlled pointing devices. In: Proceedings of the 5th international ACM conference on assistive technologies, Assets’02. ACM, New York, pp 119–126Google Scholar
  6. Bates R, Istance HO, Vickers S (2008) Gaze interaction with virtual on-line communities. Designing inclusive futures. Springer, London, pp 149–162Google Scholar
  7. Bednarik R, Vrzakova H, Hradis M (2012) What you want to do next: a novel approach for intent prediction in gaze-based interaction. In: Proceedings of the symposium on eye tracking research and applications, ETRA’12. ACM, New York, pp 83–90Google Scholar
  8. Bengoechea JJ, Villanueva A, Cabeza R (2012) Hybrid eye detection algorithm for outdoor environments. In: Proceedings of the 2012 ACM conference on ubiquitous computing, UbiComp’12. ACM, New York, pp 685–688Google Scholar
  9. Brigham FJ, Zaimi E, Matkins JJ et al (2001) The eyes may have it: reconsidering eye-movement research in human cognition. In: Scruggs TE, Mastropieri MA (eds) Technological applications. Advances in learning and behavioral disabilities, vol 15. Emerald Group Publishing Limited, Bingley, pp 39–59Google Scholar
  10. Borghetti D, Bruni A, Fabbrini M et al (2007) A low-cost interface for control of computer functions by means of eye movements. Comput Biol Med 37(12):1765–1770CrossRefGoogle Scholar
  11. Bulling A, Cheng S, Brône G et al (2012a) 2nd international workshop on pervasive eye tracking and mobile eye-based interaction (PETMEI 2012). In: Proceedings of the 2012 ACM conference on ubiquitous computing, UbiComp 2012. ACM, New York, pp 673–676Google Scholar
  12. Bulling A, Ward JA, Gellersen H et al (2008a) Robust recognition of reading activity in transit using wearable electrooculography. In: Proceedings of the 6th international conference on pervasive computing, Pervasive 2008, pp 19–37Google Scholar
  13. Bulling A, Gellersen H (2010) Toward mobile eye-based human-computer interaction. IEEE Pervasive Comput 9(4):8–12CrossRefGoogle Scholar
  14. Bulling A, Roggen D, Tröster G (2008b) It’s in your eyes—Towards context-awareness and mobile hci using wearable EOG goggles. In: Proceedings of the 10th international conference on ubiquitous computing. ACM, New York, pp 84–93Google Scholar
  15. Bulling A, Roggen D, Tröster G (2009a) Wearable EOG goggles: eye-based interaction in everyday environments. In: Extended abstracts of the 27th ACM conference on human factors in computing systems, CHI’09. ACM, New York, pp 3259–3264Google Scholar
  16. Bulling A, Roggen D, Tröster G (2009b) Wearable EOG goggles: seamless sensing and context-awareness in everyday environments. J Ambient Intell Smart Environ 1(2):157–171Google Scholar
  17. Bulling A, Ward JA, Gellersen H et al (2009c) Eye movement analysis for activity recognition. In: Proceedings of the 11th international conference on ubiquitous computing, UbiComp 2009. ACM, New York, pp 41–50Google Scholar
  18. Bulling A, Roggen D (2011a) Recognition of visual memory recall processes using eye movement analysis. In: Proceedings of the 13th international conference on ubiquitous computing, UbiComp 2011. ACM, New York, pp 455–464Google Scholar
  19. Bulling A, Ward JA, Gellersen H et al (2011b) Eye movement analysis for activity recognition using electrooculography. IEEE Trans Pattern Anal Mach Intell 33(4):741–753Google Scholar
  20. Bulling A, Ward JA, Gellersen H (2012b) Multimodal recognition of reading activity in transit using body-worn sensors. ACM Trans Appl Percept 9(1):2:1–2:21Google Scholar
  21. Bulling A, Weichel C, Gellersen H (2013) EyeContext: recognition of high-level contextual cues from human visual behavior. In: Proceedings of the 31st SIGCHI international conference on human factors in computing systems, CHI 2013. ACM, New York, pp 305–308Google Scholar
  22. Canosa RL (2009) Real-world vision: selective perception and task. ACM Trans Appl Percept 6(2):article 11, 34 ppGoogle Scholar
  23. Castellina E, Corno F (2007) Accessible web surfing through gaze interaction. In: Proceedings of the 3rd Conference on communication by gaze interaction, COGAIN 2007, Leicester, 3–4 Sept 2007, pp 74–77Google Scholar
  24. Chen Y, Newman WS (2004) A human-robot interface based on electrooculography. In: Proceedings of the international conference on robotics and automation, ICRA 2004, vol 1, pp 243–248Google Scholar
  25. Corno F, Gale A, Majaranta P et al (2010) Eye-based direct interaction for environmental control in heterogeneous smart environments. In: Nakashima H et al (eds) Handbook of ambient intelligence and smart environments. Springer, New York, pp 1117–1138Google Scholar
  26. Ding Q, Tong K, Li G (2005) Development of an EOG (ElectroOculography) based human-computer interface. In: Proceedings of the 27th annual international conference of the engineering in medicine and biology society, EMBS 2005, pp 6829–6831Google Scholar
  27. Donegan M, Morris DJ, Corno F et al (2009) Understanding users and their needs. Univ Access Inf Soc 8(4):259–275CrossRefGoogle Scholar
  28. Drewes H, Schmidt A (2007) Interacting with the computer using gaze gestures. In: Proceedings of INTERACT ‘07. Lecture notes in computer science, vol 4663. Springer, Heidelberg, pp 475-488Google Scholar
  29. Du R, Liu R, Wu T et al (2012) Online vigilance analysis combining video and electrooculography features. In: Proceedings of 19th international conference on neural information processing, ICONIP 2012. Lecture notes in computer science, vol 7667. Springer, Heidelberg, pp 447–454Google Scholar
  30. Duchowski AT (2002) A breadth-first survey of eye-tracking applications. Behav Res Meth 34(4):455–470CrossRefGoogle Scholar
  31. Duchowski AT (2003) Eye tracking methodology: theory and practice. Springer, LondonCrossRefGoogle Scholar
  32. Duchowski AT, Cournia NA, Murphy HA (2004) Gaze-contingent displays: a review. CyberPsychol Behav 7(6):621–634CrossRefGoogle Scholar
  33. Duchowski AT, Vertegaal R (2000) Eye-based interaction in graphical systems: theory and practice. Course 05, SIGGRAPH 2000. Course notes. ACM, New York. http://eyecu.ces.clemson.edu/sigcourse/. Accessed 23 Feb 2013
  34. Dybdal ML, San Agustin J, Hansen JP (2012) Gaze input for mobile devices by dwell and gestures. In: Proceedings of the symposium on eye tracking research and applications, ETRA ‘12. ACM, New York, pp 225–228Google Scholar
  35. ElHelw MA, Atkins S, Nicolaou M et al (2008) A gaze-based study for investigating the perception of photorealism in simulated scenes. ACM Trans Appl Percept 5(1):article 3, 20 ppGoogle Scholar
  36. Ellis S, Cadera R, Misner J et al (1998) Windows to the soul? What eye movements tell us about software usability. In: Proceedings of 7th annual conference of the usability professionals association, Washington, pp 151–178Google Scholar
  37. Essig K, Dornbusch D, Prinzhorn D et al (2012) Automatic analysis of 3D gaze coordinates on scene objects using data from eye-tracking and motion-capture systems. In: Proceedings of the symposium on eye tracking research and applications, ETRA’12. ACM, New York, pp 37–44Google Scholar
  38. Fairclough SH (2011) Physiological computing: interacting with the human nervous system. In: Ouwerkerk M, Westerlink J, Krans M (eds) Sensing emotions in context: the impact of context on behavioural and physiological experience measurements. Springer, Amsterdam, pp 1–22Google Scholar
  39. Fejtová M, Figueiredo L, Novák P et al (2009) Hands-free interaction with a computer and other technologies. Univ Access Inf Soc 8(4):277–295Google Scholar
  40. Fono D, Vertegaal R (2005) EyeWindows: evaluation of eye-controlled zooming windows for focus selection. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI’05. ACM, New York, pp 151–160Google Scholar
  41. Friedman MB, Kiliany G, Dzmura et al (1982) The eyetracker communication system. Johns Hopkins APL Technical Digest 3(3):250–252Google Scholar
  42. Fujitsu (2012) Fujitsu develops eye tracking technology. Press release 2 Oct 2012. http://www.fujitsu.com/global/news/pr/archives/month/2012/20121002-02.html. Accessed 23 Feb 2013
  43. Goldberg JH, Wichansky AM (2003) Eye tracking in usability evaluation: a practitioner’s guide. In: Hyönä J, Radach R, Deubel H (eds) The mind’s eye: cognitive and applied aspects of eye movement research. North-Holland, Amsterdam, pp 493–516CrossRefGoogle Scholar
  44. Greene MR, Liu TY, Wolfe JM (2012) Reconsidering Yarbus: a failure to predict observers’ task from eye movement patterns. Vis Res 62:1–8CrossRefGoogle Scholar
  45. Hacisalihzade SS, Stark LW, Allen JS (1992) Visual perception and sequences of eye movement fixations: a stochastic modeling approach. IEEE Trans Syst Man Cybern 22(3):474–481CrossRefGoogle Scholar
  46. Hammoud R (ed) (2008) Passive eye monitoring: algorithms, applications and experiments. Series: signals and communication technology. Springer, BerlinGoogle Scholar
  47. Hansen DW, Ji Q (2009) In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 32(3):478–500CrossRefGoogle Scholar
  48. Hansen DW, Majaranta P (2012) Basics of camera-based gaze tracking. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. Medical Information Science Reference, Hershey, pp 21–26Google Scholar
  49. Hansen DW, Pece AEC (2005) Eye tracking in the wild. Comput Vis Image Underst 98(1):155–181CrossRefGoogle Scholar
  50. Hansen DW, Skovsgaard HH, Hansen JP et al (2008) Noise tolerant selection by gaze-controlled pan and zoom in 3D. In: Proceedings of the symposium on eye tracking research and applications, ETRA’08. ACM, New York, pp 205–212Google Scholar
  51. Hansen DW, San Agustin J, Villanueva A (2010) Homography normalization for robust gaze estimation in uncalibrated setups. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA’10. ACM, New York, pp 13–20Google Scholar
  52. Hansen JP, Tørning K, Johansen AS et al (2004) Gaze typing compared with input by head and hand. In: Proceedings of the 2004 symposium on eye tracking research and applications, ETRA’04. ACM, New York, pp 131–138Google Scholar
  53. Hillstrom AP, Yantis S (1994) Visual motion and attentional capture. Percept Psychophys 55(4):399–411CrossRefGoogle Scholar
  54. Hori J, Sakano K, Miyakawa M, Saitoh Y (2006) Eye movement communication control system based on EOG and voluntary eye blink. In: Proceedings of the 9th international conference on computers helping people with special needs, ICCHP, vol 4061, pp 950–953Google Scholar
  55. Huckauf A, Urbina MH (2008) On object selection in gaze controlled environments. J Eye Mov Res 2(4):1–7Google Scholar
  56. Hyrskykari A, Majaranta P, Aaltonen A et al (2000) Design issues of iDICT: a gaze-assisted translation aid. In: Proceedings of the 2000 symposium on eye tracking research and applications, ETRA 2000. ACM, New York, pp 9–14Google Scholar
  57. Hyrskykari A, Majaranta P, Räihä KJ (2003) Proactive response to eye movements. In: Rauterberg et al (eds) Proceedings of INTERACT 2003, pp 129–136Google Scholar
  58. Isokoski P (2000) Text input methods for eye trackers using off-screen targets. In: Proceedings of the symposium on eye tracking research and applications, ETRA’00. ACM, New York, pp 15–21Google Scholar
  59. Isokoski P, Joos M, Spakov O et al (2009) Gaze controlled games. Univ Access Inf Soc 8(4):323–337CrossRefGoogle Scholar
  60. Istance H, Hyrskykari A (2012) Gaze-aware systems and attentive applications. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 175–195Google Scholar
  61. Jacob RJK (1991) The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Trans Inf Sys 9(3):152–169CrossRefGoogle Scholar
  62. Jacob RJK (1993) Eye movement-based human-computer interaction techniques: toward non-command interfaces. In: Hartson HR, Hix D (eds) Advances in human-computer interaction, vol 4. Ablex Publishing Co, Norwood, pp 151–190Google Scholar
  63. Jacob RJK (1995) Eye tracking in advanced interface design. In: Barfield W, Furness TA (eds) Virtual environments and advanced interface design. Oxford University Press, New York, pp 258–288Google Scholar
  64. Jokinen K, Majaranta P (2013) Eye-gaze and facial expressions as feedback signals in educational interactions. In: Barres DG et al (eds) Technologies for inclusive education: beyond traditional integration approaches. IGI Global, Hershey, pp 38–58Google Scholar
  65. Kandemir M, Kaski S (2012) Learning relevance from natural eye movements in pervasive interfaces. In: Proceedings of the 14th ACM international conference on multimodal interaction, ICMI’12. ACM, New York, pp 85–92Google Scholar
  66. Kherlopian AR, Gerrein JP, Yue M et al (2006) Electrooculogram based system for computer control using a multiple feature classification model. In: Proceedings of the 28th annual international conference of the engineering in medicine and biology society, EMBS 2006, pp 1295–1298Google Scholar
  67. Kim Y, Doh N, Youm Y et al (2001) Development of a human-mobile communication system using electrooculogram signals. In: Proceedings of the 2001 IEEE/RSJ international conference on intelligent robots and systems, IROS 2001, vol 4, pp 2160–2165Google Scholar
  68. Kinsman TB, Pelz JB (2012) Location by parts: model generation and feature fusion for mobile eye pupil tracking under challenging lighting. In: Proceedings of the 2012 ACM conference on ubiquitous computing, UbiComp’12. ACM, New York, pp 695–700Google Scholar
  69. Kleinke CL (1986) Gaze and eye contact: a research review. Psychol Bull 100(1):78–100CrossRefGoogle Scholar
  70. Land MF, Furneaux S (1997) The knowledge base of the oculomotor system. Philos Trans Biol Sci 352(1358):1231–1239CrossRefGoogle Scholar
  71. Larsen EJ (2012) Systems and methods for providing feedback by tracking user gaze and gestures. Sony Computer Entertainment Inc. US Patent application 2012/0257035. http://www.faqs.org/patents/app/20120257035. Accessed 23 Feb 2013
  72. Liu SC, Delbruck T (2010) Neuromorphic sensory systems. Curr Opin Neurobiol 20:1–8CrossRefGoogle Scholar
  73. Lorenceau J (2012) Cursive writing with smooth pursuit eye movements. Curr Biol 22(16):1506–1509. doi: 10.1016/j.cub.2012.06.026 CrossRefGoogle Scholar
  74. Majaranta P (2012) Communication and text entry by gaze. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 63–77Google Scholar
  75. Majaranta P, Ahola UK, Špakov O (2009a) Fast gaze typing with an adjustable dwell time. In: Proceedings of the 27th international conference on human factors in computing systems, CHI 2009. ACM, New York, pp 357–360Google Scholar
  76. Majaranta P, Bates R, Donegan M (2009b) Eye tracking. In: Stephanidis C. (ed) The universal access handbook, chapter 36. CRC Press, Boca Raton, 20 ppGoogle Scholar
  77. Majaranta P, MacKenzie IS, Aula A et al (2006) Effects of feedback and dwell time on eye typing speed and accuracy. Univ Access in Inf Soc 5(2):199–208CrossRefGoogle Scholar
  78. Majaranta P, Räihä KJ (2002) Twenty years of eye typing: systems and design issues. In: Proceedings of 2002 symposium on eye tracking research and applications, ETRA 2002. ACM, New York, pp 15–22Google Scholar
  79. Manabe H, Fukumoto M (2006) Full-time wearable headphone-type gaze detector. In: Extended abstracts of the SIGCHI conference on human factors in computing systems, CHI 2006. ACM, New York, pp 1073–1078Google Scholar
  80. Mele ML, Federici S (2012) A psychotechnological review on eye-tracking systems: towards user experience. Disabil Rehabil Assist Technol 7(4):261–281CrossRefGoogle Scholar
  81. Miniotas D, Špakov O, Tugoy I et al (2006) Speech-augmented eye gaze interaction with small closely spaced targets. In: Proceedings of the 2006 symposium on eye tracking research and applications, ETRA ‘06. ACM, New York, pp 67–72Google Scholar
  82. Mizuno F, Hayasaka T, Tsubota K et al (2003) Development of hands-free operation interface for wearable computer-hyper hospital at home. In: Proceedings of the 25th annual international conference of the engineering in medicine and biology society, EMBS 2003, vol 4, 17–21 Sept 2003, pp 3740–3743Google Scholar
  83. Mohammad Y, Okada S, Nishida T (2010) Autonomous development of gaze control for natural human-robot interaction. In Proceedings of the 2010 workshop on eye gaze in intelligent human machine interaction (EGIHMI ‘10). ACM, New York, NY, USA, pp 63–70Google Scholar
  84. Morimoto CH, Mimica MRM (2005) Eye gaze tracking techniques for interactive applications. Comput Vis Image Underst 98(1):4–24CrossRefGoogle Scholar
  85. Mulvey F (2012) Eye anatomy, eye movements and vision. In: Majaranta P et al (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 10–20Google Scholar
  86. Murphy-Chutorian E, Trivedi MM (2009) Head pose estimation in computer vision: a survey. IEEE Trans Pattern Anal Mach Intell 31(4):607–626CrossRefGoogle Scholar
  87. Nagamatsu N, Sugano R, Iwamoto Y et al (2010) User-calibration-free gaze tracking with estimation of the horizontal angles between the visual and the optical axes of both eyes. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA’10. ACM, New York, pp 251–254Google Scholar
  88. Nakano YI, Jokinen J, Huang HH (2012) 4th workshop on eye gaze in intelligent human machine interaction: eye gaze and multimodality. In: Proceedings of the 14th ACM international conference on multimodal interaction, ICMI’12. ACM, New York, pp 611–612Google Scholar
  89. Nielsen J (1993) Noncommand user interfaces. Commun ACM 36(4):82–99CrossRefGoogle Scholar
  90. Ohno T (1998) Features of eye gaze interface for selection tasks. In: Proceedings of the 3rd Asia Pacific computer-human interaction, APCHI’98. IEEE Computer Society, Washington, pp 176–182Google Scholar
  91. Osterberg G (1935) Topography of the layer of rods and cones in the human retina. Acta Ophthalmol Suppl 13(6):1–102Google Scholar
  92. Patmore DW, Knapp RB (1998) Towards an EOG-based eye tracker for computer control. In Proceedings of the 3rd international ACM conference on assistive technologies, ASSETS’98. ACM, New York, pp 197–203Google Scholar
  93. Penzel T, Lo CC, Ivanov PC et al (2005) Analysis of sleep fragmentation and sleep structure in patients with sleep apnea and normal volunteers. In: 27th annual international conference of the engineering in medicine and biology society, IEEE-EMBS 2005, pp 2591–2594Google Scholar
  94. Philips GR, Catellier AA, Barrett SF et al (2007) Electrooculogram wheelchair control. Biomed Sci Instrum 43:164–169Google Scholar
  95. Porta M Ravarelli A, Spagnoli G (2010) ceCursor, a contextual eye cursor for general pointing in windows environments. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA’10. ACM, New York, pp 331–337Google Scholar
  96. Rayner K (1995) Eye movements and cognitive processes in reading, visual search, and scene perception. In: Findlay JM et al (eds) Eye movement research: mechanisms, processes and applications. North Holland, Amsterdam, pp 3–22CrossRefGoogle Scholar
  97. Rothkopf CA, Pelz JP (2004) Head movement estimation for wearable eye tracker. In: Proceedings of the 2004 symposium on eye tracking research and applications, ETRA’04. ACM, New York, pp 123–130Google Scholar
  98. Räihä K-J, Hyrskykari A, Majaranta P (2011) Tracking of visual attention and adaptive applications. In: Roda C (ed) Human attention in digital environments. Cambridge University Press, Cambridge, pp 166–185Google Scholar
  99. Salvucci DD, Anderson JR (2001) Automated eye-movement protocol analysis. Human-Comput Interact 16(1):39–86CrossRefGoogle Scholar
  100. Schneider E, Dera T, Bard K et al (2005) Eye movement driven head-mounted camera: it looks where the eyes look. In: IEEE international conference on systems, man and cybernetics, vol 3, pp 2437–2442Google Scholar
  101. Shell JS, Vertegaal R, Cheng D et al (2004) ECSGlasses and EyePliances: using attention to open sociable windows of interaction. In: Proceedings of the 2004 symposium on eye tracking research and applications, ETRA’04. ACM, New York, pp 93–100Google Scholar
  102. Sibert LE, Jacob RJK (2000) Evaluation of eye gaze interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’00, ACM, pp 281–288Google Scholar
  103. Skovsgaard H, Räihä KJ, Tall M (2012) Computer control by gaze. In: Majaranta P et al. (eds) Gaze interaction and applications of eye tracking: advances in assistive technologies. IGI Global, Hershey, pp 78–102Google Scholar
  104. Smith JR, Cronin MJ, Karacan I (1971) A multichannel hybrid system for rapid eye movement detection (REM detection). Comp Biomed Res 4(3):275–290CrossRefGoogle Scholar
  105. Špakov O, Majaranta P (2012) Enhanced gaze interaction using simple head gestures. In: Proceedings of the 14th international conference on ubiquitous computing, UbiComp’12. ACM Press, New York, pp 705–710Google Scholar
  106. Stampe DM, Reingold EM (1995) Selection by looking: a novel computer interface and its application to psychological research. In: Findlay JM, Walker R, Kentridge RW (eds) Eye movement research: mechanisms, processes and applications. Elsevier Science, Amsterdam, pp 467–478CrossRefGoogle Scholar
  107. Stellmach S, Dachselt F (2012) Look and touch: gaze-supported target acquisition. In: Proceedings of the 2012 ACM annual conference on human factors in computing systems, CHI’12. ACM, New York, pp 2981–2990Google Scholar
  108. Sugioka A, Ebisawa Y, Ohtani M (1996) Noncontact video-based eye-gaze detection method allowing large head displacements. In: Proceedings of the 18th annual international conference of the IEEE engineering in medicine and biology society. Bridging disciplines for biomedicine, vol 2, pp 526–528Google Scholar
  109. Ten Kate JH, Frietman EEE, Willems W et al (1979) Eye-switch controlled communication aids. In: Proceedings of the 12th international conference on medical and biological engineering, Jerusalem, Israel, August 1979Google Scholar
  110. Tessendorf B, Bulling A, Roggen D et al (2011) Recognition of hearing needs from body and eye movements to improve hearing instruments. In: Proceedings of the 9th international conference on pervasive computing, Pervasive 2011. Lecture notes in computer science, vol 6696. Springer, Heidelberg, pp 314–331Google Scholar
  111. Tobii (2011) Tobii unveils the world’s first eyecontrolled laptop. Press release, 1 March 2011. http://www.tobii.com/en/eye-tracking-research/global/news-and-events/press-release-archive/archive-2011/tobii-unveils-the-worlds-first-eye-controlled-laptop. Accessed 23 Feb 2013
  112. Vehkaoja AT, Verho JA, Puurtinen MM et al (2005) Wireless head cap for EOG and facial EMG measurements. In: Proceedings of the 27th annual international conference of the engineering in medicine and biology society, IEEE EMBS 2005, pp 5865–5868Google Scholar
  113. Velichkovsky B, Sprenger A, Unema P (1997) Towards gaze-mediated interaction: collecting solutions of the “Midas touch problem”. In: Proceedings of the IFIP TC13 international conference on human-computer interaction, INTERACT’97. Chapman and Hall, London, pp 509–516Google Scholar
  114. Venkataramanan, A Prabhat P, Choudhury SR et al (2005) Biomedical instrumentation based on electrooculogram (EOG) signal processing and application to a hospital alarm system. In: Proceedings of the 3rd international conference on intelligent sensing and information processing, ICISIP 2005, IEEE Conference Publications, pp 535–540Google Scholar
  115. Vesterby T, Voss JC, Hansen JP et al (2005) Gaze-guided viewing of interactive movies. Digit Creativity 16(4):193–204CrossRefGoogle Scholar
  116. Vickers S, Istance H, Smalley M (2010) EyeGuitar: making rhythm based music video games accessible using only eye movements. In: Proceedings of the 7th international conference on advances in computer entertainment technology, ACE’10. ACM, New York, pp 36–39Google Scholar
  117. Vidal M, Bulling A, Gellersen H (2013) Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In: Proceedings of the 2013 ACM international joint conference on pervasive and ubiquitous computing, UbiComp 2013. ACM, New YorkGoogle Scholar
  118. Wade NJ, Tatler BW (2005) The moving tablet of the eye: the origins of modern eye movement research. Oxford University Press, OxfordCrossRefGoogle Scholar
  119. Ware C, Mikaelian HH (1987) An evaluation of an eye tracker as a device for computer input. In: Proceedings of the SIGCHI/GI conference on human factors in computing systems and graphics interface, CHI and GI’87. ACM, New York, pp 183–188Google Scholar
  120. Wijesoma WS, Wee Ks, Wee OC et al (2005) EOG based control of mobile assistive platforms for the severely disabled. In: Proceedings of the IEEE international conference on robotics and biomimetics, ROBIO 2005, pp 490–494Google Scholar
  121. Wobbrock JO, Rubinstein J, Sawyer MW et al (2008) Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In: Proceedings of the symposium on eye tracking research and applications, ETRA 2008. ACM, New York, pp 11–18Google Scholar
  122. Wästlund W, Sponseller K, Pettersson O (2010) What you see is where you go: testing a gaze-driven power wheelchair for individuals with severe multiple disabilities. In: Proceedings of the 2010 symposium on eye-tracking research and applications, ETRA 2010. ACM, New York, pp 133–136Google Scholar
  123. Zhang Y, Rasku J, Juhola M (2012) Biometric verification of subjects using saccade eye movements. Int J Biometr 4(4):317–337CrossRefGoogle Scholar
  124. Zhang Y, Bulling A, Gellersen H (2013) SideWays: a gaze interface for spontaneous interaction with situated displays. In: Proceedings of the 31st SIGCHI international conference on human factors in computing systems, CHI 2013. ACM, New York, pp 851–860Google Scholar
  125. Zhu Z, Ji Q (2005) Eye gaze tracking under natural head movements. In: IEEE computer society conference on computer vision and pattern recognition, CVPR 2005, vol 1, pp 918–923Google Scholar

Copyright information

© Springer-Verlag London 2014

Authors and Affiliations

  1. 1.School of Information SciencesUniversity of TampereTampereFinland
  2. 2.Perceptual User InterfacesMax Planck Institute for InformaticsSaarbrückenGermany

Personalised recommendations