Eye Animation

Reference work entry

Abstract

The synthesis of eye movements involves modeling saccades (the rapid shifts of gaze), smooth pursuits (object tracking motions), binocular rotations implicated in vergence, and the coupling of eye and head rotations. More detailed movements include dilation and constriction of the pupil (pupil unrest) as well as small fluctuations (microsaccades, tremor, and drift, which we collectively call microsaccadic jitter) made during fixations, when gaze is held nearly steady. In this chapter, we focus on synthesizing physiologically plausible eye rotations, microsaccadic jitter, and pupil unrest. We review concepts relevant to the animation of eye motions and provide a procedural model of gaze that incorporates rotations adhering to Donders’ and Listing’s laws, the saccadic main sequence, along with gaze jitter and pupil unrest. We model microsaccadic jitter and pupil unrest by 1/f  α or pink noise.

Keywords

Eye movements Saccades Fixations Microsaccadic jitter 

Notes

Acknowledgments

This material is based in part upon work supported by the US National Science Foundation under Grant No. IIS-1423189. Any opinions, findings, and conclusions expressed in this material are those of the authors and do not necessarily reflect the views of the NSF.

References

  1. Abrams RA, Meyer DE, Kornblum S (1989) Speed and accuracy of saccadic eye movements: characteristics of impulse variability in the oculomotor system. J Exp Psychol Hum Percept Perform 15(3):529–543CrossRefGoogle Scholar
  2. Ahern S, Beatty J (1979) Pupillary responses during information processing vary with scholastic aptitude test scores. Science 205(4412):1289–1292CrossRefGoogle Scholar
  3. Aks DJ, Zelinsky GJ, Sprott JC (2002) Memory across eye-movements: 1 / f dynamic in visual search. Nonlinear Dynamics Psychol Life Sci 6(1):1–25CrossRefGoogle Scholar
  4. Andrist S, Pejsa T, Mutlu B, Gleicher M (2012) Designing effective gaze mechanisms for virtual agents. In: Proceedings of the 2012 ACM Annual Conference on Human Factors in Computing Systems, CHI’12. ACM, New York, pp 705–714.  https://doi.org/10.1145/2207676.2207777, http://doi.acm.org/10.1145/2207676.2207777CrossRefGoogle Scholar
  5. Bahill AT, Clark M, Stark L (1975) The main sequence. A tool for studying human eye movements. Math Biosci 24(3/4):191–204CrossRefGoogle Scholar
  6. Beatty J (1982) Task-evoked pupillary responses, processing load, and the structure of processing resources. Psychol Bull 91(2):276–292CrossRefGoogle Scholar
  7. Beatty J, Lucero-Wagoner B (2000) The pupillary system. In: Cacioppo JT, Tassinary LG, Bernston GG (eds) Handbook of psychophysiology, 2nd edn. Cambridge, Cambridge University Press, pp 142–162Google Scholar
  8. Bentivoglio AR, Bressman SB, Cassetta E, Carretta D, Tonali P, Albanese A (1997) Analysis of blink rate patterns in normal subjects. Mov Disord 12(6):1028–1034CrossRefGoogle Scholar
  9. Bérard P, Bradley D, Nitti M, Beeler T, Gross M (2014) High-quality capture of eyes. ACM Trans Graph 33(6):2231–22312.  https://doi.org/10.1145/2661229.2661285CrossRefGoogle Scholar
  10. Campbell CS, Maglio PP (2001) A robust algorithm for reading detection. In: ACM workshop on perceptive user interfaces. ACM Press, New York, pp 1–7Google Scholar
  11. Condit R, Ashton PS, Baker P, Bunyavejchewin S, Gunatilleke S, Gunatilleke N, Hubbell SP, Foster RB, Itoh A, LaFrankie JV, Lee HS, Losos E, Manokaran N, Sukumar R, Yamakura T (2000) Spatial patterns in the distribution of tropical tree species. Science 288(5470):1414–1418CrossRefGoogle Scholar
  12. Davson H (1980) Physiology of the eye, 4th edn. Academic, New York, NYGoogle Scholar
  13. Di Stasi LL, McCamy MB, Catena A, Macknik SL, Cañas JJ, Martinez-Conde S (2013) Microsaccade and drift dynamics reflect mental fatigue. Eur J Neurosci 38(3):2389–2398CrossRefGoogle Scholar
  14. Duchowski A, Jörg S (2015) Modeling physiologically plausible eye rotations: adhering to Donders’ and listing’s laws. In: Proceedings of computer graphics international (short papers) (2015)Google Scholar
  15. Duchowski A, Jörg S, Lawson A, Bolte T, Świrski L, Krejtz K (2015) Eye movement synthesis with 1/f pink noise. In: Motion in Games (MIG) 2015, Paris, FranceGoogle Scholar
  16. Duchowski AT (2007) Eye tracking methodology: theory & practice, 2nd edn. Springer, London, UKMATHGoogle Scholar
  17. Duchowski AT, House DH, Gestring J, Wang RI, Krejtz K, Krejtz I, Mantiuk R, Bazyluk B (2014) Reducing visual discomfort of 3D stereoscopic displays with gaze-contingent depth-of-field. In: Proceedings of the ACM symposium on applied perception, SAP’14. ACM, New York, NY, pp 39–46.  https://doi.org/10.1145/2628257.2628259, http://doi.acm.org/10.1145/2628257.2628259Google Scholar
  18. Duchowski AT, Jörg S, Allen TN, Giannopoulos I, Krejtz K (2016) Eye movement synthesis. In: Proceedings of the ninth biennial acm symposium on eye tracking research & applications, ETRA’16. ACM, New York, NY, pp 147–154.  https://doi.org/10.1145/2857491.2857528, http://doi.acm.org/10.1145/2857491.2857528CrossRefGoogle Scholar
  19. Ferman L, Collewijn H, Van den Berg AV (1987) A direct test of listing’s law – I. human ocular torsion measured in static tertiary positions. Vision Res 27(6):929–938CrossRefGoogle Scholar
  20. Garau M, Slater M, Vinayagamoorthy V, Brogni A, Steed A, Sasse MA (2003) The impact of avatar realism and eye gaze control on perceived quality of communication in a shared immersive virtual environment. In: Human factors in computing systems: CHI 03 conference proceedings. ACM Press, New York, pp 529–536CrossRefGoogle Scholar
  21. Glenn B, Vilis T (1992) Violations of listing’s law after large eye and head gaze shifts. J Neurophysiol 68(1):309–318CrossRefGoogle Scholar
  22. Grzywacz NM, Norcia AM (1995) Directional selectivity in the cortex. In: Arbib MA (ed) The handbook of brain theory and neural networks. Cambridge, MA, The MIT Press, pp 309–311Google Scholar
  23. Gu E, Lee SP, Badler JB, Badler NI (2008) Eye movements, saccades, and multi-party conversations. In: Deng Z, Neumann U (eds) Data-driven 3D facial animation. Springer, London, UK, pp 79–97.  https://doi.org/10.1007/978-1-84628-907-1_4Google Scholar
  24. Haslwanter T (1995) Mathematics of three-dimensional eye rotations. Vision Res 35(12):1727–1739CrossRefGoogle Scholar
  25. Hollos S, Hollos JR (2015a) Creating noise. Exstrom Laboratories, LLC, Longmont, CO, http://www.abrazol.com/books/noise/ (last accessed Jan. 2015). ISBN 9781887187268 (ebook)Google Scholar
  26. Hollos S, Hollos JR (2015b) Recursive Digital Filters: A Concise Guide. Exstrom Laboratories, LLC, Longmont, CO, http://www.abrazol.com/books/filter1/ (last accessed Jan. 2015). ISBN 9781887187244 (ebook)Google Scholar
  27. Hubel DH (1988) Eye, brain, and vision. Scientific American Library, New York, NYGoogle Scholar
  28. Kashihara K, Okanoya K, Kawai N (2014) Emotional attention modulates microsaccadic rate and direction. Psychol Res 78:166–179CrossRefGoogle Scholar
  29. Knox PC (2012) The parameters of eye movement (2001). Lecture Notes, URL: http://www.liv.ac.uk/~pcknox/teaching/Eymovs/params.htm (last accessed November 2012)
  30. Komogortsev OV, Karpov A (2013) Liveness detection via oculomotor plant characteristics: attack of mechanical replicas. In: Proceedings of the IEEE/IARP international conference on biometrics (ICB), pp 1–8Google Scholar
  31. Komogortsev OV, Karpov A, Holland CD (2015) Attack of mechanical replicas: liveness detection with eye movements. IEEE Trans Inform Forensics Secur 10(4):716–725CrossRefGoogle Scholar
  32. Krejtz K, Duchowski AT, Çöltekin A (2014) High-level gaze metrics from map viewing: charting ambient/focal visual attention. In: Kiefer P, Giannopoulos I, Raubal M, Krüger A (eds) 2nd international workshop in eye tracking for spatial research (ET4S)Google Scholar
  33. Landy SD (1999) Mapping the universe. Sci Am 224:38–45CrossRefGoogle Scholar
  34. Laretzaki G, Plainis S, Vrettos I, Chrisoulakis A, Pallikaris I, Bitsios P (2011) Threat and trait anxiety affect stability of gaze fixation. Biol Psychol 86(3):330–336CrossRefGoogle Scholar
  35. Lee SP, Badler JB, Badler NI (2002) Eyes alive. ACM Trans Graph 21(3):637–644.  https://doi.org/10.1145/566654.566629, http://doi.acm.org/10.1145/566654.566629Google Scholar
  36. Looser CE, Wheatley T (2010) The tipping point of animacy. How, when, and where we perceive life in a face. Psychol Sci 21(12):1854–62CrossRefGoogle Scholar
  37. Ma X, Deng Z (2009) Natural eye motion synthesis by modeling gaze-head coupling. In: IEEE virtual reality, pp 143–150. Lafayette, LAGoogle Scholar
  38. Martinez-Conde S, Macknik SL, Troncoso Xoana G, Hubel DH (2009) Microsaccades: a neurophysiological analysis. Trends Neurosci 32(9):463–475CrossRefGoogle Scholar
  39. Mok D, Ro A, Cadera W, Crawford JD, Vilis T (1992) Rotation of listing’s plane during vergence. Vision Res 32(11):2055–2064CrossRefGoogle Scholar
  40. Mori M (1970) The uncanny valley. Energy 7(4):33–35Google Scholar
  41. Murphy H, Duchowski AT (2002) Perceptual gaze extent & level of detail in VR: looking outside the box. In: Conference abstracts and applications (sketches & applications), Computer graphics (SIGGRAPH) annual conference series. ACM, San Antonio, TXGoogle Scholar
  42. Murray N, Roberts D, Steed A, Sharkey P, Dickerson P, Rae J, Wolff R (2009) Eye gaze in virtual environments: evaluating the need and initial work on implementation. Concurr Comput 21:1437–1449CrossRefGoogle Scholar
  43. Ostling A, Harte J, Green J (2000) Self-similarity and clustering in the spatial distribution of species. Science 27(5492):671CrossRefGoogle Scholar
  44. Oyekoya O, Steptoe W, Steed A (2009) A saliency-based method of simulating visual attention in virtual scenes. In: Reality V (ed) Software and technology. New York, ACM, pp 199–206Google Scholar
  45. Pamplona VF, Oliveira MM, Baranoski GVG (2009) Photorealistic models for pupil light reflex and iridal pattern deformation. ACM Trans Graph 28(4):106:1–106:12.  https://doi.org/10.1145/1559755.1559763, http://doi.acm.org/10.1145/1559755.1559763CrossRefGoogle Scholar
  46. Pejsa T, Mutlu B, Gleicher M (2013) Stylized and performative gaze for character animation. In: Navazo I, Poulin P (eds) Proceedings of EuroGrpahics. EuroGraphicsGoogle Scholar
  47. Peters C, Qureshi A (2010) A head movement propensity model for animating gaze shifts and blinks of virtual characters. Comput Graph 34:677–687CrossRefGoogle Scholar
  48. Porrill J, Ivins JP, Frisby JP (1999) The variation of torsion with vergence and elevation. Vision Res 39:3934–3950CrossRefGoogle Scholar
  49. Privitera CM, Renninger LW, Carney T, Klein S, Aguilar M (2008) The pupil dilation response to visual detection. In: Rogowitz BE, Pappas T (eds) Human vision and electronic imaging, vol 6806. SPIE, Bellingham, WAGoogle Scholar
  50. Quaia C, Optican LM (2003) Three-dimensional Rotations of the Eye. In: Kaufman PL, Alm A (eds) Adler’s phsyiology of the eye: clinical application, 10th edn. C. V. Mosby Co., St. Louis, pp 818–829Google Scholar
  51. Robinson DA (1968) The oculomotor control system: a review. Proc IEEE 56(6):1032–1049CrossRefGoogle Scholar
  52. Rolfs M (2009) Microsaccades: Small steps on a long way. Vision Res 49(20):2415–2441.  https://doi.org/10.1016/j.visres.2009.08.010CrossRefGoogle Scholar
  53. Ruhland K, Andrist S, Badler JB, Peters CE, Badler NI, Gleicher M, Mutlu B, McDonnell R (2014) Look me in the eyes: a survey of eye and gaze animation for virtual agents and artificial systems. In: Lefebvre S, Spagnuolo M (ed) Computer graphics forum. EuroGraphics STAR – State of the Art Report. EuroGraphics.Google Scholar
  54. Stark L, Campbell FW, Atwood J (1958) Pupil unrest: an example of noise in a biological servomechanism. Nature 182(4639):857–858CrossRefGoogle Scholar
  55. Steptoe W, Oyekoya O, Steed A (2010) Eyelid kinematics for virtual characters. Comput Animat Virtual World 21(3–4):161–171Google Scholar
  56. Szendro P, Vincze G, Szasz A (2001) Pink-noise behaviour of biosystems. Eur Biophys J 30(3):227–231CrossRefGoogle Scholar
  57. Templin K, Didyk P, Myszkowski K, Hefeeda MM, Seidel HP, Matusik W (2014) Modeling and optimizing eye vergence response to stereoscopic cuts. ACM Trans. Graph 33(4):8 Article 145 (July 2014), http://dx.doi.org/10.1145/2601097.2601148
  58. Trutoiu LC, Carter EJ, Matthews I, Hodgins JK (2011) Modeling and animating eye blinks. ACM Trans Appl Percept (TAP) 2(3):17:1–17:17Google Scholar
  59. Tweed D, Cadera W, Vilis T (1990) Computing three-dimensional eye position quaternions and eye velocity from search coil signals. Vision Res 30(1):97–110CrossRefGoogle Scholar
  60. Tweed D, Vilis T (1990) Geometric relations of eye position and velocity vectors during saccades. Vision Res 30(1):111–127CrossRefGoogle Scholar
  61. Usher M, Stemmler M, Olami Z (1995) Dynamic pattern formation leads to 1/f noise in neural populations. Phys Rev Lett 74(2):326–330CrossRefGoogle Scholar
  62. van Rijn LJ (1994) Torsional eye movements in humans. Ph.D. thesis, Erasmus Universiteit Rotterdam, Rotterdam, The NetherlandsGoogle Scholar
  63. Vertegaal R (1999) The GAZE groupware system: mediating joint attention in mutiparty communication and collaboration. In: Human factors in computing systems: CHI’99 conference proceedings. ACM Press, New York, pp 294–301Google Scholar
  64. Yang Z, Zhao Q, Keefer E, Liu W (2009) Noise characterization, modeling, and reduction for in vivo neural recording. In: Bengio Y, Schuurmans D, Lafferty J, Williams CKI, Culotta A (eds) Advances in neural information processing systems, vol 22., pp 2160–2168Google Scholar
  65. Yeo SH, Lesmana M, Neog DR, Pai DK (2012) Eyecatch: simulating visuomotor coordination for object interception. ACM Trans Graph 31(4):42:1–42:10CrossRefGoogle Scholar
  66. Zhou Y, Huang H, Wei LY, Wang R (2012) Point sampling with general noise spectrum. ACM Trans Graph 31(4):76:1–76:11.  https://doi.org/10.1145/2185520.2185572, URL: http://doi.acm.org/10.1145/2185520.2185572CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Clemenson UniversityClemsonUSA
  2. 2.School of ComputingClemson UniversityClemsonUSA

Personalised recommendations