Skip to main content

A Low-Cost Head and Eye Tracking System for Realistic Eye Movements in Virtual Avatars

  • Conference paper
MultiMedia Modeling (MMM 2014)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 8325))

Included in the following conference series:

Abstract

A virtual avatar or autonomous agent is a digital representation of a human being that can be controlled by either a human or an artificially intelligent computer system. Increasingly avatars are becoming realistic virtual human characters that exhibit human behavioral traits, body language and eye and head movements. As the interpretation of eye and head movements represents an important part of nonverbal human communication it is extremely important to accurately reproduce these movements in virtual avatars to avoid falling into the well-known “uncanny valley”. In this paper we present a cheap hybrid real-time head and eye tracking system based on existing open source software and commonly available hardware. Our evaluation indicates that the system of head and eye tracking is stable and accurate and can allow a human user to robustly puppet a virtual avatar, potentially allowing us to train an A.I. system to learn realistic human head and eye movements.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Mori, M.: The uncanny valley. Energy 7(4), 33–35 (1970)

    Google Scholar 

  2. Duchowski, A.T.: Eye tracking methodology: Theory and practice, vol. 373. Springer (2007)

    Google Scholar 

  3. Morimoto, C.H., Mimica, M.R.M.: Eye gaze tracking techniques for interactive applications. Computer Vision and Image Understanding 98(1), 4–24 (2005)

    Article  Google Scholar 

  4. Brolly, X.L.C., Mulligan, J.B.: Implicit calibration of a remote gaze tracker. In: Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop, vol. 8, p. 134 (2004)

    Google Scholar 

  5. San Agustin, J., Skovsgaard, H., Mollenbach, E., Barret, M., Tall, M., Hansen, D.W., Hansen, J.P.: Evaluation of a low-cost open-source gaze tracker. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (2010)

    Google Scholar 

  6. ITU Gaze Tracker, http://www.gazegroup.org/downloads/23-gazetracker

  7. Castrilln-Santana, M., Dniz-Surez, O., Antn-Canals, L., Lorenzo-Navarro, J.: Face and facial feature detection evaluation. In: Third International Conference on Computer Vision Theory and Applications, VISAPP 2008 (2008)

    Google Scholar 

  8. Murphy-Chutorian, E., Trivedi, M.M.: Head pose estimation in computer vision: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence 31(4), 607–626 (2009)

    Article  Google Scholar 

  9. Shan, C., Gong, S., Mcowan, P.W.: Facial expression recognition based on Local Binary Patterns: A comprehensive study. Image and Vision Computing 27(6), 803–816 (2009)

    Article  Google Scholar 

  10. Terzopoulos, D., Waters, K.: Physically-based facial modelling, analysis, and animation. The Journal of Visualization and Computer Animation 1(2), 73–80 (1990)

    Article  Google Scholar 

  11. Cruz, L., Lucio, D., Velho, L.: Kinect and rgbd images: Challenges and applications. In: SIBGRAPI Tutorial (2012)

    Google Scholar 

  12. Zhang, Z.: Microsoft Kinect Sensor and Its Effect. IEEE Multimedia 19(2), 4–10 (2009)

    Article  Google Scholar 

  13. Santos, G., Proenca, H.: A robust eye-corner detection method for real-world data. In: 2011 International Joint Conference on Biometrics, IJCB (2011)

    Google Scholar 

  14. Xu, C., Zheng, Y., Wang, Z.: Semantic feature extraction for accurate eye corner detection. In: ICPR (2008)

    Google Scholar 

  15. Sukthankar, R., Stockton, R.G., Mullin, M.D.: Smarter presentations: Exploiting homography in camera-projector systems. In: ICCV (2001)

    Google Scholar 

  16. Bay, H., Ess, A., Tuytelaars, T., Van Gool, L.: SURF: Speeded Up Robust Features. Computer Vision and Image Understanding (CVIU) 110(3), 346–359 (2006)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Li, Y., Wei, H., Monaghan, D.S., O’Connor, N.E. (2014). A Low-Cost Head and Eye Tracking System for Realistic Eye Movements in Virtual Avatars. In: Gurrin, C., Hopfgartner, F., Hurst, W., Johansen, H., Lee, H., O’Connor, N. (eds) MultiMedia Modeling. MMM 2014. Lecture Notes in Computer Science, vol 8325. Springer, Cham. https://doi.org/10.1007/978-3-319-04114-8_39

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-04114-8_39

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-04113-1

  • Online ISBN: 978-3-319-04114-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics