Skip to main content

Point of Regard from Eye Velocity in Stereoscopic Virtual Environments Based on Intersections of Hypothesis Surfaces

  • Conference paper
  • 1632 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 8955))

Abstract

A new method is proposed for utilising scene information for stereo eye tracking in stereoscopic 3D virtual environments. The approach aims to improve gaze tracking accuracy and reduce the required user engagement with eye tracking calibration procedures. The approach derives absolute Point of Regard (POR) from the angular velocity of the eyes without user engaged calibration of drift. The method involves reduction of a hypothesis set for the 3D POR via a process of transformation during saccades and intersection with scene geometry during fixations. A basic implementation of this concept has been demonstrated in simulation using the depth buffer of the scene and a particle representation for the hypothesis set. Future research directions will focus on optimisation of the algorithm and improved utilisation of scene information. The technique shows promise in improving gaze tracking techniques in general, including relative paradigms such as electrooculography.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bernhard, M., Stavrakis, E., Hecher, M., Wimmer, M.: Gaze-to-object mapping during visual search in 3d virtual environments. ACM Transactions on Applied Perception 11(3), 14:1–14:17 (Aug 2014)

    Google Scholar 

  2. Bernhard, M., Stavrakis, E., Wimmer, M.: An empirical pipeline to derive gaze prediction heuristics for 3d action games. ACM Transactions on Applied Perception 8(1), 4:1–4:30 (2010)

    Google Scholar 

  3. Bulling, A., Roggen, D., Tröster, G.: Wearable eog goggles: Seamless sensing and context-awareness in everyday environments. Journal of Ambient Intelligence and Smart Environments 1(2), 157–171 (2009)

    Google Scholar 

  4. Duchowski, A.: Eye Tracking Methodology: Theory and Practice, 2nd edn. Springer-Verlag London Limited (2007)

    Google Scholar 

  5. Erkelens, C.J., Vogels, I.M.: The initial direction and landing position of saccades. In: Groner, R., d’Ydewalle, G. (eds.) Eye Movement Research Mechanisms, Processes, and Applications, Studies in Visual Information Processing, vol. 6, pp. 133–144. North-Holland (1995)

    Google Scholar 

  6. Levoy, M., Whitaker, R.: Gaze-directed volume rendering. SIGGRAPH Computer Graphics 24(2), 217–223 (1990)

    Article  Google Scholar 

  7. Mantiuk, R., Bazyluk, B., Tomaszewska, A.: Gaze-dependent depth-of-field effect rendering in virtual environments. In: Ma, M., Fradinho Oliveira, M., Madeiras Pereira, J. (eds.) SGDA 2011. LNCS, vol. 6944, pp. 1–12. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  8. Mantiuk, R., Bazyluk, B., Mantiuk, R.K.: Gaze-driven object tracking for real time rendering. Computer Graphics Forum 32(2), 163–173 (2013)

    Article  Google Scholar 

  9. Morimoto, C.H., Mimica, M.R.: Eye gaze tracking techniques for interactive applications. Computer Vision and Image Understanding 98(1), 4–24 (2005), special Issue on Eye Detection and Tracking

    Google Scholar 

  10. Ohshima, T., Yamamoto, H., Tamura, H.: Gaze-directed adaptive rendering for interacting with virtual space. In: Proceedings of the IEEE 1996 Virtual Reality Annual International Symposium, pp. 103–110, 267 (March 1996)

    Google Scholar 

  11. Papenmeier, F., Huff, M.: Dynaoi: A tool for matching eye-movement data with dynamic areas of interest in animations and movies. Behavior Research Methods 42(1), 179–187 (2010)

    Article  Google Scholar 

  12. Robinson, D.: A method of measuring eye movement using a scleral search coil in a magnetic field. IEEE Transactions on Bio-medical Electronics 10(4), 137–145 (1963)

    Article  Google Scholar 

  13. Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 71–78. ACM Press, New York (2000)

    Google Scholar 

  14. Stanford Computer Graphics Laboratory: The stanford bunny (1994), http://graphics.stanford.edu/data/3Dscanrep/#bunny (accessed September 1, 2014)

  15. Steptoe, W., Oyekoya, O., Murgia, A., Wolff, R., Rae, J., Guimaraes, E., Roberts, D., Steed, A.: Eye tracking for avatar eye gaze control during object-focused multiparty interaction in immersive collaborative virtual environments. In: 2009 IEEE Virtual Reality Conference, pp. 83–90 (March 2009)

    Google Scholar 

  16. Thrun, S., Burgard, W., Fox, D.: Probabilistic Robotics. The MIT Press, Cambridge (2005)

    MATH  Google Scholar 

  17. Vidal, M., Turner, J., Bulling, A., Gellersen, H.: Wearable eye tracking for mental health monitoring. Computer Communications 35(11), 1306–1311 (2012)

    Article  Google Scholar 

  18. Yagi, T.: Eye-gaze interfaces using electro-oculography (eog). In: Proceedings of the 2010 Workshop on Eye Gaze in Intelligent Human Machine Interaction, pp. 28–32. ACM, New York (2010)

    Google Scholar 

  19. Yoo, D.H., Kim, J.H., Lee, B.R., Chung, M.J.: Non-contact eye gaze tracking system by mapping of corneal reflections. In: Fifth IEEE International Conference on Automatic Face and Gesture Recognition, pp. 94–99 (May 2002)

    Google Scholar 

  20. Zha, H., Makimoto, Y., Hasegawa, T.: Dynamic gaze-controlled levels of detail of polygonal objects in 3-d environment modeling. In: Second International Conference on 3-D Digital Imaging and Modeling, pp. 321–330 (1999)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Fountain, J., Chalup, S.K. (2015). Point of Regard from Eye Velocity in Stereoscopic Virtual Environments Based on Intersections of Hypothesis Surfaces. In: Chalup, S.K., Blair, A.D., Randall, M. (eds) Artificial Life and Computational Intelligence. ACALCI 2015. Lecture Notes in Computer Science(), vol 8955. Springer, Cham. https://doi.org/10.1007/978-3-319-14803-8_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-14803-8_10

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-14802-1

  • Online ISBN: 978-3-319-14803-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics