Skip to main content

Gaze-Based Interaction for VR Environments

  • Conference paper
  • First Online:
Image Processing and Communications (IP&C 2019)

Abstract

In this paper we propose a steering mechanism for VR headset utilizing eye tracking. Based on the fovea region traced by the eyetracker assembled into VR headset the visible 3D ray is generated towards the focal point of sight. The user can freely look around the virtual scene and is able to interact with objects indicated by the eyes. The paper gives an overview of the proposed interaction system and addresses the effectiveness and precision issues of such interaction modality.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Giorio, C., Fascinari, M.: Kinect in Motion - Audio and Visual Tracking by Example. Packt Publishing, Birmingham (2013)

    Google Scholar 

  2. Nowosielski, A.: Evaluation of touchless typing techniques with hand movement. In: Burduk, R., et al. (eds.) Proceedings of the 9th International Conference on Computer Recognition Systems, CORES 2015. AISC, vol. 403, pp. 441–449. Springer, Cham (2016)

    Google Scholar 

  3. Nowosielski, A.: 3-steps keyboard: reduced interaction interface for touchless typing with head movements. In: Kurzynski, M., Wozniak, M., Burduk, R (eds.) Proceedings of the 10th International Conference on Computer Recognition Systems, CORES 2017. AISC, vol. 578, pp. 229–237. Springer, Cham (2018)

    Google Scholar 

  4. Mantiuk, R., Kowalik, M., Nowosielski, A., Bazyluk, B.: Do-it-yourself eye tracker: low-cost pupil-based eye tracker for computer graphics applications. In: LNCS, vol. 7131, pp. 115–125 (2012)

    Google Scholar 

  5. Pupil Labs GmbH: Eye tracking for virtual and augmented reality. https://pupil-labs.com/vr-ar/. Accessed 15 June 2019

  6. Wedel, M., Pieters, R.: A review of eye-tracking research in marketing. In: Malhotra, N.K. (ed.) Review of Marketing Research, vol. 4, pp. 123–147. Emerald Group Publishing Limited (2008)

    Google Scholar 

  7. Berkovsky, S., Taib, R., Koprinska, I., Wang, E., Zeng, Y., Li, J., Kleitman, S.: Detecting personality traits using eye-tracking data. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, CHI 2019, pp. 221:1–221:12. ACM, New York (2019)

    Google Scholar 

  8. Jankowski, J., Ziemba, P., Watróbski, J., Kazienko, P.: Towards the tradeoff between online marketing resources exploitation and the user experience with the use of eye tracking. In: Nguyen, N.T., Trawiński, B., Fujita, H., Hong, T.P. (eds.) Intelligent Information and Database Systems, ACIIDS 2016. LNCS, vol. 9621, pp. 330–343. Springer, Berlin (2016)

    Chapter  Google Scholar 

  9. Jacob, R.J.K., Karn, K.S.: Commentary on section 4. Eye tracking in human-computer interaction and usability research: ready to deliver the promises. In: Hyönä, J., Radach, R., Deubel, H. (eds.) The Mind’s Eye, pp. 573–605. North-Holland (2003)

    Google Scholar 

  10. Kristensson, P.O., Vertanen, K.: The potential of dwell-free eye-typing for fast assistive gaze communication. In: Spencer, S.N. (ed.) Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA 2012), pp. 241–244. ACM, New York (2012)

    Google Scholar 

  11. Mott, M.E., Williams, S., Wobbrock, J.O., Morris, M.R.: Improving dwell-based gaze typing with dynamic, cascading dwell times. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI 2017), pp. 2558–2570. ACM, New York (2017)

    Google Scholar 

  12. Drewes, H., Schmidt, A.: Interacting with the computer using gaze gestures. In: Baranauskas, C., Palanque, P., Abascal, J., Barbosa, S.D.J. (eds.) Human-Computer Interaction – INTERACT 2007, INTERACT 2007. LNCS, vol. 4663, pp. 475–488. Springer, Berlin (2007)

    Google Scholar 

  13. Istance, H., Hyrskykari, A., Immonen, L., Mansikkamaa, S., Vickers, S.: Designing gaze gestures for gaming: an investigation of performance. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (ETRA 2010), pp. 323–330. ACM, New York (2010)

    Google Scholar 

  14. Siekawa, A., Chwesiuk, M., Mantiuk, R., Piórkowski, R.: Foveated ray tracing for VR headsets. In: MultiMedia Modeling. LNCS, vol. 11295, pp. 106–117 (2019)

    Google Scholar 

  15. Piotrowski, P., Nowosielski, A.: Gaze interaction engine (project page) (2019). https://github.com/patryk191129/GazeInteractionEngine

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Adam Nowosielski .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Piotrowski, P., Nowosielski, A. (2020). Gaze-Based Interaction for VR Environments. In: ChoraÅ›, M., ChoraÅ›, R. (eds) Image Processing and Communications. IP&C 2019. Advances in Intelligent Systems and Computing, vol 1062. Springer, Cham. https://doi.org/10.1007/978-3-030-31254-1_6

Download citation

Publish with us

Policies and ethics