Skip to main content

Gaze-Based Human-SmartHome-Interaction by Augmented Reality Controls

  • Conference paper
  • First Online:
Book cover Advances in Robot Design and Intelligent Control (RAAD 2016)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 540))

Included in the following conference series:

Abstract

The use of eye tracking systems enables people with motor disabilities to interact with computers and thus with their environment. Combined with an optical see-through head-mounted display (OST-HMD) it allows the interaction with virtual objects which are attached to real objects respectively actions which can be performed in the SmartHome environment. This means a user can trigger actions of real SmartHome actuators by gazing on the virtual objects in the OST-HMD. In this paper we propose a mobile system which is a combination of a low cost commercial eye tracker and a commercial OST-HMD. The system is intended for a SmartHome application. For this purpose we proof our concept by controlling a LED strip light using gaze-based augmented reality controls. We show a calibration procedure of the OST-HMD and evaluate the influence of the OST-HMD to the accuracy of the eye tracking.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Cottin, T.: Aufbau und Implementierung eines tragbaren, blickbasierten Steuerungssystems zur Interaktion mit der Umgebung (2014)

    Google Scholar 

  2. Epson: Epson moverio bt-200. http://www.epson.de/de/de/viewcon/corporatesite/products/mainunits/overview/12411

  3. German Ministery of Education, Research: AICASys. http://www.mtidw.de/ueberblick-bekanntmachungen/ALS/aicasys

  4. Hammond, J.C., Sharkey, P.M., Foster, G.T.: Integrating augmented reality with home systems. In: Proceedings of the 1st International Conference on Disability, Virtual Reality and Associated Technologies (ECDVRAT), pp. 57–66 (1996)

    Google Scholar 

  5. Heun, V.M.J.: Smarter objects: Programming physical objects with AR technology (2013)

    Google Scholar 

  6. Kassner, M., Patera, W., Bulling, A.: Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. ArXiv e-prints, April 2014

    Google Scholar 

  7. Lusovu: Lusovu eyespeak. http://www.myeyespeak.com/

  8. Majaranta, P., Bulling, A.: Eye tracking and eye-based human-computer interaction. In: Fairclough, S.H., Gilleade, K. (eds.) Advances in Physiological Computing. Human–Computer Interaction Series, pp. 39–65. Springer, London (2014)

    Chapter  Google Scholar 

  9. Nilsson, S., Gustafsson, T., Carleberg, P.: Hands free interaction with virtual information in a real environment: Eye gaze as an interaction tool in an augmented reality system. PsychNology J. 7(2), 175–196 (2009)

    Google Scholar 

  10. Pupil Labs: Pupil labs eye-tracking systems. https://pupil-labs.com/

  11. SMI: Smi eye tracking upgrade for ar glasses. http://www.smivision.com/augmented-reality-eyetracking-glasses/

  12. Sonntag, D., Toyama, T.: Vision-based location-awareness in augmented reality applications. In: 3rd International Workshop on Location Awareness for Mixed and Dual Reality (LAMDa 2013), pp. 5–8 (2013)

    Google Scholar 

  13. Toyama, T., Dengel, A., Suzuki, W., Kise, K.: Wearable reading assist system: Augmented reality document combining document retrieval and eye tracking. In: 12th International Conference on Document Analysis and Recognition (ICDAR), pp. 30–34, August 2013

    Google Scholar 

  14. Ullah, A.M., Islam, M.R., Aktar, S.F., Hossain, S.K.A.: Remote-touch: Augmented reality based marker tracking for smart home control. In: 15th International Conference on Computer and Information Technology (ICCIT), pp. 473–477, December 2012

    Google Scholar 

Download references

Acknowledgement

This work is supported by the German Ministery of Education and Research under grant 16SV7181.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tim Cottin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Cottin, T., Nordheimer, E., Wagner, A., Badreddin, E. (2017). Gaze-Based Human-SmartHome-Interaction by Augmented Reality Controls. In: Rodić, A., Borangiu, T. (eds) Advances in Robot Design and Intelligent Control. RAAD 2016. Advances in Intelligent Systems and Computing, vol 540. Springer, Cham. https://doi.org/10.1007/978-3-319-49058-8_41

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-49058-8_41

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-49057-1

  • Online ISBN: 978-3-319-49058-8

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics