Skip to main content

Part of the book series: SpringerBriefs in Computer Science ((BRIEFSCOMPUTER))

  • 479 Accesses

Abstract

The previous chapter compared gaze controlled and finger tracking based pointing modalities and we found users found it difficult to home on target using the gaze controlled interface and even with finger tracking system in automotive environment. In this chapter, we have proposed an algorithm that can activate a target even before the pointer reaches on top of it. For a gaze controlled interface, the target will be activated as soon as the saccade launches near the target reducing the fixation duration required to activate a target. In the latter half of the chapter we discussed different fusion strategies to combine eye gaze and finger tracking systems together. The previous chapter already introduced multimodal eye gaze tracking system by combining eye gaze tracking with joystick and LeapMotion controller, this chapter takes forward the concept with more sophisticated fusion models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Ahmad BI, Langdon PM, Godsill SJ (2015) Intelligent intent-aware touchscreen systems using gesture tracking with endpoint prediction. In: Proceeding of the 17th International Conference on Human Computer Interaction

    Google Scholar 

  • Asano T, Sharlin E, Kitamura Y, Takashima K, Kishino F (2005) Predictive interaction using the Delphian desktop. In: Proceedings of the 186th annual ACM smposium on User Interface Software and Technology (UIST ’05), New York, pp 133–141

    Google Scholar 

  • Atrey PK, Hossain MA, Saddik AE, Kankanhalli MS (2010) Multimodal fusion for multimedia analysis: a survey. Multimedia Systems 16:345–379

    Article  Google Scholar 

  • Basir O, Bhavnani JP, Karray F, Desrochers K (2004) Drowsiness detection system, US 6822573 B2

    Google Scholar 

  • Dixon M, Fogarty J, Wobbrock J (2012) A general-purpose target-aware pointing enhancement using pixel-level analysis of graphical interfaces. In: Proceedings of the 2012 ACM annual conference on human factors in computing systems (CHI’12). ACM, New York, pp 3167–3176

    Google Scholar 

  • Duarte C, Costa D, Feiteira P, Costa D (2015) Building an adaptive multimodal framework for resource constrained systems. In: Biswas P (ed) A multimodal end-2-end approach to accessible computing, 2nd edn. Springer, London

    Google Scholar 

  • Evans AC, Wobbrock JO (2012) Taming wild behavior: the input observer for obtaining text entry and mouse pointing measures from everyday computer use. In: Proceedings of the ACM conference on human factors in computing systems (CHI ’12), pp 1947–1956

    Google Scholar 

  • Farrell S, Zhai S (2005) System and method for selectively expanding or contracting a portion of a display using eye-gaze tracking, US Patent No.: 20050047629 A1

    Google Scholar 

  • Fitts PM (1954) The information capacity of the human motor system in controlling the amplitude of movement. J Exp Psychol 47:381–391

    Article  Google Scholar 

  • Jacob M, Hurwitz B, Kamhi G (2013) Eye tracking based selective accentuation of portions of a display, WO Patent No.: 2013169237 A1

    Google Scholar 

  • Lank E, Cheng YN, Ruiz J (2007) Endpoint prediction using motion kinematics. In: Proceedings of the SIGCHI conference on human factors in computing systems (CHI ’07), New York, pp 637–646

    Google Scholar 

  • Leap Motion Controller (2015) Available at https://www.leapmotion.com/. Accessed 4 Nov 2015

  • Martin J-C (1998) Types of cooperation and referenceable objects: implications on annotation schemas for multimodal language resources. In: LREC 2000 pre-conference workshop, Athens, Greece

    Google Scholar 

  • Martins FCM (2003) Passive gaze-driven browsing, US Patent No.: 6608615 B1

    Google Scholar 

  • McGuffin MJ, Balakrishnan R (2005) Fitts’ law and expanding targets: Experimental studies and designs for user interfaces. ACM Trans Comput Hum Interact 12(4):388–422

    Article  Google Scholar 

  • Milekic S (2009) Using gaze actions to interact with a display, US Patent No.: 7561143 B1

    Google Scholar 

  • Murata A (1998) Improvement of pointing time by predicting targets in pointing with a PC mouse. Int J Hum Comput Interact 10(1):23–32

    Article  Google Scholar 

  • Pasqual P, Wobbrock J (2014), Mouse pointing endpoint prediction using kinematic template matching. In: CHI ’14 Proceedings of the SIGCHI conference on human factors in computing systems, pp 743–752

    Google Scholar 

  • Penkar AM, Lutteroth C, Weber G (2012) Designing for the eye – design parameters for Dwell in gaze interaction, OZCHI 2012

    Google Scholar 

  • Rosenbaum DA (2010) Human motor control, 2nd edn. Academic Press, Amsterdam

    Google Scholar 

  • Ruiz J, Lank E (2010) Speeding pointing in tiled widgets: understanding the effects of target expansion and misprediction. In: Proceedings of the 15th international conference on intelligent user interfaces (IUI’10). ACM, New York, pp 229–238

    Google Scholar 

  • Sanderson C, Paliwal KK (2002) Information fusionand person verification using speech & face information, research paper IDIAP-RR 02–33

    Google Scholar 

  • Schwaller M, Lalanne D (2013) Pointing in the air: measuring the effect of hand selection strategies on performance and effort. In: Proceedings of SouthCHI 2013

    Google Scholar 

  • Sharma R, Pavlovic VI, Huang TS (1998) Toward multimodal human-computer interface. Proc IEEE 86:853–869

    Article  Google Scholar 

  • Shirley P, Marschner S (2009) Fundamentals of computer graphics. CRC Press, Natick

    MATH  Google Scholar 

  • Tobii EyeX Eye Tracker (2015) Available at: http://www.tobii.com/xperience/. Accessed 31 Aug 2015

  • Tobii TX2 Eye Tracker (2013) Available at http://www.tobii.com/en/eye-tracking-research/global/products/hardware/tobii-x60x120-eye-tracker/. Accessed 31 Aug 2013

  • Vilimek R, Hempel T, Otto B (2007) Multimodal interfaces for in-vehicle applications. In: Jacko J (ed) Human-computer interaction, Part III, HCII 2007, LNCS 4552. Springer, Berlin Heidelberg, pp. 216–224

    Google Scholar 

  • Voronka N, Jacobus CJ (2001) Low-cost non-imaging eye tracker system for computer control, US Patent No.: 6299308 B1

    Google Scholar 

  • Ware C, Mikaelian HM (1987) An evaluation of an eye tracker as a device for computer input. In: Proceedings of the ACM SIGCHI conference on human factors in computing systems (CHI), pp 183–187

    Google Scholar 

  • Wobbrock JO, Fogarty J, Liu S, Kimuro S, Harada S (2009) The angle mouse: target-agnostic dynamic gain adjustment based on angular deviation. In: Proceedings of the 27th international conference on human factors in computing systems (CHI ’09), New York, pp 1401–1410

    Google Scholar 

  • Woodsworth RS (1899) The accuracy of voluntary movement. Psychol Rev 3:1–119

    Google Scholar 

  • Zhai S, Morimoto C, Ihde S (1999) Manual and Gaze Input Cascaded (MAGIC) pointing. In: ACM SIGCHI conference on human factors in computing system (CHI)

    Google Scholar 

  • Ziebart B, Dey A, Bagnell JA (2012) Probabilistic pointing target prediction via inverse optimal control. In: Proceedings of the 2012 ACM international conference on intelligent user interfaces (IUI ’12), New York, pp 1–10

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Biswas, P. (2016). Intelligent Multimodal Systems. In: Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments. SpringerBriefs in Computer Science. Springer, Cham. https://doi.org/10.1007/978-3-319-40709-8_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-40709-8_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-40708-1

  • Online ISBN: 978-3-319-40709-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics