Abstract
We describe a contact-less Human Computer Interaction (HCI) system that aims to provide paraplegics the opportunity to use computers without the need for additional invasive hardware. The proposed system is a multi-modal system combining both visual and speech input. Visual input is provided through a standard web camera used for capturing face images showing the user of the computer. Image processing techniques are used for tracking head movements, making it possible to use head motion in order to interact with a computer. Speech input is used for activating commonly used tasks that are normally activated using the mouse or the keyboard. The performance of the proposed system was evaluated using a number of specially designed test applications. According to the quantitative results, it is possible to perform most HCI tasks with the same ease and accuracy as in the case that a touch pad of a portable computer is used.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Assistive Technology Solutions (Last Accessed: July 2007), http://www.abilityhub.com/mouse/
CameraMouse: Hands Free Computer Mouse (Last Accessed: July 2007), http://cameramouse.com/
EyeTech Digital Systems-Eye Tracking Device (Last Accessed: July 2007), http://www.eyetechds.com/
Frangeskides, F., Lanitis, A.: A Hands-Free Non-Invasive Human Computer Interaction System. In: Proceedings of the International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering (2005)
Gorodnichy, D.O., Roth, G.: Nouse ’Use Your Nose as a Mouse’ - Perceptual Vision Technology for Hands-Free Games and Interfaces. Image and Vision Computing 22, 931–942 (2004)
Kettebekov, S., Sharma, R.: Toward Natural Gesture/Speech Control of a Large Display. In: Nigay, L., Little, M.R. (eds.) EHCI 2001. LNCS, vol. 2254, pp. 221–234. Springer, Heidelberg (2001)
Krahnstoever, N., Kettebekov, S., Yeasin, M., Sharma, R.: A Real-Time Framework for Natural Multimodal Interaction with Large Screen Displays. In: ICMI 2002. Proceedings of Fourth Intl. Conference on Multimodal Interfaces (2002)
Mateos, G.G.: Refining Face Tracking With Integral Projections. In: Kittler, J., Nixon, M.S. (eds.) AVBPA 2003. LNCS, vol. 2688, pp. 360–368. Springer, Heidelberg (2003)
Microsoft Speech - Speech SDK5.1 For Windows Applications (Last Accessed: July 2007), http://www.microsoft.com/speech/download/sdk51/
Mouse Vision Assistive Technologies (Last Accessed: July 2007), http://mousevision.com/
Origin Instruments Corporation (Last Accessed: July 2007), http://www.orin.com/3dtrack/
O’Shaughnessy, D.: Interacting with computers by voice: automatic speech recognition and synthesis. IEEE Proceedings 91, 1272–1305 (2003)
Smart-Nav: Hands Free Mouse AT Assistive Technology (Last Accessed: July 2007), http://www.naturalpoint.com/smartnav/
Potamianos, G., Neti, C., Gravier, G., Garg, A., Senior, A.W.: Recent advances in the automatic recognition of audiovisual speech. IEEE Proceedings 91, 1306–1326 (2003)
Toyama, K.: Look, Ma - No Hands! - Hands Free Cursor Control with Real Time 3D Face Tracking. In: Proceedings Of Workshop on Perceptual User Interfaces, pp. 49–54 (1998)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Frangeskides, F., Lanitis, A. (2008). Multi-modal Contact-Less Human Computer Interaction. In: Manolopoulos, Y., Filipe, J., Constantopoulos, P., Cordeiro, J. (eds) Enterprise Information Systems. ICEIS 2006. Lecture Notes in Business Information Processing, vol 3. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-77581-2_28
Download citation
DOI: https://doi.org/10.1007/978-3-540-77581-2_28
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-77580-5
Online ISBN: 978-3-540-77581-2
eBook Packages: Computer ScienceComputer Science (R0)