Abstract
It is envisaged that computers of the future will have smart interfaces such as speech and vision, which will facilitate natural and easy human-machine interaction. Gestures of the face and hands could become a natural way to control the operations of a computer or a machine, such as a robot. In this paper, we present a vision-based interface that in real-time tracks a person’s facial features and the gaze point of the eyes. The system can robustly track facial features, can detect tracking failures and has an automatic mechanism for error recovery. The system is insensitive to lighting changes and occulsions or distortion of the facial features. The system is user independent and can automatically calibrate for each different user. An application using this technology for driver fatigue detection and the evaluation of ergonomic design of motor vehicles has been developed. Our human-machine interface has an enormous potential in other applications that allow the control of machines and processes, and measure human performance. For example, product possibilities exist for assisting the disabled and in video game entertainment.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
T. Darrel, A.P. Pentland, “Attention-driven Expression and Gesture Analysis in an Interactive Environment”, Proceedings of the International Workshop on Automatic Face-and Gesture-Recognition, pp. 135–140, 1995.
A. Jacquin, A. Eleftheriadis, “Automatic location tracking of faces and facial features in video sequences”, Proceedings of the International Workshop on Automatic Face-and Gesture-Recognition, pp. 142–147, 1995.
A. Azarbayejani, T. Starner, B. Horowitz, and A. Pentland. Visually controlled graphics. IEEE Trans. on Pattern Analysis and Machine Intelligence, 15(6):602–605, 1993.
A. Zelinsky and J. Heinzmann. Real-time Visual Recognition of Facial Gestures for Human Computer Interaction. In Proc. of the Int. Conf, on Automatic Face and Gesture Recognition, pages 351–356, 1996.
Black and Yaccob. Tracking and Recognizing Rigid and Non-rigid Facial Motions Using Parametric Models of Image Motion. In Proc. of Int. Conf. on Computer Vision (ICCV’95), pages 374–381, 1995.
A. Gee and R. Cipolla. Fast Visual Tracking by Temporal Consensus. Image and Vision Computing, 14(2):105–114, 1996.
J. Heinzmann and A. Zelinsky. 3-D Facial Pose and Gaze Point Estimation using a Robust Real-Time Tracking Paradigm. In Proc. of the Int. Conf. on Automatic Face and Gesture Recognition, 1998.
R. Stiefelhagan, J. Yang, and A. Waibel. Tracking Eyes and Monitoring Eye Gaze. In Proc. of Workshop on Perceptual User Interface (PUI’97), 1997.
Y. Matsutmoto, T. Shibata, K. Sakai, M. Inaba, and H. Inoue. Real-time Color Stereo Vision System for a Mobile Robot based on Field Multiplexing. In Proc. of IEEE Int. Conf. on Robotics and Automation, pages 1934–1939, 1997.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zelinsky, A. (1999). Visual Human-Machine Interaction. In: Foo, N. (eds) Advanced Topics in Artificial Intelligence. AI 1999. Lecture Notes in Computer Science(), vol 1747. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-46695-9_37
Download citation
DOI: https://doi.org/10.1007/3-540-46695-9_37
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-66822-0
Online ISBN: 978-3-540-46695-6
eBook Packages: Springer Book Archive