Abstract
Humans use their faces, hands and body as an integral part of their communication with others. For the computer to interact intelligently with human users, computers should be able to recognize emotions, by analyzing the human’s affective state, physiology and behavior. Multimodal interfaces allow humans to interact with machines through multiple modalities such as speech, facial expression, gesture, and gaze. In this paper, we present an overview of research conducted on face and body gesture analysis and recognition. In order to make human-computer interfaces truly natural, we need to develop technology that tracks human movement, body behavior and facial expression, and interprets these movements in an affective way. Accordingly, in this paper we present a vision-based framework that combines face and body gesture for multimodal HCI.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Pantic, M., Rothkrantz, L.J.M.: Towards an Affect-Sensitive Multimodal Human-Computer Interaction. Proceedings of the IEEE 91(9), 1370–1390 (2003)
Pantic, M., Rothkrantz, L.J.M.: Automatic analysis of facial expressions: the state of the art. IEEE PAMI 22(12), 1424–1445 (2000)
Cassell, J.: A framework for gesture generation and interpretation. In: Cipolla, R., Pentland, A. (eds.) Computer vision in human-machine interaction, Cambridge University Press, Cambridge (2000)
Mehrabian, A.: Communication without words. Psychol. Today 2(4), 53–56 (1968)
Ekman, P., Friesen, W.V.: The Facial Action Coding System. Consulting Psychologists Press, San Francisco (1978)
Bassili, J.N.: Facial motion in the perception of faces and of emotional expression. Experimental Psyschology 4, 373–379 (1978)
Tian, Y., Kanade, T., Cohn, J.F.: Recognizing action units for facial expression analysis. IEEE PAMIÂ 23(2) (February 2001)
Donato, G., Bartlett, M., Hager, J., Ekman, P., Sejnowski, T.: Classifying facial actions. IEEE PAMI 21(10), 974–989 (1999)
Essa, I., Pentland, A.: Coding, analysis, interpretation and recognition of facial expressions. IEEE PAMI 7, 757–763 (1997)
Mase, K.: Recognition of facial expressions for optical flow. IEICE Transactions, Special Issue on Computer Vision and its Applications E 74(10) (1991)
Hager, J.C., Ekman, P.: Essential Behavioral Science of the Face and Gesture that Computer Scientists Need to Know (1995)
Pantic, M., Patras, I., Rothkrantz, L.J.M.: Facial action recognition in face profile image sequences. In: Proc. IEEE Int’l Conf. on Multimedia and Expo., Lausanne, Switzerland, August 2002, vol. 1, pp. 37–40 (2002)
Kapoor, A., Picard, R.W.: Real-time, fully automatic upper facial feature tracking. In: Proc. of FG (May 2002)
Kendon, A.: How gestures can become like words. In: Poyatos, F. (ed.) Cross-cultural perspectives in nonverbal communication, New York, C.J. Hogrefe (1988)
McNeill, D.: So you think gestures are nonverbal? Psychological Review 92, 350–371 (1985)
Turk, M.: Gesture Recognition. In: Stanney, K. (ed.) Handbook of Virtual Environments: Design, Implementation, and Applications, Lawrence Erlbaum Associates, Inc., Mahwah (2001) (Draft version)
Ohno, H., Yamamoto, M.: Gesture Recognition using Character Recognition Techniques on Two-Dimensional Eigenspace. In: Proc. of ICCV 1999, pp. 151–156 (1999)
Peixoto, P., Gonçalves, J., Araújo, H.: Real-Time Gesture Recognition System Based on Contour Signatures. In: ICPR 2002, Quebec City, Canada, August 11-15 (2002)
Chen, L.S., Huang, T.S.: Emotional expressions in audiovisual human computer interaction. In: Proc. ICME, pp. 423–426 (2000)
De Silva, L.C., Ng, P.C.: Bimodal emotion recognition. In: Proc. FG, pp. 332–335 (2000)
Yoshitomi, Y., Kim, S., Kawano, T., Kitazoe, T.: Effect of sensor fusion for recognition of emotional states using voice, face image and thermal image of face. In: Proc. ROMAN, 2000, pp. 178–183 (2000)
Picard, R.W., Vyzas, E., Healey, J.: Toward machine emotional intelligence: Analysis of affective physiological state. IEEE PAMI 23, 1175–1191 (2001)
Franagan, J.L., Huang, T.S.: Scanning the issue special issue on human-computer multimodal interface. Proceedings of the IEEE 91(9), 1267–1271 (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Gunes, H., Piccardi, M., Jan, T. (2004). Face and Body Gesture Analysis for Multimodal HCI. In: Masoodian, M., Jones, S., Rogers, B. (eds) Computer Human Interaction. APCHI 2004. Lecture Notes in Computer Science, vol 3101. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-27795-8_59
Download citation
DOI: https://doi.org/10.1007/978-3-540-27795-8_59
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-22312-2
Online ISBN: 978-3-540-27795-8
eBook Packages: Springer Book Archive