Abstract
A fully automated, multi-stage architecture for emotion recognition is presented. Faces are located using a tracker based upon the ratio template algorithm [1]. Optical flow of the face is subsequently determined using a multi-channel gradient model [2]. The speed and direction information produced is then averaged over different parts of the face and ratios taken to determine how facial parts are moving relative to one another. This information is entered into multi-layer perceptrons trained using back propagation. The system then allocates any facial expression to one of four categories, happiness, sadness, surprise, or disgust. The three key stages of the architecture are all inspired by biological systems. This emotion recognition system runs in real-time and has a range of applications in the field of humancomputer interaction.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Sinha P.: Perceiving and Recognising three-dimensional forms, PhD dissertation, M.I.T. Available at http://theses.mit.edu:80/Dienst/UI/2.0/Describe/ 0018.mit.theses%2f1995-70?abstract
Johnston A., McOwan P.W., Benton C.P.: Robust Velocity Computation from a Biologically Motivated Model of Motion Perception. Proceedings of the Royal Society of London, Vol. 266. (1999) 509–518
Picard R.W.: Towards Agents that Recognize Emotion. Actes Proceedings IMAGINA. (1998) 153–155
Bartlett M.S., Huger J.C., Ekman P., Sejnowski T.J.: Measuring Facial Expressions by Computer Image Analysis. Psychophysiology, Vol. 36. (1999) 253–263
Bartlett M.S., Donato G., Movellan J.R., Huger J.C., Ekman P., Sejnowski T.J.: Face Image Analysis for Expression Measurement and Detection of Deceit. Proceedings of the 6th Annual Joint Symposium on Neural Computation. (1999)
Ogden, B.: Interactive Vision in Robot-Human Interaction. Progression Report. (2001) 42–55
Himer W., Schneider F., Kost G., Heimann H.: Computer-based Analysis of Facial Action: A New Approach. Journal of Psychophysiology, Vol 5(2). (1991) 189–195
Yakoob Y., Davis L.: Recognizing Facial Expressions by Spatio-Temporal Analysis. IEEE CVPR. (1993) 70–75
Rosenblum M., Yakoob Y., Davis L.: Human Emotion Recognition from Motion Using a Radial Basis Function Network Architecture. IEEE Workshop on Motion of Non-Rigid and Articulated Objects. (1994)
Lien J.J, Kanade T., Cohn J.F., Li C.: Automated Facial Expression Recognition Based on FACS Action Units. Third IEEE International Conference on Automatic Face and Gesture Recognition. (1998) 390–395
Essa I.A., Pentland A.P.: Coding, Analysis, Interpretation, and Recognition of Facial Expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 19(7). (1997) 757–763
Lien J.J, Kanade T., Cohn J.F., Li C.: Automated Facial Expression Recognition Based on FACS Action Units. Third IEEE International Conference on Automatic Face and Gesture Recognition. (1998) 390–395
Tian Y., Kanade T., Cohn J.F.: Recognizing Action Units for Facial Expression Analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 23 (2). (2001) 390–395
Ekman P., Friesen W.: Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press, Palo Alto, CA. (1978)
McOwan P.W., Benton C., Dale J., Johnston A.: A Multi-differential Neuromorphic Approach to Motion Detection, International Journal of Neural Systems, Vol. 9. (1999) 429–434
Scassellati B.: Eye Finding via Face Detection for a Foveated, Active Vision System. Proceedings of the Fifteenth National Conference on Artificial Intelligence. (1998)
Anderson K., McOwan P.W.: Robust Real-Time Face Tracker for Cluttered Environments. Submitted to Computer Vision & Image Understanding.
Tovée M.J.: An Introduction to the Visual System, Cambridge University Press. (1996)
Hietanen J.K.: Does your gaze direction and head orientation shift my visual attention?, Neuroreport, Vol. 10. (1999) 3443–3447
Hill H., Johnston A.: Categorising Sex and Identity from the Biological Motion of Faces. Current Biology, Vol. 11. (2001) 880–885
Lien J.J.J., Kanade T., Cohn J.F., Li C.C.: Detection, Tracking, and Classification of Subtle Changes in Facial Expression, Journal of Robotics and Autonomous Systems, Vol. 31. (2000) 131–146
Cohn J.F., Zlochower A., Lien J., Kanade T.: “Automated face analysis by feature point tracking has high concurrent validity with manual FACS coding” Psychophysiology, Vol. 36. (1999), 35–43
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Anderson, K., McOwan, P.W. (2003). Real-Time Emotion Recognition Using Biologically Inspired Models. In: Kittler, J., Nixon, M.S. (eds) Audio- and Video-Based Biometric Person Authentication. AVBPA 2003. Lecture Notes in Computer Science, vol 2688. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44887-X_15
Download citation
DOI: https://doi.org/10.1007/3-540-44887-X_15
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-40302-9
Online ISBN: 978-3-540-44887-7
eBook Packages: Springer Book Archive