Skip to main content

Towards Real-Time Continuous Emotion Recognition from Body Movements

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 8212))

Abstract

Social psychological research indicates that bodily expressions convey important affective information, although this modality is relatively neglected in the literature as compared to facial expressions and speech. In this paper we propose a real-time system that continuously recognizes emotions from body movements data streams. Low-level 3D postural features and high-level kinematic and geometrical features are through summarization (statistical values) or aggregation (feature patches), fed to a random forests classifier. In a first stage, the MoCap UCLIC affective gesture database has been used for training the classifier, which led to an overall recognition rate of 78% using a 10-fold cross-validation (leave-one-out). Subsequently, the trained classifier was tested with different subjects using continuous Kinect data. A performance of 72% was reached in real-time, which proves the efficiency and effectiveness of the proposed system.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Aggarwal, J., Cai, Q.: Human motion analysis: a review. In: Proc. of Nonrigid and Articulated Motion Workshop, pp. 90–102 (1997)

    Google Scholar 

  2. Alaerts, K., Nackaerts, E., Meyns, P., Swinnen, S.P., Wenderoth, N.: Action and emotion recognition from point light displays: an investigation of gender differences. PLoS One 6(6), e20989 (2011)

    Google Scholar 

  3. Baltrusaitis, T., McDuff, D., Banda, N., Mahmoud, M., el Kaliouby, R., Robinson, P., Picard, R.: Real-time inference of mental states from facial expressions and upper body gestures. In: Proceedings of 2011 IEEE International Conference on Automatic Face Gesture Recognition and Workshops (FG 2011), pp. 909–914 (2011)

    Google Scholar 

  4. Bänziger, T., Mortillaro, M., Scherer, K.R.: Introducing the Geneva Multimodal expression corpus for experimental research on emotion perception. Emotion (Washington, D.C.) 12(5), 1161–1179 (2012), http://www.ncbi.nlm.nih.gov/pubmed/22081890

    Article  Google Scholar 

  5. Bernhardt, D.: Emotion inference from human body motion. Tech. Rep. 787, Computer Laboratory, University of Cambridge, Cambridge (2010)

    Google Scholar 

  6. Bianchi-Berthouze, N., Cairns, P., Cox, A.: On posture as a modality for expressing and recognizing emotions. In: Emotion in HCI: Joint Procs. of the 2005, 2006, and 2007 Intl. Workshops, pp. 74–80. Citeseer (2008)

    Google Scholar 

  7. Bianchi-Berthouze, N., Kleinsmith, A.: A categorical approach to affective gesture recognition. Connection Science 15(4), 259–269 (2003)

    Article  Google Scholar 

  8. Caruana, R., Karampatziakis, N., Yessenalina, A.: An empirical evaluation of supervised learning in high dimensions. In: Proceedings of the 25th International Conference on Machine Learning, ICML 2008, pp. 96–103 (2008), http://portal.acm.org/citation.cfm?doid=1390156.1390169

  9. Castellano, G., Kessous, L., Caridakis, G.: Multimodal emotion recognition from expressive faces, body gestures and speech. Affective computing and intelligent interaction 4738, 71–82 (2007)

    Article  Google Scholar 

  10. Castellano, G., Villalba, S., Camurri, A.: Recognising human emotions from body movement and gesture dynamics. In: Paiva, A.C.R., Prada, R., Picard, R.W. (eds.) ACII 2007. LNCS, vol. 4738, pp. 71–82. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  11. de Gelder, B.: Why bodies? Twelve reasons for including bodily expressions in affective neuroscience. Philosophical Tran. of the Royal Society B: Biological Sciences 364, 3475–3484 (2009)

    Article  Google Scholar 

  12. Girshick, R., Shotton, J., Kohli, P., Criminisi, A., Fitzgibbon, A.: Efficient regression of general-activity human poses from depth images. In: 2011 International Conference on Computer Vision, pp. 415–422 (November 2011), http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=6126270

  13. Glowinski, D., Dael, N., Camurri, A., Volpe, G., Mortillaro, M., Scherer, K.R.: Toward a Minimal Representation of Affective Gestures. IEEE Transactions on Affective Computing 2(2), 106–118 (2011)

    Article  Google Scholar 

  14. Gross, M.M., Crane, E.A., Fredrickson, B.L.: Methodology for Assessing Bodily Expression of Emotion. Journal of Nonverbal Behavior 34(4), 223–248 (2010), http://www.springerlink.com/index/10.1007/s10919-010-0094-x

    Article  Google Scholar 

  15. Gunes, H., Piccardi, M.: Automatic temporal segment detection and affect recognition from face and body display. IEEE Transactions on Systems, Man, and Cybernetics. Part B, Cybernetics: A Publication of the IEEE Systems, Man, and Cybernetics Society 39(1), 64–84 (2009)

    Google Scholar 

  16. Gunes, H., Schuller, B.: Categorical and dimensional affect analysis in continuous input: Current trends and future directions. Image and Vision Computing (July 2012), http://linkinghub.elsevier.com/retrieve/pii/S0262885612001084

  17. Gunes, H., Shan, C., Chen, S., Tian, Y.: Bodily Expression for Automatic Affect Recognition. In: Advances in Emotion Recognition (2012)

    Google Scholar 

  18. Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., Witten, I.H.: The WEKA data mining software: an update. SIGKDD Explor. Newsl. 11(1), 10–18 (2009)

    Article  Google Scholar 

  19. Kahol, K., Tripathi, P., Panchanathan, S.: Gesture Segmentation in Complex Motion Sequences. In: Proc. of International Conference on Image Processing (ICIP 2003) (2003)

    Google Scholar 

  20. Kleinsmith, A., Bianchi-Berthouze, N.: Recognizing affective dimensions from body posture. In: Proceedings of the 2nd International Conference on Affective Computing and Intelligent Interaction, pp. 45–58 (2007)

    Google Scholar 

  21. Kleinsmith, A., Bianchi-Berthouze, N.: Automatic recognition of non- acted affective postures: A video game scenario. IEEE Trans. on Systems, Man, and Cybernetics Part B 41(4), 1027–1038 (2011)

    Article  Google Scholar 

  22. Kleinsmith, A., De Silva, P.R., Bianchi-Berthouze, N.: Cross-Cultural Differences in Recognizing Affect from Body Posture. Iteracting with Computers 18(6), 1371–1389 (2006)

    Article  Google Scholar 

  23. Kroemer, K.H.E., Kroemer, H.B., Kroemer-Elbert, K.E.: Ergonomics: how to design for ease and efficiency. Prentice Hall (1994)

    Google Scholar 

  24. Leite, I., Castellano, G., Pereira, A., Martinho, C., Paiva, A.: Modelling Empathic Behaviour in a Robotic Game Companion for Children. In: Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI 2012) (2012)

    Google Scholar 

  25. Metallinou, A., Katsamanis, A., Narayanan, S.: Tracking continuous emotional trends of participants during affective dyadic interactions using body language and speech information. In: Image and Vision Computing (September 2012), http://linkinghub.elsevier.com/retrieve/pii/S0262885612001710

  26. Nicolaou, M.A., Member, S., Gunes, H., Pantic, M., Member, S.: Continuous Prediction of Spontaneous Affect from Multiple Cues and Modalities in Valence Arousal Space. IEEE Trans. on Affective Computing, 1–15 (2011)

    Google Scholar 

  27. Piana, S., Staglianò, A., Camurri, A., Odone, F.: A set of Full-Body Movement Features for Emotion Recognition to Help Children affected by Autism Spectrum Condition. In: IDGEI International Workshop (2013)

    Google Scholar 

  28. Taylor, J., Shotton, J., Sharp, T., Fitzgibbon, A.: The Vitruvian Manifold: Inferring Dense Correspondences for One-Shot Human Pose Estimation Optimization of Model Parameters, pp. 103–110 (2012)

    Google Scholar 

  29. Volpe, G., Camurri, A.: A system for embodied social active listening to sound and music content. Journal on Computing and Cultural Heritage 4(1), 1–23 (2011), http://dl.acm.org/citation.cfm?doid=2001416.2001418

    Article  Google Scholar 

  30. Wallbott, H.G.: Bodily Expression of Emotion. European Journal of Social Psychology 28(6), 879–896 (1998)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer International Publishing Switzerland

About this paper

Cite this paper

Wang, W., Enescu, V., Sahli, H. (2013). Towards Real-Time Continuous Emotion Recognition from Body Movements. In: Salah, A.A., Hung, H., Aran, O., Gunes, H. (eds) Human Behavior Understanding. HBU 2013. Lecture Notes in Computer Science, vol 8212. Springer, Cham. https://doi.org/10.1007/978-3-319-02714-2_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-02714-2_20

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-02713-5

  • Online ISBN: 978-3-319-02714-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics