Advertisement

Two-Layer Feature Selection Algorithm for Recognizing Human Emotions from 3D Motion Analysis

  • Ferdous AhmedEmail author
  • Marina L. Gavrilova
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11542)

Abstract

Research on automatic recognition of human emotion from motion is gaining momentum, especially in the areas of virtual reality, robotics, behavior modeling, and biometric identity recognition. One of the challenges is to identify emotion-specific features from a vast number of expressive descriptors of human motion. In this paper, we have developed a novel framework for emotion classification using motion features. We combined a filter-based feature selection algorithm and a genetic algorithm to recognize four basic emotions: happiness, sadness, fear, and anger. The validity of the proposed framework was confirmed on a dataset containing 30 subjects performing expressive walking sequences. Our proposed framework achieved a very high recognition rate outperforming existing state-of-the-art methods in the literature.

Keywords

Emotion recognition Kinect sensor Gait analysis Human motion Genetic algorithm Feature selection 

Notes

Acknowledgements

Authors would like to acknowledge partial support from NSERC DG “Machine Intelligence for Biometric Security”, NSERC ENGAGE on Gait Recognition and NSERC SPG on Smart Cities funding.

References

  1. 1.
    Adey, P.: Facing airport security: affect, biopolitics, and the preemptive securitisation of the mobile body. Environ. Plann. D: Soc. Space 27(2), 274–295 (2009)CrossRefGoogle Scholar
  2. 2.
    Ahmed, F., Paul, P.P., Gavrilova, M.L.: DTW-based kernel and rank-level fusion for 3D gait recognition using kinect. Vis. Comput. 31(6–8), 915–924 (2015)CrossRefGoogle Scholar
  3. 3.
    Ahmed, F., Sieu, B., Gavrilova, M.L.: Score and rank-level fusion for emotion recognition using genetic algorithm. In: ICCI*CC 2018, pp. 46–53. IEEE (2018)Google Scholar
  4. 4.
    Battiti, R.: Using mutual information for selecting features in supervised neural net learning. IEEE Trans. Neural Netw. 5(4), 537–550 (1994)CrossRefGoogle Scholar
  5. 5.
    Bianchi-Berthouze, N., Kleinsmith, A.: A categorical approach to affective gesture recognition. Connect. Sci. 15(4), 259–269 (2003)CrossRefGoogle Scholar
  6. 6.
    Camurri, A., Lagerlöf, I., Volpe, G.: Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques. Int. J. Hum.-Comput. Stud. 59(1–2), 213–225 (2003)CrossRefGoogle Scholar
  7. 7.
    De Gelder, B.: Why bodies? Twelve reasons for including bodily expressions in affective neuroscience. Biol. Sci. 364(1535), 3475–3484 (2009)CrossRefGoogle Scholar
  8. 8.
    Durupinar, F., Kapadia, M., Deutsch, S., Neff, M., Badler, N.I.: Perform: perceptual approach for adding ocean personality to human motion using laban movement analysis. ACM Trans. Graph. (TOG) 36(1), 6 (2017)Google Scholar
  9. 9.
    Fourati, N., Pelachaud, C.: Toward new expressive movement characterizations. In: Proceedings of Motion in Games (2012)Google Scholar
  10. 10.
    Fragopanagos, N., Taylor, J.G.: Emotion recognition in human-computer interaction. Neural Netw. 18(4), 389–405 (2005)CrossRefGoogle Scholar
  11. 11.
    Gelman, A., et al.: Analysis of variance-why it is more important than ever. Ann. Stat. 33(1), 1–53 (2005)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Glowinski, D., Dael, N., Camurri, A., Volpe, G., Mortillaro, M., Scherer, K.: Toward a minimal representation of affective gestures. IEEE Trans. Affect. Comput. 2(2), 106–118 (2011)CrossRefGoogle Scholar
  13. 13.
    Greene, C.S., Penrod, N.M., Kiralis, J., Moore, J.H.: Spatially uniform reliefF (SURF) for computationally-efficient filtering of gene-gene interactions. BioData Min. 2(1), 5 (2009)CrossRefGoogle Scholar
  14. 14.
    Kapadia, M., et al.: Efficient motion retrieval in large motion databases. In: Proceedings of the ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games, pp. 19–28. ACM (2013)Google Scholar
  15. 15.
    Kira, K., Rendell, L.A.: The feature selection problem: traditional methods and a new algorithm. In: Aaai, vol. 2, pp. 129–134 (1992)Google Scholar
  16. 16.
    Kollias, D., et al.: Deep affect prediction in-the-wild: Aff-wild database and challenge, deep architectures, and beyond. arXiv preprint arXiv:1804.10938 (2018)
  17. 17.
    Larboulette, C., Gibet, S.: A review of computable expressive descriptors of human motion. In: Proceedings of the 2nd International Workshop on Movement and Computing, pp. 21–28. ACM (2015)Google Scholar
  18. 18.
    Mehrabian, A.: Nonverbal Communication. Routledge, Abingdon (2017)CrossRefGoogle Scholar
  19. 19.
    Noroozi, F., Corneanu, C.A., Kamińska, D., Sapiński, T., Escalera, S., Anbarjafari, G.: Survey on emotional body gesture recognition. arXiv preprint arXiv:1801.07481 (2018)
  20. 20.
    Pollick, F.E., Paterson, H.M., Bruderlin, A., Sanford, A.J.: Perceiving affect from arm movement. Cognition 82(2), B51–B61 (2001)CrossRefGoogle Scholar
  21. 21.
    Robnik-Šikonja, M., Kononenko, I.: Theoretical and empirical analysis of reliefF and RReliefF. Mach. Learn. 53(1–2), 23–69 (2003)CrossRefGoogle Scholar
  22. 22.
    Ross, B.C.: Mutual information between discrete and continuous data sets. PloS One 9(2), e87357 (2014)CrossRefGoogle Scholar
  23. 23.
    Saha, S., Datta, S., Konar, A., Janarthanan, R.: A study on emotion recognition from body gestures using kinect sensor. In: 2014 International Conference on Communication and Signal Processing, pp. 056–060. IEEE (2014)Google Scholar
  24. 24.
    Senecal, S., Cuel, L., Aristidou, A., Magnenat-Thalmann, N.: Continuous body emotion recognition system during theater performances. Comput. Animat. Virtual Worlds 27(3–4), 311–320 (2016)CrossRefGoogle Scholar
  25. 25.
    Sultana, M., Paul, P.P., Gavrilova, M.: Social behavioral biometrics: an emerging trend. Int. J. Pattern Recognit. Artif. Intell. 29(08), 1556013 (2015)MathSciNetCrossRefGoogle Scholar
  26. 26.
    Tahir, Y., Dauwels, J., Thalmann, D., Magnenat Thalmann, N.: A user study of a humanoid robot as a social mediator for two-person conversations. Int. J. Soc. Robot. 14(4), 1–14 (2018)Google Scholar
  27. 27.
    Tarnowski, P., Kołodziej, M., Majkowski, A., Rak, R.J.: Emotion recognition using facial expressions. Procedia Comput. Sci. 108, 1175–1184 (2017)CrossRefGoogle Scholar
  28. 28.
    Wallbott, H.G.: Bodily expression of emotion. Eur. J. Soc. Psychol. 28(6), 879–896 (1998)CrossRefGoogle Scholar
  29. 29.
    Wang, H., Khoshgoftaar, T.M., Van Hulse, J.: A comparative study of threshold-based feature selection techniques. In: 2010 IEEE International Conference on Granular Computing (GRC), pp. 499–504. IEEE (2010)Google Scholar
  30. 30.
    Yanushkevich, S.N., Stoica, A., Srihari, S.N., Shmerko, V.P., Gavrilova, M.: Simulation of biometric information: the new generation of biometric systems. In: Proceedings of International Workshop Modeling and Simulation in Biometric Technology, pp. 87–98 (2004)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.University of CalgaryCalgaryCanada

Personalised recommendations