Advertisement

Emotion Recognition System by Gesture Analysis Using Fuzzy Sets

  • Reshma Kar
  • Aruna Chakraborty
  • Amit Konar
  • Ramadoss Janarthanan
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8298)

Abstract

Gestures have been called the leaky source of emotional information. Also gestures are easy to retrieve from a distance by ordinary cameras. Thus as many would agree gestures become an important clue to the emotional state of a person. In this paper we have worked on recognizing emotions of a person by analyzing only gestural information. Subjects are initially trained to perform emotionally expressive gestures by a professional actor. The same actor trained the system to recognize the emotional context of gestures. Finally the gestural performances of the subjects are evaluated by the system to identify the class of emotion indicated. Our system yields an accuracy of 94.4% with a training set of only one gesture per emotion. Apart from this our system is also computationally efficient. Our work analyses emotions from only gestures, which is a significant step towards reducing the cost efficiency of emotion recognition. It may be noted here that this system may also be used for the purpose of general gesture recognition. We have proposed new features and a new classifying approach using fuzzy sets. We have achieved state of art accuracy with minimal complexity as each motion trajectory along each axis generates only 4 displacement features. Each axis generates a trajectory and only 6 joint trajectories among all joint trajectories are compared. The 6 motion trajectories are selected based on maximum motion, as maximum moving regions give more information on gestures. The experiments have been performed on data obtained from Microsoft Kinect sensors. Training and Testing were subject gender independent.

Keywords

Type-1 Fuzzy Sets Gesture Recognition One-shot Learning Emotion Recognition 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Ju, Z., Liu, H.: A unified fuzzy framework for human-hand motion recognition. IEEE Transactions on Fuzzy Systems 19(5), 901–913 (2011)CrossRefGoogle Scholar
  2. 2.
    Liao, J., Bi, Y., Nugent, C.: Using the Dempster–Shafer theory of evidence with a revised lattice structure for activity recognition. IEEE Transactions on Information Technology in Biomedicine 15(1), 74–82 (2011)CrossRefGoogle Scholar
  3. 3.
    Schwier, J.M., Brooks, R.R., Griffin, C.: Methods to window data to differentiate between Markov models. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics 41(3), 650–663 (2011)CrossRefGoogle Scholar
  4. 4.
    Lin, C.T., Yeh, C.M., Liang, S.F., Chung, J.F., Kumar, N.: Support-vector-based fuzzy neural network for pattern classification. IEEE Transactions on Fuzzy Systems 14(1), 31–41 (2006)CrossRefGoogle Scholar
  5. 5.
    Sotelo, M.P., Desseree, E., Moreau, J.M., Shariat, B., Beuve, M.: 3-D Model-Based Multiple-Object Video Tracking for Treatment Room Supervision. IEEE Transactions on Biomedical Engineering 59(2), 562–570 (2012)CrossRefGoogle Scholar
  6. 6.
    Fang, G., Gao, W., Zhao, D.: Large vocabulary sign language recognition based on fuzzy decision trees. IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans 34(3), 305–314 (2004)CrossRefGoogle Scholar
  7. 7.
    Zhang, X., Chen, X., Li, Y., Lantz, V., Wang, K., Yang, J.: A framework for hand gesture recognition based on accelerometer and EMG sensors. IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans 41(6), 1064–1076 (2011)CrossRefGoogle Scholar
  8. 8.
    Zadeh, L.A.: Fuzzy logic = computing with words. IEEE Transactions on Fuzzy Systems 4(2), 103–111 (1996)CrossRefMathSciNetGoogle Scholar
  9. 9.
    Sarkar, N., Smith, C.A.: A novel interface system for seamlessly integrating human-robot cooperative activities in space. Technical Report, NASA Institute for Advanced Concepts (2002)Google Scholar
  10. 10.
    Jiang, M., Wang, Z.: A method for stress detection based on FCM algorithm. In: 2nd International Congress on Image and Signal Processing, CISP 2009, pp. 1–5. IEEE (2009)Google Scholar
  11. 11.
    Guney, K., Sarikaya, N.: Comparison of Mamdani and Sugeno fuzzy inference system models for resonant frequency calculation of rectangular microstrip antennas. Progress In Electromagnetics Research B 12, 81–104 (2009)CrossRefGoogle Scholar
  12. 12.
    Bailador, G., Guadarrama, S.: Robust gesture recognition using a prediction-error-classification approach. In: IEEE International Fuzzy Systems Conference, 2007 (FUZZ-IEEE 2007), pp. 1–7. IEEE (2007)Google Scholar
  13. 13.
    de Santos Sierra, A., Sánchez Ávila, C., Casanova, J.G., del Pozo, G.B.: A stress-detection system based on physiological signals and fuzzy logic. IEEE Transactions on Industrial Electronics, 58(10), 4857-4865 (2011)Google Scholar
  14. 14.
    Várkonyi-Kóczy, A.R., Tusor, B.: Human–computer interaction for smart environment applications using fuzzy hand posture and gesture models. IEEE Transactions on Instrumentation and Measurement 60(5), 1505–1514 (2011)CrossRefGoogle Scholar
  15. 15.
    Álvarez, M., Galán, R., Matía, F., Rodríguez-Losada, D., Jiménez, A.: An emotional model for a guide robot. IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans 40(5), 982–992 (2010)CrossRefGoogle Scholar
  16. 16.
    Liu, H.Y., Wang, W.J., Wang, R.J., Tung, C.W., Wang, P.J., Chang, I.P.: Image Recognition and Force Measurement Application in the Humanoid Robot Imitation. IEEE Transactions on Instrumentation and Measurement 61(1), 149–161 (2012)CrossRefGoogle Scholar
  17. 17.
    Yang, F., Li, Y.: Set-membership fuzzy filtering for nonlinear discrete-time systems. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics 40(1), 116–124 (2010)CrossRefGoogle Scholar
  18. 18.
    Kadkhodaie-Ilkhchi, A., Monteiro, S.T., Ramos, F., Hatherly, P.: Rock recognition from mwd data: A comparative study of boosting, neural networks, and fuzzy logic. IEEE Geoscience and Remote Sensing Letters 7(4), 680–684 (2010)CrossRefGoogle Scholar
  19. 19.
    Castellano, G., Villalba, S.D., Camurri, A.: Recognising human emotions from body movement and gesture dynamics. In: Paiva, A.C.R., Prada, R., Picard, R.W. (eds.) ACII 2007. LNCS, vol. 4738, pp. 71–82. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  20. 20.
    Castellano, G., Mortillaro, M., Camurri, A., Volpe, G., Scherer, K.: Automated analysis of body movement in emotionally expressive piano performances. Music Perception: An Interdisciplinary Journal 26(2), 103–119 (2008)CrossRefGoogle Scholar
  21. 21.
    Gunes, H., Shan, C., Chen, S., Tian, Y.: Bodily Expression for Automatic Affect Recognition. Advances in Emotion Recognition. Wiley-BlackwellGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2013

Authors and Affiliations

  • Reshma Kar
    • 1
  • Aruna Chakraborty
    • 2
  • Amit Konar
    • 1
  • Ramadoss Janarthanan
    • 3
  1. 1.Department of Electronics and Tele-Communication EngineeringJadavpur UniversityKolkataIndia
  2. 2.Department of Computer Science & EngineeringSt. Thomas’ College of Engineering & TechnologyKolkataIndia
  3. 3.Department of Computer Science & EngineeringTJS Engineering CollegeChennaiIndia

Personalised recommendations