Advertisement

Improving of Gesture Recognition Using Multi-hypotheses Object Association

  • Sebastian Handrich
  • Ayoub Al-Hamadi
  • Omer Rashid
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7340)

Abstract

Gesture recognition plays an important role in Human Computer Interaction (HCI) but in most HCI systems, the user is limited to use only one hand or two hands under optimal conditions. Challenges are for instance non-homogeneous backgrounds, hand-hand or hand-face overlapping and brightness modifications. In this research, we have proposed a novel approach that solves the ambiguities occurred due to the hand overlapping robustly based on multi-hypotheses object association. This multi-hypotheses object association builds the basis for the tracking in which the hand trajectories are computed and this leads us to extract the features. The gesture recognition phase takes the extracted features and classifies them through Hidden Markov Model (HMM).

Keywords

hand tracking multi hypotheses HCI gesture recognition 

References

  1. 1.
    Niese, R., Al-Hamadi, A., Panning, A., Michaelis, B.: Emotion recognition based on 2d-3d facial feature extraction from color image sequences. JMM 5 (2010)Google Scholar
  2. 2.
    Hassanpour, R., Wong, S., Shahbahrami, A.: Vision based hand gesture recognition for human computer interaction: A review. In: Int. Conference Interfaces and Human Computer Interaction, pp. 125–134 (2008)Google Scholar
  3. 3.
    Shan, C., Tan, T., Wei, Y.: Real-time hand tracking using a mean shift embedded particle filter. Pattern Recognition 40, 1958–1970 (2007)zbMATHCrossRefGoogle Scholar
  4. 4.
    Suk, H.I., Sin, B.K., Lee, S.W.: Hand gesture recognition based on dynamic bayesian network framework. Pattern Recognition 43, 3059–3072 (2010)zbMATHCrossRefGoogle Scholar
  5. 5.
    Van den Bergh, M., Van Gool, L.: Combining rgb and tof cameras for real-time 3d hand gesture interaction. In: Applications of Computer Vision, WACV (2011)Google Scholar
  6. 6.
    El-Sawah, A., Joslin, C., Georganas, N., Petriu, E.: A framework for 3d hand tracking and gesture recognition using elements of genetic programming. In: Canadian Conference on CRV, pp. 495–502 (2007)Google Scholar
  7. 7.
    Keskin, C., Erkan, A., Akarun, L.: Real time hand tracking and 3d gesture recognition for interactive interfaces using hmm. In: ICANN, pp. 3–6 (2003)Google Scholar
  8. 8.
    Saeed, A., Niese, R., Al-Hamadi, A., Michaelis, B.: Solving the Hand-Hand Overlapping for Gesture Application. In: Choraś, R.S. (ed.) Image Processing and Communications Challenges 3. AISC, vol. 102, pp. 343–350. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  9. 9.
    Nickel, K., Stiefelhagen, R.: Visual recognition of pointing gestures for human-robot interaction. Image and Vision Computing 25, 1875–1884 (2007)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Sebastian Handrich
    • 1
  • Ayoub Al-Hamadi
    • 1
  • Omer Rashid
    • 1
  1. 1.Institute for Electronics, Signal Processing and Communications (IESK)Otto-von-Guericke-University MagdeburgGermany

Personalised recommendations