Skip to main content
Log in

Online Dynamic Gesture Recognition for Human Robot Interaction

  • Published:
Journal of Intelligent & Robotic Systems Aims and scope Submit manuscript

Abstract

This paper presents an online dynamic hand gesture recognition system with an RGB-D camera, which can automatically recognize hand gestures against complicated background. For background subtraction, we use a model-based method to perform human detection and segmentation in the depth map. Since a robust hand tracking approach is crucial for the performance of hand gesture recognition, our system uses both color information and depth information in the process of hand tracking. To extract spatio-temporal hand gesture sequences in the trajectory, a reliable gesture spotting scheme with detection on change of static postures is proposed. Then discrete HMMs with Left-Right Banded (LRB) topology are utilized to model and classify gestures based on multi-feature representation and quantization of the hand gesture sequences. Experimental evaluations on two self-built databases of dynamic hand gestures show the effectiveness of the proposed system. Furthermore, we develop a human-robot interactive system, and the performance of this system is demonstrated through interactive experiments in the dynamic environment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Barczak, A., Dadgostar, F.: Real-time hand tracking using a set of cooperative classifiers based on haar-like features. Res. Lett. Inform. Math. Sci. 7, 29–42 (2005)

    Google Scholar 

  2. Bengio, Y., Frasconi, P.: Input-output hmms for sequence processing. IEEE Trans. Neural Netw. 7(5), 1231–1249 (1996)

    Article  Google Scholar 

  3. Bradski, G.: Computer vision face tracking for use in a perceptual user interface. Intel Technol. J. (1998)

  4. Chen, Q., Georganas, N., Petriu, E.: Hand gesture recognition using haar-like features and a stochastic context-free grammar. IEEE Trans. Instrument. Meas. 57(8), 1562–1571 (2008)

    Article  Google Scholar 

  5. Cheng, Y.: Mean shift, mode seeking, and clustering. IEEE Trans. Pattern Anal. Mach. Intell. 17(8), 790–799 (1995)

    Article  Google Scholar 

  6. Corradini, A.: Dynamic time warping for off-line recognition of a small gesture vocabulary. In: ICCV Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems, pp. 82–89. IEEE (2001)

  7. Elmezain, M., Al-Hamadi, A., Appenrodt, J., Michaelis, B.: A hidden markov model-based continuous gesture recognition system for hand motion trajectory. In: Proceedings of International Conference on Pattern Recognition (ICPR), pp. 1–4 (2008)

  8. Elmezain, M., Al-Hamadi, A., Sadek, S., Michaelis, B.: Robust methods for hand gesture spotting and recognition using hidden markov models and conditional random fields. In: IEEE International Symposium on Signal Processing and Information Technology (ISSPIT), pp. 131–136. IEEE (2010)

  9. Garg, P., Aggarwal, N., Sofat, S.: Vision based hand gesture recognition. World Acad. Sci. Eng. Technol. 49(1), 972–977 (2009)

    Google Scholar 

  10. Holte, M.B., Moeslund, T.B., Fihl, P.: View-invariant gesture recognition using 3d optical flow and harmonic motion context. Comput. Vis. Image Understanding 114(12), 1353–1361 (2010)

    Article  Google Scholar 

  11. Juang, B.H., Rabiner, L.R.: Hidden markov models for speech recognition. Technometrics 33(3), 251–272 (1991)

    Article  MATH  MathSciNet  Google Scholar 

  12. Keogh, E.J., Pazzani, M.J.: Derivative dynamic time warping. In: The 1st SIAM International Conference on Data Mining (SDM-2001), Chicago (2001)

  13. Kurakin, A., Zhang, Z., Liu, Z.: A real time system for dynamic hand gesture recognition with a depth sensor. In: Proceedings of the 20th European Signal Processing Conference (EUSIPCO), pp. 1975–1979. IEEE (2012)

  14. Lee, J., Yoo, S.: An elliptical boundary model for skin color detection. In: International Conference on Imaging Science, Systems, and Technology. pp. 572–584 (2002)

  15. Maggio, E., Cavallaro, A.: Hybrid particle filter and mean shift tracker with adaptive transition model. In: International Conference on Acoustics, Speech, and Signal Processing. pp. 221–224 (2005)

  16. Manders, C., Farbiz, F., Chong, J., Tang, K., Chua, G., Loke, M., Yuan, M.: Robust hand tracking using a skin tone and depth joint probability model. In: The 8th IEEE International Conference on Automatic Face & Gesture Recognition, pp. 1–6. IEEE (2008)

  17. Mitra, S., Acharya, T.: Gesture recognition: a survey. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 37(3), 311–324 (2007)

    Article  Google Scholar 

  18. OpenCvSharp: http://code.google.com/p/opencvsharp/ (2011)

  19. Rabiner, L., Juang, B.: An introduction to hidden markov models. ASSP Mag. 3(1), 4–16 (1986)

    Article  Google Scholar 

  20. Rabiner, L., Juang, B.: An introduction to hidden markov models. IEEE ASSP Mag. 3(1), 4–16 (1986)

    Article  Google Scholar 

  21. Ramamoorthy, A., Vaswani, N., Chaudhury, S., Banerjee, S.: Recognition of dynamic hand gestures. Pattern Recogn. 36(9), 2069–2081 (2003)

    Article  MATH  Google Scholar 

  22. Ren, Z., Meng, J., Yuan, J., Zhang, Z.: Robust hand gesture recognition with kinect sensor. In: Proceedings of the 19th ACM international conference on Multimedia, pp. 759–760. ACM (2011)

  23. Song, Y., Demirdjian, D., Davis, R.: Continuous body and hand gesture recognition for natural human-computer interaction. ACM Trans. Interact. Intell. Syst. (TiiS) 2(1), 5 (2012)

    Google Scholar 

  24. Stiefmeier, T., Roggen, D., Tröster, G.: Gestures are strings: efficient online gesture spotting and classification using string matching. In: Proceedings of the ICST 2nd international conference on Body area networks, p. 16. ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering) (2007)

  25. Wang, S., Quattoni, A., Morency, L., Demirdjian, D., Darrell, T.: Hidden conditional random fields for gesture recognition. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), vol. 2, pp. 1521–1527. IEEE (2006)

  26. Wang, Z., Yang, X., Xu, Y., Yu, S.: Camshift guided particle filter for visual tracking. Pattern Recogn. Lett. 30(4), 407–413 (2009)

    Article  Google Scholar 

  27. Wu, Y., Liu, Q., Huang, T.S.: An adaptive self-organizing color segmentation algorithm with application to robust real-time human hand localization. In: Asian Conference on Computer Vision (ACCV), pp. 1106–1111 (2000)

  28. Xia, L., Chen, C., Aggarwal, J.: Human detection using depth information by kinect. In: Workshop on Human Activity Understanding from 3D Data in Conjunction with CVPR 2011 (HAU3D) (2011)

  29. Xu, J., Wu, Y., Katsaggelos, A.: Part-based initialization for hand tracking. In: The 17th IEEE International Conference on Image Processing (ICIP), pp. 3257–3260. IEEE (2010)

  30. Yang, H.D., Park, A.Y., Lee, S.W.: Gesture spotting and recognition for human–robot interaction. IEEE Trans. Robot. 23(2), 256–270 (2007)

    Article  Google Scholar 

  31. Yang, J., Lu, W., Waibel, A.: Skin-color modeling and adaptation. Computer Vision-ACCV’98, pp. 687–694 (1997)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yen-Lun Chen.

Additional information

The work described in this paper is partially supported by the National Natural Science Foundation of China (61005012), Shenzhen Fundamental Research Program (JC201105190948A), 2013 Outstanding Youth Innovation Fund in Shenzhen Institute of Advanced Technology, and Guangdong Innovative Research Team Program (201001D0104648280).

Rights and permissions

Reprints and permissions

About this article

Cite this article

Xu, D., Wu, X., Chen, YL. et al. Online Dynamic Gesture Recognition for Human Robot Interaction. J Intell Robot Syst 77, 583–596 (2015). https://doi.org/10.1007/s10846-014-0039-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10846-014-0039-4

Keywords

Navigation