Static Hand Gesture Recognition for Human Robot Interaction

  • Josiane UwinezaEmail author
  • Hongbin Ma
  • Baokui Li
  • Ying Jin
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11741)


Human-robot interaction is making a robot understanding human action, working and sharing the same space with the human. To achieve this, the communication between human and robot must be effective. The most used ways in this communication are the vocal and the body gestures such as full body actions, hand and arm gestures or head and facial gestures. Hand gestures are used as a natural and effective way of communicating between human and robot. The difference in hand size and posture, light variation and background complexity makes the hand gesture recognition to be a challenging issue. Different algorithms have been proposed and they gave better results. Yet, some problems such as poor computational scalability, trivial human intervenes and slow speed learning are still appearing in this field. In this paper, these issues are solved using a combination of three features extraction methods: Haralick texture, Hu moments, and color histogram and extreme learning machine (ELM) method for classification. The ELM results were compared to that of K-Nearest Neighbors, Random Forest Classifier, Linear Discriminant Analysis, Convolution Neural Networks and these experiment was evaluated on National University of Singapore (NUS) dataset II. ELM performed better than any of the above algorithms with an accuracy of \(98.7 \%\) and time used of 109.7 s which is the proof of the satisfactory of the model.


Human robot interaction Hu moments Color histogram Haralick texture Extreme Learning Machine 



This work is partially supported by National Key Research and Development Program of China under Grant 2017YFF0205306, National Nature Science Foundation of China under Grant 91648117, and Beijing Natural Science Foundation under Grant 4172055.


  1. 1.
    Zafar, Z., Berns, K.: Recognizing hand gestures for human-robot interaction. In: Proceedings of the 9th International Conference on Advances in Computer-Human Interactions (ACHI), pp. 333–338 (2016)Google Scholar
  2. 2.
    Mitra, S., Acharya, T.: Gesture recognition: a survey. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 37(3), 311–324 (2007)CrossRefGoogle Scholar
  3. 3.
    Prakash, R.M., Deepa, T., Gunasundari, T., Kasthuri, N.: Gesture recognition and finger tip detection for human computer interaction. In: 2017 International Conference on Innovations in Information, Embedded and Communication Systems (ICIIECS), pp. 1–4. IEEE (2017)Google Scholar
  4. 4.
    Pisharady, P.K., Saerbeck, M.: Recent methods and databases in vision-based hand gesture recognition: a review. Comput. Vis. Image Underst. 141, 152–165 (2015)CrossRefGoogle Scholar
  5. 5.
    Jambhale, S.S., Khaparde, A.: Gesture recognition using DTW & piecewise DTW. In: 2014 International Conference on Electronics and Communication Systems (ICECS), pp. 1–5. IEEE (2014)Google Scholar
  6. 6.
    Islam, M.R., Mitu, U.K., Bhuiyan, R.A., Shin, J.: Hand gesture feature extraction using deep convolutional neural network for recognizing American sign language. In: 2018 4th International Conference on Frontiers of Signal Processing (ICFSP), pp. 115–119. IEEE (2018)Google Scholar
  7. 7.
    Panwar, M.: Hand gesture recognition based on shape parameters. In: 2012 International Conference on Computing, Communication and Applications, pp. 1–6. IEEE (2012)Google Scholar
  8. 8.
    Haria, A., Subramanian, A., Asokkumar, N., Poddar, S., Nayak, J.S.: Hand gesture recognition for human computer interaction. Procedia Comput. Sci. 115, 367–374 (2017)CrossRefGoogle Scholar
  9. 9.
    Shenoy, K., Dastane, T., Rao, V., Vyavaharkar, D.: Real-time Indian sign language (ISL) recognition. In: 2018 9th International Conference on Computing, Communication and Networking Technologies (ICCCNT), pp. 1–9. IEEE (2018)Google Scholar
  10. 10.
    Nagashree, R., Michahial, S., Aishwarya, G., Azeez, B.H., Jayalakshmi, M., Rani, R.K.: Hand gesture recognition using support vector machine. Int. J. Eng. Sci. 4(6), 42–46 (2005)Google Scholar
  11. 11.
    Gao, Q., Liu, J., Ju, Z., Li, Y., Zhang, T., Zhang, L.: Static hand gesture recognition with parallel CNNs for space human-robot interaction. In: Huang, Y.A., Wu, H., Liu, H., Yin, Z. (eds.) ICIRA 2017. LNCS (LNAI), vol. 10462, pp. 462–473. Springer, Cham (2017). Scholar
  12. 12.
    Basavaprasad, B., Hegadi, R.S.: Improved grabcut technique for segmentation of color image. Int. J. Comput. Appl. 975, 8887 (2014)Google Scholar
  13. 13.
    Haralick, R.M., Shanmugam, K., et al.: Textural features for image classification. IEEE Trans. Syst. Man Cybern. 6, 610–621 (1973)CrossRefGoogle Scholar
  14. 14.
    Huang, Z., Leng, J.: Analysis of Hu’s moment invariants on image scaling and rotation. In: 2010 2nd International Conference on Computer Engineering and Technology, vol. 7, pp. V7–476. IEEE (2010)Google Scholar
  15. 15.
    Munje, P.N., Kapgate, D., Golai, S.: Novel techniques for color and texture feature extraction. Int. J. Comput. Sci. Mob. Comput. 3(2), 497–507 (2014)Google Scholar
  16. 16.
    Huang, G.-B., Zhu, Q.-Y., Siew, C.-K., et al.: Extreme learning machine: a new learning scheme of feedforward neural networks. Neural Netw. 2, 985–990 (2004)Google Scholar
  17. 17.
    Bartlett, P.L.: The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network. IEEE Trans. Inf. Theory 44(2), 525–536 (1998)MathSciNetCrossRefGoogle Scholar
  18. 18.
    Huang, G., Song, S., Gupta, J.N., Wu, C.: Semi-supervised and unsupervised extreme learning machines. IEEE Trans. Cybern. 44(12), 2405–2417 (2014)CrossRefGoogle Scholar
  19. 19.
    Akusok, A., Björk, K.-M., Miche, Y., Lendasse, A.: High-performance extreme learning machines: a complete toolbox for big data applications. IEEE Access 3, 1011–1025 (2015)CrossRefGoogle Scholar
  20. 20.
    Chen, X., Koskela, M.: Using appearance-based hand features for dynamic RGB-d gesture recognition. In: 2014 22nd International Conference on Pattern Recognition, pp. 411–416. IEEE (2014)Google Scholar
  21. 21.
    Anam, K., Al-Jumaily, A.: Evaluation of extreme learning machine for classification of individual and combined finger movements using electromyography on amputees and non-amputees. Neural Netw. 85, 51–68 (2017)CrossRefGoogle Scholar
  22. 22.
    Ding, X.-J., Lei, M.: Optimization elm based on rough set for predicting the label of military simulation data. Math. Prob. Eng. 2014, 8 (2014)Google Scholar
  23. 23.
    Huang, G.-B., Zhou, H., Ding, X., Zhang, R.: Extreme learning machine for regression and multiclass classification. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 42(2), 513–529 (2012)CrossRefGoogle Scholar
  24. 24.
    Leng, Q., Qi, H., Miao, J., Zhu, W., Su, G.: One-class classification with extreme learning machine. Math. Probl. Eng. 2015, 11 (2015)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Josiane Uwineza
    • 1
    Email author
  • Hongbin Ma
    • 1
  • Baokui Li
    • 1
  • Ying Jin
    • 1
  1. 1.School of AutomationBeijing Institute of TechnologyBeijingPeople’s Republic of China

Personalised recommendations