Skip to main content

Integrated Deep Learning Structures for Hand Gesture Recognition

  • Conference paper
  • First Online:
  • 712 Accesses

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 896))

Abstract

In this paper, object control with hand movements is proposed for controlling distant objects even when the user is far from the system. This method is based on finding hands, predicting the states and direction of the hand movement. This human-computer interface (HCI) is an assistive system for users at near/far from objects. The model is specifically designed for controlling computer mouse on big screens during the formal presentation. By moving the hand to the left, right, up and down moves the mouse pointer and sends mouse button command using hand states. Here, close hand triggers mouse button, until the same hand is opened. In this system, Single Shot Multi Box Detection (SSD) architecture is used for object detection and Convolutional Neural Network (CNN) is used for predicting hand states. This integrated system allows users to control the mouse from a distant position without using any hardware. The test results show that this system is robust and accurate. This invention uses a single camera and aids users who are far from the computer during a presentation to shuffle through the slides.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Huang, C., Huang, W.: Sign language recognition using model-based tracking and a 3D Hopfield neural network. Mach. Vis. Appl. 10, 292–307 (1998)

    Article  Google Scholar 

  2. Ren, Gang, Li, Chuan, O’Neill, E., Willis, P.: 3D freehand gestural navigation for interactive public displays. IEEE Comput. Graphics Appl. 33, 47–55 (2013)

    Google Scholar 

  3. Wachs, J., Kölsch, M., Stern, H., Edan, Y.: Vision-based hand-gesture applications. Commun. ACM 54, 60 (2011)

    Article  Google Scholar 

  4. Kumar, P., Verma, J., Prasad, S.: Hand data glove: a wearable realtime device for human-computer interaction. Int. J. Adv. Sci. Technol. 43 (2012)

    Google Scholar 

  5. Kavakli, M., Taylor, M., Trapeznikov, A.: Designing in virtual reality (desire): a gesture-based interface. In: Proceedings of the 2nd International Conference on Digital Interactive Media in Entertainment and Arts, DIMEA, pp. 131–136 (2007)

    Google Scholar 

  6. Martins, T., Sommerer, C., Mignonneau, L., Correia, N.: Gauntlet: a wearable interface for ubiquitous gaming. In: MobileHCI: Proceedings of the 10th International Conference on Human Computer Interaction with Mobile Devices and Services, p. 367. ACM Request Permissions, New York (2008)

    Google Scholar 

  7. Molchanov, P., Gupta, S., Kim, K., Pulli, K.: Multi-sensor system for drivers hand-gesture recognition. In: IEEE Conference on Automatic Face and Gesture Recognition, pp. 1–8 (2015)

    Google Scholar 

  8. Lee, B., Lee, B., Chung, W.: Wristband-type driver vigilance monitoring system using smartwatch. IEEE Sens. J. 15, 5624–5633 (2015)

    Article  Google Scholar 

  9. Xu, D.: A neural network approach for hand gesture recognition in virtual reality driving training system of SPG. In: 18th International Conference on Pattern Recognition, vol. 3, pp. 519–522 (2006)

    Google Scholar 

  10. Roitberg, A., Somani, N., Perzylo, A., Rickert, M., Knoll, A.: Multimodal human activity recognition for industrial manufacturing processes in robotic workcells. In: Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, pp. 259–266 (2015)

    Google Scholar 

  11. Stiefmeier, T., Roggen, D., Ogris, G., Lukowicz, P.: Wearable activity tracking in car manufacturing. Pervasive Comput. IEEE 7(2), 42–50 (2008)

    Article  Google Scholar 

  12. Ou, J., Shi, Y., Wong, J., Fussell, R., Yang, J.: Combining audio and video to predict helpers’ focus of attention in multiparty remote collaboration on physical tasks. In: Proceedings of the 8th International Conference on Multimodal Interfaces, pp. 217–224 (2006)

    Google Scholar 

  13. Shoaib, M., Bosch, S., Scholten, H., Havinga, P., Incel. D.: Towards detection of bad habits by fusing smartphone and smartwatch sensors. In: IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops), pp. 591–596 (2015)

    Google Scholar 

  14. Sen, S., Subbaraju, V., Misra, A., Balan, R., Lee, Y.: The case for smartwatch-based diet monitoring. In: IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops), pp. 585–590 (2015)

    Google Scholar 

  15. Ji-Hwan, K., Nguyen, D., Tae-Seong, K.: 3-D hand motion tracking and gesture recognition using a data glove. In: IEEE International Symposium on Industrial Electronics, pp. 1013–1018 (2009)

    Google Scholar 

  16. She, Y., Wang, Q., Jia, Y., Gu, T., He, Q., Yang, B.: A real-time hand gesture recognition approach based on motion features of feature points. In: IEEE 17th International Conference on Computational Science and Engineering (CSE), pp. 1096–1102 (2014)

    Google Scholar 

  17. Hung, C.H., Bai, Y., Wu, H.: Home appliance control by a hand gesture recognition belt in LED array lamp case. In: IEEE 4th Global Conference on Consumer Electronics (GCCE), pp. 599–600 (2015)

    Google Scholar 

  18. Lee, D., Hong, K.: A hand gesture recognition system based on difference image entropy. In: 6th International Conference on Advanced Information Management and Service (IMS), pp. 410–413 (2010)

    Google Scholar 

  19. Dulayatrakul, J., Prasertsakul, P., Kondo, T., Nilkhamhang, I.: Robust implementation of hand gesture recognition for remote human-machine interaction. In: 7th International Conference on Information Technology and Electrical Engineering (ICITEE), pp. 247–252 (2015)

    Google Scholar 

  20. Hussain, I., Talukdar, A., Sarma, K.: Hand gesture recognition system with real-time palm tracking. In: Annual IEEE India Conference (INDICON), pp. 1–6 (2014)

    Google Scholar 

  21. Wang, C., Liu, Z., Chan, S.: Superpixel-based hand gesture recognition with kinect depth camera. IEEE Trans. Multimedia 17, 29–39 (2015)

    Article  Google Scholar 

  22. Ishiyama, H., Kurabayashi, S.: Monochrome glove: a robust real-time hand gesture recognition method by using a fabric glove with design of structured markers. In: IEEE Virtual Reality (VR), pp. 187–188 (2016)

    Google Scholar 

  23. Huong, T.N., Huu, T., Le, T.: Static hand gesture recognition for vietnamese sign language (VSL) using principal components analysis. In: International Conference on Communications, Management and Telecommunications (ComManTel), pp. 138–141 (2015)

    Google Scholar 

  24. Suriya, R., Vijayachamundeeswari, V.: A survey on hand gesture recognition for simple mouse control. In: International Conference on Information Communication and Embedded Systems (ICICES), pp. 1–5 (2014)

    Google Scholar 

  25. Luzhnica, G., Simon, J., Lex, E., Pammer, V.: A sliding window approach to natural hand gesture recognition using a custom data glove. In: IEEE Symposium on 3D User Interfaces (3DUI), pp. 81–90 (2016)

    Google Scholar 

  26. Chen, W., Wu, C., Lin, C.: Depth-based hand gesture recognition using hand movements and defects. In: International Symposium on Next-Generation Electronics (ISNE), pp. 1–4 (2015)

    Google Scholar 

  27. Chen, Y., Ding, Z., Chen, Y., Wu, X.: Rapid recognition of dynamic hand gestures using leap motion. In: IEEE International Conference on Information and Automation, pp. 1419–1424 (2015)

    Google Scholar 

  28. Abiyev, R., Arslan, M., Gunsel, I., Cagman, A.: Robot pathfinding using vision based obstacle detection (2017)

    Google Scholar 

  29. Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, Y., Berg, A.: SSD: single shot multibox detector. In: European Conference on Computer Vision, pp. 21–37. Springer (2016)

    Google Scholar 

  30. Abadi, M.: TensorFlow: learning functions at scale. In: ACM SIGPLAN Notices, vol. 51, p. 1 (2016)

    Article  Google Scholar 

  31. Pugeault, N., Bowden, R.: Spelling it out: real-time ASL fingerspelling recognition. In: Proceedings of the 1st IEEE Workshop on Consumer Depth Cameras for Computer Vision, jointly with ICCV. (2011)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Senol Korkmaz .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Korkmaz, S. (2019). Integrated Deep Learning Structures for Hand Gesture Recognition. In: Aliev, R., Kacprzyk, J., Pedrycz, W., Jamshidi, M., Sadikoglu, F. (eds) 13th International Conference on Theory and Application of Fuzzy Systems and Soft Computing — ICAFS-2018. ICAFS 2018. Advances in Intelligent Systems and Computing, vol 896. Springer, Cham. https://doi.org/10.1007/978-3-030-04164-9_19

Download citation

Publish with us

Policies and ethics