Skip to main content
Log in

Hand tracking and gesture recognition system for human-computer interaction using low-cost hardware

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Human-Computer Interaction (HCI) exists ubiquitously in our daily lives. It is usually achieved by using a physical controller such as a mouse, keyboard or touch screen. It hinders Natural User Interface (NUI) as there is a strong barrier between the user and computer. There are various hand tracking systems available on the market, but they are complex and expensive. In this paper, we present the design and development of a robust marker-less hand/finger tracking and gesture recognition system using low-cost hardware. We propose a simple but efficient method that allows robust and fast hand tracking despite complex background and motion blur. Our system is able to translate the detected hands or gestures into different functional inputs and interfaces with other applications via several methods. It enables intuitive HCI and interactive motion gaming. We also developed sample applications that can utilize the inputs from the hand tracking system. Our results show that an intuitive HCI and motion gaming system can be achieved with minimum hardware requirements.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24
Fig. 25
Fig. 26
Fig. 27
Fig. 28
Fig. 29
Fig. 30
Fig. 31
Fig. 32
Fig. 33
Fig. 34
Fig. 35
Fig. 36
Fig. 37
Fig. 38

Similar content being viewed by others

References

  1. Barczak ALC, Dadgostar F (2005) Real-time hand tracking using a set of cooperative classifiers based on Haar-like features. Res Lett Inf Math Sci 7:29–42

    Google Scholar 

  2. Benedetti W (2009) Motion controls move games into the future. [Online] http://www.msnbc.msn.com/id/31200220/ns/technology_and_science-games/

  3. Bradski G, Kaehler A (2008) Learning OpenCV: computer vision with the OpenCV library. O’Reilly Media, Incorporated

  4. Bretzner, L, Laptev I, Lindeberg T (2002) Hand gesture recognition using multi-scale colour features, hierarchical models and particle filtering. Automatic face and gesture recognition, 2002. Proceedings. Fifth IEEE International Conference on (pp. 423–428). IEEE

  5. Buchmann V et al (2004) FingARtips: gesture based direct manipulation in Augmented reality. Proceedings of the 2nd international conference on Computer graphics and interactive techniques in Australasia and South East Asia (pp. 212–221). ACM

  6. Burns A-M, Mazzarino B (2006) Finger tracking methods using eyesweb. Gesture in human-computer interaction and simulation 156–167

  7. Carter C (2007) Microsoft® xna™ unleashed: graphics and game programming for xbox 360 and windows. Sams

  8. Chai D, Ngan KN (1999) Face segmentation using skin-color map in videophone applications. IEEE Trans Circ Syst Video Technol 9(4):551–564

    Article  Google Scholar 

  9. Chen Q (2008) Real-time vision-based hand tracking and gesture recognition. University of Ottawa

  10. Chen, Q, Georganas ND, Petriu EM (2007) Real-time vision-based hand gesture recognition using haar-like features. Instrumentation and Measurement Technology Conference Proceedings, 2007. IMTC 2007. IEEE (pp. 1–6). IEEE

  11. Dardas NH, Alhaj M (2011) Hand gesture interaction with a 3D virtual environment. Int J ACM Jordan 2(3):186–194

    Google Scholar 

  12. Dardas NH, Georganas ND (2011) Real-time hand gesture detection and recognition using bag-of-features and support vector machine techniques. IEEE Trans Instrum Meas 60(11):3592–3607

    Article  Google Scholar 

  13. Dias JMS, Nande P, Barata N, Correia A (2004) OGRE-open gestures recognition engine. In: Computer graphics and image processing, 2004. Proceedings. 17th Brazilian Symposium on (pp. 33–40). IEEE

  14. Google Earth API, https://developers.google.com/earth/

  15. Han SI, Mi JY, Kwon JH, Yang HK, Lee BG (2008) Vision based hand tracking for interaction

  16. Hasan MM, Mishra PK (2012) Real time fingers and palm locating using dynamic circle templates. Int J Comput Appl 41(6):33–43

    Google Scholar 

  17. Hürst W, van Wezel C (2012) Gesture-based interaction via finger tracking for mobile augmented reality. Multimed Tools Appl 62(1):233–258

    Google Scholar 

  18. Kalman RE (1960) A new approach to linear filtering and prediction problems. J Basic Eng 82(1):35–45

    Article  Google Scholar 

  19. Kaltenbrunner M et al (2005) TUIO: a protocol for table-top tangible user interfaces. Proc. of the The 6th Int’l Workshop on Gesture in Human-Computer Interaction and Simulation

  20. Keskin C, Erkan A, Akarun L (2003) Real time hand tracking and 3d gesture recognition for interactive interfaces using hmm. ICANN/ICONIPP 2003:26–29

    Google Scholar 

  21. Laufs U, Ruff C, Zibuschka J (2010) Mt4j-a cross-platform multi-touch development framework. arXiv preprint arXiv:1012.0467

  22. Mahmoud TM (2008) A new fast skin color detection technique. World Acad Sci 501–505

  23. Mahmoudi F, Parviz M (1993) Visual hand tracking algorithms. In: Geometric modeling and imaging--new trends, 2006 (pp. 228–232). IEEE

  24. Malik S, Laszlo J (2004) Visual touchpad: a two-handed gestural input device. Proceedings of the 6th international conference on Multimodal interfaces (pp. 289–296). ACM

  25. Manresa C, Varona J, Mas R, Perales F (2005) Hand tracking and gesture recognition for human-computer interaction. Electron Letters Comput Vis Image Anal 5(3):96–104

    Google Scholar 

  26. Oka K, Sato Y, Koike H (2002) Real-time fingertip tracking and gesture recognition. IEEE Comput Graph Appl 22(6):64–71

    Article  Google Scholar 

  27. Pavlovic VI, Sharma R, Huang TS (1997) Visual interpretation of hand gestures for human-computer interaction: a review. IEEE Trans Pattern Anal Mach Intell 19(7):677–695

    Article  Google Scholar 

  28. Quam DL (1990) Gesture recognition with a DataGlove. Aerospace and Electronics Conference, 1990. Proceedings of the IEEE 1990 National (pp. 755–760). IEEE

  29. Rehg JM, Kanade T (1994) Digiteyes: vision-based hand tracking for human-computer interaction. Motion of non-rigid and articulated objects, 1994. Proceedings of the 1994 IEEE Workshop on (pp. 16–22). IEEE

  30. Segen J, Kumar S (1998) Human-computer interaction using gesture recognition and 3D hand tracking. Image Processing, 1998. ICIP 98. Proceedings. 1998 International Conference on (pp. 188–192). IEEE

  31. Singh SK, Chauhan DS, Vatsa M, Singh R (2003) A robust skin color based face detection algorithm. Tamkang J Sci Eng 6(4):227–234

    Google Scholar 

  32. Takahashi M et al (2011) Human gesture recognition system for TV viewing using time-of-flight camera. Multimed Tools Appl 1–23

  33. Video demo of hand tracking system, http://www.youtube.com/user/tcboy88/videos?

  34. Viola P, Jones MJ (2004) Robust real-time face detection. Int J Comput Vis 57(2):137–154

    Article  Google Scholar 

  35. Von Hardenberg C, Bérard F (2001) Bare-hand human-computer interaction. Proceedings of the 2001 workshop on Perceptive user interfaces (pp. 1–8). ACM

  36. Wang RY, Popović J (2009) Real-time hand-tracking with a color glove. ACM Transactions on Graphics (TOG). Vol. 28. No. 3. ACM

  37. Welch G, Bishop G (1995) An introduction to the kalman filter. Technical report, UNC-CH Computer Science Technical Report 95041

  38. Zabulis X, Baltzakis H, Argyros A (2009) Vision-based hand gesture recognition for human-computer interaction. The Universal Access Handbook. LEA

Download references

Acknowledgments

This work was supported in part by the National Research Foundation of Korea under Grant 2011-0009349. The authors wish to thank Mr. Dylan Zhu for the language editing. Thanks are also due to all reviewers for their comments and recommendations, which have greatly improved the manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hyotaek Lim.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yeo, HS., Lee, BG. & Lim, H. Hand tracking and gesture recognition system for human-computer interaction using low-cost hardware. Multimed Tools Appl 74, 2687–2715 (2015). https://doi.org/10.1007/s11042-013-1501-1

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-013-1501-1

Keywords

Navigation