Finger Tracking for Gestural Interaction in Mobile Devices

  • Matti Matilainen
  • Jari Hannuksela
  • Lixin Fan
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7944)


In this paper we propose a finger tracking system that is suitable for gesture recognition in mobile devices. The initialisation of the system does not require the use of any I/O devices. The user covers the camera lense with his or her hand and then takes it to the operating distance. The statistical models used for hand segmentation are initialised from the first frames after the hand is removed from the lense. The hand segmentation does not need to be perfect because we do not use the hand contour in the recognition. In our method the fingertips are found using template matching. However, the template matching produces false detections. These false detections are pruned by searching a path from the fingertip to the estimated hand centre and discarding the paths that do not meet a predefined criteria. We evaluate the performance of the method against a fingertip detector proposed by Baldauf et al. [2] by using seven test subjects who initialise the system and then wave their hand in front of the camera. In testing we use one handheld USB camera that matches the image quality of most recent front cameras in mobile phones.


computer vision finger tracking gesture recognition 


  1. 1.
    An, J.-H., Min, J.-H., Hong, K.-S.: Finger Gesture-Based Mobile User Interface Using a Rear-facing Camera. In: Park, J.J., Yang, L.T., Lee, C. (eds.) FutureTech 2011, Part II. CCIS, vol. 185, pp. 230–237. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  2. 2.
    Baldauf, M., Zambanini, S., Frölich, P., Reichl, P.: Markerless Visual Fingertip Detection for Natural Mobile Device Interaction. In: Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, MobileHCI, pp. 539–544 (2011)Google Scholar
  3. 3.
    Ben Jmaa, A., Mahdi, W., Ben Hmadou, A.: A new approach for digit recognition based on hand gesture analysis. International Journal of Computer Science and Information Security, 108–115 (2009)Google Scholar
  4. 4.
    Canny, J.: A Computational Approach to Edge Detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 679–698 (1986)Google Scholar
  5. 5.
    Cheng, Y.: Mean shift, mode seeking, and clustering. IEEE Transactions on Pattern Analysis and Machine Intelligence, 790–799 (1995)Google Scholar
  6. 6.
    Collins, R.T., Zhou, X., Teh, S.K.: An Open Source Tracking Testbed and Evaluation Web Site. In: IEEE International Workshop on Performance Evaluation of Tracking and Surveillance (2005)Google Scholar
  7. 7.
    Davis, J.E., Gallo, O., Arteaga, S.M.: A camera-based pointing interface for mobile devices. In: IEEE International Conference on Image Processing, pp. 1420–1423 (2008)Google Scholar
  8. 8.
    Frati, V., Prattichizzo, D.: Using Kinect for hand tracking and rendering in wearable haptics. In: IEEE World Haptics Conference 2011, pp. 317–321 (2011)Google Scholar
  9. 9.
    Hasanuzzaman, M., Ampornaramveth, V., Zhang, T., Bhuyian, M.A., Shirai, Y., Ueno, H.: Real-time Vision-based Gesture Recognition for Human Robot Interaction. In: Proceedings of the IEEE International Conference on Robotics and Biomimetics, pp. 413–418 (2004)Google Scholar
  10. 10.
    Henrysson, A., Marshall, J., Billinghurst, M.: Experiments in 3d interaction for mobile phone AR. In: International Conference on Computer Graphics and Interactive Techniques in Australia and Southeast Asia, pp. 187–194 (2007)Google Scholar
  11. 11.
    Kalal, Z., Mikolajczyk, K., Matas, J.: Tracking-Learning-Detection. IEEE Transactions on Pattern Analysis and Machine Intelligence 6(1), 3789–3792 (2010)Google Scholar
  12. 12.
    Kölsch, M., Turk, M.: Fast 2D Hand Tracking with Flocks of Features and Multi-Cue Integration. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, p. 158 (2004)Google Scholar
  13. 13.
    Li, Z., Jarvis, R.: Real time Hand Gesture Recognition using a Range Camera. In: Australasian Conference on Robotics and Automation (2009)Google Scholar
  14. 14.
    Malik, S., Laszlo, J.: Visual touchpad: a two-handed gestural input device. In: Proceedings of the 6th International Conference on Multimodal Interfaces, pp. 289–296 (2004)Google Scholar
  15. 15.
    Oikonomidis, I., Kyriazis, N., Argyros, A.: Efficient model-based 3D tracking of hand articulations using Kinect. In: Proceedings of the British Machine Vision Conference, pp. 101.1–101.11 (2011)Google Scholar
  16. 16.
    Oka, K., Sato, Y.: Real-Time Fingertip Tracking and Gesture Recognition. In: Computer Graphics and Applications, pp. 64–71 (2002)Google Scholar
  17. 17.
    Terajima, K., Komuro, T., Ishikawa, M.: Fast finger tracking system for in-air typing interface. In: Extended Abstracts on Human Factors in Computing Systems, pp. 3739–3744 (2009)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Matti Matilainen
    • 1
  • Jari Hannuksela
    • 1
  • Lixin Fan
    • 2
  1. 1.Center for Machine Vision Research, Department of Computer Science and EngineeringUniversity of OuluFinland
  2. 2.Nokia Research Center TampereTampereFinland

Personalised recommendations