Finger Gesture Recognition Based on 3D-Accelerometer and 3D-Gyroscope

  • Wenchao Ma
  • Junfeng Hu
  • Jun Liao
  • Zhencheng Fan
  • Jianjun Wu
  • Li LiuEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11775)


Gesture-based interaction, as a natural way for human-computer interaction, has a wide range of applications in the ubiquitous computing environment. The latest research findings reveal that user’s arm and hand gestures are likely to be identified with ease using the motion sensors worn on the wrist, but it is not clear how much of user’s finger gestures can be recognized. This paper presents a method, which is capable of recognizing the bending of fingers, based on input signals from the 3D-accelerometer and 3D-gyroscope worn on the wrist. Features from Time-domain and Frequency-domain are extracted. Gestures are recognized by five classifiers, and the recognition results were then compared with each other. In this paper, maximal information coefficient is adopted for examining the effect of features on the gesture classification. Besides, we work out a faster calculation method, which is based on the features of top 30 maximal information coefficient. Our present results can be widely applied for medical rehabilitation and consumer electronics control based on gesture interaction.


Gesture recognition Feature extraction Accelerometer Gyroscope 



This work was supported by grants from the Fundamental Research Funds for the Key Research Programm of Chongqing Science & Technology Commission (grant nos. cstc2017rgzn-zdyf0064, cstc2017rgzn-zdyfX0042), the Chongqing Provincial Human Resource and Social Security Department (grant no. cx2017092), the Central Universities in China (grant nos. 2019CDJGFDSJ001, CQU0225001104447 and 2018CDXYRJ0030).


  1. 1.
  2. 2.
    Bao, L., Intille, S.S.: Activity recognition from user-annotated acceleration data. In: Ferscha, A., Mattern, F. (eds.) Pervasive 2004. LNCS, vol. 3001, pp. 1–17. Springer, Heidelberg (2004). Scholar
  3. 3.
    Burges, C.J.: A tutorial on support vector machines for pattern recognition. Data Min. Knowl. Disc. 2(2), 121–167 (1998)CrossRefGoogle Scholar
  4. 4.
    Chen, X., Zhang, X., Zhao, Z.Y., Yang, J.H., Lantz, V., Wang, K.Q.: Multiple hand gesture recognition based on surface EMG signal. In: 2007 1st International Conference on Bioinformatics and Biomedical Engineering, pp. 506–509. IEEE (2007)Google Scholar
  5. 5.
    Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555 (2014)
  6. 6.
    Cover, T.M., Hart, P.E., et al.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theory 13(1), 21–27 (1967)CrossRefGoogle Scholar
  7. 7.
    Institute of Electrical and Electronics Engineers: IEEE Trans. Syst. Man Cybern. Part A: Syst. Hum. (1900)Google Scholar
  8. 8.
    Glorot, X., Bordes, A., Bengio, Y.: Deep sparse rectifier neural networks. In: Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, pp. 315–323 (2011)Google Scholar
  9. 9.
    Gummeson, J., Priyantha, B., Liu, J.: An energy harvesting wearable ring platform for gestureinput on surfaces. In: Proceedings of the 12th Annual International Conference on Mobile Systems, Applications, and Services, pp. 162–175. ACM (2014)Google Scholar
  10. 10.
    Haykin, S.: Neural Networks: A Comprehensive Foundation. Prentice Hall PTR, Upper Saddle River (1994)zbMATHGoogle Scholar
  11. 11.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  12. 12.
    Kinney, J.B., Atwal, G.S.: Equitability, mutual information, and the maximal information coefficient. Proc. Natl. Acad. Sci. 111(9), 3354–3359 (2014)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems, pp. 1097–1105 (2012)Google Scholar
  14. 14.
    Munguia Tapia, E.: Using machine learning for real-time activity recognition and estimation of energy expenditure. Ph.D. thesis, Massachusetts Institute of Technology (2008)Google Scholar
  15. 15.
    Weiss, G.M., Timko, J.L., Gallagher, C.M., Yoneda, K., Schreiber, A.J.: Smartwatch-based activity recognition: a machine learning approach. In: 2016 IEEE-EMBS International Conference on Biomedical and Health Informatics (BHI), pp. 426–429. IEEE (2016)Google Scholar
  16. 16.
    Zhang, X., Chen, X., Wang, W.H., Yang, J.H., Lantz, V., Wang, K.Q.: Hand gesture recognition and virtual game control based on 3D accelerometer and EMG sensors. In: Proceedings of the 14th International Conference on Intelligent User Interfaces, pp. 401–406. ACM (2009)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.School of Big Data & Software EngineeringChongqing UniversityChongqingChina
  2. 2.KCT Smart Wearable Technology Chongqing Research Institute Co., Ltd.ChongqingChina

Personalised recommendations