Advertisement

Realization of the Gesture Interface by Multifingered Robot Hand

  • Pavlovsky Vladimir
  • Stepanova Elizaveta
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 636)

Abstract

The paper considers theoretical mechanical model of a multifingered arm with 21 degrees of freedom. The main objective of the work is the creation of gesture interface. Gesture interface includes the set of gestures, the synthesis of finger control schemes for 26 gestures, as well as gesture recognition task with the help of convolutional neural network training. As the demonstration we propose to observe the results of 26 gestures recognition with the help of constructed convolutional network. For 26 classes 15600 images at different distance and at different angles were created. As a result of convolutional neural network training the accuracy of a test set classification is 76%.

References

  1. 1.
    Nagapetyan, V.G.: Gesture recognition methods on the base of long-range images analysis, Moscow (2013)Google Scholar
  2. 2.
    Craig, J.J.: Introduction to robotics. Mechanics and Control. Monograph. SRC “Regular and Chaotic Dynamics”, Institute for Computer Research, Izhevsk (2013)Google Scholar
  3. 3.
    Yurevich, E.I.: Fundamentals of Robotics, 2nd edn. BHV-Petersburg, St. Petersburg (2005)Google Scholar
  4. 4.
    Formalski, A.M.: Anthropomorphic mechanisms movement (1982)Google Scholar
  5. 5.
    Haykin, S.O.: Neural Networks. A Comprehensive Foundation. McMaster University, Ontario, CanadaGoogle Scholar
  6. 6.
    Nielsen, M.: Neural network and deep learning. http://neuralnetworksanddeeplearning.com/

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  1. 1.Keldysh Institute of Applied MathematicsMoscowRussia

Personalised recommendations