Advertisement

Research on Motion Control System of 6-DOF Robotic Arm

  • Minglei Liu
  • Hongbo Zhou
  • Aiping PangEmail author
Conference paper
Part of the Lecture Notes in Electrical Engineering book series (LNEE, volume 582)

Abstract

Robot control technology is developing rapidly on a global scale with the way of interaction between human and robot towards two directions of convenience and intuition. In this paper, the operator’s arm position and motion state are represented by the motion sensing, which providing the necessary information for the robot motion control. This paper studies the interactive 6-DOF robotic arm control system that combining visual and wearable leap motion device. This system uses human body’s rotation angle of the shoulders and elbows to control the 6 joints of the robotic arm, and the gestures to claw. This experiment platform was built with data and instructions were transmitted wirelessly. The experimental results show that the interactive robot with integrated visual and wearable leap motion device can effectively and intuitively control the robotic arm for object grabbing.

Keywords

Robot control Motion sensing Motion control 6-DOF robotic arm control system 

Notes

Acknowledgements

This work was supported partly by district science foundation program of a study on soft grab method based on active excitation and state recognition of manipulator system (60663005) and talent introduction scientific research project of Guizhou University named a research on H-inf comprehensive control of moving objects (201801).

References

  1. 1.
    Zhong, J., Cao, J.S.: Implementation of the 6-dof mechanical arm control system based on kinect V2. J. Mach. Tools Hydraul. 9, 81–85 (2018)Google Scholar
  2. 2.
    Li, H.B., Li, S.S., Sun, H.Y.: Methods of human motion and gesture recognition based on Kinect bone data. J. Comput. Eng. Des. 4, 969–975 (2016)Google Scholar
  3. 3.
    Xin, Y.Z., Xing, Z.F.: Methods of human motion recognition based on kinect. J. Comput. Eng. Des. 4, 1056–1061 (2016)Google Scholar
  4. 4.
    Xin, J., Guo, J.L., Ma, X.M., Huang, K.: The perspective of mobile machine people based on kinect. J. Robot. 5, 560–568 (2014)Google Scholar
  5. 5.
    Pan, S.Y.: Interactive system and control behavior design of live interactive art—taking Myo wristband as an example. J. Chin. Media Technol. 3, 75–77 (2018)Google Scholar
  6. 6.
    Wu, G.B.: Kinect Human-Computer Interaction Development Practice. People Post and Telecommunications Press (2013)Google Scholar
  7. 7.
    Han, Z., Liu, H.P., Huang, W.B.: Robot arm target grabbing based on Kinect. J. Intell. Syst. 2, 149–155 (2013)Google Scholar
  8. 8.
    Yu, T.: Practical Application Development of Kinect and Dialogue with Machine in the Most Natural Way. Mechanical Industry Press (2013)Google Scholar
  9. 9.
    Gordon, M., Tang, Y.: A Guide to Arduino Robotics. Science Press (2014)Google Scholar
  10. 10.
    Gaber, A., Faher, M.F., Waned, M.A.: Automated grading of facial paralysis using the kinect V2: a proof of concept study. In: Virtual Rehabilitation Proceedings (ICVR), pp. 258–264. IEEE, Spain (2015)Google Scholar
  11. 11.
    International Federation of Robotics.: IFR forecast: 1.7 million new robots to transform the world’s factories by 2020. R. Frankfurt (2017)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  1. 1.Guizhou University, School of Electronical EngineeringGuiyangChina

Personalised recommendations