Skip to main content

A Real-Time Robot Motion Generation System Based on Human Gesture

  • Conference paper
  • First Online:
Advances on Broadband and Wireless Computing, Communication and Applications (BWCCA 2018)

Abstract

When a communication robot conveys some messages to human users, showing some motions matching with the messages is an effective way in human robot interaction. Designing and implementing a set of useful robot motions, however, is a difficult problem. It requires developers to describe instructions for accurately controlling motors embedded in the robot. In this paper, we propose a method for generating robot motions by using human gestures as input. The proposed method captures human gestures in real time by using motion sensors, converting the acquired data to robot motion instructions, and applying them to a physical robot. We developed a prototype system targeted at supporting different motion sensors and robots with lightweight data communication between a control node and a robot. We conducted experiments for controlling the robots by human gestures to verify the effectiveness of the proposed method. It can promote better human robot communication environments with reducing labors in robot motion development.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Shimizu, D., Haramaki, T., Nishino, H.: A mobile wireless network visualizer for assisting administrators. In: Proceedings of the 6th International Conference on Emerging Internet, Data and Web Technologies, pp. 651–662 (2018)

    Chapter  Google Scholar 

  2. Yatsuda, A., Haramaki, T., Nishino, H.: An Unsolicited heat stroke alert system for the elderly. In: Proceedings of the 2017 IEEE International Conference on Consumer Electronics - Taiwan, pp. 347–348 (2017)

    Google Scholar 

  3. Haramaki, T., Yatsuda, A., Nishino, H.: A robot assistant in an edge-computing-based safe driving support system. In: Proceedings of the 21th International Conference on Network-Based Information Systems, pp. 144–155 (2018)

    Google Scholar 

  4. Okazaki, S., Haramaki, T., Nishino, H.: A safe driving support method using olfactory stimuli. In: Proceedings of the 12th International Conference on Complex, Intelligent and Software Intensive Systems, pp. 958–967 (2018)

    Google Scholar 

  5. Matsui, D., Minato, T., MacDorman, K., Ishiguro, H.: Generating natural motion in an android by mapping human motion. In: Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3301–3308 (2005)

    Google Scholar 

  6. Imai, M.: Why do we have to use a robot? IPSJ Mag. 59(8), 692–697 (2018). (in Japanese)

    Google Scholar 

  7. Cicirelli, G., Attolico, C., Guaragnella, C., D’Orazio, T.: A kinect-based gesture recognition approach for a natural human robot interface. Int. J. Adv. Rob. Syst. 12(3), 1–11 (2015)

    Google Scholar 

  8. Qureshi, A.H., Nakamura, Y., Yoshikawa, Y., Ishiguro, H.: Show, attend and interact: perceivable human-robot social interaction through neural attention Q-network. In: Proceedings of IEEE International Conference on Robotics and Automation, pp. 1639–1645 (2017)

    Google Scholar 

  9. Jung, J., Kanda, T., Kim, M.-S.: Guidelines for contextual motion design of a humanoid robot. Int. J. Social Robot. 5(2), 153–169 (2013)

    Article  Google Scholar 

  10. Leap Motion. https://www.leapmotion.com/. Accessed July 2018

  11. Cao, Z., Simon, T., Wei, S.-E., Sheikh, Y.: Realtime multi-person 2D pose estimation using part affinity fields. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 1302–1310 (2017)

    Google Scholar 

  12. Mehta, D., Sridhar, S., Sotnychenko, O., Rhodin, H., Shafiei, M., Seidel, H.-P., Xu, W., Casas, D., Theobalt, C.: VNect: real-time 3D human pose estimation with a single RGB camera. ACM Trans. Graph. 36(4), 44:1–44:14 (2017)

    Article  Google Scholar 

  13. Sota. https://sota.vstone.co.jp/home/. Accessed Aug 2018. (in Japanese)

  14. Pepper. https://www.softbank.jp/robot/pepper/. Accessed Aug 2018. (in Japanese)

  15. MQTT. http://mqtt.org/. Accessed July 2018

  16. Mosquitto. https://mosquitto.org/. Accessed July 2018

  17. Eclipse Paho. https://www.eclipse.org/paho/. Accessed July 2018

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hiroaki Nishino .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Haramaki, T., Goto, K., Tsutsumi, H., Yatsuda, A., Nishino, H. (2019). A Real-Time Robot Motion Generation System Based on Human Gesture. In: Barolli, L., Leu, FY., Enokido, T., Chen, HC. (eds) Advances on Broadband and Wireless Computing, Communication and Applications. BWCCA 2018. Lecture Notes on Data Engineering and Communications Technologies, vol 25. Springer, Cham. https://doi.org/10.1007/978-3-030-02613-4_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-02613-4_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-02612-7

  • Online ISBN: 978-3-030-02613-4

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics