Skip to main content

Estimation and Recognition of Motion Segmentation and Pose IMU-Based Human Motion Capture

  • Conference paper
  • First Online:
Book cover Robot Intelligence Technology and Applications 5 (RiTA 2017)

Abstract

Talking about motion capture systems, we will think of a system that has a lot of white markers distributed on the suit worn on the human body, can record and simulate the motion of human or any other object in the software. However, these systems are worth a lot of money, can only operate in a wide and fixed space with many cameras attached around. Therefore, only large animation filmmakers or graphic designers are capable of purchasing this type of system. In this paper, the wireless IMU-based motion capture system is researched and developed with low cost, moderate accuracy, high speed, portable as well as easy-to-use, that is our main contribution. The full-featured hardware, the very simple operation program, and controlling software are focused and built on using the low cost components. The designed system is based on a network of small inertial measurement units (IMU) called “node” distributed on the human body. In essence, the MCU is the core of the board. It collects measured data from the sensors, perform orientation filter based on a quaternion-based Madgwick orientation filter and transfer data to host via Wi-Fi or store it in the memory for later use. After that, the node’s processed data was simulated by the program called “SHURIKEN launcher”. The nodes’ behavior also is controlled by this program. All of these activities are incorporated into the operation of the system in this project. The result of experiments on accuracy demonstrated the feasibility and advantages also a few shortcomings of the system. The advantages and limitations of the system, hardware and software architecture in more detail will be discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Sturman, D.J.: A brief history of motion capture for computer character animation. SIGGRAPH94, Course9 (1994)

    Google Scholar 

  2. Sharma, A., et al.: Motion capture process, techniques and applications. Int. J. Recent Innov. Trends Comput. Commun 1, 251–257 (2013)

    Google Scholar 

  3. Brigante, C.M.N. et al.: Towards miniaturization of a MEMS-based wearable motion capture system. IEEE Trans. Ind. Electron. 58.8, 3234–3241 (2011)

    Article  Google Scholar 

  4. Alavi, S., Arsenault, D., Whitehead, A.: Quaternion-based gesture recognition using wireless wearable motion capture sensors. Sensors 16(5), 605, 6–7 (2016)

    Article  Google Scholar 

Download references

Acknowledgements

This study was financially supported by Ho Chi Minh city University of Technology and Education, Vietnam.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nguyen Thanh Tan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Luan, P.G., Tan, N.T., Thinh, N.T. (2019). Estimation and Recognition of Motion Segmentation and Pose IMU-Based Human Motion Capture. In: Kim, JH., et al. Robot Intelligence Technology and Applications 5. RiTA 2017. Advances in Intelligent Systems and Computing, vol 751. Springer, Cham. https://doi.org/10.1007/978-3-319-78452-6_32

Download citation

Publish with us

Policies and ethics