Wearable Sensor Integration and Bio-motion Capture: A Practical Perspective

  • Zhiqiang Zhang
  • Athanasia Panousopoulou
  • Guang-Zhong Yang


In the previous chapters, we have discussed the fundamentals of BSN hardware and processing techniques including multi-sensor fusion, context aware and autonomic sensing. In this chapter, we will use bio-motion analysis as an exemplar to demonstrate how some of these methods are used for practical applications involving multiple wearable sensors.

Motion Capture (Mocap) and reconstruction is the process of recording the general body movement of a human subject or living being and translating the movement onto a 3D model such that the model performs the same actions as the subject (De Aguiar E, Theobalt C, Stoll C, Seidel HP. Marker-less deformable mesh tracking for human shape and motion capture. In: Proceeding of IEEE Conference on Computer Vision and Pattern Recognition, pp 1–9, 2007). The Mocap technology has been used for a variety of applications, from delivering realistic animation in filming and entertainment to assessing the performance of professional athletes. Clinically, motion reconstruction systems are increasingly used to analyse the biomechanics of patients. The analysis provides an objective measure of physical function to aid interventional planning, evaluate the outcomes of surgical procedures and assess the efficacy of treatment and rehabilitation (King and Paulson Computer 40(9):13–16, 2007; Wong C, Zhang Z, Kwasnicki R, Liu J, and Yang GZ. Motion Reconstruction from Sparse Accelerometer Data Using PLSR, In: Proceedings of ninth International Conference on Wearable and Implantable Body Sensor Networks (BSN), pp 178–183, 2012). Thus far, a number of motion-tracking technologies have been developed and they can be mainly classified as optical tracking, mechanical tracking and inertial-sensor based tracking systems (Yun and Bachmann IEEE Transactions on Robotics 22(6): 1216–1227, 2006).


Sensor Node Kalman Filter Extended Kalman Filter Sink Node Body Segment 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    De Aguiar E, Theobalt C, Stoll C, and Seidel HP, Marker-less deformable mesh tracking for human shape and motion capture, In: Proceeding of IEEE Conference on Computer Vision and Pattern Recognition, 2007; 1–9.Google Scholar
  2. 2.
    King B and Paulson L, Motion capture moves into new realms. Computer, 2007, 40(9), 13–16.CrossRefGoogle Scholar
  3. 3.
    Wong C, Zhang Z, Kwasnicki R, Liu J, and Yang GZ, Motion Reconstruction from Sparse Accelerometer Data Using PLSR, In: Proceedings of ninth International Conference on Wearable and Implantable Body Sensor Networks (BSN), 2012; 178–183.Google Scholar
  4. 4.
    Yun X and Bachmann E, Design, Implementation, and Experimental Results of a Quaternion-Based Kalman Filter for Human Body Motion Tracking, IEEE Transactions on Robotics, 2006, 22(6): 1216–1227.CrossRefGoogle Scholar
  5. 5.
    Vicon Motion Capture System,
  6. 6.
    BTS SMART-D Motion Capture System,
  7. 7.
    Qualisys Motion Capture Systems,
  8. 8.
    OptiTrack Motion Capture Systems,
  9. 9.
    Moeslund TB, Hilton A, Krüger V, A survey of advances in vision-based human motion capture and analysis, Computer vision and image understanding, 2006; 104(2): 90–126.CrossRefGoogle Scholar
  10. 10.
    Vlasic D, Adelsberger R, Vannucci G, Barnwell J, Gross M, Matusik W, Popović J, Practical motion capture in everyday surroundings, ACM Transactions on Graphics (TOG), 007; 26(3): 35.Google Scholar
  11. 11.
    Hasler N, Rosenhahn B, Thormahlen T, Wand M, Gall J, Seidel HP, Markerless motion capture with unsynchronized moving cameras, In proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2009; 224–231.Google Scholar
  12. 12.
    Liu Y, Stoll C, Gall J, Seidel HP, Theobalt C, Markerless motion capture of interacting characters using multi-view image segmentation, In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2011; 1249–1256.Google Scholar
  13. 13.
  14. 14.
    Yoon J, Novandy B, Yoon CH, Park KJ, A 6-DOF gait rehabilitation robot with upper and lower limb connections that allows walking velocity updates on various terrains, IEEE/ASME Transactions on Mechatronics, 2010; 15(2): 201–215.CrossRefGoogle Scholar
  15. 15.
    Schabowsky CN, Godfrey SB, Holley RJ, Lum PS, Development and pilot testing of HEXORR: hand EXOskeleton rehabilitation robot, Journal of Neuroengineering and Rehabilitation, 2010; 7(1):36.CrossRefGoogle Scholar
  16. 16.
  17. 17.
  18. 18.
    ArmeoPower Arm Exoskeleton,
  19. 19.
  20. 20.
    Tao Y, Hu H, A novel sensing and data fusion system for 3D arm motion tracking in tele-rehabilitation, IEEE Transactions on Instrumentation and Measurements, 2008; 57(5):1029–1040.CrossRefGoogle Scholar
  21. 21.
    Huang S, Sun S, Huang Z, Wu J, Ambulatory real-time micro-sensor motion capture, In: Proceedings of the 11th ACM international conference on Information Processing in Sensor Networks, 2012; 107–108Google Scholar
  22. 22.
    Animazoo Mocap Suits,
  23. 23.
    Roetenberg D, Luinge H and Slycke P, Xsens MVN: full 6DOF human motion tracking using miniature inertial sensors, Xsens Motion Technologies BV Tech. Rep., 2009.Google Scholar
  24. 24.
    3DSuit 3DTracker,
  25. 25.
  26. 26.
    Zhang Z, Wu J, A novel hierarchical information fusion method for three-dimensional upper limb motion estimation, IEEE Transactions on Instrumentation and Measurement, 2011; 60(11):3709–3719.CrossRefGoogle Scholar
  27. 27.
    McAllister LB, A quick introduction to quaternions, Pi Mu Epsilon Journal, 1989; 9(1):23–25.Google Scholar
  28. 28.
    Inertial Labs Inc,
  29. 29.
    Chou JCK, Quaternion kinematic and dynamic differential equations, IEEE Transactions on Robotics and Automation, 1992; 8(1):53–64.CrossRefGoogle Scholar
  30. 30.
    Trawny N, and Stergios IR, Indirect Kalman filter for 3D attitude estimation, University of Minnesota, Dept. of Comp. Sci. & Eng., Tech. Rep, 2005.Google Scholar
  31. 31.
    Kuipers JB, Quaternions and rotation sequences, Princeton university press, 1999, 127–143.Google Scholar
  32. 32.
    Sabatini AM, Quaternion-based strap-down integration method for applications of inertial sensing to gait analysis, Medical and Biological Engineering and Computing, 2005, 43(1): 94–101.CrossRefGoogle Scholar
  33. 33.
    Chen Z, Bayesian filtering: From Kalman filters to particle filters and beyond, Statistics, 2003; 182(1):1–69.CrossRefGoogle Scholar
  34. 34.
    Arulampalam MS, Maskell S, Gordon N, Clapp T, A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking, IEEE Transactions on Signal Processing, 2002; 50(2):174–188.CrossRefGoogle Scholar
  35. 35.
    Koch W, On Bayesian tracking and data fusion: a tutorial introduction with examples, IEEE Aerospace and Electronic Systems Magazine, 2010; 25(7):29–52CrossRefGoogle Scholar
  36. 36.
    Hol JD, Schön TB, Luinge H, Slycke PJ and Gustafsson F, Robust real-time tracking by fusing measurements from inertial and vision sensors, Journal of Real-Time Image Processing, 2007; 2(2):149–160.CrossRefGoogle Scholar
  37. 37.
    Jurman D, Jankovec M, Kamnik R, & Topič M, Calibration and data fusion solution for the miniature attitude and heading reference system, Sensors and Actuators A: Physical, 2007; 138(2):411–420.CrossRefGoogle Scholar
  38. 38.
    Shen SC, Chen CJ and Huang HJ, A new calibration method for MEMS inertial sensor module, in: proceeding 11th IEEE International Workshop on of Advanced Motion Control, 2010.Google Scholar
  39. 39.
    Zhang Z, Yang GZ, Calibration of Miniature Inertial and Magnetic Sensor Units for Robust Attitude Estimation, IEE Transactions on Instrumentation and Measurements, 2014; 63(3):711–718.Google Scholar
  40. 40.
    Kraft E, A quaternion-based unscented Kalman filter for orientation tracking, In: Proceedings of the sixth International Conference of Information Fusion, 2003.Google Scholar
  41. 41.
    Zhang ZQ, Ji LY, Huang ZP and Wu JK, Adaptive Information Fusion for Human Upper Limb Movement Estimation, IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans, 2012; 42(5):1100–1108.Google Scholar
  42. 42.
    Luinge H and Veltink P, Inclination measurement of human movement using a 3-D accelerometer with autocalibration, IEEE Trans. Neural Syst. Rehabil. Eng., 2004; 12(1): 112–121.CrossRefGoogle Scholar
  43. 43.
    Young A, Use of body model constraints to improve accuracy of inertial motion capture, in: proceeding of International Conference on Wearable and Implantable Body Sensor Networks (BSN), 2010; 180–186.Google Scholar
  44. 44.
    Sabatini A, Quaternion-based extended Kalman filter for determining orientation by inertial and magnetic sensing, IEEE Transactions on Biomedical Engineering, 2006; 53(7): 1346–1356.CrossRefGoogle Scholar
  45. 45.
    Kang C and Park C, Attitude estimation with accelerometers and gyros using fuzzy tuned Kalman filter, in: proceeding of European Control Conference, 2009; 3713–3718.Google Scholar
  46. 46.
    Sun S, Meng X, Ji L, Wu J, Wong WC, Adaptive sensor data fusion in motion capture, In: Proceedings of 13th Conference on Information Fusion, 2010; 1–8.Google Scholar
  47. 47.
    Zhang Z, Meng X, Wu J, Quaternion-Based Kalman Filter With Vector Selection for Accurate Orientation Tracking, IEEE Transactions on Instrumentation and Measurement, 2012; 61(10):2817–2824.CrossRefGoogle Scholar
  48. 48.
    Meng XL, Zhang ZQ, Sun SY, Wu JK, Wong WC, Biomechanical model-based displacement estimation in micro-sensor motion capture, Measurement Science and Technology, 2012: 23(5): 055101.CrossRefGoogle Scholar
  49. 49.
    Maróti M, Kusy B, Simon G and Lédeczi, The flooding time synchronization protocol, In: Proceedings of the 2nd international conference on Embedded networked sensor systems, 2004; 39–49.Google Scholar
  50. 50.
    TinyOS operating system,

Copyright information

© Springer-Verlag London 2014

Authors and Affiliations

  • Zhiqiang Zhang
    • 1
  • Athanasia Panousopoulou
    • 1
  • Guang-Zhong Yang
    • 1
  1. 1.The Hamlyn CentreImperial College LondonLondonUK

Personalised recommendations