Personal and Ubiquitous Computing

, Volume 22, Issue 2, pp 259–274 | Cite as

Robust orientation estimate via inertial guided visual sample consensus

Original Article

Abstract

This paper presents a novel orientation estimate approach named inertial guided visual sample consensus (IGVSAC). This method is intentionally designed for capturing the orientation of human body joints in free-living environments. Unlike the traditional visual-based orientation estimation methods, where outliers among putative image-pair correspondences are removed based on hypothesize-and-verify models such as the computationally costly RANSAC, our approach novelly exploits prior motion information (i.e., rotation and translation) deduced from the quick-response inertial measurement unit (IMU) as the initial body pose to assist camera in removing hidden outliers. In addition, our IGVSAC algorithm is able to ensure estimation accuracy even in the presence of a large quantity of outliers, thanks to its capability of rejecting apparent mismatches. The estimated orientation from the visual sensor is, in turn, able to correct long-term IMU drifts. We conducted extensive experiments to verify the effectiveness and robustness of our IGVSAC algorithm. Comparisons with highly accurate VICON and OptiTrack Motion Tracking Systems prove that our orientation estimate system is quite suitable for capturing human body joints.

Keywords

Orientation estimate Wearable sensors Inertial measurement unit Monocular camera Sample and consensus 

Notes

Acknowledgements

The authors would like to thank Guoli Song for his assistance in conducting MC-IMU motion tracking experiments. Also, the authors would like to thank Doctor Bo Yang for his considerate suggestions and those peer reviewers who give substantially valuable advices on this paper. This work was supported by the National Science Foundation of China under contact 61233007, 61673371 and 71661147005, Youth Innovation Promotion Association, CAS (2015157).

References

  1. 1.
    Hardegger M, Roggen D, Troster G (2015) 3D ActionSLAM: wearable person tracking in multi-floor environments. Pers Ubiquit Comput 19(1):123–141CrossRefGoogle Scholar
  2. 2.
    Umek A, Kos A (2016) Validation of smartphone gyroscopes for mobile biofeedback applications. Pers Ubiquit Comput 20(5):657–666CrossRefGoogle Scholar
  3. 3.
    Leung T, Medioni G (2014) Visual navigation aid for the blind in dynamic environments. In: 2014 IEEE conference on computer vision and pattern recognition workshops (CVPR), pp 565– 572Google Scholar
  4. 4.
    Zhou H, Hu H (2008) Human motion tracking for rehabilitation a survey. Biomed Signal Process Control 3(1):1–18CrossRefGoogle Scholar
  5. 5.
    Chen X, Zhang J, Hamel W, Tan J (2014) An inertial-based human motion tracking system with twists and exponential maps. In: 2014 IEEE international conference on robotics and automation (ICRA), pp 5665–5670CrossRefGoogle Scholar
  6. 6.
    Marin-Perianu R, Marin-Perianu M, Havinga P, Taylor S, Begg R, Palaniswami M, Rouffet D (2013) A performance analysis of a wireless body-area network monitoring system for professional cycling. Pers Ubiquit Comput 17(1):197–209CrossRefGoogle Scholar
  7. 7.
    Biekiewicz M, Rodger M, Young W, Craig C (2013) Time to get a move on: overcoming bradykinetic movement in Parkinson’s disease with artificial sensory guidance generated from biological motion. Behav Brain Res 25(3):113–120CrossRefGoogle Scholar
  8. 8.
    Kerl C, Sturm J, Cremers D (2013) Robust odometry estimation for RGB-D cameras. In: 2013 IEEE international conference on robotics and automation (ICRA), pp 3748–3754CrossRefGoogle Scholar
  9. 9.
    Tao Y, Hu H (2008) A novel sensing and data fusion system for 3-D arm motion tracking in telerehabilitation. IEEE Trans Instrum Meas 57(5):1029–1040CrossRefGoogle Scholar
  10. 10.
    Goulding J (2010) Biologically-inspired image-based sensor fusion approach to compensate gyro sensor drift in mobile robot systems that balance. In: 2010 IEEE conference on multisensor fusion and integration for intelligent systems (MFI), pp 102– 108CrossRefGoogle Scholar
  11. 11.
    Fischler M, Bolles R (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6):381– 395MathSciNetCrossRefGoogle Scholar
  12. 12.
    Torr P, Zisserman A (2000) MLESAC: a new robust estimator with application to estimating image geometry. Comput Vis Image Und 78(1):138–156CrossRefGoogle Scholar
  13. 13.
    Gupta M, Gao J, Aggarwal C, Han J (2014) Outlier detection for temporal data. Synt Lect Data Min Knowl Disc 5(1):1–129CrossRefMATHGoogle Scholar
  14. 14.
    Lowe D (2004) Distinctive image features from scale-invariant keypoints. Int J Comput Vis 60(2):91–110CrossRefGoogle Scholar
  15. 15.
    Bay H, Tuytelaars T, VanGool L (2006) Surf: speeded up robust features. In: European conference on computer vision (ECCV), pp 404–417Google Scholar
  16. 16.
    Ma J, Zhao J, Tian J, Yuille A, Tu Z (2014) Robust point matching via vector field consensus. IEEE Trans Image Process 23(4):1706–1721MathSciNetCrossRefMATHGoogle Scholar
  17. 17.
    He H, Li Y, Guan Y, Tan J (2015) Wearable Ego-Motion tracking for blind navigation in indoor environments. IEEE Trans Auto Sci Eng 12(4):1181–1190CrossRefGoogle Scholar
  18. 18.
    Tian Y, Hamel W, Tan J (2014) Accurate human navigation using wearable monocular visual and inertial sensors. IEEE Trans Instrum Meas 63(1):203–213CrossRefGoogle Scholar
  19. 19.
    Tian Y, Wei H, Tan J (2013) An adaptive-gain complementary filter for real-time human motion tracking with marg sensors in free-living environments. IEEE Trans Neural Syst Rehabil Eng 21(2):254–264CrossRefGoogle Scholar
  20. 20.
    Troiani C, Martinelli A, Laugier C, Scaramuzza D (2014) 2-point-based outlier rejection for camera-imu systems with applications to micro aerial vehicles. In: 2014 IEEE international conference on robotics and automation (ICRA), pp 5530– 5536CrossRefGoogle Scholar
  21. 21.
    Hartley R, Zisserman A (2003) Multiple view geometry in computer vision. Cambridge University Press, CambridgeMATHGoogle Scholar
  22. 22.
    Murray R, Li Z, Sastry S, Sastry S (1994) A mathematical introduction to robotic manipulation. CRC pressGoogle Scholar
  23. 23.
    Furgale P, Rehder J, Siegwart R (2013) Unified temporal and spatial calibration for multi-sensor systems. In: 2013 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 1280–1286CrossRefGoogle Scholar
  24. 24.
    Li X, Hu Z (2010) Rejecting mismatches by correspondence function. Int J Comput Vis 89(1):1–17CrossRefGoogle Scholar
  25. 25.
    Bouguet J (2004) Camera calibration toolbox for matlabGoogle Scholar
  26. 26.
    Madgwick S, Harrison A, Vaidyanathan R (2011) Estimation of IMU and MARG orientation using a gradient descent algorithm. In: 2011 IEEE international conference on rehabilitation robotics, pp 1–7Google Scholar
  27. 27.
    Scaramuzza D, Fraundorfer F (2011) Visual odometry tutorial. IEEE Robot Autom Mag 18(4):80–92CrossRefGoogle Scholar
  28. 28.
    Steinbrcker F, Sturm J, Cremers D (2011) Real-time visual odometry from dense RGB-D images. In: 2011 IEEE international conference on computer vision workshops (ICCV), pp 719–722CrossRefGoogle Scholar
  29. 29.
    Panahandeh G, Jansson M, Hutchinson S (2013) IMU-camera data fusion: Horizontal plane observation with explicit outlier rejection. In: 2013 international conference on indoor positioning and indoor navigation (IPIN), pp 1–9Google Scholar
  30. 30.
    Shirmohammadi S, Ferrero A (2014) Camera as the instrument: the rising trend of vision based measurement. IEEE Trans Instrum Meas 17(3):41–47CrossRefGoogle Scholar
  31. 31.
    Hu J, Sun K (2015) A robust orientation estimation algorithm using MARG sensors. IEEE Trans Instrum Meas 64(3):815–822CrossRefGoogle Scholar
  32. 32.
    Shiratori T, Park H, Sigal L, Sheikh Y, Hodgins J (2011) Motion capture from body-mounted cameras. ACM Trans Graph 30(4):1–10CrossRefGoogle Scholar

Copyright information

© Springer-Verlag London Ltd. 2017

Authors and Affiliations

  1. 1.Key Laboratory of Networked Control Systems, Shenyang Institute of AutomationChinese Academy of SciencesShenyangChina
  2. 2.University of Chinese Academy of SciencesBeijingChina
  3. 3.Department of Mechanical, Aerospace and Biomedical EngineeringUniversity of TennesseeKonxvilleUSA

Personalised recommendations