Autonomous Robots

, Volume 42, Issue 3, pp 615–629 | Cite as

Relative motion estimation using visual–inertial optical flow

Article
  • 151 Downloads

Abstract

This paper proposes a method to measure the motion of a moving rigid body using a hybrid visual–inertial sensor. The rotational velocity of the moving object is computed from visual optical flow by solving a depth-independent bilinear constraint, and the translational velocity of the moving object is estimated by solving a dynamics constraint that reveals the relation between scene depth and translational motion. By fusing an inertial sensor, the scale of translational velocities can be estimated, which is otherwise unrecoverable from monocular visual optical flow. An iterative refinement scheme is introduced to deal with observation noise and outliers, and the extended Kalman filter is applied for motion tracking. The performance of the proposed method is evaluated by simulation studies and practical experiments, and the results show the effectiveness of the proposed method in terms of accuracy and robustness.

Keywords

Motion measurement Dynamic scene analysis Visual–inertial perception Smart camera Wearable robotics 

References

  1. Adiv, G. (1985). Determining three-dimensional motion and structure from optical flow generated by several moving objects. IEEE Transactions on Pattern Analysis and Machine Intelligence, 4, 384–401.CrossRefGoogle Scholar
  2. Andersh, J., Cherian, A., Mettler, B., & Papanikolopoulos, N. (2015). A vision based ensemble approach to velocity estimation for miniature rotorcraft. Autonomous Robots, 39(2), 123–138.CrossRefGoogle Scholar
  3. Ballard, D. H., & Kimball, O. (1983). Rigid body motion from depth and optical flow. Computer Vision, Graphics, and Image Processing, 22(1), 95–115.CrossRefGoogle Scholar
  4. Cong, Y., Gong, H., Tang, Y., Ge, S. S., & Luo, J. (2015). Real-time one-dimensional motion estimation and its application in computer vision. Machine Vision and Applications, 26(5), 633–648.CrossRefGoogle Scholar
  5. Green, W. E., & Oh, P. Y. (2008). Optic-flow-based collision avoidance. IEEE Robotics & Automation Magazine, 15, 1.CrossRefGoogle Scholar
  6. He, H., Li, Y., Guan, Y., & Tan, J. (2015). Wearable ego-motion tracking for blind navigation in indoor environments. IEEE Transactions on Automation Science and Engineering, 12(4), 1181–1190.CrossRefGoogle Scholar
  7. He, H., Li, Y., & Tan, J. (2016). Rotational coordinate transformation for visual–inertial sensor fusion. In International conference on social robotics (pp. 431–440). New York: Springer.Google Scholar
  8. He, H., & Tan, J. (2014). Ambient motion estimation in dynamic scenes using wearable visual and inertial sensors. Proceedings of the IEEE international conference on robotics and automation.Google Scholar
  9. Herbst, E., Ren, X., & Fox, D. (2013). Rgb-d flow: Dense 3-d motion estimation using color and depth. In Robotics and automation (ICRA), 2013 IEEE international conference (pp. 2276–2282).Google Scholar
  10. Honegger, D., Greisen, P., Meier, L., Tanskanen, P., & Pollefeys, M. (2012). Real-time velocity estimation based on optical flow and disparity matching. In Intelligent robots and systems (IROS), 2012 IEEE/RSJ international conference (pp. 5177–5182).Google Scholar
  11. Kendoul, F., Fantoni, I., & Nonami, K. (2009). Optic flow-based vision system for autonomous 3d localization and control of small aerial vehicles. Robotics and Autonomous Systems, 57(6), 591–602.CrossRefGoogle Scholar
  12. Longuet-Higgins, H. C., & Prazdny, K. (1980). The interpretation of a moving retinal image. Proceedings of the Royal Society of London. Series B. Biological Sciences, 208(1173), 385–397.CrossRefGoogle Scholar
  13. Lucas, B. D., Kanade, T., et al. (1981). An iterative image registration technique with an application to stereo vision. IJCAI, 81(1), 674–679.Google Scholar
  14. Maimone, M., Cheng, Y., & Matthies, L. (2007). Two years of visual odometry on the mars exploration rovers: Field reports. Journal of Field Robotics, 24, 3.CrossRefGoogle Scholar
  15. Martinelli, A. (2014). Closed-form solution of visual–inertial structure from motion. International Journal of Computer Vision, 106(2), 138–152.Google Scholar
  16. Mebarki, R., Lippiello, V., & Siciliano, B. (2017). Vision-based and IMU-aided scale factor-free linear velocity estimator. Autonomous Robots, 41(4), 903–917.Google Scholar
  17. Meier, D., Brockers, R., Matthies, L., Siegwart, R., & Weiss, S. (2015). Detection and characterization of moving objects with aerial vehicles using inertial-optical flow. In Intelligent robots and systems (IROS), 2015 IEEE/RSJ international conference (pp. 2473–2480).Google Scholar
  18. Mondragón, I. F., Olivares-Méndez, M. A., Campoy, P., Martínez, C., & Mejias, L. (2010). Unmanned aerial vehicles uavs attitude, height, motion estimation and control using visual systems. Autonomous Robots, 29(1), 17–34.CrossRefGoogle Scholar
  19. Naroditsky, O., Zhou, X. S., Gallier, J., Roumeliotis, S. I., & Daniilidis, K. (2012). Two efficient solutions for visual odometry using directional correspondence. IEEE Transactions on Pattern Analysis and Machine Intelligence, 34(4), 818–824.CrossRefGoogle Scholar
  20. Ozden, K. E., Schindler, K., & Van Gool, L. (2010). Multibody structure-from-motion in practice. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(6), 1134–1141.CrossRefGoogle Scholar
  21. Pauwels, K., & Van Hulle, M. M. (2004). Segmenting independently moving objects from egomotion flow fields. In Early cognitive vision workshop, isle of skye (p. 6).Google Scholar
  22. Rao, S., Tron, R., Vidal, R., & Ma, Y. (2010). Motion segmentation in the presence of outlying, incomplete, or corrupted trajectories. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(10), 1832–1845.CrossRefGoogle Scholar
  23. Raudies, F., & Neumann, H. (2009). An efficient linear method for the estimation of ego-motion from optical flow. In DAGM-Symposium, pp. 11–20. New York: Springer.Google Scholar
  24. Raudies, F., & Neumann, H. (2012). A review and evaluation of methods estimating ego-motion. Computer Vision and Image Understanding, 116(5), 606–633.CrossRefGoogle Scholar
  25. Sabzevari, R., & Scaramuzza, D. (2016). Multi-body motion estimation from monocular vehicle-mounted cameras. IEEE Transactions on Robotics, 32(3), 638–651.CrossRefGoogle Scholar
  26. Stückler, J., & Behnke, S. (2015). Efficient dense rigid-body motion segmentation and estimation in RGB-D video. International Journal of Computer Vision, 113(3), 233–245.MathSciNetCrossRefGoogle Scholar
  27. Ullman, S., & Hildreth, E. (1983). The measurement of visual motion. In Physical and biological processing of images (pp. 154–176). New York: Springer.Google Scholar
  28. Zingg, S., Scaramuzza, D., Weiss, S., & Siegwart, R. (2010). Mav navigation through indoor corridors using optical flow. In Robotics and automation (ICRA), 2010 IEEE international conference (pp. 3361–3368).Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2017

Authors and Affiliations

  1. 1.Department of Electrical Engineering and Computer ScienceWichita State UniversityWichitaUSA
  2. 2.Department of Mechanical, Aerospace and Biomedical EngineeringThe University of TennesseeKnoxvilleUSA

Personalised recommendations