Efficient velocity estimation for MAVs by fusing motion from two frontally parallel cameras

  • Zhi Gao
  • Bharath Ramesh
  • Wen-Yan Lin
  • Pengfei Wang
  • Xu Yan
  • Ruifang Zhai
Original Research Paper
  • 13 Downloads

Abstract

Efficient velocity estimation is crucial for the robust operation of navigation control loops of micro aerial vehicles (MAVs). Motivated by the research on how animals exploit their visual topographies to rapidly perform locomotion, we propose a bio-inspired method that applies quasi-parallax technique to estimate the velocity of an MAV equipped with a forward-looking stereo camera without GPS. Different to the available optical flow-based methods, our method can realize efficient metric velocity estimation without applying any depth information from either additional distance sensors or from stereopsis. In particular, the quasi-parallax technique, which claims to press maximal benefits from the configuration of two frontally parallel cameras, leverages pairs of parallel visual rays to eliminate rotational flow for translational velocity estimation, followed by refinement of the estimation of rotational velocity and translational velocity iteratively and alternately. Our method fuses the motion information from two frontal-parallel cameras without performing correspondences matching, achieving enhanced robustness and efficiency. Extensive experiments on synthesized and actual scenes demonstrate the effectiveness and efficiency of our method.

Keywords

Velocity estimation Micro aerial vehicle Quasi-parallax Optical flow 

Supplementary material

11554_2018_752_MOESM1_ESM.mp4 (8.9 mb)
Supplementary material 1 (mp4 9087 KB)

References

  1. 1.
    Briod, A., Zufferey, J.C., Floreano, D.: Optic-flow based control of a 46g quadrotor. In: Workshop on Vision-based Closed-Loop Control and Navigation of Micro Helicopters in GPS-Denied Environments, IROS 2013, EPFL-CONF-189879 (2013)Google Scholar
  2. 2.
    Cheong, L.F., Gao, Z.: Quasi-parallax for nearly parallel frontal eyes. Int. J. Comput. Vis. 101(1), 45–63 (2013)MathSciNetCrossRefGoogle Scholar
  3. 3.
    Faessler, M., Fontana, F., Forster, C., Mueggler, E., Pizzoli, M., Scaramuzza, D. (2015) Autonomous, vision-based flight and live dense 3d mapping with a quadrotor micro aerial vehicle. J. Field Robot.Google Scholar
  4. 4.
    Fraundorfer, F., Scaramuzza, D.: Visual odometry: Part I: the first 30 years and fundamentals. IEEE Robot. Autom. Mag. 18(4), 80–92 (2011)CrossRefGoogle Scholar
  5. 5.
    Heng, L., Honegger, D., Lee, G.H., Meier, L., Tanskanen, P., Fraundorfer, F., Pollefeys, M.: Autonomous visual mapping and exploration with a micro aerial vehicle. J. Field Robot. 31(4), 654–675 (2014)CrossRefGoogle Scholar
  6. 6.
    Honegger, D., Greisen, P., Meier, L., Tanskanen, P., Pollefeys, M.: Real-time velocity estimation based on optical flow and disparity matching. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, pp. 5177–5182 (2012)Google Scholar
  7. 7.
    Honegger, D., Meier, L., Tanskanen, P., Pollefeys, M.: An open source and open hardware embedded metric optical flow cmos camera for indoor and outdoor applications. In: Robotics and Automation (ICRA), 2013 IEEE International Conference on, IEEE, pp. 1736–1741 (2013)Google Scholar
  8. 8.
    Kato, S., Takeuchi, E., Ishiguro, Y., Ninomiya, Y., Takeda, K., Hamada, T.: An open approach to autonomous vehicles. IEEE Micro 35(6), 60–68 (2015)CrossRefGoogle Scholar
  9. 9.
    Kim, J.H., Li, H., Hartley, R.: Motion estimation for nonoverlapping multicamera rigs: linearalgebraic and l-infinity geometric solutions. IEEE Trans. Pattern Anal. Mach. Intell. 32(6), 1044–1059 (2010)CrossRefGoogle Scholar
  10. 10.
    Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: Mixed and Augmented Reality, 2007. ISMAR 2007. 6th IEEE and ACM International Symposium on, IEEE, pp. 225–234 (2007)Google Scholar
  11. 11.
    Lee, H., Faundorfer, F., Pollefeys, M.: Motion estimation for self-driving cars with a generalized camera. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2746–2753 (2013)Google Scholar
  12. 12.
    Martin, G.R.: What is binocular vision for? A birds’ eye view. J. Vis. 9(11), 14–14 (2009)CrossRefGoogle Scholar
  13. 13.
    McGuire, K., de Croon, G., De Wagter, C., Remes, B., Tuyls, K., Kappen, H.: Local histogram matching for efficient optical flow computation applied to velocity estimation on pocket drones. In: Robotics and Automation (ICRA), 2016 IEEE International Conference on, IEEE, pp. 3255–3260 (2016)Google Scholar
  14. 14.
    McGuire, K., de Croon, G., De Wagter, C., Tuyls, K., Kappen, H.: Efficient optical flow and stereo vision for velocity estimation and obstacle avoidance on an autonomous pocket drone. IEEE Robot. Autom. Lett. 2(2), 1070–1076 (2017)CrossRefGoogle Scholar
  15. 15.
    Mur-Artal, R., Montiel, J., Tardos, J.D.: Orb-slam: a versatile and accurate monocular slam system. IEEE Trans. Robot. 31(5), 1147–1163 (2015)CrossRefGoogle Scholar
  16. 16.
    Pless, R.: Camera cluster in motion: motion estimation for generalized camera designs. IEEE Robot. Autom. Mag. 11(4), 39–44 (2004)CrossRefGoogle Scholar
  17. 17.
    Rieger, J., Lawton, D.: Processing differential image motion. JOSA A 2(2), 354–359 (1985)CrossRefGoogle Scholar
  18. 18.
    Shi, J., Tomasi, C.: Good features to track. In: Computer Vision and Pattern Recognition, 1994. Proceedings CVPR’94., 1994 IEEE Computer Society Conference on, IEEE, pp. 593–600 (1994)Google Scholar
  19. 19.
    Srinivasan, M.V., Thurrowgood, S., Soccol, D.: From visual guidance in flying insects to autonomous aerial vehicles. In: Flying insects and robots, Springer, Berlin, pp. 15–28 (2009)Google Scholar
  20. 20.
    Sun, D., Roth, S., Black, M.J.: Secrets of optical flow estimation and their principles. In: Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on, IEEE, pp. 2432–2439 (2010)Google Scholar
  21. 21.
    Trucco, E., Verri, A.: Introductory Techniques for 3-D Computer Vision, vol. 201. Prentice Hall, Englewood Cliffs (1998)Google Scholar
  22. 22.
    Weiss, S., Brockers, R., Matthies, L.: 4dof drift free navigation using inertial cues and optical flow. In: Intelligent Robots and Systems (IROS), 2013 IEEE/RSJ International Conference on, IEEE, pp. 4180–4186 (2013)Google Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Temasek LaboratoriesNational University of SingaporeSingaporeSingapore
  2. 2.Advanced Digital Sciences Center (ADSC)SingaporeSingapore
  3. 3.Department of Computer Science, School of InformaticsHuazhong Agricultural UniversityWuhanChina

Personalised recommendations