Advertisement

Journal of Real-Time Image Processing

, Volume 14, Issue 3, pp 637–646 | Cite as

Visual odometry based on the Fourier transform using a monocular ground-facing camera

  • Merwan Birem
  • Richard Kleihorst
  • Norddin El-Ghouti
Special Issue Paper
  • 179 Downloads

Abstract

This paper presents a visual odometry method that estimates the location and orientation of a robot. The visual odometry approach is based on the Fourier transform, which extracts the translation between consecutive image’s regions captured using a ground-facing camera. The proposed method is especially suited if no distinct visual features are present on the ground. This approach is resistant to wheel slippage because it is independent of the kinematics of the vehicle. The method has been tested on different experimental platforms and evaluated against the ground truth, including a successful loop-closing test, to demonstrate its general use and performance.

Keywords

Visual odometry Vision Ground-facing camera 

Notes

Acknowledgements

This work was supported by the Flander’s Make Research Center and partially funded by the GPS-Positioning project.

References

  1. 1.
  2. 2.
    Agrawal, M., Konolige, K.: Rough terrain visual odometry. In: Proceedings of the International Conference on Advanced Robotics (ICAR), pp. 28–30 (2007)Google Scholar
  3. 3.
    Benhimane, S., Ladikos, A., Lepetit, V., Navab, N.: Linear and quadratic subsets for template-based tracking. In: Computer Vision and Pattern Recognition, 2007. CVPR’07. Conference on IEEE, pp. 1–6 (2007)Google Scholar
  4. 4.
    Bonin-Font, F., Ortiz, A., Oliver, G.: Visual navigation for mobile robots: a survey. J. Intell. Robot. Syst. 53(3), 263–296 (2008)CrossRefGoogle Scholar
  5. 5.
    Campbell, J., Sukthankar, R., Nourbakhsh, I.: Techniques for evaluating optical flow for visual odometry in extreme terrain. In: Intelligent Robots and Systems, 2004 (IROS 2004), pp. 3704–3711 (2004)Google Scholar
  6. 6.
    Campbell, J., Sukthankar, R., Nourbakhsh, I., Pahwa, A.: A robust visual odometry and precipice detection system using consumer-grade monocular vision. In: Robotics and Automation, 2005. ICRA 2005. Proceedings of the 2005 International Conference on IEEE, pp. 3421–3427 (2005)Google Scholar
  7. 7.
    Casasent, D., Psaltis, D.: Position, rotation, and scale invariant optical correlation. Appl. Opt. 15(7), 1795–1799 (1976)CrossRefGoogle Scholar
  8. 8.
    Cheng, Y., Maimone, M.W., Matthies, L.: Visual odometry on the Mars exploration rovers-a tool to ensure accurate driving and science imaging. Robot. Autom. Mag. IEEE 13(2), 54–62 (2006)CrossRefGoogle Scholar
  9. 9.
    Dille, M., Grocholsky, B., Singh, S.: Outdoor downward-facing optical flow odometry with commodity sensors. In: Field and Service Robotics, pp. 183–193 (2010)Google Scholar
  10. 10.
    ElMadany, M., Abduljabbar, Z.: On the statistical performance of active and semi-active car suspension systems. Comput. Struct. 33(3), 785–790 (1989)CrossRefGoogle Scholar
  11. 11.
    Foroosh, H., Zerubia, J.B., Berthod, M.: Extension of phase correlation to subpixel registration. Image Process. IEEE Trans. 11(3), 188–200 (2002)CrossRefGoogle Scholar
  12. 12.
    Goecke, R., Asthana, A., Pettersson, N., Petersson, L.: Visual vehicle egomotion estimation using the Fourier-Mellin transform. In: Intelligent Vehicles Symposium, 2007 IEEE, pp. 450–455 (2007)Google Scholar
  13. 13.
    Horn, B.K., Schunck, B.G.: Determining optical flow. In: 1981 Technical Symposium East, pp. 319–331 (1981)Google Scholar
  14. 14.
    Howard, A.: Real-time stereo visual odometry for autonomous ground vehicles. In: Intelligent Robots and Systems, 2008. IROS 2008. International Conference on IEEE/RSJ, pp. 3946–3952 (2008)Google Scholar
  15. 15.
    Kazik, T., Ali Haydar, G.: Visual odometry based on the Fourier-Mellin transform for a rover using a monocular ground-facing camera. In: 2011 IEEE International Conference on Mechatronics, ICM 2011—Proceedings, pp. 469–474Google Scholar
  16. 16.
    Kelly, A.: Mobile robot localization from large-scale appearance mosaics. Int. J. Robot. Res. 19(11), 1104–1125 (2000)CrossRefGoogle Scholar
  17. 17.
    Konolige, K., Agrawal, M.: Frame-frame matching for real-time consistent visual mapping. In: Robotics and Automation, 2007 International Conference on IEEE, pp. 2803–2810 (2007)Google Scholar
  18. 18.
    Labrosse, F.: The visual compass: performance and limitations of an appearance-based method. J. Field Robot. 23(10), 913–941 (2006)CrossRefGoogle Scholar
  19. 19.
    Lucas, B. D., Kanade, T., others.: An iterative image registration technique with an application to stereo vision., in IJCAI, pp. 674–679, (1981)Google Scholar
  20. 20.
    Matthies, L., Maimone, M., Johnson, A., Cheng, Y., Willson, R., Villalpando, C., Goldberg, S., Huertas, A., Stein, A., Angelova, A.: Computer vision on mars. Int. J. Comput. Vis. 75(1), 67–92 (2007)CrossRefGoogle Scholar
  21. 21.
    McCarthy, C., Bames, N.: Performance of optical flow techniques for indoor navigation with a mobile robot. In: Robotics and Automation, 2004. Proceedings. ICRA 04. 2004 International Conference on IEEE, pp. 5093–5098 (2004)Google Scholar
  22. 22.
    Mei, C., Benhimane, S., Malis, E., Rives, P.: Constrained multiple planar template tracking for central Catadioptric Cameras. In: BMVC, pp. 619–628 (2006)Google Scholar
  23. 23.
    Miller, I., Freund, J.E., Johnson, R.A.: Probability and Statistics for Engineers, vol. 1110. Prentice-Hall, Englewood Cliffs (1965)zbMATHGoogle Scholar
  24. 24.
    Molton, N., Davison, A.J., Reid, I.: Locally planar patch features for real-time structure from motion. In: BMVC, pp. 1–10 (2004)Google Scholar
  25. 25.
    Nistr, D., Naroditsky, O., Bergen, J.: Visual odometry for ground vehicle applications. J. Field Robot. 23(1), 3–20 (2006)CrossRefzbMATHGoogle Scholar
  26. 26.
    Nourani-Vatani, N., Borges, P.V.K.: Correlation-based visual odometry for ground vehicles. J. Field Robot. 28(5), 742–768 (2011)CrossRefzbMATHGoogle Scholar
  27. 27.
    Papoulis, A.: Signal analysis, vol. 191. McGraw-Hill, New York (1977)zbMATHGoogle Scholar
  28. 28.
    Reddy, B.S., Chatterji, B.N.: An FFT-based technique for translation, rotation, and scale-invariant image registration. IEEE Trans. Image Process. 5(8), 1266–1271 (1996)CrossRefGoogle Scholar
  29. 29.
    Zimmermann, K., Matas, J., Svoboda, T.: Tracking by an optimal sequence of linear predictors. Pattern Analysis and Machine Intelligence, Transactions on IEEE 31(4), 677–692 (2009)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany 2017

Authors and Affiliations

  1. 1.Flanders MakeLommelBelgium

Personalised recommendations