Skip to main content

Egomotion Estimation Under Planar Motion with an RGB-D Camera

  • Conference paper
  • First Online:
Intelligence Science and Big Data Engineering. Visual Data Engineering (IScIDE 2019)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 11935))

  • 1389 Accesses

Abstract

In this paper, we propose a method for egomotion estimation of an indoor mobile robot under planar motion with an RGB-D camera. Our approach mainly deals with the corridor-like structured scenarios and uses the prior knowledge of the environment: when at least one vertical plane is detected using the depth data, egomotion is estimated with one normal of the vertical plane and one point; when there are no vertical planes, a 2-point homography-based algorithm using only point correspondences is presented for the egomotion estimation. The proposed method then is used in a frame-to-frame visual odometry framework. We evaluate our algorithm on the synthetic data and show the application on the real-world data. The experiments show that the proposed approach is efficient and robust enough for egomotion estimation in the Manhattan-like environments compared with the state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bay, H., Ess, A., Tuytelaars, T., Van Gool, L.: Speeded-up robust features (SURF). Comput. Vis. Image Underst. 110(3), 346–359 (2008)

    Article  Google Scholar 

  2. Bergmann, P., Wang, R., Cremers, D.: Online photometric calibration of auto exposure video for realtime visual odometry and SLAM. IEEE Robot. Autom. Lett. 3(2), 627–634 (2018)

    Article  Google Scholar 

  3. Cao, Z., Sheikh, Y., Banerjee, N.K.: Real-time scalable 6DOF pose estimation for textureless objects. In: 2016 IEEE International conference on Robotics and Automation (ICRA), pp. 2441–2448. IEEE (2016)

    Google Scholar 

  4. Engel, J., Koltun, V., Cremers, D.: Direct sparse odometry. IEEE Trans. Pattern Anal. Mach. Intell. 40(3), 611–625 (2018)

    Article  Google Scholar 

  5. Forster, C., Zhang, Z., Gassner, M., Werlberger, M., Scaramuzza, D.: SVO: semidirect visual odometry for monocular and multicamera systems. IEEE Trans. Robot. 33(2), 249–265 (2017)

    Article  Google Scholar 

  6. Hartley, R., Zisserman, A.: Multiple View Geometry in Computer Vision, 2nd edn. Cambridge University Press, Cambridge (2000)

    MATH  Google Scholar 

  7. Hartley, R.I.: In defence of the 8-point algorithm. In: Proceedings of IEEE International Conference on Computer Vision, pp. 1064–1070. IEEE (1995)

    Google Scholar 

  8. Hu, H., Sun, H., Ye, P., Jia, Q., Gao, X.: Multiple maps for the feature-based monocular SLAM system. J. Intell. Robot. Syst. 94(2), 389–404 (2019)

    Article  Google Scholar 

  9. Kaess, M.: Simultaneous localization and mapping with infinite planes. In: 2015 IEEE International Conference on Robotics and Automation (ICRA), pp. 4605–4611. IEEE (2015)

    Google Scholar 

  10. Kim, P., Coltin, B., Kim, H.J.: Visual odometry with drift-free rotation estimation using indoor scene regularities. In: 2017 British Machine Vision Conference (2017)

    Google Scholar 

  11. Kukelova, Z., Bujnak, M., Pajdla, T.: Polynomial eigenvalue solutions to the 5-pt and 6-pt relative pose problems. In: BMVC, vol. 2, p. 2008 (2008)

    Google Scholar 

  12. Le, P.H., Košecka, J.: Dense piecewise planar RGB-D SLAM for indoor environments. In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4944–4949. IEEE (2017)

    Google Scholar 

  13. Li, H., Hartley, R.: Five-point motion estimation made easy. In: 18th International Conference on Pattern Recognition (ICPR 2006), vol. 1, pp. 630–633. IEEE (2006)

    Google Scholar 

  14. Li, S., Calway, A.: Absolute pose estimation using multiple forms of correspondences from RGB-D frames. In: 2016 IEEE International Conference on Robotics and Automation (ICRA), pp. 4756–4761. IEEE (2016)

    Google Scholar 

  15. Matsuki, H., von Stumberg, L., Usenko, V., Stückler, J., Cremers, D.: Omnidirectional DSO: direct sparse odometry with fisheye cameras. IEEE Robot. Autom. Lett. 3(4), 3693–3700 (2018)

    Article  Google Scholar 

  16. Mur-Artal, R., Tardós, J.D.: ORB-SLAM2: an open-source SLAM system for monocular, stereo, and RGB-D cameras. IEEE Trans. Robot. 33(5), 1255–1262 (2017)

    Article  Google Scholar 

  17. Nistér, D.: An efficient solution to the five-point relative pose problem. IEEE Trans. Pattern Anal. Mach. Intell. 26(6), 0756–777 (2004)

    Article  Google Scholar 

  18. Nistér, D., Naroditsky, O., Bergen, J.: Visual odometry. In: Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2004, vol. 1, pp. I. IEEE (2004)

    Google Scholar 

  19. Nistér, D., Schaffalitzky, F.: Four points in two or three calibrated views: theory and practice. Int. J. Comput. Vis. 67(2), 211–231 (2006)

    Article  Google Scholar 

  20. Rubio, A., et al.: Efficient monocular pose estimation for complex 3D models. In: 2015 IEEE International Conference on Robotics and Automation (ICRA), pp. 1397–1402. IEEE (2015)

    Google Scholar 

  21. Saurer, O., Vasseur, P., Boutteau, R., Demonceaux, C., Pollefeys, M., Fraundorfer, F.: Homography based egomotion estimation with a common direction. IEEE Trans. Pattern Anal. Mach. Intell. 39(2), 327–341 (2016)

    Article  Google Scholar 

  22. Sun, H., Tang, S., Sun, S., Tong, M.: Vision odometer based on RGB-D camera. In: 2018 International Conference on Robots & Intelligent System (ICRIS), pp. 168–171. IEEE (2018)

    Google Scholar 

  23. Ventura, J., Arth, C., Lepetit, V.: Approximated relative pose solvers for efficient camera motion estimation. In: Agapito, L., Bronstein, M.M., Rother, C. (eds.) ECCV 2014. LNCS, vol. 8925, pp. 180–193. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-16178-5_12

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xuelan Mu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Mu, X., Hou, Z., Zhang, Y. (2019). Egomotion Estimation Under Planar Motion with an RGB-D Camera. In: Cui, Z., Pan, J., Zhang, S., Xiao, L., Yang, J. (eds) Intelligence Science and Big Data Engineering. Visual Data Engineering. IScIDE 2019. Lecture Notes in Computer Science(), vol 11935. Springer, Cham. https://doi.org/10.1007/978-3-030-36189-1_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-36189-1_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-36188-4

  • Online ISBN: 978-3-030-36189-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics