Abstract
We present a real-time feature-based SLAM (Simultaneous Localization and Mapping) system for fisheye cameras featured by a large field-of-view (FoV). Large FoV cameras are beneficial for large-scale outdoor SLAM applications, because they increase visual overlap between consecutive frames and capture more pixels belonging to the static parts of the environment. However, current feature-based SLAM systems such as PTAM and ORB-SLAM limit their camera model to pinhole only. To compensate for the vacancy, we propose a novel SLAM system with the cubemap model that utilizes the full FoV without introducing distortion from the fisheye lens, which greatly benefits the feature matching pipeline. In the initialization and point triangulation stages, we adopt a unified vector-based representation to efficiently handle matches across multiple faces, and based on this representation we propose and analyze a novel inlier checking metric. In the optimization stage, we design and test a novel multi-pinhole reprojection error metric that outperforms other metrics by a large margin. We evaluate our system comprehensively on a public dataset as well as a self-collected dataset that contains real-world challenging sequences. The results suggest that our system is more robust and accurate than other feature-based fisheye SLAM approaches. The CubemapSLAM system has been released into the public domain.
This work is partially supported by the National Natural Science Foundation (61872200), the Natural Science Foundation of Tianjin (17JCQNJC00300) and the National Key Research and Development Program of China (2016YFC0400709).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Arican, Z., Frossard, P.: OmniSIFT: scale invariant features in omnidirectional images. In: IEEE International Conference on Image Processing (ICIP), pp. 3505–3508. IEEE (2010)
Barfoot, T.D.: State Estimation for Robotics. Cambridge University Press, Cambridge (2017)
Caruso, D., Engel, J., Cremers, D.: Large-scale direct slam for omnidirectional cameras. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 141–148. IEEE (2015)
Engel, J., Schöps, T., Cremers, D.: LSD-SLAM: large-scale direct monocular SLAM. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds.) ECCV 2014. LNCS, vol. 8690, pp. 834–849. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-10605-2_54
Forster, C., Zhang, Z., Gassner, M., Werlberger, M., Scaramuzza, D.: SVO: semidirect visual odometry for monocular and multicamera systems. IEEE Trans. Robot. 33(2), 249–265 (2017)
Furgale, P., et al.: Toward automated driving in cities using close-to-market sensors: an overview of the V-Charge project. In: IEEE Intelligent Vehicles Symposium (IV), pp. 809–816. IEEE (2013)
Gálvez-López, D., Tardos, J.D.: Bags of binary words for fast place recognition in image sequences. IEEE Trans. Robot. 28(5), 1188–1197 (2012)
Hartley, R., Zisserman, A.: Multiple View Geometry in Computer Vision. Cambridge University Press, Cambridge (2003)
Heng, L., Choi, B.: Semi-direct visual odometry for a fisheye-stereo camera. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4077–4084. IEEE (2016)
Heng, L., Li, B., Pollefeys, M.: CamOdoCal: automatic intrinsic and extrinsic calibration of a rig with multiple generic cameras and odometry. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1793–1800. IEEE (2013)
Kangni, F., Laganiere, R.: Orientation and pose recovery from spherical panoramas. In: IEEE International Conference on Computer Vision (ICCV), pp. 1–8. IEEE (2007)
Kneip, L., Furgale, P.: OpenGV: a unified and generalized approach to real-time calibrated geometric vision. In: IEEE International Conference on Robotics and Automation (ICRA), pp. 1–8. IEEE (2014)
Lee, G.H., Faundorfer, F., Pollefeys, M.: Motion estimation for self-driving cars with a generalized camera. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 2746–2753. IEEE (2013)
Linegar, C., Churchill, W., Newman, P.: Work smart, not hard: recalling relevant experiences for vast-scale but time-constrained localisation. In: IEEE International Conference on Robotics and Automation (ICRA), pp. 90–97. IEEE (2015)
Liu, P., Heng, L., Sattler, T., Geiger, A., Pollefeys, M.: Direct visual odometry for a fisheye-stereo camera. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2017)
Lowe, D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. (IJCV) 60(2), 91–110 (2004)
Mur-Artal, R., Montiel, J.M.M., Tardos, J.D.: ORB-SLAM: a versatile and accurate monocular SLAM system. IEEE Trans. Robot. 31(5), 1147–1163 (2015)
Mur-Artal, R., Tardós, J.D.: ORB-SLAM2: an open-source SLAM system for monocular, stereo, and RGB-D cameras. IEEE Trans. Robot. 33(5), 1255–1262 (2017)
Pagani, A., Stricker, D.: Structure from motion using full spherical panoramic cameras. In: IEEE International Conference on Computer Vision Workshops (ICCV Workshops), pp. 375–382. IEEE (2011)
Pless, R.: Using many cameras as one. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2, pp. II-587. IEEE (2003)
Rituerto, A., Puig, L., Guerrero, J.J.: Visual SLAM with an omnidirectional camera. In: International Conference on Pattern Recognition (ICPR), pp. 348–351. IEEE (2010)
Ros, G., Sappa, A., Ponsa, D., Lopez, A.M.: Visual SLAM for driverless cars: a brief survey. In: Intelligent Vehicles Symposium (IV) Workshops, vol. 2 (2012)
Scaramuzza, D., Martinelli, A., Siegwart, R.: A toolbox for easily calibrating omnidirectional cameras. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5695–5701. IEEE (2006)
Scaramuzza, D., Siegwart, R.: Appearance-guided monocular omnidirectional visual odometry for outdoor ground vehicles. IEEE Trans. Robot. 24(5), 1015–1026 (2008)
Sturm, J., Engelhard, N., Endres, F., Burgard, W., Cremers, D.: A benchmark for the evaluation of RGB-D SLAM systems. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 573–580. IEEE (2012)
Tardif, J.P., Pavlidis, Y., Daniilidis, K.: Monocular visual odometry in urban environments using an omnidirectional camera. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2531–2538. IEEE (2008)
Urban, S., Hinz, S.: MultiCol-SLAM-A modular real-time multi-camera SLAM system. arXiv preprint arXiv:1610.07336 (2016)
Urban, S., Jutzi, B.: LaFiDa—a laserscanner multi-fisheye camera dataset. J. Imaging 3(1), 5 (2017)
Ventura, J., Höllerer, T.: Wide-area scene mapping for mobile visual tracking. In: IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 3–12. IEEE (2012)
Zhang, Z., Rebecq, H., Forster, C., Scaramuzza, D.: Benefit of large field-of-view cameras for visual odometry. In: IEEE International Conference on Robotics and Automation (ICRA), pp. 801–808. IEEE (2016)
Zhao, Q., Feng, W., Wan, L., Zhang, J.: SPHORB: a fast and robust binary feature on the sphere. Int. J. Comput. Vis. (IJCV) 113(2), 143–159 (2015)
Ziegler, J., et al.: Making Bertha drive—an autonomous journey on a historic route. IEEE Intell. Transp. Syst. Mag. 6(2), 8–20 (2014)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Wang, Y. et al. (2019). CubemapSLAM: A Piecewise-Pinhole Monocular Fisheye SLAM System. In: Jawahar, C., Li, H., Mori, G., Schindler, K. (eds) Computer Vision – ACCV 2018. ACCV 2018. Lecture Notes in Computer Science(), vol 11366. Springer, Cham. https://doi.org/10.1007/978-3-030-20876-9_3
Download citation
DOI: https://doi.org/10.1007/978-3-030-20876-9_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-20875-2
Online ISBN: 978-3-030-20876-9
eBook Packages: Computer ScienceComputer Science (R0)