Skip to main content

Time-of-Flight Depth Datasets for Indoor Semantic SLAM

  • Conference paper
  • First Online:
Robotics Research

Abstract

This paper introduces a medium-scale point cloud dataset for semantic SLAM (Simultaneous Localization and Mapping) acquired using a SwissRanger time-of-flight camera. An indoor environment with relatively unfluctuating lighting conditions is considered for mapping and localization. The camera is positioned on a mobile tripod and ready to capture images at prearranged locations in the environment. The prearranged locations are in fact used as ground truth for estimating the variance with poses calculated from SLAM, and also as initial pose estimates for the ICP algorithm (Iterative Closest Point). An interesting point is that, in this work, no type of Inertial Measurement Units or visual odometry techniques has been utilized, given the fact that, data from time-of-flight cameras is noisy and sensitive to external conditions (such as lighting, transparent surfaces, parallel overlapping surfaces etc.). Furthermore, a large collection of household objects is made in order to label the scene with semantic information. The whole SLAM dataset with pose files along with the point clouds of household objects is a major contribution in this paper apart from mapping and plane detection using a publicly available toolkit. Also, a novel metric, a context-based similarity score, for evaluating SLAM algorithms is presented.

This work is supported by the Labex IMobS3.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://slam6d.sourceforge.net/.

  2. 2.

    http://kos.informatik.uni-osnabrueck.de/3Dscans/.

References

  1. Bao, S.Y., Bagra, M., Chao, Y.W., Savarese, S.: Semantic structure from motion with points, regions, and objects. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition, pp. 2703–2710 (2012). https://doi.org/10.1109/CVPR.2012.6247992

  2. Borrmann, D., Elseberg, J., Lingemann, K., Nüchter, A., Hertzberg, J.: Globally consistent 3D mapping with scan matching. Robot. Auton. Syst. 56(2), 130–142 (2008). https://doi.org/10.1016/j.robot.2007.07.002, http://www.sciencedirect.com/science/article/pii/S0921889007000863

  3. Cadena, C., Carlone, L., Carrillo, H., Latif, Y., Scaramuzza, D., Neira, J., Reid, I., Leonard, J.J.: Past, present, and future of simultaneous localization and mapping: toward the robust-perception age. IEEE Trans. Robot. 32(6), 1309–1332 (2016). https://doi.org/10.1109/TRO.2016.2624754

    Article  Google Scholar 

  4. Cazorla, M., Viejo, D., Pomares, C.: Study of the sr4000 camera. In: Proceedings of XI Workshop of Physical Agents Fısicos, Valencia, Spain (2010)

    Google Scholar 

  5. Chiabrando, F., Chiabrando, R., Piatti, D., Rinaudo, F.: Sensors for 3d imaging: metric evaluation and calibration of a ccd/cmos time-of-flight camera. Sensors 9(12), 10080–10096 (2009)

    Article  Google Scholar 

  6. Chiabrando, F., Piatti, D., Rinaudo, F.: SR-4000 Tof camera: further experimental tests and first applications to metric surveys. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 38, 149–154 (2010)

    Google Scholar 

  7. Comport, A., Malis, E., Rives, P.: Real-time quadrifocal visual odometry. Int. J. Rob. Res. 29(2–3), 245–266 (2010). https://doi.org/10.1177/0278364909356601

  8. Comport, A.I., Meill, M., Rives, P.: Real-time dense appearance-based SLAM for RGB-D sensors. In: Proceedings of the 2011 Australasian Conference on Robotics and Automation, pp. 100–109 (2011)

    Google Scholar 

  9. Cui, Y., Schuon, S., Chan, D., Thrun, S., Theobalt, C.: 3D shape scanning with a time-of-flight camera. In: 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 1173–1180 (2010). https://doi.org/10.1109/CVPR.2010.5540082

  10. Dib, A., Beaufort, N., Charpillet, F.: A real time visual SLAM for RGB-D cameras based on chamfer distance and occupancy grid. In: 2014 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, pp. 652–657 (2014). https://doi.org/10.1109/AIM.2014.6878153

  11. Diebel, J., Thrun, S.: An application of Markov random fields to range sensing. In: Paper presented on NIPS, pp. 291–298. MIT Press, Cambridge (2005)

    Google Scholar 

  12. Donoho, D.L.: Denoising by soft-thresholding. IEEE Trans. Inf. Theory 41, 613–627 (1995). https://doi.org/10.1109/18.382009

  13. Dopfer, A., Wang, H.H., Wang, C.C.: 3d active appearance model alignment using intensity and range data. Robot. Auton. Syst. 62(2), 168–176 (2014). https://doi.org/10.1016/j.robot.2013.11.002, http://www.sciencedirect.com/science/article/pii/S0921889013002194

  14. Falie, D., Buzuloiu, V.: Noise characteristics of 3d time-of-flight cameras. In: 2007 International Symposium on Signals, Circuits and Systems, vol. 1, pp. 1–4 (2007). https://doi.org/10.1109/ISSCS.2007.4292693

  15. Foix, S., Alenya, G., Torras, C.: Lock-in time-of-flight (tof) cameras: a survey. IEEE Sens. J. 11(9), 1917–1926 (2011). https://doi.org/10.1109/JSEN.2010.2101060

    Article  Google Scholar 

  16. Fuchs, S., May, S.: Calibration and registration for precise surface reconstruction with time-of-flight cameras. Int. J. Intell. Syst. Technol. Appl. 5(3/4), 274–284 (2008). https://doi.org/10.1504/IJISTA.2008.021290

  17. Ghorpade, V.K., Checchin, P., Malaterre, L., Trassoudaine, L.: Performance evaluation of 3d keypoint detectors for time-of-flight depth data. In: 2016 14th International Conference on Control, Automation, Robotics and Vision (ICARCV), pp. 1–6 (2016). https://doi.org/10.1109/ICARCV.2016.7838686

  18. Ghorpade, V.K., Checchin, P., Malaterre, L., Trassoudaine, L.: 3d shape representation with spatial probabilistic distribution of intrinsic shape keypoints. EURASIP J. Adv. Signal Process. 2017(1), 52 (2017). https://doi.org/10.1186/s13634-017-0483-y

  19. Ghorpade, V.K., Checchin, P., Trassoudaine, L.: Line-of-sight-based tof camera’s range image filtering for precise 3d scene reconstruction. In: 2015 European Conference on Mobile Robots (ECMR), pp. 1–6 (2015). https://doi.org/10.1109/ECMR.2015.7324208

  20. Grisetti, G., Stachniss, C., Burgard, W.: Improved techniques for grid mapping with rao-blackwellized particle filters. IEEE Trans. Robot. 23(1), 34–46 (2007). https://doi.org/10.1109/TRO.2006.889486

    Article  Google Scholar 

  21. Hansard, M., Lee, S., Choi, O., Horaud, R.P.: Time-of-flight cameras: principles, methods and applications. Springer Science & Business Media, London (2012)

    Google Scholar 

  22. He, Y., Liang, B., Zou, Y., He, J., Yang, J.: Depth errors analysis and correction for time-of-flight (tof) cameras. Sensors 17(1), 92 (2017)

    Article  Google Scholar 

  23. Henry, P., Krainin, M., Herbst, E., Ren, X., Fox, D.: RGB-D Mapping: Using Depth Cameras for Dense 3D Modeling of Indoor Environments, pp. 477–491. Springer, Berlin (2014). https://doi.org/10.1007/978-3-642-28572-1_33

  24. Hong, S., Ye, C., Bruch, M., Halterman, R.: Performance evaluation of a pose estimation method based on the swissranger sr4000. In: 2012 IEEE International Conference on Mechatronics and Automation, pp. 499–504 (2012). https://doi.org/10.1109/ICMA.2012.6283123

  25. Iddan, G.J., Yahav, G.: Three-dimensional imaging in the studio and elsewhere. In: Proceedings of the SPIE Three-Dimensional Image Capture and Applications IV. vol. 4298 (2001). https://doi.org/10.1117/12.424913

  26. Jovanov, L., Pižurica, A., Philips, W.: Fuzzy logic-based approach to wavelet denoising of 3d images produced by time-of-flight cameras. Opt. Exp. 18, :22651–22676 (2010). https://doi.org/10.1364/OE.18.022651

  27. Kahlmann, T., Remondino, F., Ingensand, H.: Calibration for increased accuracy of the range imaging camera SwissRangerTM. In: Maas, H.G. (ed.) Proceedings of the ISPRS Commission V Symposium, International archives of photogrammetry, remote sensing and spatial information sciences, vol. 36, pp. 136–141. Institute of Photogrammetry and Remote Sensing, University of Technology, Dresden (2006)

    Google Scholar 

  28. Khoshelham, K., Elberink, S.O.: Accuracy and resolution of kinect depth data for indoor mapping applications. Sensors 12(2), 1437–1454 (2012). https://doi.org/10.3390/s120201437, http://www.mdpi.com/1424-8220/12/2/1437

  29. Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, ISMAR ’07, pp. 1–10. IEEE Computer Society, Washington, DC, USA (2007). https://doi.org/10.1109/ISMAR.2007.4538852

  30. Kolb, A., Barth, E., Koch, R., Larsen, R.: Time-of-flight sensors in computer graphics. In: Pauly, M., Greiner, G. (eds.) Eurographics 2009 - State of the Art Reports. The Eurographics Association, Aire-la-Ville (2009). https://doi.org/10.2312/egst.20091064

  31. Konolige, K., Agrawal, M., Bolles, R.C., Cowan, C., Fischler, M., Gerkey, B.: Outdoor Mapping and Navigation Using Stereo Vision, pp. 179–190. Springer, Berlin (2008). https://doi.org/10.1007/978-3-540-77457-0_17

  32. Konolige, K., Bowman, J.: Towards lifelong visual maps. In: 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1156–1163 (2009). https://doi.org/10.1109/IROS.2009.5354121

  33. Kuffner, J.J.: Effective sampling and distance metrics for 3D rigid body path planning. In: 2004 IEEE International Conference on Robotics and Automation (ICRA), pp. 3993–3998. New Orleans, United States (2004)

    Google Scholar 

  34. Lange, R.: 3d time-of-flight distance measurement with custom solid-state image sensors in cmos/ccd-technology (2000). http://dokumentix.ub.uni-siegen.de/opus/volltexte/2006/178

  35. Lu, F., Milios, E.: Globally consistent range scan alignment for environment mapping. Auton. Robot. 4(4), 333–349 (1997). https://doi.org/10.1023/A:1008854305733

  36. Magnusson, M., Andreasson, H., Nüchter, A., Lilienthal, A.J.: Automatic appearance-based loop detection from three-dimensional laser data using the normal distributions transform. J. Field Robot. 26(11–12), 892–914 (2009). https://doi.org/10.1002/rob.20314

  37. May, S., Droeschel, D., Fuchs, S., Holz, D., Nüchter, A.: Robust 3d-mapping with time-of-flight cameras. In: 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1673–1678 (2009). https://doi.org/10.1109/IROS.2009.5354684

  38. May, S., Droeschel, D., Holz, D., Fuchs, S., Malis, E., Nüchter, A., Hertzberg, J.: Three-dimensional mapping with time-of-flight cameras. J. Field Robot. 26(11–12), 934–965 (2009). https://doi.org/10.1002/rob.20321

  39. Montemerlo, M., Thrun, S., Koller, D., Wegbreit, B.: FastSLAM: a factored solution to the simultaneous localization and mapping problem. In: 18th National Conference on Artificial Intelligence, pp. 593–598. American Association for Artificial Intelligence, Menlo Park, CA, USA (2002). http://dl.acm.org/citation.cfm?id=777092.777184

  40. Nister, D.: Preemptive RANSAC for live structure and motion estimation. In: Proceedings Ninth IEEE International Conference on Computer Vision, vol. 1, pp. 199–206 (2003). https://doi.org/10.1109/ICCV.2003.1238341

  41. Nüchter, A., Lingemann, K., Hertzberg, J., Surmann, H.: 6D SLAM - 3D mapping outdoor environments: research articles. J. Field Robot. 24(8–9), 699–722 (2007). https://doi.org/10.1002/rob.v24:8/9

  42. Nüchter, A., Lingemann, K.: Robotic 3D Scan Repository. http://kos.informatik.uos.de/3Dscans/ (2017)

  43. Pollefeys, M., Gool, L.V.: From images to 3D models. Commun. ACM 45(7), 50–55 (2002). https://doi.org/10.1145/514236.514263

  44. Pomerleau, F., Magnenat, S., Colas, F., Liu, M., Siegwart, R.: Tracking a depth camera: parameter exploration for fast ICP. In: 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3824–3829 (2011). https://doi.org/10.1109/IROS.2011.6094861

  45. Reynolds, M., Dobo, J., Peel, L., Weyrich, T., Brostow, G.J.: Capturing time-of-flight data with confidence. CVPR 2011, 945–952 (2011). https://doi.org/10.1109/CVPR.2011.5995550

    Article  Google Scholar 

  46. Robbins, S., Schroeder, B., Murawski, B., Heckman, N., Leung, J.: Photogrammetric calibration of the SwissRanger 3D range imaging sensor. In: Proceedings of the SPIE Optical Sensors, vol. 7003, p. 700320 (2008). https://doi.org/10.1117/12.781551

  47. Segal, A., Haehnel, D., Thrun, S.: Generalized-ICP. In: Proceedings of Robotics: Science and Systems. Seattle, USA (2009). https://doi.org/10.15607/RSS.2009.V.021

  48. Fuchs, S.: DLR institute of robotics and mechatronics: 3D mapping with ToF cameras. http://kos.informatik.uos.de/3Dscans/ (2017)

  49. Strasdat, H., Montiel, J., Davison, A.J.: Scale drift-aware large scale monocular SLAM. In: Robotics: Science and Systems VI. The MIT Press, Cambridge (2010)

    Google Scholar 

  50. Stühmer, J., Gumhold, S., Cremers, D.: Real-Time Dense Geometry from a Handheld Camera, pp. 11–20. Springer, Berlin (2010). https://doi.org/10.1007/978-3-642-15986-2_2

  51. Sturm, J., Engelhard, N., Endres, F., Burgard, W., Cremers, D.: A benchmark for the evaluation of RGB-D SLAM systems. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 573–580. IEEE, Algarve (2012)

    Google Scholar 

  52. Sturm, J., Magnenat, S., Engelhard, N., Pomerleau, F., Colas, F., Cremers, D., Siegwart, R., Burgard, W.: Towards a benchmark for RGB-D SLAM evaluation. In: RGB-D Workshop on Advanced Reasoning with Depth Cameras at Robotics: Science and Systems Conference (RSS). Los Angeles, United States (2011). https://hal.archives-ouvertes.fr/hal-01142608

  53. Tamas, L., Jensen, B.: Robustness analysis of 3d feature descriptors for object recognition using a time-of-flight camera. In: 22nd Mediterranean Conference on Control and Automation, pp. 1020–1025 (2014). https://doi.org/10.1109/MED.2014.6961508

  54. Vivet, D., Gérossier, F., Checchin, P., Trassoudaine, L., Chapuis, R.: Mobile ground-based radar sensor for localization and mapping: an evaluation of two approaches. Int. J. Adv. Robot. Syst. 10(307), 12 (2013). https://doi.org/10.5772/56636

    Article  Google Scholar 

  55. Vivet, D., Checchin, P., Chapuis, R.: Localization and mapping using only a rotating FMCW radar sensor. Sensors 13(4), 4527–4552 (2013). https://doi.org/10.3390/s130404527, http://www.mdpi.com/1424-8220/13/4/4527

  56. Weingarten, J.W., Gruener, G., Siegwart, R.: A state-of-the-art 3d sensor for robot navigation. In: 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), vol. 3, pp. 2155–2160 (2004). https://doi.org/10.1109/IROS.2004.1389728

  57. Ye, C., Bruch, M.: A visual odometry method based on the swissranger sr4000. In: Technical report, Arkansas University at Little rock, Little Rock (2010)

    Google Scholar 

Download references

Acknowledgements

This work is supported by the French government research program Investissements d’Avenir through the RobotEx Equipment of Excellence (ANR-10-EQPX-44) and the IMobS3 Laboratory of Excellence (ANR-10-LABX-16-01), by the European Union through the program Regional competitiveness and employment 2007–2013 (ERDF - Auvergne region), by the Auvergne region.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Paul Checchin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ghorpade, V.K., Borrmann, D., Checchin, P., Malaterre, L., Trassoudaine, L. (2020). Time-of-Flight Depth Datasets for Indoor Semantic SLAM. In: Amato, N., Hager, G., Thomas, S., Torres-Torriti, M. (eds) Robotics Research. Springer Proceedings in Advanced Robotics, vol 10. Springer, Cham. https://doi.org/10.1007/978-3-030-28619-4_48

Download citation

Publish with us

Policies and ethics