Advertisement

Localization of 3D objects using model-constrained SLAM

  • 438 Accesses

  • 2 Citations

Abstract

Accurate and real-time camera localization relative to an object is needed for high-quality Augmented Reality applications. However, static object tracking is not an easy task in an industrial context where objects may be textured or not, have sharp edges or occluding contours, be relatively small or too large to be entirely observable from one point of view. This paper presents a localization solution built on a keyframe-based SLAM algorithm. This solution only requires a video stream of a 2D camera (color or grayscale) and the prior knowledge of a 3D mesh model of the object to localize (also named object of interest in this document). The 3D model provides an absolute constraint that drastically reduces the SLAM drift. It is based on 3D-oriented contour points called edgelets, dynamically extracted from the model using Analysis-by-Synthesis on the graphics hardware. This model constraint is then expressed through two different formalisms in the SLAM optimization process. The dynamic edgelet generation ensures the genericity of our tracking method, since it allows to localize polyhedral and curved objects. The proposed solution is easy to deploy, requiring no manual intervention on the model, and runs in real time on HD video streams. It is thus perfectly adapted for high-quality Augmented Reality experiences. Videos are available as supplementary material.

This is a preview of subscription content, log in to check access.

Access options

Buy single article

Instant unlimited access to the full article PDF.

US$ 39.95

Price includes VAT for USA

Subscribe to journal

Immediate online access to all issues from 2019. Subscription will auto renew annually.

US$ 99

This is the net price. Taxes to be calculated in checkout.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24
Fig. 25
Fig. 26

References

  1. 1.

    Bleser, G., Stricker, D.: Advanced tracking through efficient image processing and visual-inertial sensor fusion. Comput. Graph. 3(1), 59–72 (2009)

  2. 2.

    Bleser, G., Wuest, H., Strieker, D.: Online camera pose estimation in partially known and dynamic scenes. In: International Symposium on Mixed and Augmented Reality (2006)

  3. 3.

    Drummond, T.: Lie groups, lie algebras, projective geometry and optimization for 3D geometry, engineering and computer vision. https://www.dropbox.com/s/5y3tvypzps59s29/3DGeometry.pdf?dl=0 (2014). Accessed 26 June 2018

  4. 4.

    Drummond, T., Cipolla, R.: Real-time visual tracking of complex structures. PAMI 24(7), 932–946 (2002)

  5. 5.

    Eade, E.: Lie groups for computer vision. http://ethaneade.com (2014). Accessed 26 June 2018

  6. 6.

    Finsterle, S., Kowalsky, M.: A truncated Levenberg Marquardt algorithm for the calibration of highly parameterized nonlinear models. Comput. Geosci. 37(6), 731–738 (2011)

  7. 7.

    Gay-Bellile, G., Bourgeois, S., Tamaazousti, M., Naudet-Collette, S., Knodel, S.: A mobile markerless augmented reality system for the automotive field. In: International Symposium on Mixed and Augmented Reality Workshop (2012)

  8. 8.

    Hajagos, B., Szcsi, L., Csbfalvi, B.: Fast silhouette and crease edge synthesis with geometry shaders. In: Spring Conference on Computer Graphics (2013)

  9. 9.

    Handa, A., Whelan, T., McDonald, J., Davison, A.: A benchmark for RGB-D visual odometry, 3D reconstruction and SLAM. In: International Conference on Robotics and Automation. Hong Kong, China (2014)

  10. 10.

    Hertzmann, A.: Introduction to 3D non-photorealistic rendering: Silhouettes and outlines. In: Special Interest Group on Computer GRAPHics and Interactive Techniques (1999)

  11. 11.

    Huang, A.S., Bachrach, A., Henry, P., Krainin, M., Maturana, D., Fox, D., Roy, N.: Visual odometry and mapping for autonomous flight using an RGB-D camera. In: International Symposium on Robotics Research (2011)

  12. 12.

    Kerl, C., Sturm, J., Cremers, D.: Robust odometry estimation for RGB-D cameras. In: International Conference on Robotics and Automation (2013)

  13. 13.

    Klein, G., Murray, D.: Full-3D edge tracking with a particle filter. In: British Machine Vision Conference (2006)

  14. 14.

    Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: International Symposium on Mixed and Augmented Reality (2007)

  15. 15.

    Lepetit, V., Fua, P.: Monocular model-based 3D tracking of rigid objects: a survey. Found. Trends Comput. Graph. Vis. 1(1), 1–89 (2006)

  16. 16.

    Levenberg, K.: A method for the solution of certain non-linear problems in least squares. Q. Appl. Math. 2(2), 164–168 (1944)

  17. 17.

    Lhuillier, M.: Incremental fusion of structure-from-motion and GPS using constrained bundle adjustments. Pattern Anal. Mach. Intell. 34(12), 2489–2495 (2012)

  18. 18.

    Li, G., Tsin, Y., Genc, Y.: Exploiting occluding contours for real-time 3D tracking: A unified approach. In: International Conference on Computer Vision (2007)

  19. 19.

    Loesch, A., Bourgeois, S., Gay-Bellile, V., Dhome, M.: Generic edgelet-based tracking of 3D objects in real-time. In: Intelligent RObots and Systems (2015)

  20. 20.

    Loesch, A., Bourgeois, S., Gay-Bellile, V., Dhome, M.: A hybrid structure/trajectory constraint for visual slam. In: 3D Vision (2016)

  21. 21.

    Lothe, P., Bourgeois, S., Dekeyser, F., Royer, E., Dhome, M.: Towards geographical referencing of monocular slam reconstruction using 3D city models: application to real-time accurate vision-based localization. In: Computer Vision and Pattern Recognition (2009)

  22. 22.

    Melbouci, K., Collette, S.N., Gay-Bellile, V., Ait-aider, O., Dhome, M.: Model based RGBD SLAM. In: International Conference on Image Processing (2016)

  23. 23.

    Middelberg, S., Sattler, T., Untzelmann, O., Kobbelt, L.: Scalable 6-DOF localization on mobile devices. In: European Conference on Computer Vision (2014)

  24. 24.

    Mouragnon, E., Lhuillier, M., Dhome, M., Dekeyser, F., Sayd, P.: Real time localization and 3D reconstruction. In: Computer Vision and Pattern Recognition (2006)

  25. 25.

    Mur-Artal, R., Montiel, J.M.M., Tards, J.D.: ORB-SLAM: a versatile and accurate monocular SLAM system. Trans. Robot. 31(5), 1147–1163 (2015)

  26. 26.

    Newcombe, R.A., Izadi, S., Hilliges, O., Molyneaux, D., Kim, D., Davison, A.J., Fitzgibbon, A.: Kinectfusion: Real-time dense surface mapping and tracking. In: International Symposium on Mixed and Augmented Reality (2011)

  27. 27.

    Nienhaus, M., Doellner, J.: Edge-enhancement–an algorithm for real-time non-photorealistic rendering. J. WSCG 11(2), 1–3 (2003)

  28. 28.

    Oikawa, M.A., Taketomi, T., Yamamoto, G., Fujisawa, M., Amano, T., Miyazaki, J., Kato, H.: Local quadrics surface approximation for real-time tracking of textureless 3D rigid curved objects. In: Symposium on Virtual and Augmented Reality (2012)

  29. 29.

    Petit, A., Marchand, E., Kanani, K.: Combining complementary edge, point and color cues in model-based tracking for highly dynamic scenes. In: International Conference on Robotics and Automation (2014)

  30. 30.

    Press, W.H. Teukolsky, S.A., Vetterling, W.T., Flannery, B.P. Numerical Recipes: The Art of Scientific Computing, 3rd edn, pp. 282–283. Cambridge University Press, Cambridge (2007)

  31. 31.

    Ramadasan, D., Chevaldonne, M., Chateau, T.: Dcslam: A dynamically constrained real-time SLAM. In: International Conference on Image Processing (2015)

  32. 32.

    Raskar, R.: Hardware support for non-photorealistic rendering. In: Special Interest Group on computer GRAPHics and Interactive Techniques Workshop on Graphics hardware (2001)

  33. 33.

    Stanimirovic, D., Damasky, N., Webel, S., Koriath, D., Spillner, A., Kurz, D.: [Poster] A mobile augmented reality system to assist auto mechanics. In: International Symposium on Mixed and Augmented Reality (2014)

  34. 34.

    Sturm, J., Engelhard, N., Endres, F., Burgard, W., Cremers, D.: A benchmark for the evaluation of RGB-D SLAM systems. In: Intelligent Robots and Systems (2012)

  35. 35.

    Tamaazousti, M., Gay-Bellile, V., Collette, S.N., Bourgeois, S., Dhome, M.: Real-time accurate localization in a partially known environment: application to augmented reality on textureless 3D objects. In: International Symposium on Mixed and Augmented Reality Workshop (2011)

  36. 36.

    Tamaazousti, M., Gay-Bellile, V., Naudet-Collette, S., Bourgeois, S., Dhome, M.: Nonlinear refinement of structure from motion reconstruction by taking advantage of a partial knowledge of the environment. In: Conference on Computer Vision and Pattern Recognition (2011)

  37. 37.

    Triggs, B., McLauchlan, P., Hartley, R., Fitzgibbon, A.: Bundle adjustment—a modern synthesis. In: International Conference on Computer Vision (2000)

  38. 38.

    Vacchetti, L., Lepetit, V., Fua, P.: Combining edge and texture information for real-time accurate 3D camera tracking. In: International Symposium on Mixed and Augmented Reality (2004)

  39. 39.

    Vacchetti, L., Lepetit, V., Fua, P.: Stable real-time 3D tracking using online and offline information. Pattern Anal. Mach. Intell. 26(10), 1385–1391 (2004)

  40. 40.

    Wasenmuller, O., Meyer, M., Stricker, D.: CoRBS: comprehensive RGB-D benchmark for SLAM using Kinect v2. In: Winter Conference on Applications of Computer Vision. http://corbs.dfki.uni-kl.de/ (2016). Accessed 26 June 2018

  41. 41.

    Wuest, H., Wientapper, F., Stricker, D.: Adaptable model-based tracking using analysis-by-synthesis techniques. In: Computer Analysis of Images and Patterns (2007)

Download references

Acknowledgements

This work was partly funded by the french research program FUI through the projects NASIMA and SEEMAKE. The authors would also like to thank their project partners Diotasoft and Faurecia for providing the car seat sequence.

Author information

Correspondence to Angelique Loesch.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (avi 152285 KB)

Supplementary material 1 (avi 152285 KB)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Loesch, A., Bourgeois, S., Gay-Bellile, V. et al. Localization of 3D objects using model-constrained SLAM. Machine Vision and Applications 29, 1041–1068 (2018). https://doi.org/10.1007/s00138-018-0951-x

Download citation

Keywords

  • Simultaneous localization and mapping
  • Constrained bundle adjustment
  • Occluding contours
  • Memory consumption
  • Real time
  • Augmented Reality