Advertisement

Markerless Inside-Out Tracking for 3D Ultrasound Compounding

  • Benjamin Busam
  • Patrick Ruhkamp
  • Salvatore Virga
  • Beatrice Lentes
  • Julia Rackerseder
  • Nassir Navab
  • Christoph Hennersperger
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11042)

Abstract

Tracking of rotation and translation of medical instruments plays a substantial role in many modern interventions and is essential for 3D ultrasound compounding. Traditional external optical tracking systems are often subject to line-of-sight issues, in particular when the region of interest is difficult to access. The introduction of inside-out tracking systems aims to overcome these issues. We propose a marker-less tracking system based on visual SLAM to enable tracking of ultrasound probes in an interventional scenario. To achieve this goal, we mount a miniature multi-modal (mono, stereo, active depth) vision system on the object of interest and relocalize its pose within an adaptive map of the operating room. We compare state-of-the-art algorithmic pipelines and apply the idea to transrectal 3D ultrasound (TRUS). Obtained volumes are compared to reconstruction using a commercial optical tracking system as well as a robotic manipulator. Feature-based binocular SLAM is identified as the most promising method and is tested extensively in challenging clinical environments and for the use case of prostate US biopsies.

Keywords

3D ultrasound imaging Line-of-sight avoidance Visual inside-out tracking SLAM Computer assisted interventions 

References

  1. 1.
    Hennersperger, C., et al.: Towards MRIs-based autonomous robotic US acquisitions: a first feasibility study. MI 36(2), 538–548 (2017)Google Scholar
  2. 2.
    Kral, F., Puschban, E.J., Riechelmann, H., Freysinger, W.: Comparison of optical and electromagnetic tracking for navigated lateral skull base surgery. IJMRCAS 9(2), 247–252 (2013)CrossRefGoogle Scholar
  3. 3.
    Busam, B., Esposito, M., Che’Rose, S., Navab, N., Frisch, B.: A stereo vision approach for cooperative robotic movement therapy. In: ICCVW, pp. 127–135 (2015)Google Scholar
  4. 4.
    Heuveling, D., Karagozoglu, K., Van Schie, A., Van Weert, S., Van Lingen, A., De Bree, R.: Sentinel node biopsy using 3D lymphatic mapping by freehand spect in early stage oral cancer: a new technique. CO 37(1), 89–90 (2012)CrossRefGoogle Scholar
  5. 5.
    Fenster, A., Downey, D.B., Cardinal, H.N.: Three-dimensional ultrasound imaging. Phys. Med. Biol. 46(5), R67 (2001)CrossRefGoogle Scholar
  6. 6.
    Esposito, M., et al.: Cooperative robotic gamma imaging: enhancing US-guided needle biopsy. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9350, pp. 611–618. Springer, Cham (2015).  https://doi.org/10.1007/978-3-319-24571-3_73CrossRefGoogle Scholar
  7. 7.
    Sun, S.-Y., Gilbertson, M., Anthony, B.W.: Probe localization for freehand 3D ultrasound by tracking skin features. In: Golland, P., Hata, N., Barillot, C., Hornegger, J., Howe, R. (eds.) MICCAI 2014. LNCS, vol. 8674, pp. 365–372. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-10470-6_46CrossRefGoogle Scholar
  8. 8.
    Mur-Artal, R., Tardós, J.D.: ORB-SLAM2: an open-source slam system for monocular, stereo, and RGB-D cameras. TR 33(5), 1255–1262 (2017)Google Scholar
  9. 9.
    Hsu, P.W., Prager, R.W., Gee, A.H., Treece, G.M.: Freehand 3D ultrasound calibration: a review. In: Sensen, C.W., Hallgrímsson, B. (eds.) Advanced Imaging in Biology and Medicine, pp. 47–84. Springer, Heidelberg (2009).  https://doi.org/10.1007/978-3-540-68993-5_3CrossRefGoogle Scholar
  10. 10.
    Engel, J., Schöps, T., Cremers, D.: LSD-SLAM: large-scale direct monocular SLAM. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds.) ECCV 2014. LNCS, vol. 8690, pp. 834–849. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-10605-2_54CrossRefGoogle Scholar
  11. 11.
    Engel, J., Koltun, V., Cremers, D.: Direct sparse odometry. PAMI (2018)Google Scholar
  12. 12.
    Wang, R., Schwörer, M., Cremers, D.: Stereo DSO: large-scale direct sparse visual odometry with stereo cameras. In: ICCV (2017)Google Scholar
  13. 13.
    Zhang, Z.: A flexible new technique for camera calibration. PAMI 22(11), 1330–1334 (2000)CrossRefGoogle Scholar
  14. 14.
    Tsai, R.Y., Lenz, R.K.: A new technique for fully autonomous and efficient 3D robotics hand/eye calibration. TRA 5(3), 345–358 (1989)Google Scholar
  15. 15.
    Marchand, É., Spindler, F., Chaumette, F.: Visp for visual servoing: a generic software platform with a wide class of robot control skills. RAM 12(4), 40–52 (2005)Google Scholar
  16. 16.
    Lasso, A., Heffter, T., Rankin, A., Pinter, C., Ungi, T., Fichtinger, G.: Plus: open-source toolkit for ultrasound-guided intervention systems. BE 61(10), 2527–2537 (2014)CrossRefGoogle Scholar
  17. 17.
    Garrido-Jurado, S., noz Salinas, R.M., Madrid-Cuevas, F., Marín-Jiménez, M.: Automatic generation and detection of highly reliable fiducial markers under occlusion. PR 47(6), 2280–2292 (2014)CrossRefGoogle Scholar
  18. 18.
    Busam, B., Esposito, M., Frisch, B., Navab, N.: Quaternionic upsampling: Hyperspherical techniques for 6 DoF pose tracking. In: 3DV, IEEE (2016) 629–638Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Benjamin Busam
    • 1
    • 2
  • Patrick Ruhkamp
    • 1
    • 2
  • Salvatore Virga
    • 1
  • Beatrice Lentes
    • 1
  • Julia Rackerseder
    • 1
  • Nassir Navab
    • 1
    • 3
  • Christoph Hennersperger
    • 1
  1. 1.Computer Aided Medical ProceduresTechnische Universität MünchenMunichGermany
  2. 2.FRAMOS GmbHTaufkirchenGermany
  3. 3.Computer Aided Medical ProceduresJohns Hopkins UniversityBaltimoreUSA

Personalised recommendations