Advertisement

Learning needle tip localization from digital subtraction in 2D ultrasound

  • Cosmas MwikirizeEmail author
  • John L. Nosher
  • Ilker Hacihaliloglu
Original Article

Abstract

Purpose

This paper addresses localization of needles inserted both in-plane and out-of-plane in challenging ultrasound-guided interventions where the shaft and tip have low intensity. Our approach combines a novel digital subtraction scheme for enhancement of low-level intensity changes caused by tip movement in the ultrasound image and a state-of-the-art deep learning scheme for tip detection.

Methods

As the needle tip moves through tissue, it causes subtle spatiotemporal variations in intensity. Relying on these intensity changes, we formulate a foreground detection scheme for enhancing the tip from consecutive ultrasound frames. The tip is augmented by solving a spatial total variation regularization problem using the split Bregman method. Lastly, we filter irrelevant motion events with a deep learning-based end-to-end data-driven method that models the appearance of the needle tip in ultrasound images, resulting in needle tip detection.

Results

The detection model is trained and evaluated on an extensive ex vivo dataset collected with 17G and 22G needles inserted in-plane and out-of-plane in bovine, porcine and chicken phantoms. We use 5000 images extracted from 20 video sequences for training and 1000 images from 10 sequences for validation. The overall framework is evaluated on 700 images from 20 sequences not used in training and validation, and achieves a tip localization error of 0.72 ± 0.04 mm and an overall processing time of 0.094 s per frame (~ 10 frames per second).

Conclusion

The proposed method is faster and more accurate than state of the art and is resilient to spatiotemporal redundancies. The promising results demonstrate its potential for accurate needle localization in challenging ultrasound-guided interventions.

Keywords

Needle tip localization Ultrasound Deep learning Minimally invasive procedures 

Notes

Acknowledgements

This work was accomplished with funding support from the North American Spine Society 2017 young investigator award.

Compliance with ethical standards

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Informed consent

This article does not contain patient data.

Supplementary material

Supplementary material 1 (MP4 51745 kb)

References

  1. 1.
    Elsharkawy H, Babazade R, Kolli S, Kalagara H, Soliman ML (2016) The Infiniti plus ultrasound needle guidance system improves needle visualization during the placement of spinal anesthesia. Korean J Anesthesiol 69(4):417–419CrossRefPubMedPubMedCentralGoogle Scholar
  2. 2.
    Lu H, Li J, Lu Q, Bharat S, Erkamp R, Chen B, Drysdale J, Vignon F, Jain A (2014) A new sensor technology for 2D ultrasound-guided needle tracking. MICCAI 17(Pt. 2):389–396PubMedGoogle Scholar
  3. 3.
    Xia W, West S, Finlay M, Mari J, Ourselin S, David A, Desjardins A (2017) Looking beyond the imaging plane: 3D needle tracking with a linear array ultrasound probe. Sci Rep 7(1):3674CrossRefPubMedPubMedCentralGoogle Scholar
  4. 4.
    Miura M, Takeyama K, Suzuki T (2014) Visibility of ultrasound-guided echogenic needle and its potential in clinical delivery of regional anesthesia. Tokai J Exp Clin Med 39(2):80–86PubMedGoogle Scholar
  5. 5.
    Arif M, Moelker A, van Walsum T (2018) Needle Tip Visibility in 3D Ultrasound Images. Cardiovasc Interv Radiol 41(1):145–152CrossRefGoogle Scholar
  6. 6.
    Fevre MC, Vincent C, Picard J, Vighetti A, Chapuis C, Detavernier M, Allenet B, Payen JF, Bosson JL, Albaladejo P (2018) Reduced variability and execution time to reach a target with a needle GPS system: comparison between physicians, residents and nurse anaesthetists. Anaesth Crit Care Pain Med 37(1):55–60CrossRefPubMedGoogle Scholar
  7. 7.
    Stolka PJ, Foroughi P, Rendina M, Weiss CR, Hager GD, Boctor EM (2014) Needle guidance using handheld stereo vision and projection for ultrasound-based interventions. MICCAI 17(Pt.2):684–691PubMedGoogle Scholar
  8. 8.
    Priester AM, Natarajan S, Culjat MO (2013) Robotic ultrasound systems in medicine. IEEE EEE Trans Ultrason Ferroelectr Freq Control 60:507–523CrossRefGoogle Scholar
  9. 9.
    Ayvali E, Desai J (2014) Optical flow-based tracking of needles and needle-tip localization using circular hough transform in ultrasound images. Ann Biomed Eng 43(8):1828–1840CrossRefPubMedPubMedCentralGoogle Scholar
  10. 10.
    Zhao Y, Cachard C, Liebgott H (2013) Automatic needle detection and tracking in 3D ultrasound using an ROI-based RANSAC and Kalman method. Ultrason Imaging 35(4):283–306CrossRefPubMedGoogle Scholar
  11. 11.
    Hacihaliloglu I, Beigi P, Ng G, Rohling RN, Salcudean S, Abolmaesumi P (2015) Projection-based phase features for localization of a needle Tip in 2D curvilinear ultrasound. MICCAI 9349:347–354Google Scholar
  12. 12.
    Hatt CR, Ng G, Parthasarathy V (2015) Enhanced needle localization in ultrasound using beam steering and learning-based segmentation. Comput Med Imaging Graph 41:46–54CrossRefPubMedGoogle Scholar
  13. 13.
    Mwikirize C, Nosher JL, Hacihaliloglu I (2018) Signal attenuation maps for needle enhancement and localization in 2D ultrasound. Int J CARS 13(3):363–374CrossRefGoogle Scholar
  14. 14.
    Beigi P, Rohling R, Salcudean S, Ng G (2017) CASPER: computer-aided segmentation of imperceptible motion-a learning-based tracking of an invisible needle in ultrasound. Int J CARS 12(11):1857–1866CrossRefGoogle Scholar
  15. 15.
    Beigi P, Rohling R, Salcudean SE, Ng GC (2016) Spectral analysis of the tremor motion for needle detection in curvilinear ultrasound via spatiotemporal linear sampling. Int J CARS 11(6):1183–1192CrossRefGoogle Scholar
  16. 16.
    Mwikirize C, Nosher JL, Hacihaliloglu I (2018) Convolution neural networks for real-time needle detection and localization in 2D ultrasound. Int J CARS 13(5):647–657CrossRefGoogle Scholar
  17. 17.
    Pourtaherian A, Ghazvinian Zanjani F, Zinger S, Mihajlovic N, Ng G, Korsten H, With P (2017) Improving needle detection in 3D ultrasound using orthogonal-plane convolutional networks. MICCAI 2:610–618Google Scholar
  18. 18.
    Pourtaherian A, Ghazvinian Zanjani F, Zinger S, Mihajlovic N, Ng G, Korsten H, With P (2018) Robust and semantic needle detection in 3D ultrasound using orthogonal-plane convolutional neural networks. Int J CARS 13(9):1321–1333CrossRefGoogle Scholar
  19. 19.
    Redmon J, Farhadi A (2016) Yolo9000: better, faster, stronger. arXiv:1612.08242
  20. 20.
    Afonso M, Bioucas-Dias J, Figueiredo M (2010) Fast image recovery using variable splitting and constrained optimization. IEEE Trans Image Process 19(9):2345–2356CrossRefPubMedGoogle Scholar
  21. 21.
    Chan S, Khoshabeh R, Gibson K, Gill P, Nguyen T (2011) An augmented lagrangian method for total variation video restoration. IEEE Trans Image Process 20(11):3097–3111CrossRefPubMedGoogle Scholar
  22. 22.
    Goldstein T, Osher S (2009) The split Bregman method for L1-regularized problems. SIAM J Imaging Sci 2(2):323–343CrossRefGoogle Scholar
  23. 23.
    Fong D, Saunders M (2011) LSMR: an iterative algorithm for sparse least-squares problems. SIAM J Sci Comput 33(5):2950–2971CrossRefGoogle Scholar
  24. 24.
    Klambauer G, Unterthiner T, Mayr A, Hochreiter S (2017) Self-normalizing neural networks. arXiv:1706.02515v5
  25. 25.
    Everingham M, Gool LV, Williams CKI, Winn J, Zisserman A (2010) The PASCAL visual object classes (VOC) challenge. Int J Comput Vis 88:303–338CrossRefGoogle Scholar

Copyright information

© CARS 2019

Authors and Affiliations

  1. 1.Department of Biomedical EngineeringRutgers UniversityPiscatawayUSA
  2. 2.Department of RadiologyRutgers Robert Wood Johnson Medical SchoolNew BrunswickUSA

Personalised recommendations