A Training Tool for Ultrasound-Guided Central Line Insertion with Webcam-Based Position Tracking

  • Mark Asselin
  • Tamas Ungi
  • Andras Lasso
  • Gabor Fichtinger
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11042)


PURPOSE: This paper describes an open-source ultrasound-guided central line insertion training system. Modern clinical guidelines are increasingly recommending ultrasound guidance for this procedure due to the decrease in morbidity it provides. However, there are no adequate low-cost systems for helping new clinicians train their inter-hand coordination for this demanding procedure. METHODS: This paper details a training platform which can be recreated with any standard ultrasound machine using inexpensive components. We describe the hardware, software, and calibration procedures with the intention that a reader can recreate this system themselves. RESULTS: The reproducibility and accuracy of the ultrasound calibration for this system was examined. We found that across the ultrasound image the calibration error was less than 2 mm. In a small feasibility study, two participants performed 5 needle insertions each with an average of slightly above 2 mm error. CONCLUSION: We conclude that the accuracy of the system is sufficient for clinician training.


Open-source Webcam tracking Central line insertion Medical training 

1 Introduction

Central line insertion is the placement of a catheter usually through a major vein in the neck for administering medication and fluids directly into the heart. This common procedure is routinely performed to directly monitor venous pressure, to deliver large volumes of fluids, or to infuse solutions that would harm peripheral veins.

In many countries, the standard of care for central line insertion includes the use of ultrasound (US) guidance [1]. Ultrasound helps the operator find the optimal needle insertion location at the first insertion attempt, and helps prevent accidental puncture of the carotid artery. US is also used to visualize a patients’ anatomy and provide guidance during the insertion of the needle. To insert a needle under US guidance, a clinician must simultaneously manipulate an ultrasound probe and the needle, one in each hand. Maintaining this coordination amidst the many steps of a venous cannulation is a daunting task for new clinicians. This problem is compounded by a lack of accessible practical training tools for medical students and clinician trainees to practice this coordination. In this paper we detail an inexpensive and portable system designed to foster this skill in new clinicians by real time position tracking of the instruments for virtual reality visualization.

1.1 Standard Procedure for Central Line Insertion

To perform central line insertion, a clinician will first select a vein for catheterization. Typical sites include the internal jugular vein (in the neck), the femoral vein (in the thigh), or the subclavian vein (in the upper chest). In this paper we focus on internal jugular vein insertions, but the skills developed apply equally to using US guidance at any of the three sites [2].

Once the clinician has selected the insertion site, they will examine the patient’s anatomy and attempt to discriminate between the vein and other nearby structures, including arteries, nerves and the surrounding tissues. This step is crucially important. Accidental cannulation of the artery is a serious complication in this procedure with the potential to cause significant morbidity or mortality [3]. Other serious complications include pneumothorax (collapsed lung), infection, air embolus, and losing the guidewire into the vasculature. To help avoid these complications, many modern clinical guidelines suggest the use of ultrasound when performing central line insertion. US guidance is especially effective for helping to discern the artery from the vein. In a 900-patient randomized study, Karakitsos et al. compared the use of ultrasound against anatomical landmarks for central line insertion. They found a significant reduction in access time, as well as significant reductions in many of the common complications [4]. For these reasons, modern clinical standards are recommending the use of US guidance for this procedure.

There are two common techniques for the positioning of the ultrasound probe relative to the vein for the needle insertion. The first technique is called an “out of plane” insertion, where the imaging plane bisects the vein at a right angle. Out of plane insertion provides excellent visualization of the vein and the artery, helping to prevent accidental arterial cannulation. The two vessels can be distinguished by their relative positions within the anatomy. However, the drawback of the out of plane insertion method is that the operator must advance the needle and the probe iteratively, being very careful not to advance the needle ahead of the US imaging plane. If this were to happen, the operator would lose visualization of the advancing needle’s path.

The second common technique for central line insertion is an “in plane” insertion where the US plane is parallel to the vessel. This technique has the advantage of continuous needle tip visualization, at the expense of making it more difficult to distinguish the artery from the vein. Hybrid techniques have been suggested where the clinician holds the probe at an oblique angle relative to the vein. This is intended to combine the advantages of the in plane and out of plane insertions [5]. In this paper we demonstrate our visualization with the out of plane approach, though it can be easily used for the in plane or oblique approaches by rotating the probe.

1.2 Training Challenges

One of the major challenges faced by new clinicians learning to use US guidance for needle insertion is the development of the requisite hand coordination. Clinicians must be able to simultaneously control the US probe in one hand, and the needle in the other. We have found in earlier studies that 3-dimensional visualization of the ultrasound image and the needle in real time is an effective training tool in learning coordination skills in ultrasound-guided needle placement [6]. This training setup requires position tracking of the ultrasound and the needle. Position tracking has additional advantages besides enabling 3-dimensional visualization as a training tool. Tracking can be used for the quantification of trainee skills for objective competency assessment [7], and for providing real time information to the trainee on the next procedure steps to perform in the early phases of training [8].

Although position tracking of the ultrasound and needle has many advantages during training of central line insertion, it is currently an expensive and complicated system. In this paper, we aim to show how a tracking system can be built for central line training using only open-source software and an inexpensive webcam for optical tracking. We evaluate the reproducibility and accuracy of the system and perform a small feasibility study.

2 Methods

2.1 Hardware

One major barrier in training new clinicians for US guided central line catheterization is the high cost for specialized, non-portable hardware. In creating this system we used only off the shelf components that are robust and relatively inexpensive to obtain. The design of every custom tool we used is open-sourced and the tools can be printed on any inexpensive 3D printer. Excluding the computer and the US machine, the total hardware cost for this system is ~$200 US. The system can be built around any computer and any ultrasound machine; we endeavor to describe the system assembly in enough detail to allow it to be replicated easily. Additional instructions, source files, screenshots and information are available on the project’s GitHub page1.

In our experiments, we used a modern Lenovo laptop computer and a Telemed USB ultrasound with a L12 linear probe (Telemed Ltd., Lithuania). We have found this portable ultrasound machine to be incredibly suitable for US training applications. In addition to this, we used an Intel RealSense D415 depth camera (Intel, California, USA). We chose this camera in particular because it has fixed focus. We have found in the past that webcam autofocus can cause interruptions in tracking. Another advantage of this camera is its integrated depth sensor, capable of producing a point cloud of the scene in front of it. We envision several possible extensions to this system which would make use of this feature.

In addition to the components we purchased, we needed to design and manufacture several tools shown in Fig. 1. The STL models and source files for all these tools are open source, and accessible on the project’s GitHub page and in the PLUS model repository2. Each tool has a black and white marker to be used with the ArUco marker tracking toolkit [9]. The first tool (A) is a clip to connect an ArUco marker to the US probe. This clip also has a built-in dimple for performing pivot calibration. The middle tool (B) is a marker plane to rigidly fix an ArUco marker to the syringe. The hockey stick shaped tool (C) is a tracked stylus used to perform the calibration needed to visualize the US image in 3D space. To create these components, we used the Autodesk Fusion 360 (Autodesk, California, USA) CAD software to create the STL models. We then 3D printed these on an inexpensive 3D printer (Qidi Tech, Rui’an, China).
Fig. 1.

Open source 3D printed tools with ArUco markers for tracking. A: ultrasound probe with marker bracket, embedded pivot calibration dimple is circled in red. B: tracked syringe mounted to steel needle. C: tracked stylus for US calibration, note the pointed tip. (Color figure online)

An important consideration when creating these tracked tools is the orientation of the marker with respect to the tracker. This is important to maintain good tracking accuracy. The goal is to ensure the plane of the ArUco marker is close to perpendicular to a ray drawn between the tracker and the center of the marker. This consideration must be balanced against ergonomic constraints and marker occlusion avoidance. We have found the use of 3D printing to be a useful tool in solving this problem because it enables the rapid creation of iterative prototypes. Typically it takes multiple prototypes to arrive at a satisfactory design.

2.2 System Design

To capture real-time US frames and tracking data we used the PLUS toolkit [10]. In order to track the tools using the Intel RealSense webcam, we used the OpticalMarkerTracking device built into PLUS [11]. This software device allows tracking to be performed using any RGB webcam, including the webcams built into modern laptops. It leverages the ArUco marker tracking toolkit to enable distortion correction of the camera image and pose computation of the black and white patterns shown above.

We built the visualization and training software on top of 3D Slicer, a widely used open-source application framework for medical image computing. Specifically, we leveraged the functionality in the image guided therapy extension built for 3D Slicer called SlicerIGT [12]. Using these two tools, this system was assembled without writing any custom software. Instead, we created a Slicer scene through configuration of Slicer widgets in Slicer’s graphical user interface. Then we saved the scene into MRML file, an XML-based file format for medical computing. The MRML scene can then be loaded from Slicer on any computer (Fig. 2), providing an easy distribution mechanism for software developed in this manner.
Fig. 2.

The complete training system in use.

2.3 Calibration

One of the critical steps in building any tracked ultrasound system is to calibrate the US image with respect to the position sensor mounted on the US probe. This process is typically referred to as ultrasound calibration. To calibrate this training tool, we used a fiducial based registration procedure. The general idea of this method is to track the positions of the stylus and probe, using corresponding points in each frame of reference to determine the transformation between the two coordinate systems. This process begins by computing the tip of the stylus in its own coordinate system via pivot calibration. Then, a sampling of points distributed across the US image are collected along with their corresponding points in 3D space. We typically choose to use 6–10 such points in our calibrations. In Fig. 3, the selection of a sample point is shown. The position of the stylus tip is recorded in the US image (top left quadrant) and in 3D space (top right quadrant). The frame of video data from the webcam-based marker tracking is shown in the bottom left quadrant for reference. A more detailed description of this calibration process can be found in the SlicerIGT tracked ultrasound calibration tutorial3.
Fig. 3.

Selection of points during US calibration. Top left: stylus tip position in US image coordinates. Top right: stylus tip position in 3D space. Bottom left: image of stylus & US probe from which tracking data was computed.

2.4 Calibration Verification

To verify the reproducibility of our US calibration, we performed a sequence of 5 calibrations. We then placed imaginary points in 5 regions of interest in the US frame - the center and each of the four corners. The center was selected because it is typically where the target for needle insertion will be, and the corners because any rotational error in the calibration will be most significant there. We transformed each of these points to physical space using all 5 of the US calibrations resulting in 5 clusters of 5 points each. For each cluster of 5, we took the center of mass as our best approximation to the true physical space position of the point. We then computed the average distance of the points in each cluster from the approximation of the true spatial position.

Lastly, we tested the system by having 2 users, one experienced with ultrasound and the other an intermediate operator, perform 5 needle insertions each. For each insertion, the operator targeted a 2 mm steel sphere implanted into a clear plastisol phantom. To assess their accuracy, we measured the maximum distance between the center of their needle tip and the closest side of the steel sphere. During the insertion the users were requested not to look directly at the phantom, relying only on the display of the training system.

3 Results

For each of the 5 calibration trials we recorded the root mean square (RMS) error of the pivot and fiducial registrations (Table 1). Note that these RMS errors are not a metric of accuracy, however they are a good measurement of the reproducibility of the system.
Table 1.

RMS error from each pivot calibrations and corresponding fiducial registration.

Calibration #

Pivot RMS Error (mm)

FRE (RMS, mm)
















Mean (STD)

0.51 (0.06)

1.58 (0.23)

Using each of the 5 US calibrations, we mapped 5 fiducials into their 3D positions using the image to probe transformation. The average distance of the 5 fiducials in each region from their center of mass is summarized in Table 2. Then using the best US calibration, two participants performed 5 needle localizations each on a simulated phantom using only the training system for guidance. The average distance from the simulated target for each participant is shown in Table 3.
Table 2.

US calibration errors.

Region of Interest

Average Distance (mm)

Top Left


Top Right




Bottom Left


Bottom Right


Table 3.

Target localization errors.


Average distance from target mm (SD)


2.16 (1.10)


2.32 (0.82)

4 Discussion

Overall, the errors in the calibration of the system fall within an acceptable target range for use in US guided needle insertion training.

Participants noted that an advantage of using the clear phantom is the immediate spatial feedback it provides post-localization. After each insertion participants could look at the phantom and quickly see where their needle was placed with respect to the target (Fig. 4). The authors feel that this may be an effective feedback for honing ability with this technique.
Fig. 4.

Needle localization seen through clear phantom.

4.1 Limitations of Methods

The measurement of the needle to target sphere distance using calipers is subject to optical distortion in the clear phantom. To mitigate this, the phantom was designed with flat sides to minimize the lens effect. Ideally, we would have measured the needle – sphere distance using X-Ray or CT imaging, but these modalities were infeasible in the confines of this preliminary study.

4.2 Potential Improvements

Our lab currently develops a system called Central Line Tutor, which provides guidance to trainees learning the sequence of steps for performing US guided central line insertion. It would be a straightforward exercise to integrate these two platforms, providing a complete low-cost toolkit for central line insertion training.

5 Conclusion

We have demonstrated the feasibility of using a webcam-based system for training new clinicians hand coordination for ultrasound guided central line insertion. Our training platform focused on developing the requisite inter-hand coordination for performing the needle insertion portion of the procedure.


  1. 1.
  2. 2.

    PLUS Toolkit open source model catalog, accessible at:

  3. 3.

    SlicerIGT tracked ultrasound calibration tutorial:



This work was funded, in part, by NIH/NIBIB and NIH/NIGMS (via grant 1R01EB021396-01A1 - Slicer + PLUS: Point-of-Care Ultrasound) and by CANARIE’s Research Software Program. Gabor Fichtinger is supported as a Cancer Care Ontario Research Chair in Cancer Imaging. Mark Asselin is supported by an NSERC USRA.


  1. 1.
    Frykholm, P., et al.: Clinical guidelines on central venous catheterization. Acta Anaethesiol Scand. 58, 508–524 (2014). Scholar
  2. 2.
    Rigby, I., Howes, D., Lord, J., Walker, I.: Central Venous Access. Resuscitation Education Consortium/Kingston Resuscitation InstituteGoogle Scholar
  3. 3.
    Gillman, L.M., Blaivas, M., Lord, J., Al-Kadi, A., Kirkpatrick, A.W.: Ultrasound confirmation of guidewire position may eliminate accidental arterial dilation during central venous cannulation. Scand. J. Trauma, Resusc. Emerg. Med. 18, 39–42 (2010). Scholar
  4. 4.
    Karakitsos, D., et al.: Real-time ultrasound guided catheterization of the internal jugular vein: a prospective comparison with the landmark technique in critical care patients. Crit. Care 10(6), R162 (2006). Scholar
  5. 5.
    Phelan, M., Hagerty, D.: The oblique view: an alternative approach for ultrasound-guided central line placement. J. Emerg. Med. 37(4), 403–408 (2008). Scholar
  6. 6.
    Keri, Z., et al.: Training for ultrasound-guided lumbar puncture on abnormal spines using an augmented reality system. Can. J. Anesth. 62(7), 777–784 (2015). Scholar
  7. 7.
    Clinkard, D., et al.: The development and validation of hand motion analysis to evaluate competency in central line catheterization. Acad. Emerg. Med. 22(2), 212–218 (2015). Scholar
  8. 8.
    Hisey, R., Ungi, T., Holden, M., Baum, Z., Keri, Z., Fichtinger, G.: Real-time workflow detection using webcam video for providing real-time feedback in central venous catheterization training. In: SPIE Medical Imaging 2018, 10–15 February, Houston, Texas, USA (2018)Google Scholar
  9. 9.
    Garrido-Jurado, S., Munoz-Salinas, R., Madrid-Cuevas, F.J., Marin-Jimenes, M.J.: Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recogn. 47(6), 2280–2292 (2014). Scholar
  10. 10.
    Lasso, A., Heffter, T., Rankin, A., Pinter, C., Ungi, T., Fichtinger, G.: PLUS: open-source toolkit for ultrasound-guided intervention systems. IEEE Trans Biomed Eng. 61(10), 2527–2537 (2014). Scholar
  11. 11.
    Asselin, M., Lasso, A., Ungi, T., Fichtinger, G.: Towards webcam-based tracking for interventional navigation. In: SPIE Medical Imaging 2018, 10–15 February, Houston, Texas, USA (2018)Google Scholar
  12. 12.
    Ungi, T., Lasso, A., Fichtinger, G.: Open-source platforms for navigated image-guided interventions. Med. Image Anal. 33, 181–186 (2016)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Mark Asselin
    • 1
  • Tamas Ungi
    • 1
  • Andras Lasso
    • 1
  • Gabor Fichtinger
    • 1
  1. 1.Laboratory for Percutaneous SurgeryQueen’s UniversityKingstonCanada

Personalised recommendations