Medical Robotics for Ultrasound Imaging: Current Systems and Future Trends

Abstract

Purpose of Review

This review provides an overview of the most recent robotic ultrasound systems that have contemporary emerged over the past five years, highlighting their status and future directions. The systems are categorized based on their level of robot autonomy (LORA).

Recent Findings

Teleoperating systems show the highest level of technical maturity. Collaborative assisting and autonomous systems are still in the research phase, with a focus on ultrasound image processing and force adaptation strategies. However, missing key factors are clinical studies and appropriate safety strategies. Future research will likely focus on artificial intelligence and virtual/augmented reality to improve image understanding and ergonomics.

Summary

A review on robotic ultrasound systems is presented in which first technical specifications are outlined. Hereafter, the literature of the past five years is subdivided into teleoperation, collaborative assistance, or autonomous systems based on LORA. Finally, future trends for robotic ultrasound systems are reviewed with a focus on artificial intelligence and virtual/augmented reality.

Introduction

Ultrasound has become an indispensable medical imaging modality for both diagnostics and interventions. As a radiation-free, portable, widely available, and real-time capable imaging technique, this imaging modality has significant advantages compared to other techniques such as computed tomography (CT) or magnetic resonance imaging (MRI). Additionally, real-time volumetric ultrasound (four-dimensional, 4D) has recently gained attention as new matrix array probes provide sufficiently high frame rates for many medical applications. However, ultrasound is a strongly user-dependent modality that requires highly skilled and experienced sonographers for proper examinations. Apart from identifying the correct field of view, thus being continuously focused on the ultrasound station screen, and holding the probe manually with an appropriate pressure, the examiner must also adjust several imaging settings on the ultrasound station. This un-ergonomic examination process may also lead to work-related musculoskeletal disorders [1, 2]. Further, manual guidance of the probe makes reproducible image acquisition almost impossible. While spatially and temporally separated image acquisition and diagnostics are common practice for MRI and CT, sonographers must perform both at the same time, making the examination mentally more demanding.

Robotic ultrasound is the fusion of a robotic system and an ultrasound station with its probe attached to the robot end-effector. This combination might overcome ultrasound disadvantages by means of either a teleoperated, a collaborative assisting, or even an autonomous system. A range of commercial and research systems have been developed over the past two decades for different medical fields, and many were summarized in previous reviews [3, 4]. Nevertheless, this review focuses on the most recent systems with the emphasis on findings published in the last five years, highlighting the current status and future directions of robotic ultrasound. We use the level of robot autonomy (LORA) [5] to organize the sections of this review into either teleoperated, collaborative assisting, or autonomous systems. In addition, each described system was objectively classified to a LORA between level one and nine after defining the task to be performed autonomously by the robotic ultrasound systems as: The ultrasound acquisition of a specific anatomical region of interest (ROI) including the initial placement of the ultrasound probe. The LORA values correspond to the following terms (further information on the levels in Fig. 6, Appendix 1):

Teleoperation:

  1. 1.

    Teleoperation

  2. 2.

    Assisted Teleoperation

Collaborative assistance:

  1. 3.

    Batch Processing

  2. 4.

    Decision support

Autonomous systems:

  1. 5.

    Shared control with human initiative

  2. 6.

    Shared control with robot initiative

  3. 7.

    Executive control

  4. 8.

    Supervisory control

  5. 9.

    Full autonomy

This review starts by presenting the technical specifications and requirements for these systems with a focus on ultrasound imaging and safety considerations of the robot. The reviewed systems are then categorized into teleoperation, collaborative assistance, and autonomous systems. Finally, an outlook for future directions of robotic ultrasound systems combined with artificial intelligence (AI) or virtual/augmented reality (VR/AR) is provided, as these technologies have gained increased attention in the past years. AI-based applications can achieve exceptional performance in medical image understanding which could be crucial for increasing autonomy of robotic ultrasound systems. VR/AR, on the other hand, may facilitate an enhancement of the physician’s perception with subsurface targets and critical structures while also potentially improving 3D understanding.

Technical Specifications

Ultrasound Imaging

Using a robot to perform ultrasound imaging poses task-specific challenges for the imaging system. If the task of the robotic ultrasound system requires visual servoing (the process of controlling robot motion based on image information [6, 7]), online data access is mandatory. In case of two-dimensional (2D) ultrasound images, data can usually be accessed by grabbing the frames at the display output of the ultrasound system. In contrast, volumetric data offer the distinct advantage of covering entire anatomical structures, and their motion paths can then be used for automated robotic control. However, three-dimensional (3D) data are more complex and therefore require a dedicated interface for streaming. Robotic ultrasound imaging might also require remote or even automatic control of the imaging parameters which are usually adjusted manually on the ultrasound system. Remote control, just like direct data access, is typically not enabled by commercial diagnostic systems and thus requires development of open platforms or close collaborations with manufacturers for integration.

Force Sensitivity and Safety Considerations

Medical robotic ultrasound sets special safety requirements beyond the established industry standards of human-robot collaboration where direct contact between the robot and humans is typically to be avoided. Patients, who are purposely touched by the moving robot tool, are in an unprotected position with no quick escape possibility from the dangerous area and are possibly physically weakened. The potential dangers to patient and personnel during robot operation are clamping, squeezing, impact, and pressing in various ways. These dangers can be detected by extensive technical precautions on the robot system and should be prevented or stopped early at the onset of a potential injury.

Safety technologies usually contain either external force/torque sensors mounted on the end-effector or, in the case of lightweight robots, integrated torque sensors in all joints, realizing proprioceptive sensing. While the former does not allow to perform collision checks of the arm links, the latter calculates the contact force at the end-effector and possible collision forces at the individual arm links by means of a dynamic model with the joint torque measurements. Moreover, this technique enables the modeling of impedance/admittance-controlled motion modes that mimic the behavior of a multidimensional spring-damper system, enabling a safer human-robot interaction. Lightweight robots also have the advantage of taking up lower kinetic energy and thus potentially reducing the risk of injury. Camera surveillance and the integration of external proximity sensors can also reduce the risks but are more expensive to implement and to maintain and can also be adversely affected by interruptions of the direct line-of-sight. In addition, research is also being conducted on mechanical safety concepts that intrinsically protect against hazards [8, 9].

Dynamic concepts of injury prevention consist of adapted velocity profiles depending on the distance to the patient and the blocking of safe areas against robot movement. Additionally, the anticipation and treatment of collisions in the application context through a structured process in real time could be used to prevent adverse events [10]. The fast and often short-term nature of collisions requires maximal detection and data processing speed. The main problem of collision detection is signal monitoring with high sensitivity while also avoiding false alarms.

Safety aspects are often not the primary focus in many research projects. Nevertheless, meeting these safety requirements should be considered already during the conception and development phases of a project to ensure safe operation and facilitate a subsequent product certification.

Teleoperation

The operator dependency of ultrasound imaging means that receiving a reliable diagnosis generally depends on the availability of an expert sonographer. Considering the shortage of trained experts especially in remote regions, access to ultrasound imaging can be very limited, increasing travel and waiting times with potential negative effects on patient outcomes. Another problem is the physical strain of manually handling the probe [1, 2]. Remote control of the ultrasound probe using robotic technology (LORA one and two) holds the potential to solve these problems. In this section, the most recent systems are categorized into custom design and commercially available robotic hardware and summarized in Table 1.

Table 1 Overview of teleoperated and collaborative robotic ultrasound systems and their respective components, published between 2015 and 2020

Custom Design Robots

The only commercially available teleoperated ultrasound solutions to date are the MGIUS-R3 (MGI Tech Co.) system [11] and the MELODY (AdEchoTech) system [12]. The former system consists of a six degrees of freedom (DOF) robotic arm including a force sensor and the ultrasound probe. A dummy probe (simple model made from plastic) at the physician site allows controlling the actual probe at the remote site. A single study was conducted to assess the feasibility of examining a patient with COVID-19, highlighting its advantage regarding the eliminated infection risk for the physician [13]. MELODY consists of a specialized robotic probe holder at the patient site (Fig. 1a) with three passive DOF for positioning, three active DOF for rotating the probe, and a force sensor. Coarse translational positioning of the robot is handled by a human assistant, while fine adjustments of probe orientation are remotely controlled by the expert sonographer via a haptic device with force feedback. MELODY has already been used for cardiac [14], abdominal [15, 16], obstetric [15, 17•], pelvic, and vascular telesonography [15] in over 300 patients.

Fig. 1
figure1

Overview of different teleoperated robotic ultrasound systems. a MELODY system used in an abdominal exam (picture courtesy S. Avgousti, Cyprus University of Technology). b ReMeDi system used in a cardiac exam (figure by M. Giuliani et al. [21••] under CC-BY license). c TOURS system as utilized for remote exams on the International Space Station (reprinted from [23•], copyright [2018], with permission from Elsevier). d Teleoperated ultrasound platform with haptic device while acquiring an imaging phantom (figure by K. Mathiassen et al. [26] under CC-BY license)

The novel ReMeDi (Remote Medical Diagnostician) system is based on a detailed analysis of user requirements with a focus on safety, dexterity, and accurate tactile feedback [18, 19]. The kinematically redundant robotic arm (Fig. 1b) features seven active DOF and an additional force-torque sensor and was specially designed to reproduce all necessary movements of a human examiner [20]. In contrast to MELODY, ReMeDi does not rely on a human assistant. This system has successfully been tested in 14 patients for remote cardiac exams [21••].

The TOURS (Tele-Operated UltRasound System) features a compact robotic probe manipulator (Fig. 1c) with three active DOF for remote control of probe orientation via a dummy probe without haptic feedback [22]. Translation is handled manually by an assistant at the patient site. TOURS has been tested over long distances for abdominal, pelvic, vascular, and obstetric exams in over 100 patients [22]. The system has also been successfully employed for remote ultrasound scans on the International Space Station [23•].

In [24], a specially designed robot with six DOF and a force sensor was controlled using a dummy probe for probe rotations and a conventional keyboard for translational motion. Feasibility was demonstrated in a healthy volunteer. A compact parallel telerobotic system with six DOF for fine positioning of the probe and haptic feedback for remote control was presented in [25] but not tested in vivo yet.

Commercial Robots

In [26], the six DOF UR5 robot (Universal Robots) was used to develop a general, low-cost robotic ultrasound platform. The integrated torque measurements were enhanced with an external force sensor, and a haptic device was used for remote control (Fig. 1d). The system meets the technical requirements for teleoperated ultrasound, but has not been evaluated in vivo [26]. A similar study using the UR5 robot investigated filtering haptic commands and reducing velocity to improve safety [27].

A new control approach was presented in [28, 29] using a lightweight anthropomorphic robot (WAM, Barrett Technology) with seven DOF and remote control with a haptic device. To achieve smooth transitions between free movement and patient contact, an external force sensor and a 3D time-of-flight camera were integrated. The architecture was validated in a pelvic exam of a healthy volunteer with the examiner located in the same room.

In [30], a ProSix C4 robot (Epson) without force sensors was proposed for acquiring ultrasound images for 3D volume reconstruction using remote control of the probe via joystick. Safety and surveillance relied on visual inspection by the operator via camera. The authors tested their setup for a vascular scan on a healthy volunteer.

Summary

The past five years have proven feasibility of performing remote ultrasound exams of various anatomical regions at varying distances. Patients and examiners generally accept this new technology [21••], which could improve access to care, for example, by reducing waiting times for a consultation in remote locations which lack experienced sonographers [31].

Collaborative Assistance

Research in the field of collaborative robotic ultrasound assistance typically aims to enable physicians to perform standard ultrasound imaging procedures faster, more precise, and more reproducible. On the other hand, collaborative therapy guided interventions may be performed with reduced assistant personnel or even alone. In this review, collaborative assisting robotic ultrasound systems comprise systems that have a LORA of three and four and thus can perform a certain action and partially even suggest a task plan. This section introduces applications and functionality of such systems, while Table 1 shows an overview of the most important recent systems.

Collaborative Image Acquisition

The reconstruction of the iliac artery has been performed by Janvier et al. [32] using their system of a six DOF CRS F3 robot (Thermo CRS) with an attached linear probe, whereby the scan path over the ROI was manually taught and the vessel surface structure was reconstructed from multiple automatically replayed robotic cross-sectional ultrasound scans. The authors compared the ultrasound volume reconstruction from the system to computed tomography angiographies of a phantom and in vivo. Ultrasound image quality was optimized by Jiang et al. [33] by adjusting the in-plane and out-of-plane orientation of the ultrasound probe. Therefore, an initial confidence map of the ultrasound image was analyzed, and a subsequent fan motion was then automatically performed with a force-sensitive LBR iiwa robot (KUKA). A method for the correction of contact pressure-induced soft-tissue deformation in 3D ultrasound images was developed by Virga et al. [34•]. The image-based process utilizes displacement fields in a graph-based approach which in turn is based solely on the ultrasound images and the applied force measured by the robot. Zhang et al. applied the concept of synthetic tracked aperture ultrasound (STRATUS) in [35] to extend the effective aperture size by means of robotic movements (Fig. 2a). During the process, the system accurately tracks the orientation and translation of the probe and improves image quality especially in deeper regions. Here, sub-apertures captured from each ultrasound pose were synthesized to construct a high-resolution image. Thereby, the probe has been moved by an operator, while a virtual wall for constraining the motion to the desired image plane is mimicked by the force feedback control of an external force-torque sensor.

Fig. 2
figure2

Overview of system components for collaborative assisting robotic ultrasound systems. a The STRATUS system including a UR5 robot and an ultrasound probe interconnected by a six DOF force-torque sensor (copyright © [2016] IEEE. Reprinted with permission from [35]). b Near infrared imaging sensors combined with an ultrasound probe for bimodal vessel imaging in the forearm to guide venipuncture (reproduced from [37•] with permission from Springer Nature). c Setup for a flexible needle steering system of two Viper s650 robots (Adept) with needle holder and ultrasound probe (copyright © [2015] IEEE. Reprinted with permission from [38]). d LBR iiwa robot with ultrasound probe on custom mount with needle holder used for facet joint insertion (reproduced from [39•] with permission from Springer Nature)

Collaborative Therapy Guidance

A system for needle insertion and needle guidance during the ablation of liver tumors was developed by Li et al. [36], utilizing a robotic ultrasound system with real-time imaging and respiratory motion compensation. Chen et al. [37•] reported the use of automatic image segmentation, reconstruction, and motion tracking algorithms for the ultrasound probe, which is mechanically connected to near infrared sensors and forms a portable device (Fig. 2b). The system shall perform robotic venipuncture but has so far only been validated for manually guided procedures in forearm vessels. Robotized insertion and steering of a flexible needle in a phantom under 3D ultrasound guidance with one robot for needle steering and a second robot for ultrasound imaging (Fig. 2c) were performed by Chatelain et al. [38]. In 2018, Esteban et al. reported the first clinical trial of a robotized spine facet joint insertion system in [39•], performing a force-compliant sweep over the spine region with automatic volume reconstruction to facilitate intrainterventional insertion planning and subsequent precise needle prepositioning over the target. The system consists of a calibrated probe holder with a needle guide mounted on an LBR iiwa robot (Fig. 2d). A navigation assistant for markerless automatic motion compensation in a custom femur drilling LBR robot (KUKA) was developed by Torres et al. [40] and evaluated on a bone phantom. The dynamic bone position and orientation were registered intrainterventionally with the image of a manually operated optically tracked ultrasound probe, and a preinterventional CT scan in which the target was defined.

Summary

Research in recent years was performed in the areas of optimization for probe alignment, 3D tissue reconstruction, anatomical target recognition, vessel segmentation, and tracking. Intensive work has been done to replace external force sensors, adapt the force control for lightweight robots, improve motion compensation and trajectory planning, accelerate real-time imaging, and refine calibration. Resulting systems provide more comfort with less fatigue for the operator and improved image quality compared to conventional ultrasound.

Autonomous Systems

Autonomous systems in the field of robotic ultrasound may be considered to be systems facilitating independent task plan generation and consequent control and movement of the robot to acquire ultrasound for diagnostics or interventional tasks. First, autonomous image acquisition systems and, afterwards, systems for autonomous therapy guidance with respect to the medical fields of minimally invasive procedures, high-intensity focused ultrasound (HIFU), and radiation therapy are reviewed in this section. The systems described in this section may have a LORA between five and nine. However, the highest LORA obtained in this review is seven. The systems are presented in Table 2.

Table 2 Overview of autonomous robotic ultrasound systems and their respective components, published between 2015 and 2020

Autonomous Image Acquisition

Autonomous image acquisition systems are categorized into the following three main objectives: (1) using robotic ultrasound systems to create a volumetric image by combining several images and spatial information, (2) autonomous trajectory planning and probe positioning, and (3) optimizing image quality by probe position adjustment.

3D Image Reconstruction

A robotic ultrasound system to reconstruct peripheral arteries within the leg using 2D ultrasound images and an automatic vessel tracking algorithm was developed in [41]. The physician initially places the probe on the leg such that a cross-section of the vessel is visible. Thereafter, the vessel center is detected, and the robotic arm moves autonomously such that the vessel center is in the horizontal center of the image. A force-torque sensor is placed between probe holder and end-effector that allows keeping a constant pressure during the scan. The 3D reconstruction was performed online during the acquisition. Huang et al. [42] presented a more autonomous system that encompasses a depth camera in order to identify the patient and independently plan the scan path of the ultrasound robot. After spatial calibration, the system could autonomously identify the skin within the image and scan along the coronal plane using a normal vector-based approach for probe positioning (Fig. 3a). Two force sensors placed at the bottom of the probe ensured proper acoustic coupling during image acquisition.

Fig. 3
figure3

Overview of different robotic ultrasound systems for autonomous image acquisition. a A robotic ultrasound system autonomously scanning along a lumbar phantom (left) and the reconstructed ultrasound volume from 2D images (right) (copyright © [2019] IEEE. Reprinted with permission from [42]). b System setup including transformations (arrows) between robot, camera, ultrasound probe, and patient (left). MRI atlas displaying the generic trajectory (dotted red line) to image the aorta (right) (copyright © [2016] IEEE. Reprinted with permission from [44•]). c Robotic ultrasound system and phantom (left) with the target (red) in the ultrasound image (top right). A confidence map is calculated, and the current and desired configuration (red and green line, respectively) are calculated (bottom right) (copyright © [2016] IEEE. Reprinted with permission from [49])

Trajectory Planning and Probe Positioning

Hennersperger et al. [43] developed a robotic ultrasound system using an LBR iiwa robot that can autonomously conduct trajectories based on selected start and end points selected by a physician in preinterventional images such as MRI or CT. Given the start and end points within the MRI data, the trajectory was calculated by computing the closest surface point and combining it with the corresponding surface normal direction. Drawbacks of this method are the need for patients to hold their breath and the necessity of preinterventional image acquisition prior to selecting start and end points. The same research group overcame this drawback and used the system for quantitative assessment of the diameter of the abdominal aorta [44•]. Based on an MRI atlas and the registration to the current patient, the robot follows a generic trajectory to cover the abdominal aorta (Fig. 3b). An online force adaptation approach allowed measuring the aortic diameter even while the patient was breathing during acquisition. The system setup proposed by Graumann et al. [45] was similar but with the main objective to autonomously compute a trajectory in order to cover a volume of interest within previously obtained images such as CT, MRI, or even ultrasound. The robotic ultrasound system could cover the volume by single or multiple parallel scan trajectories. Kojcev et al. [46] evaluated the system regarding the reproducibility of measurements performed by the system producing ultrasound volumes compared to an expert-operated 2D ultrasound acquisition.

Von Haxthausen et al. [47•] developed a system that, after a manual initial placement of the probe, can control the robot in order to follow peripheral arteries, whereas the vessel detection is realized using convolutional neural networks (CNNs).

A system that provides an automatic probe position adjustment with respect to an object of interest was proposed in [48]. Their approach is based on visual servoing using image features (image moments). The authors used a 3D ultrasound probe and extracted features from the three orthogonal planes to servo in- and out-of-plane motions.

Image Quality Improvement

Since ultrasound imaging suffers from high user dependency, there is a strong interest in autonomously improving the image quality by means of probe positioning of the robot. Chatelain et al. dedicated several publications to this topic. The authors proposed a system that can automatically adjust the in-plane rotation for image quality improvement while using a tracking algorithm for a specific anatomical target [49]. The main objective was to keep the object horizontally centered within the ultrasound image while scanning the best acoustic window for the object (Fig. 3c). However, out-of-plane control is not considered. Their following work [50•] utilized the same approaches but for an ultrasound volume instead of a 2D image that in turn could provide tracking and image quality improvement for all six DOF.

Summary

Several systems and approaches have been proposed to provide autonomous image acquisition with respect to 3D image reconstruction, trajectory planning, probe positioning, and image quality improvement. A key component for initial autonomous probe placement is a depth camera to capture relative positions of robot and patient. Mostly, preinterventional images such as CT or MRI were used to calculate the trajectory needed to image the desired volume of interest. To improve image quality during acquisition, the systems rely on ultrasound image processing and force information. Even though some studies provide in vivo results, safety aspects with respect to the workflow are rarely considered within the reviewed articles.

Autonomous Therapy Guidance

This subsection presents systems that eliminate the need of human intervention for imaging during therapy. Using an autonomous system has the benefit that the physician can concentrate on the interventional task while a robot performs ultrasound imaging. To realize this, ultrasound images need to be interpreted automatically to be able to continuously track and visualize the ROI for guidance.

Minimally Invasive Procedures/Needle Guidance

In [51•], the authors proposed an autonomous catheter tracking system for endovascular aneurysm repair (EVAR). As illustrated in Fig. 4a, an LBR iiwa robot with a 2D ultrasound probe is used to acquire ultrasound images. In a preinterventional CT, the vessel structure of interest is segmented and subsequently registered to the intrainterventional ultrasound images. During the intervention, a catheter is inserted into the abdominal aorta by a physician, and the endovascular tool is guided to the ROI. The robot follows the catheter using a tracking algorithm and force control law so that the catheter tip is continuously visible in the ultrasound images. For needle placement tasks such as biopsies, Kojcev et al. [52] proposed an autonomous dual-robot system (Fig. 4b). The system can perform both ultrasound imaging and needle insertion. In this phantom study, two LBR iiwa robots are used, one holding the needle and the other one holding the ultrasound probe. Preinterventional planning data is registered to the robot coordinate system in the initialization phase using image registration. The physician selects the ROI on the patients’ surface images acquired by RGB-D (depth) cameras mounted on the robots. The robots move the ultrasound probe and the needle to the ROI and start target tracking based on a predefined target and also needle tracking to perform needle insertion as planned. A dual-robot system provides higher flexibility than a one-robot system as used in [39•, 53], but its setup is more complicated to implement.

Fig. 4
figure4

Examples of autonomous therapy guidance systems. a Autonomous robotized catheter tracking for EVAR with an LBR iiwa robot. Robot ultrasound setup (top), ultrasound image (bottom left), and 3D vessel model (bottom right) (copyright © [2019] IEEE. Reprinted with permission from [51•]). b Dual-robot system with two LBR iiwa robots performing both target tracking and needle insertion in a water bath phantom (reproduced from [52] with permission from Springer Nature)

High-Intensity Focused Ultrasound

Another application field is tumor treatment with HIFU. In [54], one 2D ultrasound probe and the HIFU transducer are mounted to a six DOF robotic arm. The HIFU focus is adapted by using speckle tracking to determine the difference between target and HIFU focus. While this phantom study only considered one-dimensional (1D) motion, the authors plan to extend the system to 2D motion. In the system developed by An et al. [55], an optically tracked 2D ultrasound probe is handheld, and a YK400XG robot (YAMAHA) holds the HIFU transducer. The robot adapts the HIFU focus to the target position that is identified in the ultrasound images. In contrast to other systems, the treatment transducer, but not the ultrasound probe, is robot controlled. Another approach is proposed in [56] where a tracking accuracy study is performed. Here, two 2D ultrasound probes mounted on the HIFU transducer are used to track the target position using image registration with preinterventional image data. So far, the ultrasound probes and the transducer are static, but the authors plan to use a dual-robot system to reach higher flexibility in the future.

Radiation Therapy

In radiation therapy, tumors are treated by using ionizing radiation. Especially treatment of soft-tissue tumors is a challenging task due to organ motion [6]. For example, various approaches have been proposed to track tumor motion and adapt the radiation beam using ultrasound guidance [57, 58•]. However, in the treatment room, the probe needs to be placed on the patient for image acquisition. To help the operator with this task, Şen et al. [59] proposed an autonomous robotic ultrasound-guided patient alignment. Kuhlemann et al. [60] proposed a robotic camera-based patient localization approach where a depth camera is used to localize the patient within the treatment room and register the body surfaces from the preinterventional CT and the depth camera. In addition, optimal ultrasound view ports were calculated from the preinterventional CT. For treatment delivery, Schlüter et al. [61] proposed the usage of a kinematically redundant robot (LBR iiwa) to be able to avoid beam interferences caused by the robotic system and developed strategies for automatic ultrasound probe placement [62••]. In addition, safety aspects need to be considered [63] to prevent collisions and ensure that robot forces do not exceed acceptable values.

Summary

Autonomous therapy guidance systems are highly application-specific and depend on the ultrasound image analysis capability. While robotic motion compensation can already be performed using force sensitive robots, the automatic detection of target motion in 2D and 3D ultrasound images is still under active research. Furthermore, most evaluations were limited to phantom experiments, highlighting the need for more realistic in vivo studies.

Trends and Future Directions

Trends in robotic ultrasound are focused on enhancing the autonomy of image acquisition, diagnosis, and therapy guidance. More advanced solutions are needed to supersede, for example, manually selected start and end points on/in the patient’s body. This could be achieved by using a body atlas including segmented organs based on MRI data. Furthermore, the capability to compensate for high-dimensional target motion and deformations should be improved to avoid target visibility loss in ultrasound images. The integration of ultrasound robots into the clinical workflow is also still under investigation. In this context, the interaction between the robot, operator, patient, and also safety aspects such as collision avoidance should be improved and have to be evaluated in in vivo studies. This could be achieved by using robots with at least six DOF and internal force sensors and additionally employing AI for robot navigation and image analysis. Another approach could be the use of VR and AR to create virtual environments and project the ultrasound image information directly into the field of view of the operator.

Towards Intelligent Systems Using Artificial Intelligence

Even though there are several groups working towards autonomous systems (Table 2), the highest LORA observed in this review was seven. This might change during the next years due to the recent emerge of technologies in the field of AI.

From our point of view, there are two main application areas of AI to increase the autonomy of robotic ultrasound systems in the future: image understanding and robot navigation. For image understanding, CNNs showed exceptional performance in medical image analysis recently [64] and were successfully applied to ultrasound images [65]. An intelligent image understanding system can aim for enhanced navigation (e.g., in automatic landmark detection [66]), for diagnosis based on the acquired images (e.g., in the autonomous detection of a specific disease [67]), or for the identification of the individually optimal therapy [68]. Regarding autonomous robot navigation, deep reinforcement learning (DRL) [69] led to a breakthrough in robot learning such as human-aware path planning [70], object manipulation [71], and obstacle avoidance in complex dynamical environments [72]. Additionally, DRL provided promising results in its application for landmark detection in ultrasound images [73] and hence might also be interesting for image understanding. These approaches might play a key role in completely autonomously solving the ultrasound probe placement task, which remains one of the open challenges in autonomous robotic ultrasound system development.

Virtual Reality and Augmented Reality

In VR, a purely digital environment is generated with or without full user immersion, while AR refers to a real-world environment enhanced by means of overlying virtual content. Previous research reported the combination of these technologies and robotic ultrasound. Regarding VR, ultrasound data were displayed on graphical user interfaces for navigation [51•, 74, 75]. The virtual scenes were extended with 3D models of the robot that controlled the ultrasound probe for treatment guidance [76] and for simulation and/or verification of the robot setup (Fig. 5a) [77, 78]. The visualization of these virtual environments on head-mounted displays (HMDs) for a fully immersed experience seems logical to mimic a real experience. Regarding AR, the real scene was enhanced by means of 2D ultrasound images (Fig. 5b) [79], 3D ultrasound images [80], and tumor models from reconstructed ultrasound volumes [81,82,83]. The AR display technologies involved projection onto the organ surface [81], video see-through devices (specifically, remote consoles for surgical robots [82] and HMDs [83]), and optical see-through HMDs (specifically, HoloLens glasses [80]). These AR setups have a high potential to increase ergonomics since the sonographers can look at the patient while acquiring ultrasound images. The new developments in ultrasound probes, non-linear image registration, and VR/AR technologies (specifically, visualization techniques, sensor integration, and user interactions) open new opportunities in robotic ultrasound to enhance physician perception with subsurface targets and critical structures and also to improve 3D understanding.

Fig. 5
figure5

Examples of VR and AR in robotic ultrasound. a Virtual radiotherapy scenario showing a linear accelerator and the robotic ultrasound acquiring data from a patient (copyright [2016] John Wiley & Sons, Inc. Used with permission from [78] and John Wiley & Sons, Inc.). b 2D ultrasound image superimposed on a laparoscopic video image (reprinted from [79], copyright [2014] with permission from Elsevier)

Conclusions

This review provides an overview of robotic ultrasound systems published within the last five years. Based on a standardized classification scheme for the autonomy level of a robotic system, each system was rated and categorized as a teleoperated, a collaborative assisting, or an autonomous system.

Teleoperated systems are developed sufficiently to perform remote exams at varying distances which is also supported by the fact that commercial systems are available nowadays. Current research on collaborative assisting systems focuses on ways to support the sonographer during the examination by means of probe positioning, navigation, and more intuitive visualizations. These systems may improve the quality of ultrasound acquisitions while providing more comfort and decreasing the mental load for the sonographer. As in other disciplines, autonomous systems are of special interest for robotic ultrasound systems as they could ultimately eliminate operator dependency. The review showed a wide variety of potential application fields, while research in these areas is still focused on ultrasound image processing as well as force adaptation strategies. In our opinion, a missing step is research on robust and reliable navigation and safety strategies for closed-loop applications to eventually reach full autonomy. Currently, the highest LORA of seven in this review shows that autonomous operation has not yet been achieved with robotic ultrasound. At the same time, many groups have declared a higher level of autonomy as their future project goal.

Future trends such as AI have the potential to increase autonomy of these platforms, with published work showing the promising capabilities of this technology in the fields of image understanding and robot navigation. At the same time, VR and AR technologies may improve ergonomics as well as spatial and anatomical understanding as these techniques allow displaying not only of important structures but also of the generated ultrasound image within the area of interest.

Overall, current robotic ultrasound systems show the potential to provide improved examination and intervention quality as well as a more ergonomically friendly work environment for sonographers with reduced workload. However, especially in this applied medical context, clinical studies are mandatory to assess the ultimate improvements in clinical outcomes.

References

Papers of particular interest, published recently, have been highlighted as: • Of importance •• Of major importance

  1. 1.

    Evans K, Roll S, Baker J. Work-related musculoskeletal disorders (WRMSD) among registered diagnostic medical sonographers and vascular technologists: a representative sample. J Diagn Med Sonogr. 2009;25:287–99. https://doi.org/10.1177/8756479309351748.

  2. 2.

    Harrison G, Harris A. Work-related musculoskeletal disorders in ultrasound: can you reduce risk? Ultrasound. 2015;23:224–30. https://doi.org/10.1177/1742271X15593575.

    Article  Google Scholar 

  3. 3.

    Priester AM, Natarajan S, Culjat MO. Robotic ultrasound systems in medicine. IEEE Trans Ultrason Ferroelectr Freq Control. 2013;60:507–23. https://doi.org/10.1109/TUFFC.2013.2593.

    Article  Google Scholar 

  4. 4.

    Swerdlow DR, Cleary K, Wilson E, Azizi-Koutenaei B, Monfaredi R. Robotic arm-assisted sonography: review of technical developments and potential clinical applications. AJR Am J Roentgenol. 2017;208:733–8. https://doi.org/10.2214/AJR.16.16780.

    Article  Google Scholar 

  5. 5.

    Beer JM, Fisk AD, Rogers WA. Toward a framework for levels of robot autonomy in human-robot interaction. J Hum Robot Interact. 2014;3:74–99. https://doi.org/10.5898/JHRI.3.2.Beer.

    Article  Google Scholar 

  6. 6.

    Elek R, Nagy TD, Nagy DA, Takács B, Galambos P, Rudas I, et al. Robotic platforms for ultrasound diagnostics and treatment. 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE; 2017. p. 1752–1757. https://doi.org/10.1109/SMC.2017.8122869.

  7. 7.

    Yip M, Das N. Robot autonomy for surgery. In: Patel R, editor. The Encyclopedia of Medical Robotics; 2018. p. 281–313. https://doi.org/10.1142/9789813232266_0010.

  8. 8.

    Wang S, Housden RJ, Noh Y, Singh A, Lindenroth L, Liu H, et al. Analysis of a customized clutch joint designed for the safety management of an ultrasound robot. Appl Sci. 2019;9:1900. https://doi.org/10.3390/app9091900.

    Article  Google Scholar 

  9. 9.

    Sandoval J, Laribi MA, Zeghloul S, Arsicault M, Guilhem J-M. Cobot with prismatic compliant joint intended for doppler sonography. Robotics. 2020;9:14. https://doi.org/10.3390/robotics9010014.

    Article  Google Scholar 

  10. 10.

    Haddadin S, De Luca A, Albu-Schaffer A. Robot collisions: a survey on detection, isolation, and identification. IEEE Trans Robot. 2017;33:1292–312. https://doi.org/10.1109/TRO.2017.2723903.

    Article  Google Scholar 

  11. 11.

    MGIUS-R3 robotic ultrasound system [Internet]. Available from: https://en.mgitech.cn/products/instruments_info/11/.

  12. 12.

    Vieyres P, Novales C, Rivas R, Vilcahuaman L, Sandoval Arévalo JS, Clark T, et al. The next challenge for WOrld wide Robotized Tele-Echography eXperiment (WORTEX 2012): from engineering success to healthcare delivery. Congreso Peruano de Ingeniería Biomédica, Bioingeniería, Biotecnología y Física Médica: Tumi II 2013. 2013. p. 205–10. Available from: http://www.congreso.pucp.edu.pe/tumi/docs/papers_revision.pdf.

  13. 13.

    Wang J, Peng C, Zhao Y, Ye R, Hong J, Huang H, et al. Application of a robotic tele-echography system for COVID-19 pneumonia. J Ultrasound Med. 2021;40:385–90. https://doi.org/10.1002/jum.15406.

  14. 14.

    Avgousti S, Panayides AS, Jossif AP, Christoforou EG, Vieyres P, Novales C, et al. Cardiac ultrasonography over 4G wireless networks using a tele-operated robot. Healthc Technol Lett. 2016;3:212–7. https://doi.org/10.1049/htl.2016.0043.

    Article  Google Scholar 

  15. 15.

    Georgescu M, Sacccomandi A, Baudron B, Arbeille PL. Remote sonography in routine clinical practice between two isolated medical centers and the university hospital using a robotic arm: a 1-year study. Telemed J E Health. 2016;22:276–81. https://doi.org/10.1089/tmj.2015.0100.

    Article  Google Scholar 

  16. 16.

    Adams SJ, Burbridge BE, Badea A, Langford L, Vergara V, Bryce R, et al. Initial experience using a telerobotic ultrasound system for adult abdominal sonography. Can Assoc Radiol J. 2017;68:308–14. https://doi.org/10.1016/j.carj.2016.08.002.

    Article  Google Scholar 

  17. 17.

    • Adams SJ, Burbridge BE, Badea A, Kanigan N, Bustamante L, Babyn P, et al. A crossover comparison of standard and telerobotic approaches to prenatal sonography. J Ultrasound Med. 2018;37:2603–12. https://doi.org/10.1002/jum.14619Describes the feasibility of a commercial telerobotic system to remotely perform prenatal sonographic examinations.

    Article  Google Scholar 

  18. 18.

    Stollnberger G, Moser C, Giuliani M, Stadler S, Tscheligi M, Szczesniak-Stanczyk D, et al. User requirements for a medical robotic system: enabling doctors to remotely conduct ultrasonography and physical examination. 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE; 2016. p. 1156–61. https://doi.org/10.1109/ROMAN.2016.7745254.

  19. 19.

    Arent K, Cholewiński M, Chojnacki Ł, Domski W, Drwięga M, Jakubiak J, et al. Selected topics in design and application of a robot for remote medical examination with the use of ultrasonography and ascultation from the perspective of the REMEDI project. J Autom Mob Robot Intell Syst. 2017;11:82–94. Available from: http://www.jamris.org/images/ISSUES/ISSUE-2017-02/82_94%20Arent.pdf.

  20. 20.

    Kurnicki A, Stańczyk B. Manipulator control system for remote USG examination. J Autom Mob Robot Intell Syst. 2019;13:48–59. https://doi.org/10.14313/JAMRIS/2-2019/18.

    Article  Google Scholar 

  21. 21.

    •• Giuliani M, Szczęśniak-Stańczyk D, Mirnig N, Stollnberger G, Szyszko M, Stańczyk B, et al. User-centred design and evaluation of a tele-operated echocardiography robot. Health Technol (Berl). 2020;10:649–65. https://doi.org/10.1007/s12553-019-00399-0Describes the complete design process and systematic evaluation of a teleoperated system for remote ultrasound exams.

    Article  Google Scholar 

  22. 22.

    Arbeille P, Zuj K, Saccomandi A, Ruiz J, Andre E, de la Porte C, et al. Teleoperated echograph and probe transducer for remote ultrasound investigation on isolated patients (study of 100 cases). Telemed J E Health. 2016;22:599–607. https://doi.org/10.1089/tmj.2015.0186.

    Article  Google Scholar 

  23. 23.

    • Arbeille P, Chaput D, Zuj K, Depriester A, Maillet A, Belbis O, et al. Remote echography between a ground control center and the International Space Station using a tele-operated echograph with motorized probe. Ultrasound Med Biol. 2018;44:2406–12. https://doi.org/10.1016/j.ultrasmedbio.2018.06.012Describes the first teleoperated ultrasound exams conducted on a space station from ground control on earth.

    Article  Google Scholar 

  24. 24.

    Guan X, Wu H, Hou X, Teng Q, Wei S, Jiang T, et al. Study of a 6DOF robot assisted ultrasound scanning system and its simulated control handle. 2017 IEEE International Conference on Cybernetics and Intelligent Systems (CIS) and IEEE Conference on Robotics, Automation and Mechatronics (RAM). IEEE; 2017. p. 469–474. https://doi.org/10.1109/ICCIS.2017.8274821.

  25. 25.

    Monfaredi R, Wilson E, Azizi Koutenaei B, Labrecque B, Leroy K, Goldie J, et al. Robot-assisted ultrasound imaging: overview and development of a parallel telerobotic system. Minim Invasive Ther Allied Technol. 2015;24:54–62. https://doi.org/10.3109/13645706.2014.992908.

    Article  Google Scholar 

  26. 26.

    Mathiassen K, Fjellin JE, Glette K, Hol PK, Elle OJ. An ultrasound robotic system using the commercial robot UR5. Front Robot AI. 2016;3:1. https://doi.org/10.3389/frobt.2016.00001.

  27. 27.

    Geng C, Xie Q, Chen L, Li A, Qin B. Study and analysis of a remote robot-assisted ultrasound imaging system. 2020 IEEE 4th Information Technology, Networking, Electronic and Automation Control Conference (ITNEC). IEEE; 2020. p. 389–393. https://doi.org/10.1109/ITNEC48623.2020.9084796.

  28. 28.

    Santos L, Cortesão R. A dynamically consistent hierarchical control architecture for robotic-assisted tele-echography with motion and contact dynamics driven by a 3D time-of-flight camera and a force sensor. 2015 IEEE International Conference on Robotics and Automation (ICRA). IEEE; 2015. p. 2931–937. https://doi.org/10.1109/ICRA.2015.7139600.

  29. 29.

    Santos L, Cortesão R. Computed-torque control for robotic-assisted tele-echography based on perceived stiffness estimation. IEEE Trans Automat Sci Eng. 2018;15:1337–54. https://doi.org/10.1109/TASE.2018.2790900.

    Article  Google Scholar 

  30. 30.

    Huang Q, Lan J. Remote control of a robotic prosthesis arm with six-degree-of-freedom for ultrasonic scanning and three-dimensional imaging. Biomed Signal Process Control. 2019;54:101606. https://doi.org/10.1016/j.bspc.2019.101606.

    Article  Google Scholar 

  31. 31.

    Boman K, Olofsson M, Berggren P, Sengupta PP, Narula J. Robot-assisted remote echocardiographic examination and teleconsultation: a randomized comparison of time to diagnosis with standard of care referral approach. JACC Cardiovasc Imaging. 2014;7:799–803. https://doi.org/10.1016/j.jcmg.2014.05.006.

    Article  Google Scholar 

  32. 32.

    Janvier M-A, Merouche S, Allard L, Soulez G, Cloutier G. A 3-D ultrasound imaging robotic system to detect and quantify lower limb arterial stenoses: in vivo feasibility. Ultrasound Med Biol. 2014;40:232–43. https://doi.org/10.1016/j.ultrasmedbio.2013.08.010.

    Article  Google Scholar 

  33. 33.

    Jiang Z, Grimm M, Zhou M, Esteban J, Simson W, Zahnd G, et al. Automatic normal positioning of robotic ultrasound probe based only on confidence map optimization and force measurement. IEEE Robot Autom Lett. 2020;5:1342–9. https://doi.org/10.1109/LRA.2020.2967682.

    Article  Google Scholar 

  34. 34.

    • Virga S, Göbl R, Baust M, Navab N, Hennersperger C. Use the force: deformation correction in robotic 3D ultrasound. Int J Comput Assist Radiol Surg. 2018;13:619–27. https://doi.org/10.1007/s11548-018-1716-8Describes a method to estimate and correct the induced deformation based solely on the tracked ultrasound images and information about the applied force.

    Article  Google Scholar 

  35. 35.

    Zhang HK, Finocchi R, Apkarian K, Boctor EM. Co-robotic synthetic tracked aperture ultrasound imaging with cross-correlation based dynamic error compensation and virtual fixture control. 2016 IEEE International Ultrasonics Symposium (IUS). IEEE; 2016. p. 1–4. https://doi.org/10.1109/ULTSYM.2016.7728522.

  36. 36.

    Li D, Cheng Z, Chen G, Liu F, Wu W, Yu J, et al. A multimodality imaging-compatible insertion robot with a respiratory motion calibration module designed for ablation of liver tumors: a preclinical study. Int J Hyperth. 2018;34:1194–201. https://doi.org/10.1080/02656736.2018.1456680.

    Article  Google Scholar 

  37. 37.

    • Chen AI, Balter ML, Maguire TJ, Yarmush ML. 3D near infrared and ultrasound imaging of peripheral blood vessels for real-time localization and needle guidance. Med Image Comput Comput Assist Interv. 2016;9902:388–96. https://doi.org/10.1007/978-3-319-46726-9_45Describes a portable device to detect peripheral blood vessels for cannula insertion combining near infrared stereo vision, ultrasound and real-time image analysis.

    Article  Google Scholar 

  38. 38.

    Chatelain P, Krupa A, Navab N. 3D ultrasound-guided robotic steering of a flexible needle via visual servoing. 2015 IEEE International Conference on Robotics and Automation (ICRA). IEEE; 2015. p. 2250–2255. https://doi.org/10.1109/ICRA.2015.7139497.

  39. 39.

    • Esteban J, Simson W, Requena Witzig S, Rienmüller A, Virga S, Frisch B, et al. Robotic ultrasound-guided facet joint insertion. Int J Comput Assist Radiol Surg. 2018;13:895–904. https://doi.org/10.1007/s11548-018-1759-xDescribes the first clinical trial using an ultrasound-based robotic guiding system to perform facet joint insertions.

    Article  Google Scholar 

  40. 40.

    Torres PMB, Gonçalves PJS, Martins JMM. Robotic motion compensation for bone movement, using ultrasound images. Ind Robot. 2015;42:466–74. https://doi.org/10.1108/IR-12-2014-0435.

    Article  Google Scholar 

  41. 41.

    Merouche S, Allard L, Montagnon E, Soulez G, Bigras P, Cloutier G. A robotic ultrasound scanner for automatic vessel tracking and three-dimensional reconstruction of b-mode images. IEEE Trans Ultrason Ferroelectr Freq Control. 2016;63:35–46. https://doi.org/10.1109/TUFFC.2015.2499084.

    Article  Google Scholar 

  42. 42.

    Huang Q, Lan J, Li X. Robotic arm based automatic ultrasound scanning for three-dimensional imaging. IEEE Trans Ind Inf. 2019;15:1173–82. https://doi.org/10.1109/TII.2018.2871864.

    Article  Google Scholar 

  43. 43.

    Hennersperger C, Fuerst B, Virga S, Zettinig O, Frisch B, Neff T, et al. Towards MRI-based autonomous robotic US acquisitions: a first feasibility study. IEEE Trans Med Imaging. 2017;36:538–48. https://doi.org/10.1109/TMI.2016.2620723.

    Article  Google Scholar 

  44. 44.

    • Virga S, Zettinig O, Esposito M, Pfister K, Frisch B, Neff T, et al. Automatic force-compliant robotic ultrasound screening of abdominal aortic aneurysms. 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE; 2016. p. 508–513. https://doi.org/10.1109/IROS.2016.7759101Describes an approach that does not rely on preinterventional images for trajectory planning as the system uses an MRI atlas including a generic trajectory.

  45. 45.

    Graumann C, Fuerst B, Hennersperger C, Bork F, Navab N. Robotic ultrasound trajectory planning for volume of interest coverage. 2016 IEEE International Conference on Robotics and Automation (ICRA). IEEE; 2016. p. 736–741. https://doi.org/10.1109/ICRA.2016.7487201.

  46. 46.

    Kojcev R, Khakzar A, Fuerst B, Zettinig O, Fahkry C, DeJong R, et al. On the reproducibility of expert-operated and robotic ultrasound acquisitions. Int J Comput Assist Radiol Surg. 2017;12:1003–11. https://doi.org/10.1007/s11548-017-1561-1.

    Article  Google Scholar 

  47. 47.

    • von Haxthausen F, Hagenah J, Kaschwich M, Kleemann M, García-Vázquez V, Ernst F. Robotized ultrasound imaging of the peripheral arteries - a phantom study. Curr Dir Biomed Eng. 2020;6:20200033. https://doi.org/10.1515/cdbme-2020-0033Describes the first system that is based on CNNs to control the robot movement for scanning peripheral arteries.

  48. 48.

    Nadeau C, Krupa A, Petr J, Barillot C. Moments-based ultrasound visual servoing: from a mono- to multiplane approach. IEEE Trans Robot. 2016;32:1558–64. https://doi.org/10.1109/TRO.2016.2604482.

    Article  Google Scholar 

  49. 49.

    Chatelain P, Krupa A, Navab N. Confidence-driven control of an ultrasound probe: target-specific acoustic window optimization. 2016 IEEE International Conference on Robotics and Automation (ICRA). IEEE; 2016. p. 3441–3446. https://doi.org/10.1109/ICRA.2016.7487522.

  50. 50.

    • Chatelain P, Krupa A, Navab N. Confidence-driven control of an ultrasound probe. IEEE Trans Robot. 2017;33:1410–24. https://doi.org/10.1109/TRO.2017.2723618Describes methods for target tracking and image quality optimization for ultrasound volumes.

    Article  Google Scholar 

  51. 51.

    • Langsch F, Virga S, Esteban J, Göbl R, Navab N. Robotic ultrasound for catheter navigation in endovascular procedures. 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE; 2019. p. 5404–5410. https://doi.org/10.1109/IROS40897.2019.8967652Describes a study performing autonomous catheter and target tracking in vivo and in a phantom for EVAR.

  52. 52.

    Kojcev R, Fuerst B, Zettinig O, Fotouhi J, Lee SC, Frisch B, et al. Dual-robot ultrasound-guided needle placement: closing the planning-imaging-action loop. Int J Comput Assist Radiol Surg. 2016;11:1173–81. https://doi.org/10.1007/s11548-016-1408-1.

    Article  Google Scholar 

  53. 53.

    Zettinig O, Fuerst B, Kojcev R, Esposito M, Salehi M, Wein W, et al. Toward real-time 3D ultrasound registration-based visual servoing for interventional navigation. 2016 IEEE International Conference on Robotics and Automation (ICRA). IEEE; 2016. p. 945–950. https://doi.org/10.1109/ICRA.2016.7487226.

  54. 54.

    Chanel L-A, Nageotte F, Vappou J, Luo J, Cuvillon L, de Mathelin M. Robotized High Intensity Focused Ultrasound (HIFU) system for treatment of mobile organs using motion tracking by ultrasound imaging: an in vitro study. 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE; 2015. p. 2571–2575. https://doi.org/10.1109/EMBC.2015.7318917.

  55. 55.

    An CY, Syu JH, Tseng CS, Chang C-J. An ultrasound imaging-guided robotic HIFU ablation experimental system and accuracy evaluations. Appl Bionics Biomech. 2017;2017:5868695. https://doi.org/10.1155/2017/5868695.

  56. 56.

    Seo J, Koizumi N, Mitsuishi M, Sugita N. Ultrasound image based visual servoing for moving target ablation by high intensity focused ultrasound. Int J Med Robot. 2017;13:e1793. https://doi.org/10.1002/rcs.1793.

  57. 57.

    De Luca V, Harris E, Lediju Bell MA, Tanner C. Challenge on Liver Ultrasound Tracking CLUST 2015 [Internet]. Available from: https://clust.ethz.ch/.

  58. 58.

    • De Luca V, Banerjee J, Hallack A, Kondo S, Makhinya M, Nouri D, et al. Evaluation of 2D and 3D ultrasound tracking algorithms and impact on ultrasound-guided liver radiotherapy margins. Med Phys. 2018;45:4986–5003. https://doi.org/10.1002/mp.13152Describes and evaluates different tracking algorithms for 2D and 3D ultrasound.

    Article  Google Scholar 

  59. 59.

    Şen HT, Lediju Bell MA, Zhang Y, Ding K, Boctor E, Wong J, et al. System integration and in vivo testing of a robot for ultrasound guidance and monitoring during radiotherapy. IEEE Trans Biomed Eng. 2017;64:1608–18. https://doi.org/10.1109/TBME.2016.2612229.

    Article  Google Scholar 

  60. 60.

    Kuhlemann I, Jauer P, Schweikard A, Ernst F. Patient localization for robotized ultrasound-guided radiation therapy. Imaging and Computer Assistance in Radiation Therapy, ICART 2015, 18th International Conference on Medical Image Computing and Computer-Assisted Intervention - MICCAI’15. 2015. p. 105–112. Available from: https://hal.archives-ouvertes.fr/hal-01264358/document.

  61. 61.

    Schlüter M, Fürweger C, Schlaefer A. Optimizing robot motion for robotic ultrasound-guided radiation therapy. Phys Med Biol. 2019;64:195012. https://doi.org/10.1088/1361-6560/ab3bfb.

    Article  Google Scholar 

  62. 62.

    •• Schlüter M, Gerlach S, Fürweger C, Schlaefer A. Analysis and optimization of the robot setup for robotic-ultrasound-guided radiation therapy. Int J Comput Assist Radiol Surg. 2019;14:1379–87. https://doi.org/10.1007/s11548-019-02009-wDescribes optimization strategies for automatic positioning of a seven DOF robot in a radiotherapy scenario.

    Article  Google Scholar 

  63. 63.

    Seitz PK, Baumann B, Johnen W, Lissek C, Seidel J, Bendl R. Development of a robot-assisted ultrasound-guided radiation therapy (USgRT). Int J Comput Assist Radiol Surg. 2020;15:491–501. https://doi.org/10.1007/s11548-019-02104-y.

    Article  Google Scholar 

  64. 64.

    Litjens G, Kooi T, Bejnordi BE, Setio AAA, Ciompi F, Ghafoorian M, et al. A survey on deep learning in medical image analysis. Med Image Anal. 2017;42:60–88. https://doi.org/10.1016/j.media.2017.07.005.

    Article  Google Scholar 

  65. 65.

    Liu S, Wang Y, Yang X, Lei B, Liu L, Li SX, et al. Deep learning in medical ultrasound analysis: a review. Engineering. 2019;5:261–75. https://doi.org/10.1016/j.eng.2018.11.020.

    Article  Google Scholar 

  66. 66.

    Tuysuzoglu A, Tan J, Eissa K, Kiraly AP, Diallo M, Kamen A. Deep adversarial context-aware landmark detection for ultrasound imaging. In: Frangi AF, Schnabel JA, Davatzikos C, Alberola-López C, Fichtinger G, editors. Medical image computing and computer assisted intervention -- MICCAI 2018. Cham: Springer International Publishing; 2018. p. 151–158. https://doi.org/10.1007/978-3-030-00937-3_18.

    Google Scholar 

  67. 67.

    Li H, Weng J, Shi Y, Gu W, Mao Y, Wang Y, et al. An improved deep learning approach for detection of thyroid papillary cancer in ultrasound images. Sci Rep. 2018;8:6600. https://doi.org/10.1038/s41598-018-25005-7.

    Article  Google Scholar 

  68. 68.

    Diamant A, Chatterjee A, Vallières M, Shenouda G, Seuntjens J. Deep learning in head & neck cancer outcome prediction. Sci Rep. 2019;9:2764. https://doi.org/10.1038/s41598-019-39206-1.

    Article  Google Scholar 

  69. 69.

    Mnih V, Kavukcuoglu K, Silver D, Rusu AA, Veness J, Bellemare MG, et al. Human-level control through deep reinforcement learning. Nature. 2015;518:529–33. https://doi.org/10.1038/nature14236.

    Article  Google Scholar 

  70. 70.

    Chen YF, Everett M, Liu M, How JP. Socially aware motion planning with deep reinforcement learning. 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE; 2017. p. 1343–1350. https://doi.org/10.1109/IROS.2017.8202312.

  71. 71.

    Yuan W, Stork JA, Kragic D, Wang MY, Hang K. Rearrangement with nonprehensile manipulation using deep reinforcement learning. 2018 IEEE International Conference on Robotics and Automation (ICRA). IEEE; 2018. p. 270–277. https://doi.org/10.1109/ICRA.2018.8462863.

  72. 72.

    Wang Y, He H, Sun C. Learning to navigate through complex dynamic environment with modular deep reinforcement learning. IEEE Trans Games. 2018;10:400–12. https://doi.org/10.1109/TG.2018.2849942.

    Article  Google Scholar 

  73. 73.

    Alansary A, Oktay O, Li Y, Folgoc LL, Hou B, Vaillant G, et al. Evaluating reinforcement learning agents for anatomical landmark detection. Med Image Anal. 2019;53:156–64. https://doi.org/10.1016/j.media.2019.02.007.

    Article  Google Scholar 

  74. 74.

    Bhattad S, Escoto A, Malthaner R, Patel R. Robot-assisted, ultrasound-guided minimally invasive navigation tool for brachytherapy and ablation therapy: initial assessment. In: Webster III RJ, Yaniv ZR, editors. Medical Imaging 2015: Image-guided procedures, robotic interventions, and Modeling. SPIE; 2015. p. 94150N. https://doi.org/10.1117/12.2082472.

  75. 75.

    Samei G, Tsang K, Kesch C, Lobo J, Hor S, Mohareri O, et al. A partial augmented reality system with live ultrasound and registered preoperative MRI for guiding robot-assisted radical prostatectomy. Med Image Anal. 2020;60:101588. https://doi.org/10.1016/j.media.2019.101588.

    Article  Google Scholar 

  76. 76.

    Lim S, Jun C, Chang D, Petrisor D, Han M, Stoianovici D. Robotic transrectal ultrasound guided prostate biopsy. IEEE Trans Biomed Eng. 2019;66:2527–37. https://doi.org/10.1109/TBME.2019.2891240.

    Article  Google Scholar 

  77. 77.

    Pîslă D, Gherman B, Gîrbacia F, Vaida C, Butnariu S, Gîrbacia T, et al. Optimal planning of needle insertion for robotic-assisted prostate biopsy. In: Borangiu T, editor. Advances in robot design and intelligent control. Cham: Springer International Publishing; 2016. p. 339–346. https://doi.org/10.1007/978-3-319-21290-6_34.

    Google Scholar 

  78. 78.

    Schlosser J, Gong RH, Bruder R, Schweikard A, Jang S, Henrie J, et al. Robotic intrafractional US guidance for liver SABR: system design, beam avoidance, and clinical imaging. Med Phys. 2016;43:5951–63. https://doi.org/10.1118/1.4964454.

    Article  Google Scholar 

  79. 79.

    Hughes-Hallett A, Pratt P, Mayer E, Di Marco A, Yang G-Z, Vale J, et al. Intraoperative ultrasound overlay in robot-assisted partial nephrectomy: first clinical experience. Eur Urol. 2014;65:671–2. https://doi.org/10.1016/j.eururo.2013.11.001.

    Article  Google Scholar 

  80. 80.

    von Haxthausen F, Böttger S, Kleemann M, Ernst F, Schweikard A. Robotics from the bench: research for ultrasound automation with augmented reality visualization. Proceedings on Minimally Invasive Surgery. 2019. 219. Available from: https://doi.org/10.18416/MIC.2019.

  81. 81.

    Edgcumbe P, Singla R, Pratt P, Schneider C, Nguan C, Rohling R. Follow the light: projector-based augmented reality intracorporeal system for laparoscopic surgery. J Med Imaging (Bellingham). 2018;5:021216. https://doi.org/10.1117/1.JMI.5.2.021216.

  82. 82.

    Edgcumbe P, Singla R, Pratt P, Schneider C, Nguan C, Rohling R. Augmented reality imaging for robot-assisted partial nephrectomy surgery. In: Zheng G, Liao H, Jannin P, Cattin P, Lee S-L, editors. Medical imaging and augmented reality. Cham: Springer International Publishing; 2016. p. 139–150. https://doi.org/10.1007/978-3-319-43775-0_13.

    Google Scholar 

  83. 83.

    Shen J, Zemiti N, Taoum C, Aiche G, Dillenseger J-L, Rouanet P, et al. Transrectal ultrasound image-based real-time augmented reality guidance in robot-assisted laparoscopic rectal surgery: a proof-of-concept study. Int J Comput Assist Radiol Surg. 2020;15:531–43. https://doi.org/10.1007/s11548-019-02100-2.

    Article  Google Scholar 

Download references

Funding

Open Access funding enabled and organized by Projekt DEAL.

Author information

Affiliations

Authors

Contributions

All authors contributed to the study conception, study design, original draft preparation, as well as review and editing. Felix von Haxthausen performed supervision, figure composition, as well as the literature search and data analysis for autonomous image acquisition. Sven Böttger performed the literature search and data analysis for collaborative assistance. Daniel Wulff performed the literature search and data analysis for autonomous therapy guidance. Jannis Hagenah performed the literature search and data analysis for towards intelligent systems using artificial intelligence. Verónica García-Vázquez performed the literature search and data analysis for virtual reality and augmented reality. Svenja Ipsen performed project administration, supervision, as well as the literature search and data analysis for teleoperation.

Corresponding author

Correspondence to Felix von Haxthausen.

Ethics declarations

Conflict of Interest

Felix von Haxthausen reports that his own work is referenced in this review paper (references 47 and 80). Sven Böttger reports that his own work is referenced in this review paper (reference 80). Jannis Hagenah reports that his own work is referenced in this review paper (reference 47). Verónica García-Vázquez reports that her own work is referenced in this review paper (reference 47). Daniel Wulff and Svenja Ipsen have nothing to disclose.

Human and Animal Rights and Informed Consent

This article does not contain any studies with human or animal subjects performed by any of the authors.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article belongs to the Topical Collection on Medical and Surgical Robotics

Appendix 1

Appendix 1

Fig. 6
figure6

The applied taxonomy for levels of robot autonomy (figure by JM. Beer et al. [5] under CC-BY license)

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

von Haxthausen, F., Böttger, S., Wulff, D. et al. Medical Robotics for Ultrasound Imaging: Current Systems and Future Trends. Curr Robot Rep (2021). https://doi.org/10.1007/s43154-020-00037-y

Download citation

Keywords

  • Telesonography
  • Collaborative robotics
  • Autonomous image acquisition
  • Autonomous therapy guidance
  • Intelligent systems
  • Virtual/augmented reality