An Overview of Systems and Techniques For Autonomous Robotic Ultrasound Acquisitions

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/350778947

An Overview of Systems and Techniques for Autonomous Robotic Ultrasound


Acquisitions

Preprint  in  IEEE Transactions on Medical Robotics and Bionics · April 2021


DOI: 10.1109/TMRB.2021.3072190

CITATIONS READS
0 91

3 authors, including:

Keyu Li
The Chinese University of Hong Kong
11 PUBLICATIONS   13 CITATIONS   

SEE PROFILE

All content following this page was uploaded by Keyu Li on 28 April 2021.

The user has requested enhancement of the downloaded file.


This is the author's version of an article that has been published in this journal. Changes were made to this version by the publisher prior to publication.
The final version of record is available at http://dx.doi.org/10.1109/TMRB.2021.3072190
JOURNAL OF LATEX CLASS FILES, VOL. 14, NO. 8, AUGUST 2015 1

An Overview of Systems and Techniques for


Autonomous Robotic Ultrasound Acquisitions
Keyu Li, Graduate Student Member, IEEE, Yangxin Xu, Graduate Student Member, IEEE,
and Max Q.-H. Meng∗ , Fellow, IEEE

Abstract—Ultrasound (US) acquisition in standard clinical


procedures requires manual operation of the probe based on the
interpretation of the image. A robotic system for autonomous
US acquisitions holds great promise to relieve the workload of
sonographers, improve access to care, yield more standardized
imaging results, and circumvent the need for direct patient
contact. In this paper, we provide an overview of the current
status of systems and techniques for autonomous robotic US
acquisitions in the literature. Specifically, we first present a
framework to standardize the increasing levels of autonomy in
robotic ultrasonography and briefly discuss the non-autonomous
and semi-autonomous systems. Then, we systematically review
the existing autonomous robotic systems developed for extracor-
poreal US acquisitions based on the hardware components, work-
flow and applications. Besides, we summarize three important
topics related with the autonomous US acquisitions, including the
scanning path planning, contact force control and image quality
optimization, and introduce current methods and techniques
proposed on each topic. Finally, we discuss the challenges and Fig. 1. Representative commercial products for robot-assisted US acquisitions,
including the automated breast US (ABUS) systems (a) Invenia ABUS 2.0
provide some potential directions for future research. We believe (General Electric, USA) [13] and (b) ACUSON S2000 ABVS (Siemens,
that the presented efforts in autonomous robotic US acquisitions Germany) [14], and the robotic transcranial Doppler (TCD) systems (c) Delica
will pave the way for a promising future of medical care and Robotic TCD headband (Shenzhen Delica Medical Equipment Company,
benefit a broad spectrum of clinical procedures. China) [15] and (d) Lucid M1 TCD system (Neural Analytics, USA) [16].
Index Terms—Medical robotics, robotic system and software,
robotic ultrasound, ultrasound imaging.
physical and cognitive burden on the sonographer. As a result,
the quality and repeatability of the acquisitions are highly
I. I NTRODUCTION
dependent on the operator [7], and it is difficult for people

U LTRASOUND (US) imaging, first used as a diagnostic


tool in the 1940s, has become one of the most commonly
used diagnostic modalities in the world [1][2]. Due to the
in remote or rural areas to undergo such examinations where
a trained sonographer may not be available. Also, the excessive
workload has been exposing the sonographers to health risks
advantages of portability, non-invasiveness, low cost and real- such as regional pain and musculoskeletal disorders, since they
time capabilities over other imaging techniques, US imaging have to continously exert a significant force onto the patient
is widely accepted in clinical use in a broad range of medical during imaging [8][9]. Moreover, the frontline sonographers
disciplines such as cardiology [3], urology [4], neurology [5], are vulnerable to infectious diseases due to exposure to the
obstetrics and gynecology [6]. pathogen caused by direct patient contact, especially during
In standard US examinations, a skilled sonographer is a pandemic such as COVID-19 [10][11]. This highlights the
required to manually operate a US probe to scan the patient need to develop automated or remotely operated systems for
based on the interpretation of US images and mental con- US acquisitions to cope with the insufficient medical resources
struction of the target anatomy, which would impose a heavy and reduce the risk of infection for medical staff [12].
This work was partially supported by National Key R&D program of China With improved accuracy, dexterity, maneuverability and per-
with Grant No. 2019YFB1312400, Hong Kong RGC GRF grant #14210117, ception capabilities, modern robotic manipulators are capable
Hong Kong RGC TRS grant T42-409/18-R and Hong Kong RGC GRF grant of performing precise force and position control of the probe,
#14211420 awarded to Max Q.-H. Meng.
K. Li and Y. Xu are with the Department of Electronic Engi- holding great promise to reduce the physical and cognitive
neering, The Chinese University of Hong Kong, Hong Kong (e-mail: burden on the sonographer, improve access to care, yield
kyli@link.cuhk.edu.hk; yxxu@link.cuhk.edu.hk). more standardized imaging results, and circumvent the need
Max Q.-H. Meng is with the Department of Electronic and Electrical Engi-
neering of the Southern University of Science and Technology in Shenzhen, for direct contact between medical staff and patients. Since
China, on leave from the Department of Electronic Engineering, The Chinese the first tele-echography systems were developed at the end
University of Hong Kong, Hong Kong, and also with the Shenzhen Research of the 20th century [17][18], burgeoning efforts have been
Institute of the Chinese University of Hong Kong in Shenzhen, China (e-mail:
max.meng@ieee.org). made to automatize the US acquisitions in a broad range of
∗ Corresponding author. clinical applications from diagnosis to interventions [19][20].

Copyright (c) 2021 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.
This is the author's version of an article that has been published in this journal. Changes were made to this version by the publisher prior to publication.
The final version of record is available at http://dx.doi.org/10.1109/TMRB.2021.3072190
JOURNAL OF LATEX CLASS FILES, VOL. 14, NO. 8, AUGUST 2015 2

Several teleoperated systems have been successfully applied the levels of computer assistance provided by the systems, i.e.,
in the remote US examination and diagnosis of lung, heart, to which extent the robotic system may improve ease of use,
and vasculature during the outbreak of COVID-19 supported relieve operator burden, and reduce the user-dependency in US
by the 5G communication network [12][21][22]. acquisitions.
In the field of industry, there have been a few com- The simplest level is defined as “manual probe manipu-
mercialization attempts in robot-assisted US acquisitions for lation”, which is associated to level 0 or no autonomy in
some specific applications, including the automated breast US [31]. The user directly instructs the probe motion and the
(ABUS) and robotic transcranial Doppler (TCD) systems, as robot holding a US probe will automatically compute the
shown in Fig. 1. Introduced a few decades ago, ABUS has control actions to accomplish the required probe motion.
become a promising new tool to detect cancer in women This level includes two kinds of systems, i.e., the tele-
with dense breasts [23][24][25][26]. A technologist is required echography systems [34][35][36][37] and the “teach/replay”
to manually place the probe on the patient and apply an systems [38][39][40][41]. In the teleoperated systems, a re-
appropriate pressure during the scan, and the system can easily mote sonographer can manipulate a robot at the examination
scan the whole breast with a single sweep using a large- site through a joystick in real time to perform remote US
footprint transducer [27]. 3D reconstruction is automatically examinations on people in medically underserved areas. In the
performed after acquisition to allow clinicians to view any “teach/replay” systems, an experienced sonographer physically
slice in the breast. Transcranial Doppler (TCD) is non-invasive maneuvers the US probe in the teaching phase and the probe
tool to measure cerebral blood flow velocity in the major motions are recorded, then the robotic system at the patient
intracranial arteries [28]. The robotically assisted TCD systems location could automatically execute the US examinations
are usually designed as a wearable headband or headset that following the demonstrated trajectories. Compared with the
contains the TCD probes and the robotic drive, as shown teleoperated systems, the “teach/replay” mode can avoid the
in Fig. 1 (c)(d). After the robot is initiated by a trained need for real-time transmission of US images and complex
operator, it can automatically adjust the probes to detect and remote robot control, thereby reducing the bandwidth and
track intracranial vasculature, which provides the potential communication requirements in remote US examinations [40].
to obtain consistent and reproducible TCD recordings for However, since the probe motion is completely instructed by
extended periods [15][16][29]. The robotic TCD systems have the user in the above systems, the workload of the operator
been successfully applied in the detection of pulmonary vessel is still heavy and the imaging quality still depends on the
dilatation in patients with COVID-19 pneumonia [30]. operator. Besides, the complex remote operation may result in
Recently, researchers have introduced six levels of au- information overload on the remote clinician, and has limited
tonomy for medical robots characterized by increased au- ability to perform tasks at the patient bedside [20].
tonomous capabilities [31]. These concepts have been applied Level 1 or robotic assistance, is defined as “human-robot
to analyse different medical robotic systems such as surgical cooperation” following the classification in [20] to describe the
robots [32] and robotic colonoscopy systems [33]. In order systems that allow the robot and the operator to have shared
to provide a comprehensive overview of the current status of control over the probe motion. Salcudean et al. [17][18][42]
autonomous robotic ultrasonography, in this work, we first and Abolmaesumi et al. [43][44][45] first utilized US image-
provide a framework to standardize the increasing levels of based visual servoing to control up to three degrees-of-freedom
autonomy in robotic ultrasonography and briefly introduce (DoFs) of a conventional 2D US probe to realize shared
the non-autonomous and semi-autonomous systems, and then control in teleoperated ultrasonography. The visual servoing
conduct a systematic review of literature on the systems techniques allow the robot to automatically track the desired
and techniques developed for autonomous extracorporeal US image features and compensate for unwanted patient motions
acquisitions. during teleoperation, thus partially reducing operator burden
The rest of this article is organized as follows. In Section II, and user-dependency in the acquisitions. Şen et al. [46]
we define the levels of autonomy in robotic US acquisition designed a cooperative control strategy using virtual fixture
systems and present our search methodology for the review of constraints to help an inexperienced operator reproduce the
autonomous systems. Section III provides an introduction of planned US acquisition during radiation therapy. They further
existing robotic systems for autonomous US acquisitions based incorporated visual servoing to dynamically update the virtual
on their system design, workflow and target applications. In fixtures based on real-time US images [47]. Fang et al. [48]
Section IV, we present a detailed discussion of related topics developed a dual force sensing system to comply with the
and methods in the autonomous systems and compare their motion of the operator while applying a stable contact force
features, before we discuss the challenges and provide some under admittance control to alleviate the physical burdens on
future perspectives in Section V. the sonographers. User studies were performed to study the
effect of the robotic force assistance on US image stability
II. M ETHODOLOGY in [49]. Different from the tele-operated systems, the human-
robot cooperation systems do not rely entirely on the remote
A. Autonomy of Robotic US Acquisitions control by a medical expert, but can allow a less experienced
Following the concept of autonomy levels in medical operator to perform US imaging with the assistance of the
robotics [31], we define the increasing levels of autonomy robot. Moreover, as the operator is included in the loop of in-
in robotic ultrasonography to introduce the prior art based on person control, the safety of the patient can be better monitored

Copyright (c) 2021 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.
This is the author's version of an article that has been published in this journal. Changes were made to this version by the publisher prior to publication.
The final version of record is available at http://dx.doi.org/10.1109/TMRB.2021.3072190
JOURNAL OF LATEX CLASS FILES, VOL. 14, NO. 8, AUGUST 2015 3

TABLE I
I NCLUSION AND EXCLUSION CRITERIA FOR AUTONOMOUS ROBOTIC US
ACQUISITIONS .

Inclusion Criteria Exclusion Criteria


• Articles in English or • Non-English articles or
Study translated to English without translation to English
design • Before November 23, 2020 • No full-text
• Full-text available
• Articles on autonomous robotic • For non-medical use
acquisitions for extracorporeal US • Autonomous acquisitions for
Acquisition imaging of human tissues for other imaging modalities (e.g.
purpose medical use (e.g., diagnosis, X-ray, CT, MRI)
biometric measurement and • Robot-assisted interventions or
interventional support) surgeries under US guidance
• Intracorporeal US acquisitions
• Articles that describe systems • Articles that do not include
or techniques for autonomous US robotic systems, including US
acquisitions conducted using image analysis and simulated US
physical robotic systems acquisitions
Acquisition • Details of the robotic system • Articles without clear
method and acquisition methods are description of the robotic system
presented or methods for acquisition
• The acquisition process is fully • The acquisition process is fully
automatic without human or patially controlled by human
intervention operators
• Articles that present the US • Articles that only describe the
Acquisition images acquired by the robotic robot design without displaying
result system the acquired images
Fig. 2. Workflow of existing systems for autonomous robotic US acquisitions.

by the on-site clinician and the scanning protocols can be 0, 1, and 1.5 following our standardization. However, the
easily modified through patient-clinician communication [48]. autonomous systems of level 2 and above have not been sys-
Therefore, the unique features of the human-robot cooperation tematically discussed and analysed in the literature. Therefore,
systems make them more likely to adapt to clinical practice our review will mainly focus on the systems of level 2 and 3
compared with the remotely controlled ones. for autonomous US acquisitions.
We further define level 1.5 to describe the systems that
enable fully automatic control of the probe but require manual
operation to initialize the probe position. Examples include B. Search Methodology
the systems that use visual servoing to autonomously stabilize In order to find relevant literature sources, a search was
an existing target in the image [50][51][52]. The systems of conducted by specifying the following keywords (“ultrasound”
level 1 and 1.5 are considered as semi-autonomous systems, OR “ultrasonography”) AND (“robot*”) AND (“automatic”
as manual operation of the probe is still involved in the OR “autonomous”) AND (“imaging” OR “acquisition” OR
acquisition process. “scanning” OR “screening”) in the IEEE Xplore, PubMed
Level 2 or task autonomy in robotic ultrasonography is and Google Scholar databases. We adopted a set of inclu-
used to describe the systems that can perform autonomous sion/exclusion criteria as shown in Table I to screen the
US acquisition along a manually planned path. The operator resulting records. After reviewing the titles and abstracts of
only indicates the scanning strategy in the planning phase, the results, we removed duplicate, non-English and irrelevant
and no manual operation is required during the robotic acqui- articles, and accessed the full-text of the remaining articles
sition. While the overall procedure is further simplified, the to verify whether they met the above criteria. In addition,
predefined scanning path may limit the flexibility of the ac- works from the same group that had been published multiple
quisition due to the presence of tissue deformation and patient times without significant modifications are manually screened
movement. Therefore, automatic correction of the manually to include the most representative ones. While this review
planned path or online probe adjustment based on sensor data focuses on the autonomous US acquisitions, some technolog-
are incorporated in some systems to improve the flexibility ical developments in the semi-autonomous systems related to
of the execution. Level 3 or conditional autonomy describes topics involved in the autonomous US acquisitions are also
the systems that can autonomously plan and execute the US discussed in this article.
acquisition without any instruction of the human operator (but Table II presents a comprehensive description of existing
under supervision of the operator). The system should be able autonomous robotic systems for extracorporeal US acquisi-
to respond to sensor data and make corresponding decisions tions. First, the hardware components, including the robotic
to obtain US images for its target application, which offers the manipulator, US devices and additional sensors are sum-
highest level of assistance to the user. marized for each study. Second, due to the differences in
Several review articles have been written to date that focus the scanning algorithms, the controlled degrees of freedom
on the non-autonomous and semi-autonomous systems for (DoFs) of the probe and whether preoperative imaging data
robotic US acquisitions [19][20][23][53], which are at level is required are also reported to provide an indicator on the

Copyright (c) 2021 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.
This is the author's version of an article that has been published in this journal. Changes were made to this version by the publisher prior to publication.
The final version of record is available at http://dx.doi.org/10.1109/TMRB.2021.3072190
JOURNAL OF LATEX CLASS FILES, VOL. 14, NO. 8, AUGUST 2015 4

application range of each method. Third, since the challenges


and requirements of the systems are application-specific, we
also report the type of the imaging results (i.e., 2D- or 3D-US)
and the anatomical and clinical applications for each study.
The different evaluation metrics used in these methods are
summarized and discussed in Section III-C.

The general workflow of existing systems for autonomous


robotic US acquisitions is illustrated in Fig. 2. The scanning
path is usually planned before acquisition to cover the region
of interest (ROI) in the patient. In some methods, preopera-
tive medical imaging data of the patient such as Computed Fig. 3. Representative autonomous robotic systems for 2D-US acquisitions.
Tomography (CT) and Magnetic Resonance Imaging (MRI) (a) A robotic system for autonomous liver screening [56]. (b) A robotic
platform for imaging of the cervix [57]. (c) A robotic system for fetal US
is used for planning of the US acquisition. Other methods scan with a passive mechanism [67]. (d) A robotic system for breast US
plan the scanning path based on the real-time perception acquisition based on US feedback [68] (e) A robotic system for autonomous
of the patient obtained from external sensors (e.g., RGB-D tissue scanning under free-form motion for intraoperative support [69].
camera), or directly plan the movement of the probe based
on the acquired US images. During acquisition, the position III. ROBOTIC S YSTEMS FOR AUTONOMOUS US
and orientation of the probe are continuously controlled by a ACQUISITIONS
robotic manipulator to scan the patient following the planned
A. Systems for Extracorporeal 2D-US Acquisitions
path. The contact force between the probe and the patient is
monitored and controlled in real time to maintain acoustic In conventional 2D-US scans, the sonographer manually
coupling and ensure patient safety. In some of the reported operates a probe to find the desired imaging plane that can
systems, the acquired US images are processed to detect clearly show the anatomical structures and contains valuable
anatomical structures or reconstruct a 3D volume and used information for identifying abnormalities or performimg bio-
as feedback to update the scanning path or adjust the pose of metric measurements [71][72]. The main challenge of au-
the probe. After acquisition, the system is evaluated with some tonomous robotic 2D-US acquisitions lies in the interpretation
application-specific metrics and the acquired data is used for of the acquired US images and the corresponding robot control
various clinical and anatomical applications. strategies to properly position the probe.

TABLE II
S UMMARY OF AUTONOMOUS ROBOTIC S YSTEMS FOR E XTRACORPOREAL US ACQUISITIONS

Controlled DoFs of Requirement of Imaging Anatomical Clinical


Reference Robotic manipulator US devices Additional sensors
the US probe preoperative image result application application
A custom-made composed of a
Pro Sound II SSD-6500SV (ALOKA
Nakadate et al. (2010) [54] 6-DoF parallel link manipulator and None 6 No 2D-US Carotid artery Diagnosis
Co., Ltd., Japan) with a linear probe
a passive serial link arm [55]
MELFA RV-1 (Mitsubishi Electric, A diagnostic US machine (Hitachi Aloka A web camera, and a six-axis force
Mustafa et al. (2013) [56] 6 No 2D-US Liver Diagnosis
Ltd., Japan) (6 DoFs) Medical, Ltd., Japan) sensor
APLIO MX SSA-780A (Toshiba 5 (3 for translation,
Pahl et al. (2015) [57] A custom-made platform (5 DoFs) None No 2D-US Cervix Diagnosis
Corporation, Japan) with a convex probe 2 for rotation)
F3 articulated robot (CRS Robotics SonixTouch (Analogic Corporation, A six-axis force/torque sensor (Industrial Lower limb Biometric
Merouche et al. (2015) [58] 3 (translation) No 3D-US
Corporation, Canada) (6 DoFs) USA) with a linear probe L14-7 Automation, USA) arteries measurement
KUKA LBR iiwa R800 (KUKA Sonix RP (Analogic Corporation, USA) RGB-D camera (Kinect, Microsoft 4 (3 for translation, Abdominal Biometric
Virga et al. (2016) [59] No 3D-US
Roboter GmbH, Germany) (7 DoFs) with a convex probe 4DC7-3/40 Corporation, USA) 1 for rotation) aortic aneurysms measurement
KUKA iiwa (KUKA Roboter Sonix RP (Analogic Corporation, USA) RGB-D camera (Kinect, Microsoft Interventional
Hennersperger et al. (2016) [60] 6 Yes 3D-US General
GmbH, Germany) (7 DoFs) with a convex probe 4DC7-3/40 Corporation, USA) support
KUKA iiwa LBR5 (KUKA Roboter Ultrasonix (Analogic Corporation, USA) RGB-D camera (Kinect, Microsoft
Graumann et al. (2016) [61] 6 Yes 3D-US General General purpose
GmbH, Germany) (7 DoFs) with a convex probe C5-2 Corporation, USA)
KUKA LBR iiwa7 R800 (KUKA SonixTable (Analogic Corporation, USA) RGB-D camera (RealSense SR300, Intel Interventional
Kojcev et al. (2016) [62] 6 No 3D-US General
Roboter GmbH, Germany) (7 DoFs) with a convex probe C5-2/60 Corporation, USA) support
KUKA LBR iiwa R800 (KUKA Sonix RP (Analogic Corporation, USA) RGB-D camera (Kinect, Microsoft
Göbl et al. (2017) [63] 6 Yes 3D-US General General purpose
Roboter GmbH, Germany) (7 DoFs) with a convex probe 4DC7-3/40 Corporation, USA)
KUKA LBR iiwa7 R800 (KUKA SonixTablet (Analogic Corporation, RGB-D camera (RealSense F200, Intel Biometric
Kojcev et al. (2017) [64] 6 No 3D-US Thyroid
AG, Germany) (7 DoFs) USA) with a linear probe L14-5 Corporation, USA) measurement
Sonix RP (Analogic Corporation, USA) RGB-D camera (Kinect, Microsoft
A 3D translating device (Leetro
Huang et al. (2018) [65] with a linear probe L9-4/38, and a Corporation, USA) and two force sensors 3 (translation) No 3D-US General General purpose
Automation, Ltd., China) (3 DoFs)
convex probe C7-3/50 (IMS-Y-Z03, I-Motion Inc., China)
RGB-D camera (Kinect, Microsoft
Epson C4-A601S (Seiko Epson Sonix RP (Analogic Corporation, USA)
Huang et al. (2018) [66] Corporation, USA) and two force sensors 6 No 3D-US General General purpose
Corporation, Japan) (6 DoFs) with a linear probe L9-4/38
(RFP-601, Yubo, Inc., China)
A custom-made platform with two
LOGIQ S8 (General Electric, USA) with A three-axis force sensor (USL06-H5, 5 (3 for translation,
Tsumura and Iwata (2020) [67] linear stages and a 3-DoF passive No 2D-US Fetus Diagnosis
a convex probe C1-5 Tec Gihan, Japan) 2 for rotation)
mechanism (5 DoFs)
KUKA LBR Med7 R800 (KUKA Siemens X300 (Siemens AG, Germany)
Welleweerd et al. (2020) [68] None 6 No 2D-US Breast Diagnosis
Roboter GmbH, Germany) (7 DoFs) with a linear probe VF13-5
ProSound Alpha 10 (Hitachi Aloka
Da Vinci surgical robot (Intuitive A stereo endoscopic camera (da Vinci Interventional
Zhan et al. (2020) [69] Medical Ltd., Japan) with a linear probe 6 No 2D-US General
Surgical Inc., USA) (7 DoFs) Research Kit) support
UTS-533
Franka Emika Panda (Franka Emika Verasonics Vantage 128 (Verasonics, Inc., RGB-D camera (RealSense D435i, Intel
Kaminski et al. (2020) [70] 3 (translation) No 3D-US Thyroid Diagnosis
GmbH, Germany) (7 DoFs) USA) with a linear probe ATL L7-4 Corporation, USA)

Copyright (c) 2021 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.
This is the author's version of an article that has been published in this journal. Changes were made to this version by the publisher prior to publication.
The final version of record is available at http://dx.doi.org/10.1109/TMRB.2021.3072190
JOURNAL OF LATEX CLASS FILES, VOL. 14, NO. 8, AUGUST 2015 5

As the first attempt towards autonomous robotic 2D-US applications (see Fig. 3 (e)). A da Vinci surgical robot attached
acquisitions, Nakadate et al. [54] developed a robot which is with a linear US probe is used to scan the tissue along a
composed of a 6-DoF parallel link manipulator and a passive pre-defined scanning trajectory, and the image of the tissue
serial link arm [55], and used it in combination with real-time surface captured by a stereo endoscopic camera is processed
image processing algorithms to automatically acquire a clear to estimate the tissue motion, reconstruct the 3D surface and
2D-US image of the longitudinal section of the carotid artery correct the scanning trajectory in real time.
for diagnostic purposes. Several hand-crafted features are used
to detect the anatomical landmarks in the acquired images, and
B. Systems for Extracorporeal 3D-US Acquisitions
a scanning strategy is designed accordingly to search for the
desire imaging plane of the carotid artery. Compared with 2D-US imaging, 3D-US imaging can visu-
Mustafa et al. [56] used a commercial robotic manipulator alize the entire target anatomy and allow clinicians to view any
with 6 DoFs with a web camera to autonomously screen the slice in the compounded volume, which can provide a better
liver (see Fig. 3 (a)). The epigastric region of the patient is lo- point of reference for diagnosis [75], allow more repeatable
cated based on the RGB image of the patient surface obtained biometric measurements [76], and the volume data can be
by a web camera, similar as [73], and a scanning protocal fused with other 3D imaging modalities [77]. Besides, an
was designed to mimic the clinical 2D-US examination of overview of the target anatomy of interest may assist in plan-
the liver. Although the quality of the acquired images cannot ning surgery or interventions [58], or providing intraoperative
be guaranteed, this work shows the potential of using robotic support [60]. However, the image quality and repeatability of
systems to autonomously acquire meaningful US images of freehand 3D ultrasound is usually susceptible to the varying
the liver for diagnosis. forces exerted by the operator and the motion of the patient,
Pahl et al. [57] proposed a cage-like robotic platform as the induced inhomogeneous tissue deformation will impair
attached to the patient bed to acquire 2D images of the cervix, the volumetric reconstruction [78][79]. The enhanced func-
as shown in Fig. 3 (b). The system can scan along manually tionality and maneuverability of modern robotic manipulators
defined paths with 3 translational DoFs and 2 rotational DoFs can allow precise position and force control, hence the robots
to acquire coronal and sagittal US images of the cervix. are advantageous for the probe manipulation tasks in 3D-US
Similarly, Tsumura and Iwata [67] developed a 5-DoF cage- acquisitions. Therefore, a number of robotic systems have been
like robotic system for fetal ultrasonography (see Fig. 3 (c)). designed to automatically acquire a 3D image that covers an
Different from [57], the robot can only actively move the probe ROI, especially for biometric measurement and interventional
in the horizontal plane, and a passive mechanism is used to applications.
adjust the height and 2-DoF orientation of the probe and apply Merouche et al. [58] proposed a robotic US system to
a constant force on the patient. The design of the passive US automatically scan the lower limb arteries and reconstruct a
probe holding mechanism can avoid directly pushing the probe 3D volume for assessment of the length and severity of a
toward the patient with actuators and allow safe interaction stenosis. As shown in Fig. 4 (a), a linear probe is held by a 6-
between the US probe and the abdominal surface, thereby DoF robotic manipulator to scan along the lower limb artery,
improving the safety of the fetus and the pregnant woman. during which the vessel lumen segmented from the B-mode
However, some limitations of the system were also reported, image in real time and the position of the probe is adjusted
including the limited scan range and shadow artifacts in the accordingly to maintain the artery in the center of the image.
acquired US images [67]. In addition, the cage-like design After the acquisition, a reconstructed 3D volume of the vessel
in [57] and [67] may make the patient feel uncomfortable segment is used to measure the vessel cross-sectional areas
during the robotic US examination, which should be taken and quantification of stenoses.
into consideration in the future mechanical design in order to Virga et al. [59] developed a robotic system for quantitative
improve the clinical acceptance of the system. assessment of the abdominal aortic aneurysm (see Fig. 4
Welleweerd et al. [68] used a 7-DoF robotic manipulator (b)). A 7-DoF light weight robot is used to manipulate a
with a linear probe attached to its end-effector to acquire 2D- convex probe, and an RGB-D camera is fixed at the ceiling
US images of the breast along a pre-defined trajectory, assum- to obtain the point cloud of the patient. After a planned
ing that the patient is in a prone position, as shown in Fig. 3 path in an MRI atlas is transferred to the RGB-D image,
(d). The robot can manipulate the probe to maintain contact the US acquisitions can be performed automatically and a
with the breast surface based on the acquired US images. For 3D volume is subsequently compounded for aortic diameter
safe trajectory following, the authors used the safety aware, measurements. The optimal contact force is estimated in a
intrinsically passive (SAIP) control framework [74] to adjust patient-specific manner before acquisition, and the out-of-
the impedance control parameters according to the energy and plane rotation of the probe is adaptively controlled during the
power of the system. Similar to [43][44][45] which use US scan to optimize the image quality.
image-based tracking to guide the probe, this work also uses Similarly, Hennersperger et al. [60] developed a robotic
the real-time US feedback for visual servoing adjustment to system for autonomous 3D-US acquisitions based on pre-
adjust the probe pose, in order to maintain contact with the interventional plans in the pre-interventional MRI data of the
patient surface and prevent excessive deformation of the breast. patient. The system consists of a 7-DoF robotic manipulator,
Another system proposed by Zhan et al. [69] focused on a diagnostic US machine and an RGB-D camera mounted
scanning a tissue under free-form motion for intraoperative at the ceiling (see Fig. 4 (c)). A closed-loop calibration

Copyright (c) 2021 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.
This is the author's version of an article that has been published in this journal. Changes were made to this version by the publisher prior to publication.
The final version of record is available at http://dx.doi.org/10.1109/TMRB.2021.3072190
JOURNAL OF LATEX CLASS FILES, VOL. 14, NO. 8, AUGUST 2015 6

acquired by a robotic system with that in expert-operated


2D-US acquisitions (see Fig. 4 (f)). They used a 7-DoF
light weight robot with an RGB-D camera mounted at its
end-effector to scan the thyroid lobe following a scanning
trajectory defined in the RGB-D image. Then a 3D volume
is reconstructed and the measurements are conducted by
clinicians. This study showed that robotic US acquisitions
can allow more repeatable measurement results compared with
expert-operated sonography.
Huang et al. [65] used a 3D translating device as the
manipulator to hold a probe for 3D-US acquisitions in general
applications (see Fig. 4 (g)). An RGB-D camera is attached
to the manipulator to obtain depth and color information of
the patient. The region of interest is automatically segmented
from the RGB image and a scanning path is planned to cover
Fig. 4. Representative autonomous robotic systems for 3D-US acquisitions. of the region. Two force sensors attached to the probe are used
(a) A robotic system for US imaging of the lower limb arteries [58]. (b)
Automatic force-compliant US screening of the abdominal aortic aneurysm to measure the contact force during the scan, and a 3D volume
[59]. (c) MRI-based autonomous robotic US acquisitions [60]. (d) A robotic of the target is reconstructed after the scan. This system can
system to scan a volume of interest defined in CT images [61]. (e) A dual- complete autonomous 3D-US acquisitions for different tissues,
robot system for US-guided needle placement [62]. (f) Autonomous 3D-US
imaging and measurement of thyroid lobes [64]. (g) A 3-DoF robotic system but the orientation of the probe is fixed. In an advanced version
[65] and (h) a 6-DoF robotic system [66] for 3D-US imaging of ROI detected of the system [66], a 6-DoF robotic arm is used and the
in the RGB-D image. (i) Autonomous 3D-US imaging of the thyroid [70]. orientation of the probe is adjusted to be perpendicular to the
tissue surface, which can extend the application range of the
system (see Fig. 4 (h)). Experiments on phantoms of different
update is conducted based on the registration between the pre- tissues (i.e., thyroid, breast, lumbar) and a real human forearm
interventional MRI and acquired US data of the patient to were conducted to demonstrate the effectiveness of the system
refine the scanning trajectory, and the robotic US acquisitions to perform autonomous 3D-US imaging for general purposes.
can follow the pre-interventional plan with an accuracy of 2.46 Kaminski et al. [70] developed a robotic system to scan the
± 0.96 mm. thyroid following a manually planned trajectory. As shown in
Graumann et al. [61] used a 7-DoF robotic arm with an Fig. 4 (i), 3D perception of the patient is conducted using
RGB-D camera and a US probe mounted at its end-effector to an RGB-D camera attached at the end-effector of the robot
autonomously scan a pre-defined volume of interest in the pa- before acquisition, and the current observations is registrated
tient, as shown in Fig. 4 (d). A volume of interest is selected in with a template model for transfer of the preplanned trajectory
a preoperative tomographic image and transferred to the real- in the horizontal plane. Then the robotic arm controls the 3-
time RGB-D point cloud data. Afterwards, a scanning path is DoF translation of the probe to scan along the scanning path
automatically planned for coverage of the volume, which will while exerting a constant force in the vertical direction, and a
be discussed in detail in Section IV-A. This trajectory planning 3D-US volume is compounded using the collected data after
method was later used in a dual-robot system for US-guided acquisition.
needle placement [62], where one robotic arm was used for
autonomous US imaging and the other for needle insertion (see
Fig. 4 (e)). Before the needle insertion, an ROI is selected in C. Evaluation Metrics
the RGB-D image of the patient, and the robot automatically Due to the different purposes and requirements of existing
performs the US scan along a self-generated path to cover the autonomous US acquisition systems, the evaluation metrics
region of interest. Then the reconstructed 3D volume is used used in these systems are usually application-specific. We
for US-based visual servoing during the needle insertion. This summarize the evaluation criteria used in the reported systems
work demonstrates the feasibility of using robotically acquired as follows.
3D-US images for interventional support. 1) Visibility of Target Anatomy: In the systems for diagnos-
In the system developed by Göbl et al. [63], the scanning tic purposes, the evaluation is usually based on the visibility of
path is automatically planned in the pre-interventional CT data the target anatomy in the acquired images. In [54], the robotic
to both cover the region of interest inside the patient and acquisition is judged as successful or not according to whether
optimize the acoustic window based on consideration of the a clear image of the intima is obtained. In [56], the system is
inner anatomy and estimation of acoustic transmission. During manually evaluated according to the diagnosability and clarity
acquisition, an RGB-D camera is positioned above the patient of the image and the completeness of the scanning trajectory.
to calibrate the CT image to the world coordinate system, and Other methods also qualitatively assess the visibility of the
a 7-DoF robotic manipulator is used to manipulate the US target tissue or features in the image by manual evaluation
probe. [57][67][65][66].
Kojcev et al. [64] compared the reproduciblity of thyroid 2) Accuracy of Biometric Measurements: In the systems for
lobe length measurements using 3D-US images autonomously biometric measurement applications, the evaluation metrics are

Copyright (c) 2021 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.
This is the author's version of an article that has been published in this journal. Changes were made to this version by the publisher prior to publication.
The final version of record is available at http://dx.doi.org/10.1109/TMRB.2021.3072190
JOURNAL OF LATEX CLASS FILES, VOL. 14, NO. 8, AUGUST 2015 7

based on the accuracy of the measurement results using the prerequisite to plan the scanning path before acquisition based
robotically acquired US images. Some systems use manual on the specific requirement of the target application, such as
measurements in standard expert-operated US scans as the finding a desired imaging plane or covering a selected ROI.
reference to evaluate the accuracy of robotically acquired 3D- Generally, the scanning path planning methods in existing
US volumes [59] and [64]. In [58], the measurement results systems are based on the global information of the target tissue
using robotically acquired 3D-US volumes are compared with obtained from preoperative medical images (e.g., MRI or CT)
the computer-aided design (CAD) files used for prototyping or the surface information obtained from external sensors (e.g.,
the phantoms and the computed tomography angiography RGB-D camera), or directly based on the real-time US images
(CTA). In addition, [65] uses phantoms with regularly sized acquired by the robot.
and spaced features to evaluate the accuracy of measurements In some reviewed systems, the scanning path is manually
in the acquired images. specified in the robot workspace by the human operator before
3) Accuracy of Executing Pre-interventional Plans: The the robotic acquisition [57][58]. Since the predefined trajectory
systems that perform US acquisitions for interventional sup- does not consider real-time tissue information and may limit
port usually pay more attention to the agreement between the the flexibility of the acquisition, online probe adjustment based
actual US acquisition and the pre-interventional plans. In [61], on the US feedback is incorporated in [58] to improve the
the system is quantitatively evaluated based on the coverage imaging results, which will be discussed in Section IV-C.
of pre-defined volumes of interest in the CT image. In [60], Some attempts have been made to plan the scanning path
registration errors between the acquired 3D-US volume and the in a semi-autonomous manner, by automatically adjusting the
preoperative MRI data are used for evaluation of the system manually planned path to fit the tissue surface. In [67], the
accuracy to follow the pre-interventional plan. manually specified scanning path in the horizontal plane is
4) Quality of US Images: In some of the reported works, automatically adjusted to follow the abdominal surface using
the system performance is also evaluated based on the general a delicately designed passive mechanism, which assumes the
US image quality. Some methods qualitatively evaluate the US shape of the abdomen as an oval hemisphere. In [68], the
image quality based on the presence of shadowing artifacts authors proposed to reconstruct the tissue surface using pre-
[67][63], and [63] also quantitatively assesses the image qual- interventional MRI scans (CAD file of a phantom was actually
ity based on the US intensity in the image during acquisition. used in their experiments), and transfer the predefined path
In [68] and [59], the image quality is evaluated by calculating onto the surface. Similarly, [69][64] project the predefined
the confidence value in the region of interest based on the 2D scanning path to the 3D surface of the patient extracted
ultrasound confidence map [80], which can estimate the per- using RGB-D cameras. In [60], the scanning path is manually
pixel uncertainty in attenuated or shadow regions in the US planned in the pre-interventional MRI data of the patient, and
image. automatically transferred to the robot workspace based on the
5) Repeatability of Acquisitions: In [70], the repeatability surface registration between pre-interventional MRI and real-
of the 2D-US scans are evaluated by measuring the cross- time RGB-D image of the patient. In [59], the preplanned
correlation value between frames acquired in subsequent US path in a generic MRI atlas is transferred to the live RGB-
scans. In [64], the standard deviation of thyroid size mea- D image of the patient based on a deformable registration.
surements performed in robotically acquired images are used Similarly, in [70], the scanning path is planned on a template
to indirectly assess the repeatability of image acquisitions. thyroid model and tranferred to the robot workspace based
As [69] focuses on overcoming the tissue motion during on registration between the model and the real-time RGB-D
the autonomous acquisition, the similarity between the US images. Compared with [60], the methods in [59] and [70] are
images acquired at the same position in the prescence of more versatile as they utilize statistical or template anatomical
tissue deformation is measured based on the normalized cross images and do not require individual tomographic data.
correlation to evaluate the system performance. There are also a number of methods that can realize
automatic planning of the scanning path to cover a manually
IV. T OPICS AND M ETHODS selected ROI. In these methods, since the user only needs
In the aforementioned robotic systems for autonomous US to select an ROI without giving a specific path, the ease of
acquisitions, some common research topics are involved and use is further improved. In [61], after a volume of interest
different methods have been proposed with particular con- is manually selected in a preoperative tomographic image of
cerns. Three main topics, namely, the scanning path planning, the patient, it can be transferred into the RGB-D image of
contact force control and image quality optimization, are the patient by surface registration, and the scanning path can
summarized in Table III and discussed in this section. be automatically planned to cover the volume of interest. The
same planning method is also used in [62], where the region
of interest is directly selected in the RGB-D image of the
A. Scanning Path Planning patient. In addition to covering a selected region of interest,
Scanning path planning is defined as the planning of a set of [63] can also automatically plan the scanning path to optimize
waypoints that specify the desired position and orientation of the image quality. Given a target point inside the human
the US probe during the autonomous robotic US acquisitions. body, the method can find the probe pose with the optimal
In order to efficiently and accurately acquire US images of acoustic window (i.e., anatomical structure on the travelling
the target anatomy with autonomous robotic systems, it is a path of the US wave) based on the pre-interventional CT image

Copyright (c) 2021 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.
This is the author's version of an article that has been published in this journal. Changes were made to this version by the publisher prior to publication.
The final version of record is available at http://dx.doi.org/10.1109/TMRB.2021.3072190
JOURNAL OF LATEX CLASS FILES, VOL. 14, NO. 8, AUGUST 2015 8

TABLE III
T OPICS AND M ETHODS IN AUTONOMOUS ROBOTIC US ACQUISITIONS

Topic Method References

Manually plan the scanning path in the robot workspace [57][58]


Manually plan the scanning path and automatically adjust it to the tissue surface with a passive mechanism [67]
Manually plan the scanning path and automatically adjust it to the tissue surface based on preoperative images [68]
Manually plan the scanning path and automatically adjust it to the tissue surface based on RGB-D images [69][64]
Scanning path
Manually plan the scanning path in preoperative images, and automatically tranfer it to the robot workspace based on RGB-D images [59][60][70]
planning
Manually select a region of interest in preoperative images or RGB-D images, and automatically plan to cover the selected region [61][62]
Manually select a region of interest in preoperative images, and automatically plan to cover the region and optimize the acoustic window [63]
Automatically plan the scanning path based on RGB-D images [56][65][66]
Automatically plan the scanning path based on real-time US images [54]

Use a six-axis force sensor mounted at the tip of the robotic manipulator, and apply a constant force on the patient [56][58]
Use two force sensors mounted at the front face of the probe; keep the two force measurements in a range [65][66]
Contact force
Use built-in force sensors in the robotic manipulator; apply a constant force on the patient [60][61][62][63][64][70]
control
Use built-in force sensors in the robotic manipulator; apply a patient-specific optimal contact force on the patient [59]
Not use force sensors; apply a constant force on the patient with a passive constant-force spring [67]

Before acquisition; plan the scanning path to optimize the acoustic window based on preoperative images [63]
During acquisition; online update of the scanning path based on tissue motion estimation from RGB-D images [69]
Image quality
During acquisition; online update of the scanning path based on registration between the acquired US images and preoperative images [60]
optimization
During acquisition; online probe adjustment based on US image features [56][54][58]
During acquisition; online probe adjustment base on the ultrasound confidence map [80] [68][59]

of the organ, by modelling both the geometric and physical specific imaging purposes, this method may be difficult to
constraints of the US acquisition. First, the probe poses that generalize to other organs.
can guarantee the coverage of the target point are selected Among the aforementioned planning methods, [69] and [60]
as candidates based on the geometrical constraints. Then an also support online update of the planned path based on the
acoustic transmission model based on the tomographic data is real-time tissue information obtained using RGB-D cameras or
established to estimate the US attenuation, and the probe pose the acquired US images to account for tissue motion during
that maximizes the image quality is finally chosen. Although acquisition, which will also be discussed in Section IV-C.
this approach is limited in high computational cost, it opens
the way for integrating the image quality optimization into the
planning stage, which will also be discussed in Section IV-C. B. Contact Force Control
Only a few systems allow fully automatic planning of the During the acquisition process, it is essential to maintain
scanning path. In [56], [65] and [66], the region of interest is proper contact between the US probe and the patient to
automatically extracted from the RGB-D image of the patient improve image quality and ensure patient safety. Excessive
surface using image processing techniques. In [56], the human contact force may cause deformation of the target anatomy
umbilicus and mammary papillae are detected from RGB and even hurt the patient, while insufficient force cannot ensure
images as landmarks to locate the liver region to determine good acoustic coupling and would result in poor image quality.
the scanning initial position. Then a scanning protocal is Therefore, contact force control during the acquisition process
implemented to mimic the standard acquisition steps in the is implemented in most of the reviewed systems.
general clinical US imaging of the liver. Although a clear In [56][58], a six-axis force sensor is attached to the tip
path is not given by this method, it provides a general idea for of the manipulator to detect the contact force between the
automatically planning the scanning path to mimic the clinical probe and the patient, and a constant force is applied in the
procedures. In [65] and [66], color features are used to segment normal direction of the patient surface. In [65][66], two force
the region of interest from the RGB-D image, and the scanning sensors are mounted at the front face of the US probe, and the
path is automatically planned to cover the region of interest pose of the robot is automatically adjusted to ensure both the
through a number of sweeps and follow the tissue surface. force measurements fall into the range of [1N, 8N]. While this
In [54], the scanning path is incrementally planned based on contact force control method can help maintain proper contact
the real-time US images, in order to search for the desired between the probe and the skin surface, the attachment of force
view of the carotid artery. The authors developed real-time sensors to the front face of the probe will inevitably introduce
image processing algorithms to detect the carotid landmarks, small shadow artifacts in the acquired images. Many systems
and delicately designed an automatic path planning workflow use the built-in force-torque sensors in the robotic manipulator
to plan the probe motion accordingly. Since the future motion to estimate the Cartesian force acting on the end-effector, and
of the probe is planned in real time based on the US feedback, apply a constant contact force along the direction of the probe
the method has the inherent advantage to compensate for [59][60][61][62][63][64][70].
patient motion. However, since the hand-crafted image features Different from the above methods that actively apply a
and well-engineered scanning strategies are designed for the contact force on the patient with actuators, [67] does not

Copyright (c) 2021 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.
This is the author's version of an article that has been published in this journal. Changes were made to this version by the publisher prior to publication.
The final version of record is available at http://dx.doi.org/10.1109/TMRB.2021.3072190
JOURNAL OF LATEX CLASS FILES, VOL. 14, NO. 8, AUGUST 2015 9

implement force sensing and feedback control, but uses a methods have been proposed for more generic US image
passive mechanism that adopts a constant-force spring of 1.0 quality assessment that are applicable to a broader range of
kgf to generate a constant force in the vertical direction to US imaging applications. For instance, the ultrasonic atten-
avoid exerting excessive force onto the patient by accident, uation is estimated in the obtained images in some studies
thereby improving patient safety in fetal US scans. [91][92][93][94]. Karamalis et al. [80] proposed the ultra-
In the majority of the reported methods, the desired contact sound confidence map, which is a general approach to estimate
force applied on the patient is manually parameterized based the pixel-wise uncertainty in US images using the random
on experience. In view of a clinical integration, however, the walks framework [95]. This method utilizes the propagation
parameterization of the contact force needs to be tailored to properties of US in soft tissues to evaluate the image quality,
the individual characteristics and specific target applications and can provide a spatial map of the relative confidence in
to satisfy the safety requirements and improve image quality. real time. A variant of the ultrasound confidence map was
In [67], the optimal contact force is determined according presented in [96].
to clinical survey data in fetal US imaging. In [59], the In some of the reported autonomous US acquisition systems,
optimal contact force is estimated in a patient-specific manner the optimization of image quality is conducted before acqui-
before acquisition based on the ultrasound confidence map sition (i.e., in the scanning path planning phase) or during
[80] in order to optimize the image quality (which will also acquisition (i.e., by online update of the scanning path or
be discussed in Section IV-C). This adaptive force estimation probe adjustment), corresponding to the above two types of
can make it easier for the robotic system to cope with diverse image quality evaluation methods. In [63], based on the US
patient populations. attenuation model introduced in [84], the authors used pre-
interventional tomographic data of the target organ to plan
the scanning path that optimizes the acoustic window. Using
C. Image Quality Optimization
the relationship between X-ray attenuation coefficient and
In the past few decades, the imaging quality of clinical US the ultrasonic propagation physics, the US intensity at the
devices has greatly improved due to advances in transducer selected point inside the patient body can be predicted before
design [81] and image enhancement techniques [82]. However, performing the actual acquisition, and the optimal probe pose
since the acoustic propagation is easily affected by the prop- on the tissue surface can be selected to minimize the US
erty of the medium, it is important to correctly position the US attenuation. Therefore, the planned scanning path can avoid
probe with respect to the patient to ensure good image quality strong reflectors such as the ribs based on the prior knowledge
in extracorporeal US imaging tasks. For instance, since US of the target organ, thereby optimizing the US image quality
waves cannot penetrate the air and the bones, the probe should in the planning phase.
be in close contact with the skin and the acoustic window Other systems optimize the US image quality during the
should be appropriately selected to avoid strong reflectors such acquisition process by updating the scanning path or adjusting
as the ribs. Therefore, methods to quantitatively evaluate and the probe pose in real time. In [69][60], the preplanned scan-
optimize the image quality during the acquisition are needed. ning path is updated online based on the real-time estimation
Currently, there is no uniform standard to assess the quality of tissue motion. In [69], the scanning path is updated to
of US images. In general, existing methods for US image adapt to the tissue surface in real time by estimating a rigid
quality evaluation can be divided into two categories. One transformation of the tissue from RGB-D images captured
is to predict US image quality based on prior knowledge by an endoscopic stereo camera. The US image quality is
of the target anatomy, which is usually obtained from the optimized during acquisition as the tissue tracking method
tomographic data of the patient, such as CT and MRI. A helps maintain proper contact between the probe and the
number of methods have been proposed to synthesize US tissue. [60] updates the planned path based on multi-modality
images from tomographic data [83][84][85]. Since the 3D registration between the real-time US image and preoperative
tomographic image of the target organ can provide information MRI to overcome patient movement. Since the above methods
about its spatial anatomy, including the location of strong can account for motion compensation, the pre-interventional
reflectors such as bones and gases, this information can be plan can be executed more stably and accurately to obtain
combined with the physical model of ultrasonic propagation images of the selected target in the precense of tissue motion,
to predict the quality of US images. which is crutial in interventional applications. Some systems
The other category directly processes the acquired US adjust the probe pose in real time based on the US feedback
images to evaluate the US image quality. Several groups have to optimize image quality. In [56][54], the brightness of the
investigated the automatic quality assessment of US images for B-mode images is used to detect the shadow artifacts caused
specific applications, such as breast US imaging [86], fetal by improper contact with the patient, and the probe orientation
biometric measurement [87][88][89], and knee arthroscopy is adjusted accordingly to improve the image quality. In [58],
[90]. These studies mainly use the correlation between the since the system is aimed at recontructing a 3D-US image of
content of obtained images and the target image to represent the lower limb artery, the probe position is adjusted in real
the image quality (e.g., whether the obtained image contains time to maintain the artery at the center of the B-mode image
the target anatomical structures). The quality assessment can based on the segmentation of the artery lumen using a fast-
then be used to assist in the specific image analysis tasks. marching method. In [68][59], the image quality is estimated
Based on the physical properties of US propagation, some in real time by computing the average confidence value in

Copyright (c) 2021 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.
This is the author's version of an article that has been published in this journal. Changes were made to this version by the publisher prior to publication.
The final version of record is available at http://dx.doi.org/10.1109/TMRB.2021.3072190
JOURNAL OF LATEX CLASS FILES, VOL. 14, NO. 8, AUGUST 2015 10

the region of interest using the ultrasound confidence map majority of existing autonomous systems are still at level 2,
[80]. A visual servoing approach is used in [68] to control which rely on a human-in-the-loop to manually plan the acqui-
the in-plane rotation and the height of the probe to keep sition. Therefore, the imaging results of these systems are still
the probe in contact with the breast and optimize the image operator-dependent. Only four systems (i.e., [54][56][65][66])
quality, while [59] adjusts the out-of-plane rotation of the have been developed at level 3, but their clinical effectiveness
probe in real time based on the confidence change compared needs to be further investigated and compared with the semi-
with previous frames. In addition, the system in [59] also autonomous systems.
estimates the optimal contact force for each individual based Besides, as the level of autonomy increases, the task of
on the confidence map to optimize the image quality. Besides, robotic acquisition becomes more complex and sophisticated,
the systems that incorporate contact force control (which are which requires the system to have better perception and
discussed in Section IV-B) can inherently maintain contact decision-making capabilities. Several limitations have been re-
between the probe and the tissue, thereby implicitly optimizing ported in existing autonomous systems, including the inability
the image quality. to adapt to large tissue motions [66] and suboptimal image
The optimization of US image quality have also been quality [56][60]. To this end, better algorithms need to be
studied in many semi-autonomous robotic US imaging sys- developed and incorporated in current systems to respond to
tems to improve ease of use and reduce user-dependency. various sensor data and make corresponding decisions, in order
Abolmaesumi et al. [43][44][45] proposed several feature to replicate the skills of an expert sonographer.
extraction algorithms to track the carotid artery in real-time US Moreover, safety is a paramount issue for the autonomous
images, and used a visual servoing controller to automatically US acquisition systems to improve clinical acceptance of
adjust the in-plane motions of the probe during remote US this technology. Compared with the systems under continuous
examinations. Since the US image servoing can automatically control of a human operator, the increased autonomy in the
maintain the carotid artery in the image center, unwanted autonomous systems may bring a higher risk of patient injury
patient motions can be compensated during the teleoperation. due to malfunction [31]. Although most existing autonomous
More visual features have been explored in US-based visual systems have incorporated force sensing to improve patient
servoing to consider more complex shaped objects in robot- safety, excessive force may be accidentally exerted onto the
assisted target tracking tasks [97][98][51][99]. Şen et al. [47] patient due to sensor or controller failure [67][68]. Using more
proposed a registration-based method to dynamically update sensors to monitor the system may improve the safety and
the desired probe position in the cooperative system based reliability of the acquisition, but it would also increase the
on the difference between the real-time US image and a burden of data processing. Future studies should pay more
reference US image. Chatelain et al. [100] incorporated the attention to these issues, and conduct more clinical trials to
ultrasound confidence map in a visual servoing framework certify the safety and reliability of the autonomous systems.
to control the in-plane rotation of the probe in robotic tele-
echography tasks. This work was further extended to full B. Expansion of the Application Paradigms
control of the probe [52], which integrates the force, position Autonomous robotic systems have been developed for US
and confidence control of the probe in a redundant framework acquisition in a broad range of medical applications, as
for tele-echography and target tracking tasks. Besides, Jiang demonstrated in Section III. Their target anatomies include
et al. [101] used the ultrasound confidence map and force the thyroid, carotid artery, breast, liver, cervix, fetus, lumbar,
feedback to estimate the optimal orientation of the probe at forearm, lower limb arteries, and abdominal aortic aneurysms.
the point of contact, thereby improving the image quality All existing works use general-purpose US probes in their
at a given position. Although proposed for different target systems, probably because of the low cost. Future studies are
applications, we believe that the ideas in these methods are expected to explore the applications of autonomous robotic
general and can be integrated into current autonomous systems systems for US imaging of other organs, such as the lung,
to achieve better imaging results. heart, kidneys, urinary tract and gastrointestinal (GI) tract.
For these new applications, the autonomous robotic acquisition
V. C HALLENGES AND F UTURE D IRECTIONS will face different difficulties and challenges due to the diverse
transducer types and imaging requirements. Therefore, it is
A. Challenges of the Increased Autonomy of great practical relevance to identify the clinical needs and
Abundant research has demonstrated the feasibility of using develop dedicated systems for specific target applications.
autonomous robotic systems as an alternative to the stan- Besides, intracorporeal US acquisition such as intravascular
dard handheld technique in extracorporeal US acquisitions. and intracardiac ultrasonography [102] is another promising
Compared with the non-autonomous and semi-autonomous application for autonomous robotic systems, which has differ-
systems, the autonomous systems do not need an on-site or off- ent challenges from extracorporeal US acquisitions, as the task
site human operator to directly manipulate the probe, which involves inserting a specialty transducer into the body to obtain
can significantly simplify the workflow, reduce operator work- images of the internal organs. This topic is outside the scope of
load and improve the reproducibility of the acquisition. Also, this review, however, the basic ideas of scanning path planning,
the operation of the autonomous systems does not require contact force control and image quality optimization addressed
the patient site to have remote communication capabilities. in existing extracorporeal systems may be transferred and
However, according to our standardization in Section II, the applied in the intracorporeal applications.

Copyright (c) 2021 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.
This is the author's version of an article that has been published in this journal. Changes were made to this version by the publisher prior to publication.
The final version of record is available at http://dx.doi.org/10.1109/TMRB.2021.3072190
JOURNAL OF LATEX CLASS FILES, VOL. 14, NO. 8, AUGUST 2015 11

C. Hardware Design patient. However, since the soft tissues of the human body
Existing autonomous US acquisition systems usually use are highly deformable, the contact force should achieve a
rigid robots and conventional mechanical actuators, which balance between maintaining good acoustic coupling and
are often expensive [64] and contain large and bulky ma- avoiding excessive deformation of the tissue, which remains
chinery which may result in potential damage to the patient an important yet challenging task. Large tissue deformation
[67][103]. To this end, future improvements can be made in in the acquired images will strongly impair the quality of 3D
the hardware design to make the robotic system more compact, reconstruction [78] and the accuracy of subsequent diagnosis
more portable, less expensive, and safer for human-robot and measurements. Also, the optimal contact force for different
interaction. Novel actuation methods for the robotic US system patients should depend on individual characteristics, such as
can be investigated, such as pneumatic [103], fluidic [104] and tissue stiffness [67][59]. To this end, future researchers are
magnetic actuation [105]. Recent developments in soft robotics suggested to take into account patient-specific characteristics
have demonstrated the capability of soft robots in performing in the design of contact force control strategies, as well as
dexterous and high-level tasks in a wide range of medical estimate and overcome the tissue deformation caused by the
applications [106]. In [103], a portable and wearable soft external pressure [110] to achieve more adaptive control of the
robotic system with soft pneumatic actuators was developed contact force.
for US probe steering applications, which can allow more
compliant human-robot interaction during the acquisition. F. Real-time US image Interpretation
Similarly, soft robotic end-effectors have been developed for
teleoperated fetal US imaging [104] and US-guided interven- Another direction for future research is to combine the
tions [107]. Compared with rigid robots, soft robots have interpretation of US images with the autonomous control
inherent advantages of smaller size, lighter weight, simpler of the US probe to make the robotic US acquisition more
control methods, higher flexibility, and better compatibility intelligent. With the rapid development in artificial intelligence
with human tissues. Combined with these advances in soft and computer vision, various data-driven algorithms have been
robotics, we can anticipate that the safety, practicability and proposed and applied in US image analysis, yielding fron-
clinical acceptance of future robotic systems for autonomous tier results in the classification, detection, and segmentation
US acquisitions could be significantly improved. tasks for different anatomical structures [111]. Recently, a
Besides, only one of the autonomous systems utilized the growing number of machine learning approaches have been
clinical survey data in the hardware design of the robotic developed for image-based probe guidance in simulated US
system [67]. Future studies are suggested to improve the safety acquisitions for fetal [112] and spinal imaging [113]. Li et
and comfort of the robot design by taking into consideration al. [114] proposed a deep reinforcement learning solution to
the clinical acceptance by the patients. To this end, more clin- autonomous control of a US probe towards the desired imaging
ical statistical data from diverse patient populations should be plane based on real-time US image feedback. In future studies,
collected in clinical US imaging to find the optimal solutions. these methods are expected to be integrated into real robotic
systems to automatically interpret the US images and guide
D. Tissue Motion Compensation the movement of the probe in real time, thereby mimicking
the visual search and navigation strategies of sonographers.
Most of the existing robotic systems for autonomous US
acquisitions require the patients to remain still during the
scan. However, this setup may cause discomfort to the pa- G. Evaluation Criteria
tient; furthermore, the system may fail to work properly if Currently, there is no uniform evaluation criteria for the
the patient unintentionally moves during the examination. autonomous US acquisition systems, and many existing studies
Therefore, active compensation for patient movement and only qualitatively validate the effectiveness of their methods.
tissue deformation can help improve the robustness and pa- Also, there is a lack of uniform standards regarding the reading
tient comfort during the robotic acquisition, especially in techniques of the reconstructed 3D-US volumes, making it
intraoperative imaging applications [69]. A large number of difficult to quantitatively evaluate the robotic US imaging
visual servoing methods have been proposed to control the results [24]. Future studies should investigate more standard-
US probe in real time to compensate for undesired tissue ized evaluation metrics to validate the systems based on their
motions in applications such as teleoperated sonography [45], clinical applications and requirements.
target tracking [108] and surgical navigation [109]. Various Furthermore, none of the reported systems include the
features have been used for tissue tracking in the US image, patient satisfaction in the evaluation criteria. In order to further
such as speckle information [98][50], image moments [51] and justify the clinical acceptance of this technology, the perfor-
intensity [99]. These methods can be integrated into existing mance of the systems need to be assessed in terms of patient
autonomous US acquisition systems to further improve the safety, comfort and satisfaction. For instance, feedback from
imaging performance under tissue motions. the patients can be included in the system evaluation [115].
In addition, more studies are needed to compare the results
E. Contact Force Optimization of the robotic acquisitions with the standard expert-operated
In order to ensure proper contact with the patient, most acquisitions to demonstrate the feasibility and effectiveness of
of existing methods apply a constant contact force onto the their methods.

Copyright (c) 2021 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.
This is the author's version of an article that has been published in this journal. Changes were made to this version by the publisher prior to publication.
The final version of record is available at http://dx.doi.org/10.1109/TMRB.2021.3072190
JOURNAL OF LATEX CLASS FILES, VOL. 14, NO. 8, AUGUST 2015 12

VI. C ONCLUSIONS [15] D. Z. Khan, M. M. Placek, P. Smielewski, K. P. Budohoski, F. Anwar,


P. J. Hutchinson, M. Bance, M. Czosnyka, and A. Helmy, “Robotic
Combining the advantages of robotic manipulators and semi-automated transcranial doppler assessment of cerebrovascular
US imaging, an increasing number of robotic systems have autoregulation in post-concussion syndrome: Methodological consid-
been developed to automatize extracorporeal US acquisitions erations,” Neurotrauma reports, vol. 1, no. 1, pp. 218–231, 2020.
[16] S. Esmaeeli, C. M. Hrdlicka, A. B. Bastos, J. Wang, S. Gomez-
in a broad range of medical applications. We present an Paz, K. A. Hanafy, V.-A. Lioutas, C. S. Ogilvy, A. J. Thomas,
overview of literature on this subject of research, covering S. Shaefi et al., “Robotically assisted transcranial doppler with artificial
the state-of-the-art systems and techniques on autonomous intelligence for assessment of cerebral vasospasm after subarachnoid
hemorrhage,” Journal of Neurocritical Care, vol. 13, no. 1, pp. 32–40,
robotic US acquisitions for various clinical applications. The 2020.
standardization of autonomy levels in robotic ultrasonography [17] S. E. Salcudean, G. Bell, S. Bachmann, W.-H. Zhu, P. Abolmaesumi,
is discussed based on the level of computer assistance provided and P. D. Lawrence, “Robot-assisted diagnostic ultrasound–design and
feasibility experiments,” in International Conference on Medical Image
by the robotic systems. The important features of existing Computing and Computer-Assisted Intervention. Springer, 1999, pp.
autonomous systems are described and compared, and related 1062–1071.
research topics and methods are summarized and discussed. [18] S. E. Salcudean, W. H. Zhu, P. Abolmaesumi, S. Bachmann, and
P. D. Lawrence, “A robot system for medical ultrasound,” in Robotics
The development of autonomous robotic US acquisitions has Research. Springer, 2000, pp. 195–202.
demonstrated the potential of using robots to autonomously [19] A. M. Priester, S. Natarajan, and M. O. Culjat, “Robotic ultrasound
obtain reproducible and diagnosable imaging results without systems in medicine,” IEEE transactions on ultrasonics, ferroelectrics,
and frequency control, vol. 60, no. 3, pp. 507–523, 2013.
the operation of specialized personnel. [20] D. R. Swerdlow, K. Cleary, E. Wilson, B. Azizi-Koutenaei, and
We strongly believe that the presented efforts in autonomous R. Monfaredi, “Robotic arm–assisted sonography: Review of technical
robotic US acquisitions are important to pave the way for a developments and potential clinical applications,” American Journal of
Roentgenology, vol. 208, no. 4, pp. 733–738, 2017.
promising future of US-based medical care. This technology [21] J. Wang, C. Peng, Y. Zhao, R. Ye, J. Hong, H. Huang, and L. Chen,
is expected to be incorporated with advances in other fields “Application of a robotic tele-echography system for covid-19 pneumo-
such as soft robotics and AI-powered US image analysis, and nia,” Journal of Ultrasound in Medicine, vol. 40, no. 2, pp. 385–390,
2021.
will hopefully benefit a broad spectrum of clinical procedures [22] R. Ye, X. Zhou, F. Shao, L. Xiong, J. Hong, H. Huang, W. Tong,
in the medical domain. J. Wang, S. Chen, A. Cui et al., “Feasibility of a 5g-based robot-
assisted remote ultrasound system for cardiopulmonary assessment of
R EFERENCES patients with coronavirus disease 2019,” Chest, vol. 159, no. 1, pp.
270–281, 2021.
[1] P. G. Newman and G. S. Rozycki, “The history of ultrasound,” Surgical [23] H. J. Shin, H. H. Kim, and J. H. Cha, “Current status of automated
clinics of north America, vol. 78, no. 2, pp. 179–195, 1998. breast ultrasonography,” Ultrasonography, vol. 34, no. 3, p. 165, 2015.
[2] K. K. Shung, “Diagnostic ultrasound: Past, present, and future,” J Med [24] M. Zanotel, I. Bednarova, V. Londero, A. Linda, M. Lorenzon,
Biol Eng, vol. 31, no. 6, pp. 371–4, 2011. R. Girometti, and C. Zuiani, “Automated breast ultrasound: basic
[3] K. J. Schmailzl and O. Ormerod, Ultrasound in cardiology. Blackwell principles and emerging clinical applications,” La radiologia medica,
Science Oxford, 1994. vol. 123, no. 1, pp. 1–12, 2018.
[4] W. Peeling and G. Griffiths, “Imaging of the prostate by ultrasound,” [25] R. F. Brem, L. Tabár, S. W. Duffy, M. F. Inciardi, J. A. Guingrich,
The Journal of urology, vol. 132, no. 2, pp. 217–224, 1984. B. E. Hashimoto, M. R. Lander, R. L. Lapidus, M. K. Peterson, J. A.
[5] G. Leinenga, C. Langton, R. Nisbet, and J. Götz, “Ultrasound treatment Rapelyea et al., “Assessing improvement in detection of breast cancer
of neurological diseases—current and emerging applications,” Nature with three-dimensional automated breast us in women with dense breast
Reviews Neurology, vol. 12, no. 3, p. 161, 2016. tissue: the somoinsight study,” Radiology, vol. 274, no. 3, pp. 663–673,
[6] P. W. Callen, “Ultrasonography in obstetrics and gynecology,” 1988. 2014.
[7] W. A. Berg, J. D. Blume, J. B. Cormack, and E. B. Mendelson, [26] A. S. Tagliafico, M. Calabrese, G. Mariscotti, M. Durando, S. Tosto,
“Operator dependence of physician-performed whole-breast us: lesion F. Monetti, S. Airaldi, B. Bignotti, J. Nori, A. Bagni et al., “Ad-
detection and characterization,” Radiology, vol. 241, no. 2, pp. 355– junct screening with tomosynthesis or ultrasound in women with
365, 2006. mammography-negative dense breasts: interim report of a prospective
[8] G. Brown, “Work related musculoskeletal disorders in sonographers,” comparative trial,” J Clin Oncol, vol. 34, no. 16, pp. 1882–1888, 2016.
BMUS Bulletin, vol. 11, no. 3, pp. 6–13, 2003. [27] M. A. Durand and R. J. Hooley, “Implementation of whole-breast
[9] M. Muir, P. Hrynkow, R. Chase, D. Boyce, and D. Mclean, “The nature, screening ultrasonography,” Radiologic Clinics, vol. 55, no. 3, pp. 527–
cause, and extent of occupational musculoskeletal injuries among 539, 2017.
sonographers: recommendations for treatment and prevention,” Journal [28] J. Naqvi, K. H. Yap, G. Ahmad, and J. Ghosh, “Transcranial doppler
of Diagnostic Medical Sonography, vol. 20, no. 5, pp. 317–325, 2004. ultrasound: a review of the physical principles and major applications
[10] A. Kaminski, A. Payne, S. Roemer, D. Ignatowski, and B. K. Khand- in critical care,” International journal of vascular medicine, vol. 2013,
heria, “Answering to the call of critically ill patients: Limiting sono- 2013.
grapher exposure to covid-19 with focused protocols,” Journal of the [29] F. Zeiler and P. Smielewski, “Application of robotic transcranial doppler
American Society of Echocardiography, 2020. for extended duration recording in moderate/severe traumatic brain
[11] G.-Z. Yang, B. J. Nelson, R. R. Murphy, H. Choset, H. Christensen, injury: first experiences,” Critical ultrasound journal, vol. 10, no. 1,
S. H. Collins, P. Dario, K. Goldberg, K. Ikuta, N. Jacobstein et al., pp. 1–8, 2018.
“Combating covid-19—the role of robotics in managing public health [30] A. S. Reynolds, A. G. Lee, J. Renz, K. DeSantis, J. Liang, C. A. Powell,
and infectious diseases,” 2020. C. E. Ventetuolo, and H. D. Poor, “Pulmonary vascular dilatation
[12] S. Wu, D. Wu, R. Ye, K. Li, Y. Lu, J. Xu, L. Xiong, Y. Zhao, A. Cui, detected by automated transcranial doppler in covid-19 pneumonia,”
Y. Li et al., “Pilot study of robot-assisted teleultrasound based on 5g American journal of respiratory and critical care medicine, vol. 202,
network: A new feasible strategy for early imaging assessment during no. 7, pp. 1037–1039, 2020.
covid-19 pandemic,” IEEE Transactions on Ultrasonics, Ferroelectrics, [31] G.-Z. Yang, J. Cambias, K. Cleary, E. Daimler, J. Drake, P. E.
and Frequency Control, vol. 67, no. 11, pp. 2241–2248, 2020. Dupont, N. Hata, P. Kazanzides, S. Martel, R. V. Patel et al., “Medical
[13] General Electric Company, “Invenia ABUS 2.0: Look differently at robotics—regulatory, ethical, and legal considerations for increasing
dense breast tissue,” http://www.gehealthcare.com/products/ultrasound/ levels of autonomy,” Science Robotics, vol. 2, no. 4, p. 8638, 2017.
somo-v-abus-breast-imaging/invenia-abus, accessed: 2019-12-20. [32] T. Haidegger, “Autonomy for surgical robots: Concepts and paradigms,”
[14] Siemens Healthineers, “ACUSON S2000 ABVS Ultrasound System: IEEE Transactions on Medical Robotics and Bionics, vol. 1, no. 2, pp.
A comprehensive solution for breast ultrasound,” https://www. 65–76, 2019.
siemens-healthineers.com/ultrasound/products-by-family#02571109, [33] J. W. Martin, B. Scaglioni, J. C. Norton, V. Subramanian, A. Arezzo,
accessed: 2019-12-20. K. L. Obstein, and P. Valdastri, “Enabling the future of colonoscopy

Copyright (c) 2021 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.
This is the author's version of an article that has been published in this journal. Changes were made to this version by the publisher prior to publication.
The final version of record is available at http://dx.doi.org/10.1109/TMRB.2021.3072190
JOURNAL OF LATEX CLASS FILES, VOL. 14, NO. 8, AUGUST 2015 13

with intelligent and autonomous magnetic manipulation,” Nature Ma- [52] P. Chatelain, A. Krupa, and N. Navab, “Confidence-driven control of
chine Intelligence, vol. 2, no. 10, pp. 595–606, 2020. an ultrasound probe,” IEEE Transactions on Robotics, vol. 33, no. 6,
[34] A. M. Mikula-Curtis, J. L. Marshall, and L. M. Bruno, “Ultrasound pp. 1410–1424, 2017.
imaging system with touch-pad pointing device,” Oct. 24 2000, uS [53] R. Elek, T. D. Nagy, D. A. Nagy, B. Takács, P. Galambos, I. Rudas,
Patent 6,135,958. and T. Haidegger, “Robotic platforms for ultrasound diagnostics and
[35] A. V. Gonzales, P. Cinquin, J. Troccaz, A. Guerraz, B. Hennion, F. Pel- treatment,” in 2017 IEEE International Conference on Systems, Man,
lissier, P. Thorel, F. Courreges, A. Gourdon, G. Poisson et al., “Ter: and Cybernetics (SMC). IEEE, 2017, pp. 1752–1757.
a system for robotic tele-echography,” in International Conference [54] R. Nakadate, J. Solis, A. Takanishi, E. Minagawa, M. Sugawara, and
on Medical Image Computing and Computer-Assisted Intervention. K. Niki, “Implementation of an automatic scanning and detection
Springer, 2001, pp. 326–334. algorithm for the carotid artery by an assisted-robotic measurement
[36] A. Vilchis, J. Troccaz, P. Cinquin, K. Masuda, and F. Pellissier, “A new system,” in 2010 IEEE/RSJ International Conference on Intelligent
robot architecture for tele-echography,” IEEE Transactions on Robotics Robots and Systems. IEEE, 2010, pp. 313–318.
and Automation, vol. 19, no. 5, pp. 922–926, 2003. [55] R. Nakadate, H. Uda, H. Hirano, J. Solis, A. Takanishi, E. Minagawa,
[37] T. Essomba, L. Nouaille, M. Laribi, G. Poisson, and S. Zeghloul, M. Sugawara, and K. Niki, “Development of assisted-robotic system
designed to measure the wave intensity with an ultrasonic diagnostic
“Design process of a robotized tele-echography system,” in Applied
Mechanics and Materials, vol. 162. Trans Tech Publ, 2012, pp. 384– device,” in 2009 IEEE/RSJ International Conference on Intelligent
393. Robots and Systems. IEEE, 2009, pp. 510–515.
[56] A. S. B. Mustafa, T. Ishii, Y. Matsunaga, R. Nakadate, H. Ishii,
[38] F. Pierrot, E. Dombre, E. Dégoulange, L. Urbain, P. Caron, S. Boudet,
K. Ogawa, A. Saito, M. Sugawara, K. Niki, and A. Takanishi, “De-
J. Gariépy, and J.-L. Mégnien, “Hippocrate: A safe robot arm for
velopment of robotic system for autonomous liver screening using
medical applications with force feedback,” Medical Image Analysis,
ultrasound scanning device,” in 2013 IEEE International Conference
vol. 3, no. 3, pp. 285–300, 1999.
on Robotics and Biomimetics (ROBIO). IEEE, 2013, pp. 804–809.
[39] M.-A. Janvier, L.-G. Durand, M.-H. R. Cardinal, I. Renaud, B. Chayer, [57] C. Pahl and E. Supriyanto, “Design of automatic transabdominal
P. Bigras, J. De Guise, G. Soulez, and G. Cloutier, “Performance eval- ultrasound imaging system,” in 2015 20th International Conference
uation of a medical robotic 3d-ultrasound imaging system,” Medical on Methods and Models in Automation and Robotics (MMAR). IEEE,
image analysis, vol. 12, no. 3, pp. 275–290, 2008. 2015, pp. 435–440.
[40] G. P. Mylonas, P. Giataganas, M. Chaudery, V. Vitiello, A. Darzi, [58] S. Merouche, L. Allard, E. Montagnon, G. Soulez, P. Bigras, and
and G.-Z. Yang, “Autonomous efast ultrasound scanning by a robotic G. Cloutier, “A robotic ultrasound scanner for automatic vessel tracking
manipulator using learning from demonstrations,” in 2013 IEEE/RSJ and three-dimensional reconstruction of b-mode images,” IEEE trans-
International Conference on Intelligent Robots and Systems. IEEE, actions on ultrasonics, ferroelectrics, and frequency control, vol. 63,
2013, pp. 3251–3256. no. 1, pp. 35–46, 2015.
[41] M. Victorova, D. Navarro-Alarcon, and Y.-P. Zheng, “3d ultrasound [59] S. Virga, O. Zettinig, M. Esposito, K. Pfister, B. Frisch, T. Neff,
imaging of scoliosis with force-sensitive robotic scanning,” in 2019 N. Navab, and C. Hennersperger, “Automatic force-compliant robotic
Third IEEE International Conference on Robotic Computing (IRC). ultrasound screening of abdominal aortic aneurysms,” in 2016
IEEE, 2019, pp. 262–265. IEEE/RSJ International Conference on Intelligent Robots and Systems
[42] S. E. Salcudean, G. S. Bell, P. D. Lawrence, A. Marko, and M. Jame- (IROS). IEEE, 2016, pp. 508–513.
son, “Robotically assisted medical ultrasound,” Jul. 30 2002, uS Patent [60] C. Hennersperger, B. Fuerst, S. Virga, O. Zettinig, B. Frisch, T. Neff,
6,425,865. and N. Navab, “Towards mri-based autonomous robotic us acquisitions:
[43] P. Abolmaesumi, S. Salcudean, and W. Zhu, “Visual servoing for robot- a first feasibility study,” IEEE transactions on medical imaging, vol. 36,
assisted diagnostic ultrasound,” in Proceedings of the 22nd Annual no. 2, pp. 538–548, 2016.
International Conference of the IEEE Engineering in Medicine and [61] C. Graumann, B. Fuerst, C. Hennersperger, F. Bork, and N. Navab,
Biology Society (Cat. No. 00CH37143), vol. 4. IEEE, 2000, pp. 2532– “Robotic ultrasound trajectory planning for volume of interest cov-
2535. erage,” in 2016 IEEE International Conference on Robotics and
[44] P. Abolmaesumi, M. R. Sirouspour, and S. Salcudean, “Real-time Automation (ICRA). IEEE, 2016, pp. 736–741.
extraction of carotid artery contours from ultrasound images,” in Pro- [62] R. Kojcev, B. Fuerst, O. Zettinig, J. Fotouhi, S. C. Lee, B. Frisch,
ceedings 13th IEEE Symposium on Computer-Based Medical Systems. R. Taylor, E. Sinibaldi, and N. Navab, “Dual-robot ultrasound-guided
CBMS 2000. IEEE, 2000, pp. 181–186. needle placement: closing the planning-imaging-action loop,” Interna-
[45] P. Abolmaesumi, S. E. Salcudean, W.-H. Zhu, M. R. Sirouspour, and tional journal of computer assisted radiology and surgery, vol. 11,
S. P. DiMaio, “Image-guided control of a robot for medical ultrasound,” no. 6, pp. 1173–1181, 2016.
IEEE Transactions on Robotics and Automation, vol. 18, no. 1, pp. 11– [63] R. Göbl, S. Virga, J. Rackerseder, B. Frisch, N. Navab, and C. Hen-
23, 2002. nersperger, “Acoustic window planning for ultrasound acquisition,”
[46] H. T. Şen, M. A. L. Bell, I. Iordachita, J. Wong, and P. Kazanzides, “A International journal of computer assisted radiology and surgery,
vol. 12, no. 6, pp. 993–1001, 2017.
cooperatively controlled robot for ultrasound monitoring of radiation
therapy,” in 2013 IEEE/RSJ International Conference on Intelligent [64] R. Kojcev, A. Khakzar, B. Fuerst, O. Zettinig, C. Fahkry, R. De-
Jong, J. Richmon, R. Taylor, E. Sinibaldi, and N. Navab, “On the
Robots and Systems. IEEE, 2013, pp. 3071–3076.
reproducibility of expert-operated and robotic ultrasound acquisitions,”
[47] H. T. Şen, A. Cheng, K. Ding, E. Boctor, J. Wong, I. Iordachita,
International journal of computer assisted radiology and surgery,
and P. Kazanzides, “Cooperative control with ultrasound guidance vol. 12, no. 6, pp. 1003–1011, 2017.
for radiation therapy,” Frontiers in Robotics and AI, vol. 3, p. 49,
[65] Q. Huang, B. Wu, J. Lan, and X. Li, “Fully automatic three-dimensional
2016. [Online]. Available: https://www.frontiersin.org/article/10.3389/
ultrasound imaging based on conventional b-scan,” IEEE transactions
frobt.2016.00049 on biomedical circuits and systems, vol. 12, no. 2, pp. 426–436, 2018.
[48] T.-Y. Fang, H. K. Zhang, R. Finocchi, R. H. Taylor, and E. M. Boctor, [66] Q. Huang, J. Lan, and X. Li, “Robotic arm based automatic ultra-
“Force-assisted ultrasound imaging system through dual force sens- sound scanning for three-dimensional imaging,” IEEE Transactions on
ing and admittance robot control,” International journal of computer Industrial Informatics, vol. 15, no. 2, pp. 1173–1182, 2018.
assisted radiology and surgery, vol. 12, no. 6, pp. 983–991, 2017. [67] R. Tsumura and H. Iwata, “Robotic fetal ultrasonography platform with
[49] R. Finocchi, F. Aalamifar, T. Y. Fang, R. H. Taylor, and E. M. Boctor, a passive scan mechanism,” International Journal of Computer Assisted
“Co-robotic ultrasound imaging: a cooperative force control approach,” Radiology and Surgery, pp. 1–11, 2020.
in Medical Imaging 2017: Image-Guided Procedures, Robotic Interven- [68] M. Welleweerd, A. de Groot, S. de Looijer, F. Siepel, and S. Stramigi-
tions, and Modeling, vol. 10135. International Society for Optics and oli, “Automated robotic breast ultrasound acquisition using ultrasound
Photonics, 2017, p. 1013510. feedback,” in 2020 IEEE International Conference on Robotics and
[50] A. Krupa, G. Fichtinger, and G. D. Hager, “Real-time tissue track- Automation (ICRA). IEEE, 2020, pp. 9946–9952.
ing with b-mode ultrasound using speckle and visual servoing,” in [69] J. Zhan, J. Cartucho, and S. Giannarou, “Autonomous tissue scanning
International Conference on Medical Image Computing and Computer- under free-form motion for intraoperative tissue characterisation,” in
Assisted Intervention. Springer, 2007, pp. 1–8. 2020 IEEE International Conference on Robotics and Automation
[51] R. Mebarki, A. Krupa, and F. Chaumette, “2-d ultrasound probe (ICRA), 2020, pp. 11 147–11 154.
complete guidance by visual servoing using image moments,” IEEE [70] J. T. Kaminski, K. Rafatzand, and H. K. Zhang, “Feasibility of robot-
Transactions on Robotics, vol. 26, no. 2, pp. 296–306, 2010. assisted ultrasound imaging with force feedback for assessment of

Copyright (c) 2021 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.
This is the author's version of an article that has been published in this journal. Changes were made to this version by the publisher prior to publication.
The final version of record is available at http://dx.doi.org/10.1109/TMRB.2021.3072190
JOURNAL OF LATEX CLASS FILES, VOL. 14, NO. 8, AUGUST 2015 14

thyroid diseases,” in Medical Imaging 2020: Image-Guided Procedures, Transactions on Ultrasonics, Ferroelectrics, and Frequency Control,
Robotic Interventions, and Modeling, vol. 11315. International Society 2020.
for Optics and Photonics, 2020, p. 113151D. [91] G. P. Penney, J. M. Blackall, M. Hamady, T. Sabharwal, A. Adam, and
[71] C. F. Baumgartner, K. Kamnitsas, J. Matthew, T. P. Fletcher, S. Smith, D. J. Hawkes, “Registration of freehand 3d ultrasound and magnetic
L. M. Koch, B. Kainz, and D. Rueckert, “Sononet: real-time detection resonance liver images,” Medical image analysis, vol. 8, no. 1, pp.
and localisation of fetal standard scan planes in freehand ultrasound,” 81–91, 2004.
IEEE transactions on medical imaging, vol. 36, no. 11, pp. 2204–2215, [92] W. Wein, B. Roper, and N. Navab, “Integrating diagnostic b-mode
2017. ultrasonography into ct-based radiation treatment planning,” IEEE
[72] M. K. Karmakar and K. J. Chin, Spinal Sonography and transactions on medical imaging, vol. 26, no. 6, pp. 866–879, 2007.
Applications of Ultrasound for Central Neuraxial Blocks. New [93] H. Kim and T. Varghese, “Hybrid spectral domain method for atten-
York, NY: McGraw-Hill Education, 2017. [Online]. Available: uation slope estimation,” Ultrasound in medicine & biology, vol. 34,
accessanesthesiology.mhmedical.com/content.aspx?aid=1141735352 no. 11, pp. 1808–1819, 2008.
[73] A. S. B. Mustafa, T. Ishii, Y. Matsunaga, R. Nakadate, H. Ishii, [94] Y. Yu and J. Wang, “Backscatter-contour-attenuation joint estimation
K. Ogawa, A. Saito, M. Sugawara, K. Niki, and A. Takanishi, “Human model for attenuation compensation in ultrasound imagery,” IEEE
abdomen recognition using camera and force sensor in medical robot Transactions on Image Processing, vol. 19, no. 10, pp. 2725–2736,
system for automatic ultrasound scan,” in 2013 35th Annual Interna- 2010.
tional Conference of the IEEE Engineering in Medicine and Biology [95] L. Grady, “Random walks for image segmentation,” IEEE Transactions
Society (EMBC). IEEE, 2013, pp. 4855–4858. on Pattern Analysis & Machine Intelligence, no. 11, pp. 1768–1783,
[74] G. Raiola, C. A. Cardenas, T. S. Tadele, T. De Vries, and S. Stramigioli, 2006.
“Development of a safety-and energy-aware impedance controller for [96] T. Klein and W. M. Wells, “Rf ultrasound distribution-based confi-
collaborative robots,” IEEE Robotics and automation letters, vol. 3, dence maps,” in Medical Image Computing and Computer-Assisted
no. 2, pp. 1237–1244, 2018. Intervention – MICCAI 2015, N. Navab, J. Hornegger, W. M. Wells,
[75] A. Gee, R. Prager, G. Treece, and L. Berman, “Engineering a freehand and A. Frangi, Eds. Cham: Springer International Publishing, 2015,
3d ultrasound system,” Pattern Recognition Letters, vol. 24, no. 4-5, pp. 595–602.
pp. 757–777, 2003. [97] W. Bachta and A. Krupa, “Towards ultrasound image-based visual
[76] S. J. Obst, R. Newsham-West, and R. S. Barrett, “In vivo measurement servoing,” in Proceedings 2006 IEEE International Conference on
of human achilles tendon morphology using freehand 3-d ultrasound,” Robotics and Automation, 2006. ICRA 2006. IEEE, 2006, pp. 4112–
Ultrasound in medicine & biology, vol. 40, no. 1, pp. 62–70, 2014. 4117.
[77] M. H. Mozaffari and W.-S. Lee, “Freehand 3-d ultrasound imaging: a [98] A. Krupa, G. Fichtinger, and G. D. Hager, “Full motion tracking in
systematic review,” Ultrasound in medicine & biology, vol. 43, no. 10, ultrasound using image speckle information and visual servoing,” in
pp. 2099–2124, 2017. Proceedings 2007 IEEE International Conference on Robotics and
[78] S. Virga, R. Göbl, M. Baust, N. Navab, and C. Hennersperger, “Use the Automation. IEEE, 2007, pp. 2458–2464.
force: deformation correction in robotic 3d ultrasound,” International [99] C. Nadeau and A. Krupa, “Intensity-based ultrasound visual servoing:
journal of computer assisted radiology and surgery, vol. 13, no. 5, pp. Modeling and validation with 2-d and 3-d probes,” IEEE Transactions
619–627, 2018. on Robotics, vol. 29, no. 4, pp. 1003–1015, 2013.
[100] P. Chatelain, A. Krupa, and N. Navab, “Optimization of ultrasound im-
[79] M. W. Gilbertson and B. W. Anthony, “Force and position control
age quality via visual servoing,” in 2015 IEEE International Conference
system for freehand ultrasound,” IEEE Transactions on Robotics,
on Robotics and Automation (ICRA), May 2015, pp. 5997–6002.
vol. 31, no. 4, pp. 835–849, 2015.
[101] Z. Jiang, M. Grimm, M. Zhou, J. Esteban, W. Simson, G. Zahnd, and
[80] A. Karamalis, W. Wein, T. Klein, and N. Navab, “Ultrasound confi-
N. Navab, “Automatic normal positioning of robotic ultrasound probe
dence maps using random walks,” Medical image analysis, vol. 16,
based only on confidence map optimization and force measurement,”
no. 6, pp. 1101–1112, 2012.
IEEE Robotics and Automation Letters, vol. 5, no. 2, pp. 1342–1349,
[81] K. Nakamura, Ultrasonic transducers: Materials and design for sen-
2020.
sors, actuators and medical applications. Elsevier, 2012.
[102] N. G. Pandian, “Intravascular and intracardiac ultrasound imaging. an
[82] S. H. C. Ortiz, T. Chiu, and M. D. Fox, “Ultrasound image enhance- old concept, now on the road to reality.” Circulation, vol. 80, no. 4,
ment: A review,” Biomedical Signal Processing and Control, vol. 7, pp. 1091–1094, 1989.
no. 5, pp. 419–428, 2012. [103] H. Ren, X. Gu, and K. L. Tan, “Human-compliant body-attached
[83] Y. Zhu, D. Magee, R. Ratnalingam, and D. Kessel, “A virtual ultra- soft robots towards automatic cooperative ultrasound imaging,” in
sound imaging system for the simulation of ultrasound-guided needle 2016 IEEE 20th International Conference on Computer Supported
insertion procedures,” in Proceedings of Medical Image Understanding Cooperative Work in Design (CSCWD). IEEE, 2016, pp. 653–658.
and Analysis, 2006, pp. 61–65. [104] L. Lindenroth, R. J. Housden, S. Wang, J. Back, K. Rhode, and H. Liu,
[84] W. Wein, S. Brunke, A. Khamene, M. R. Callstrom, and N. Navab, “Au- “Design and integration of a parallel, soft robotic end-effector for extra-
tomatic ct-ultrasound registration for diagnostic imaging and image- corporeal ultrasound,” IEEE Transactions on Biomedical Engineering,
guided intervention,” Medical image analysis, vol. 12, no. 5, pp. 577– 2019.
585, 2008. [105] Y. Xu, K. Li, Z. Zhao, and M. Q.-H. Meng, “A novel system for
[85] O. Kutter, R. Shams, and N. Navab, “Visualization and gpu-accelerated closed-loop simultaneous magnetic actuation and localization of wce
simulation of medical ultrasound from ct images,” Computer methods based on external sensors and rotating actuation,” IEEE Transactions
and programs in biomedicine, vol. 94, no. 3, pp. 250–266, 2009. on Automation Science and Engineering, 2020.
[86] J. Schwaab, Y. Diez, A. Oliver, R. Martı́, J. van Zelst, A. Gubern- [106] C. Lee, M. Kim, Y. J. Kim, N. Hong, S. Ryu, H. J. Kim, and S. Kim,
Mérida, A. B. Mourri, J. Gregori, and M. Günther, “Automated quality “Soft robot review,” International Journal of Control, Automation and
assessment in three-dimensional breast ultrasound images,” Journal of Systems, vol. 15, no. 1, pp. 3–15, 2017.
Medical Imaging, vol. 3, no. 2, p. 027002, 2016. [107] L. Lindenroth, A. Soor, J. Hutchinson, A. Shafi, J. Back, K. Rhode, and
[87] L. Zhang, N. J. Dudley, T. Lambrou, N. Allinson, and X. Ye, “Auto- H. Liu, “Design of a soft, parallel end-effector applied to robot-guided
matic image quality assessment and measurement of fetal head in two- ultrasound interventions,” in 2017 IEEE/RSJ International Conference
dimensional ultrasound image,” Journal of Medical Imaging, vol. 4, on Intelligent Robots and Systems (IROS). IEEE, 2017, pp. 3716–
no. 2, p. 024001, 2017. 3721.
[88] L. Wu, J.-Z. Cheng, S. Li, B. Lei, T. Wang, and D. Ni, “Fuiqa: [108] P. A. Patlan-Rosales and A. Krupa, “Automatic palpation for quanti-
Fetal ultrasound image quality assessment with deep convolutional tative ultrasound elastography by visual servoing and force control,”
networks,” IEEE transactions on cybernetics, vol. 47, no. 5, pp. 1336– in 2016 IEEE/RSJ International Conference on Intelligent Robots and
1349, 2017. Systems (IROS). IEEE, 2016, pp. 2357–2362.
[89] Z. Lin, S. Li, D. Ni, Y. Liao, H. Wen, J. Du, S. Chen, T. Wang, [109] O. Zettinig, B. Frisch, S. Virga, M. Esposito, A. Rienmüller, B. Meyer,
and B. Lei, “Multi-task learning for quality assessment of fetal head C. Hennersperger, Y.-M. Ryang, and N. Navab, “3d ultrasound
ultrasound images,” Medical image analysis, vol. 58, p. 101548, 2019. registration-based visual servoing for neurosurgical navigation,” Inter-
[90] M. Antico, D. Vukovic, S. M. Camps, F. Sasazawa, Y. Takeda, A. T. national journal of computer assisted radiology and surgery, vol. 12,
Le, A. T. Jaiprakash, J. Roberts, R. Crawford, D. Fontanarosa et al., no. 9, pp. 1607–1619, 2017.
“Deep learning for us image quality assessment based on femoral [110] R. Shams, “Deformation estimation and assessment of its accuracy in
cartilage boundaries detection in autonomous knee arthroscopy,” IEEE ultrasound images,” Ph.D. dissertation, Concordia University, 2017.

Copyright (c) 2021 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.
This is the author's version of an article that has been published in this journal. Changes were made to this version by the publisher prior to publication.
The final version of record is available at http://dx.doi.org/10.1109/TMRB.2021.3072190
JOURNAL OF LATEX CLASS FILES, VOL. 14, NO. 8, AUGUST 2015 15

[111] S. Liu, Y. Wang, X. Yang, B. Lei, L. Liu, S. X. Li, D. Ni, and T. Wang, Max Q.-H. Meng received the Ph.D. degree in elec-
“Deep learning in medical ultrasound analysis: a review,” Engineering, trical and computer engineering from the University
2019. of Victoria, Victoria, BC, Canada, in 1992.
[112] R. Droste, L. Drukker, A. T. Papageorghiou, and J. A. Noble, “Auto- He was with the Department of Electrical and
matic probe movement guidance for freehand obstetric ultrasound,” in Computer Engineering, University of Alberta, Ed-
International Conference on Medical Image Computing and Computer- monton, AB, Canada, where he served as the Di-
Assisted Intervention. Springer, 2020, pp. 583–592. rector of the Advanced Robotics and Teleoperation
[113] H. Hase, M. F. Azampour, M. Tirindelli, M. Paschali, W. Simson, Laboratory, holding the positions of Assistant Pro-
E. Fatemizadeh, and N. Navab, “Ultrasound-guided robotic navigation fessor, Associate Professor, and Professor in 1994,
with deep reinforcement learning,” arXiv preprint arXiv:2003.13321, 1998, and 2000, respectively. In 2001, he joined
2020. The Chinese University of Hong Kong, where he
[114] K. Li, J. Wang, Y. Xu, H. Qin, D. Liu, L. Liu, and M. Q.-H. served as the Chairman of the Department of Electronic Engineering, holding
Meng, “Autonomous navigation of an ultrasound probe towards stan- the position of Professor. He is affiliated with the State Key Laboratory of
dard scan planes with deep reinforcement learning,” arXiv preprint Robotics and Systems, Harbin Institute of Technology, and is the Honorary
arXiv:2103.00718, 2021. Dean of the School of Control Science and Engineering, Shandong University,
[115] S. Wang, J. Housden, Y. Noh, D. Singh, A. Singh, E. Skelton, China. He is currently with the Department of Electronic and Electrical
J. Matthew, C. Tan, J. Back, L. Lindenroth et al., “Robotic-assisted Engineering, Southern University of Science and Technology, on leave from
ultrasound for fetal imaging: evolution from single-arm to dual-arm the Department of Electronic Engineering, The Chinese University of Hong
system,” in Annual Conference Towards Autonomous Robotic Systems. Kong, Hong Kong SAR, China, and also with the Shenzhen Research Institute
Springer, 2019, pp. 27–38. of the Chinese University of Hong Kong, Shenzhen, China. His research
interests include robotics, medical robotics and devices, perception, and
scenario intelligence. He has published about 600 journal and conference
papers and led more than 50 funded research projects to completion as PI.
Dr. Meng is an elected member of the Administrative Committee (AdCom)
of the IEEE Robotics and Automation Society. He is a recipient of the IEEE
Millennium Medal, a fellow of the Canadian Academy of Engineering, and a
fellow of HKIE. He has served as an editor for several journals and also as
the General and Program Chair for many conferences, including the General
Chair of IROS 2005 and the General Chair of ICRA 2021 to be held in Xi’an,
China.

Keyu Li received the B.Eng. degree in communi-


cation engineering from Harbin Institute of Tech-
nology, Weihai, China, in 2019. She is currently
pursuing the Ph.D. degree in the Department of
Electronic Engineering, The Chinese University of
Hong Kong (CUHK), Hong Kong.
Her research focuses on artificial intelligence and
its application in medical robotic systems, super-
vised by Prof. Max Q.-H, Meng.

Yangxin Xu received the B.Eng. degree in electrical


engineering and its automation from Harbin Institute
of Technology, Weihai, China, in 2017. He is cur-
rently pursuing the Ph.D. degree in the Department
of Electronic Engineering, The Chinese University
of Hong Kong (CUHK), Hong Kong.
His research focuses on magnetic actuation and
localization methods and hardware implementation
for wireless robotic capsule endoscopy, supervised
by Prof. Max Q.-H, Meng.
Mr. Xu received the Best Conference Paper from
the 2018 IEEE International Conference on Robotics and Biomimetics (RO-
BIO), Kuala Lumpur, Malaysiain, in 2018.

View publication stats Copyright (c) 2021 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.

You might also like