Download as pdf or txt
Download as pdf or txt
You are on page 1of 46

Industry F cus

Winter 22

Robotics & Automation


A curated collection of top articles,
Thought Leaders and Insights from
Industry
TABLE OF
CONTENTS

3 NEWS
Monitoring the Effects of Climate
Change with Reef Robots

7 ARTICLE
What is Bionic Engineering?

12 NEWS
Keeping Semiconductor
Manufacturing Clean with Robotics

16 THOUGHT LEADERS
Robotic Microswimmers: The Next
Biomedical Breakthrough

25 THOUGHT LEADERS
Social distancing surveillance: how
robots will keep you in line

31 THOUGHT LEADERS
Pediatric Robotic Surgery: past,
present and potential

37 NEWS
SEA-KIT Unveils New H-class USV for
Ocean Survey

39 ARTICLE
3D Printing Polymers with Graphene

43 NEWS
Fully Autonomous Inspection Robot
Systems for Sewer Pipe Networks
News

Monitoring the Effects of Climate


Change with Reef Robots
By Ben Pilkington

Monitoring coral reefs’ degrading health is an essential task for environmental


science, but it is still technically difficult. A team of computer scientists from the
Woods Hole Oceanographic Institution, Massachusetts, was recently awarded a grant
to develop image collection and acoustic analysis technologies for robots to
automatically gather information about reef ecosystems.

Image Credit: V_E/Shutterstock.com

The grant, awarded by the National Science Foundation and shared with Syracuse
University, is intended to help the organizations advance robots’ abilities to study coral
reefs. They plan to do this by developing autonomous underwater vehicles that can find
their way around complex underwater environments and gather data.

The Woods Hole team is led by computer scientist Yogesh Girdhar, and is trying to
create a robot that can measure multiple properties of the reef over a long period of
time. The robot needs to take readings on the reef’s biomass and biodiversity, as well as
record information about the behavior of organisms living or traveling through the reef
system.

Read this article online 3


News

Coral reefs play an important role in the well-being of our planet: they support a healthy
ocean ecosystem by creating habitats for numerous and varied animal species. About a
quarter of all marine organisms rely on a coral reef at some point in their lives, and reefs
also protect human habitats from storms and erosion.

In 2020, a report on the status of global reefs worldwide by the Global Coral Reef
Monitoring Network valued reefs’ benefits to human society at around $2.7 trillion each
year. But despite the important role they play, reefs are declining all over the world. This
is caused by global warming, ocean acidification, pollution, and damage from industrial
fishing and shipping.

The scale and importance of the problem


of reef health around the world has
inspired many scientists, researchers,
Related Stories
and agencies to actively seek solutions
Vine-Like Soft Robot with High
to protect it. The key to solving a
Controllability and Multiple
problem like this is understanding reef
Degrees of Freedom
health, but technology is still not
available that can help us effectively Origami Worm Robot Aids
monitor reefs’ changing circumstances. Biomedical Soft Robotics

Girdhar said: How Can Climate Robots Help


“The tools we have right now to study Fight Global Warming?
coral reefs are pretty primitive. The
robots and the sensors we have at the moment can’t capture the spatial and temporal
diversity of a reef at the same time. We want to amplify the capability of scientists in the field
and the tools they’re using.”

To overcome this Girdhar and his team are working with a $1.5 million grant entitled “An
Ecologically Curious RObot for Monitoring Coral Reef Biodiversity.” The grant was made as
part of the US National Science Foundation’s National Robotics Initiative 3.0 program, which is
intended to support foundational research to integrate the next generation of robots with
human attributes like safety and independence.

The team is working with a concept that includes two input systems: image collection and
acoustic analysis. The robot will process this information in a way analogous to a “curious
diver,” that is, by exploring caves and passages that seem likely to give up more interesting
data further in.

Read this article online 4


News

The imaging system enables scientists to accurately identify individual species and any
locations of interest in a geospatially complex environment. However, these have limited
range underwater where not much light reaches. Acoustic signals, on the other hand,
work well underwater and can provide a good overall picture of a reef’s health.
However, they cannot provide enough information to identify individual species.

The Woods Hole team is combining the two input systems to get the best of both worlds
in their robot investigator. This will enable the robot to build an accurate, detailed
picture of the reef ecosystem, its health, and its functions in the same way that a trained
scientist diving into the system manually would operate.

The team also needs to overcome the problem of the robot itself disturbing animals that
live around the reef. The presence of a moving robot is enough to disrupt animals’
behavior and cause them to scatter to other parts of the reef or hide in gaps.

To address this, the robot has been designed with a “hopping” motion system that
makes it move quickly in short distances, then settle down on the sea bed or reef itself
to gather data. Moving in short bursts also enables the robot to use as little energy as
possible and greatly extend its possible mission length. This gives it advantages over
most other swimming robots used for environmental analysis and enables the robot to
gather much more information over a longer period of time.

“We envision putting the robot in a reef, and having it come back in a week or month
with a detailed understanding of how biodiversity is distributed across the reef in space
and time,” Girdhar said.

It will really give coral reef scientists more bang for the buck over each
deployment.”

The curious robot is one of many new bio-inspired robots being used for environmental
monitoring applications. For example, the Envirobot built by Swiss Biorobotics
Laboratory has a motion system inspired by eels, and detects water pollution in real-
world conditions. Robot jellyfish have also been designed to passively float in the ocean
and gather data to help us understand climate change.

Continue reading: Modeling the Effect of Climate


Change on Oceans with OcéanIA

Read this article online 5


News

References and Further Reading


Dwivedi, R. (2019). How Can Robots Help Fight Climate Change? AZO Robotics.
[Online] Available at: https://www.azorobotics.com/Article.aspx?ArticleID=294.
Pilkington, B. (2022). Can Soft Robots Become Truly Sustainable? AZO Robotics.
[Online] Available at: https://www.azorobotics.com/article.aspx?ArticleID=453.
Souter, D., S. Planes, J. Wicquart, et al (2020). Status of the Coral Reefs of the World.
Global Coral Reef Monitoring Network. [Online] Available at: https://gcrmn.net/.
WHOI (2021). Development of a curious robot to study coral reef ecosystems awarded
$1.5 million by the National Science Foundation. Woods Hole Oceanographic
Institution. [Online] Available at: https://www.whoi.edu/press-room/news-
release/development-of-a-curious-robot-to-study-coral-reef-ecosystems-awarded-1-5-
million-by-the-national-science-foundation.

Disclaimer: The views expressed here are those of the author expressed in their private
capacity and do not necessarily represent the views of AZoM.com Limited T/A
AZoNetwork the owner and operator of this website. This disclaimer forms part of the
Terms and conditions of use of this website.

Written by
Ben Pilkington
Ben Pilkington is a freelance writer who is interested in society and technology.
He enjoys learning how the latest scientific developments can affect us and
imagining what will be possible in the future. Since completing graduate studies
at Oxford University in 2016, Ben has reported on developments in computer
software, the UK technology industry, digital rights and privacy, industrial
automation, IoT, AI, additive manufacturing, sustainability, and clean
technology.

Read this article online 6


Article

What is Bionic Engineering?


By Reginald Davey

While it may seem to belong to the realm of science fiction, recent advances in the
field of bionic engineering and bionics have provided the potential for augmenting
the human body, repairing damage to limbs, and even replacing entire sections of the
human body. This article will look at the field of bionic engineering and recent
advances in bionic technologies.

Image Credit: PHOTOCREO Michal Bednarek/Shutterstock.com

Bionics
Bionics is a field of engineering that studies and develops mechanical systems that
accurately mimic living organisms' function or parts. Biological structures, methods, and
systems are applied to the design of engineering systems and modern technologies.

The inspiration for bionic engineering comes from the observation that evolutionary
pressures force biological organisms to adapt and develop structures and processes
which possess the optimal efficiency for survival.

Bionics is an interdisciplinary field that combines engineering and life sciences. Related
interdisciplinary fields include biophysics, biomechanics, cybernetics, biocybernetics,
information theory, biomedical engineering, and bioengineering.

Read this article online 7


Article

A large overlap exists between these fields, making a sharp distinction difficult in that
they require the same basic information but only differ in their application and use.

The field’s origins can be traced back to 1960, when around seven hundred physicists,
engineers, biologists, biophysicists, psychologists, and psychiatrists attended a
congress in Dayton, Ohio. Some notable historical examples of biomimetic engineering
include Cat’s eye reflectors and Velcro.

Today, bionics and bionic engineering are pushing the boundaries of biology and
engineering with the development of numerous biomimetic devices that are changing
the future of areas such as physiotherapy, robotics, and medical science.

Bionic Vision
The bionic eye is a potential breakthrough technology that can enhance the vision of
patients with eye conditions and partial and complete blindness.

Bioelectric implants can interpret visual data and transmit them to cells in the visual
cortex, which can then be interpreted as visual data by the brain. There are two main
challenges that exist with this technology: the complexity of mimicking retinal function
and the preferences and constraints of consumers for miniature, implantable devices.

Despite the challenges with design and


meeting consumer demands, the bionic
vision market contains numerous
Related Stories
prototypes and commercial products.
What is the Effectiveness of
Machine-Learning Algorithms in
A notable example is the Argus II,
Predicting Melanoma
developed by Second Sight Medical Recurrence?
Products, which is a prosthetic device
that consists of a microelectronic array UB Hosts ASME 2014
implanted in the retina, a wearable International Design and
camera, and an image processing unit. Engineering Technical
Conferences & Computers and
Information in Engineering
Images captured by the camera, which
Conference
can be integrated into wearable glasses,
are transmitted to the processing unit, NSF Awards Grant to Establish
wirelessly transmitting signals to the Engineering Research Center for
implanted microelectronic array. These Sensorimotor Neural Engineering
impulses stimulate the retinal cells by

Read this article online 8


Article

converting the signals into electrical impulses, acting as a vital link between the object
and optic nerve, bypassing damaged photoreceptors.

Auditory Bionics
Bionic devices can help people with partial or profound hearing loss. Cochlear implants,
auditory midbrain implants, and auditory brainstem implants are the three main classes
of this technology. An artificial link between the brain and the auditory source is created
via a microelectronic array implanted in either the brain stem or the cochlea.

Compared to bionic devices that aid sight, auditory bionics is a more mature commercial
technology field. The market has greater global adoption, a larger innovative ecosystem,
and more products currently on the market.

Many companies produce auditory bionics products, including MED-EL, Advanced


Bionics, Cochlear Limited, and several smaller, regional companies.

Bionic Limbs
According to the WHO, around 15% of the world’s population live with some form of
physical disability, whether it is hereditary or stemming from injuries and accidents.

Around 190 million people worldwide have a severe functional difficulty.

For around one hundred years, prosthetic limbs have been the norm for providing some
degree of functional independence for patients. In recent decades, bionic limbs have
started to replace prosthetic limbs, which, despite some advances in technology and
lighter, advanced materials, suffer from limitations.

Bionic limbs are interfaced with the neuromuscular system of patients. This allows
enhanced control of limbs that mimic biological functions. Grasping, bending, and
flexing are controlled by the brain. Movement is controlled via an electronic pathway
that bypasses the damaged peripheral nerves connecting the brain and the mechatronic
limb.

The Bionic Engineering Lab at the University of Utah is one example of academic
institutions developing next-generation bionic limbs.

Examples of research include the Ergo Knee Expo, a research exoskeleton designed to
study human-robot interaction and ergonomics, an adaptive positioning ankle that uses

Read this article online 9


Article

a novel non-backdrivable actuation system, and an the OpenLegs Bionics CAD model,
which is an open-source project that makes powered prostheses more accessible.

Perhaps the most notable bionic device under development at the Bionic Engineering
Lab is an advanced bionic leg design. Instead of being controlled by signals from the
brain, it utilizes artificial intelligence to control itself.

Additionally, MIT has announced the founding of an interdisciplinary bionics center with
the aid of $24 million in funding from Lisa Yang, a philanthropist and former investment
banker. The K. Lisa Yang Center for Bionics will focus on three bionic technologies
during its first four years of operation: bionic limb reconstruction, digital nervous
systems, and brain-controlled limb exoskeletons.

Bionic Design in Non-Medical Industries

Beyond medical devices, bionic engineering has vast potential for multiple industries,
including human and animal-like biomimicking robots and military and construction
exoskeletons.

A notable example is the bird wing-designed morphing wing, which can change its
shape during flight in line with optimal mission performance. Other research is being
conducted into the perception, decision-making, implementation, and interaction
capabilities of autonomous robotic aircraft using bionics.

The Future
The field of bionic engineering is progressing rapidly, and the integration of
technologies such as artificial intelligence and the development of advanced materials is
pushing technology forward into the 20th century. The future of bionic engineering is
exciting and has vast potential for medicine and industry.

Social Robots: A Companionship Cure?

Further Reading
Roth, R.R (1983) The Foundation of Bionics [online] Perspectives in Biology and
Medicine 26(2) | muse.jhu.edu. Available at:
https://muse.jhu.edu/article/403622/summary
University of Utah (2021) Bionic Engineering Lab, [online] Available at:
https://belab.mech.utah.edu/

Read this article online 10


Article

Michalowski, J (2021) New bionics center established at MIT with $24 million gift
[online] MIT News | news.mit.edu. Available at: https://news.mit.edu/2021/new-
bionics-center-established-mit-24-million-gift-0923
Frost & Sullivan (2017) Bionics: A Step into the Future [online] Alliance of Advanced
BioMedical Engineering | aabme.asme.org. Available at:
https://aabme.asme.org/posts/bionics-a-step-into-the-future
The SMC Society (2022) Autonomous Bionic Robotic Aircrafts [online] ieesmc.org.
Available at: https://ieeesmc.org/technical-activities/systems-science-and-
engineering/autonomous-bionic-robotic-aircraft

Disclaimer: The views expressed here are those of the author expressed in their private
capacity and do not necessarily represent the views of AZoM.com Limited T/A
AZoNetwork the owner and operator of this website. This disclaimer forms part of the
Terms and conditions of use of this website.

Written by
Reginald Davey
Reg Davey is a freelance copywriter and editor based in Nottingham in the
United Kingdom. Writing for AZoNetwork represents the coming together of
various interests and fields he has been interested and involved in over the
years, including Microbiology, Biomedical Sciences, and Environmental Science.

Read this article online 11


News

Keeping Semiconductor
Manufacturing Clean with Robotics
By Taha Khan

Both the robotics and semiconductor manufacturing fields are rapidly evolving
simultaneously. In this article, the benefits of using robots in the semiconductor
manufacturing sector, as well as the advantages and challenges associated with their
use, are discussed.

Image Credit: Wichy/Shutterstock.com

As robots become more commonplace in industrial automation and manufacturing, their


potential application in the global semiconductor industry has received much interest.
Semiconductor manufacturing is a complicated process. Robot systems, among other
technologies, are crucial in automated semiconductor production. Due to complex
assemblies and small parts such as wafers and chips, manufacturers are designing
flexible robots that can meet delicate semiconductor fabrication processes.

Using robots to handle microscopic parts has become essential in introducing micro and
nanotechnology for semiconductor manufacturers. Similarly, moving silicon wafers at
fast rates without causing damage is a critical problem for these machines in
semiconductor fabrication.

Read this article online 12


News

These tasks become more complex with variable wafer sizes. To avoid causing any
damage to these delicate components, advanced clean robots are used in handling
motions

Why Are Robots Used in the Semiconductor Industry?


Semiconductor fabrication requires a high level of purity and cleanliness within the
manufacturing environment. Robot manufacturers play a crucial role in meeting these
needs, helping to ensure a safe, impurity-free workflow.

For highly sensitive applications, these systems can offer an ultra-high performance line
of cleanroom robot arms. Such installations can manipulate cassettes, silicon wafers,
photomasks and panels in all semiconductor value chains with smooth trajectories, high
speed, and exceptional accuracy.

The design of these robots depends


upon the task they are required to
perform. Since the handling of
Related Stories
semiconductors is a very delicate
Robots to Clean up Japan’s
process, each system component
Radioactive Waste Await
directly involved in handling these
Clearance in Europe
semiconductors must be critically
assessed to ensure precise movements. Celestica to Buy Semiconductor
Equipment Manufacturing
For example, rather than having one Operations from Brooks
system designed for all tasks, specific Automation
robots are made for handling inspection
Additive Manufacturing in
and cleaning purposes, while others are
Automation Industries
designed for front-end tasks like wafer
fabrication.

Clean Semiconductor Manufacturing


Robot manufacturers have revolutionized the semiconductor industry. The technological
advancement achieved with the help of these systems has helped catalyze
semiconductors fabrication.

The latest robots offer reliability, flexibility, and high-speed handling of wafers or
cassettes for front-end semiconductor fabrication processes like cleaning, lithography,
etching, and polishing. Similarly, they perform back-end tasks like testing, packaging,
and assembly.

Read this article online 13


News

One common structural component observed amongst robots with arms is that they are
covered with glossy polyurethane surfaces. To meet the requirements of cleanrooms,
modern designs have to meet ISO standards, including ISO 14644-1 class 2/3.

Several companies offer advanced robots for keeping semiconductor manufacturing


clean, including Mitsubishi, KUKA, Kawasaki and many more.

Kawasaki has introduced two series of clean robots, including the NTS series and TTS
series. These high-precision systems work with advanced drive systems very smoothly.
In addition to high precision, the TTS series has a rigid telescopic mechanism allowing
high-speed handling in both low and high positions.

Both the NTS and TTS series can access two and three FOUPs of EFEM without a track.
NTS series includes two types of robots, namely NTS10 having a four-axis degree of
freedom and NTS20 having a five-axis degree of freedom. Similarly, the TTS series
include TTS10 having a four-axis degree of freedom and TTS20 having a five-axis
degree of freedom. They are compliant with SEMI-F47 and SEMI-S2 standards.

Challenges for Semiconductor Manufacturing


In the semiconductor industry, technology is advancing at a very rapid pace. To keep up
with these innovations, the robotics industry faces a massive challenge of advancing
simultaneously.

Although fields like soft robotics, micro and nanorobotics, artificial intelligence (AI) and
machine learning has boosted the development of this sector, there is still room for
improvement as far as applications in semiconductor manufacturing are concerned.

For example, there is still room for improvement in fully automated robots through AI
and machine learning to reduce manual labor, which will eventually help keep the
environment cleaner. Furthermore, this will also reduce the cost of hiring skilled staff to
operate these machines.

Inspecting and testing semiconductors with vision-enabled robots is also critical to


guarantee that they are executing their jobs correctly. In semiconductor production,
artinspection and testing are critical tasks.

Since semiconductor production involves numerous small and fragile parts, better
resolution cameras are required. Only high-resolution cameras are capable of locating
and inspecting these pieces.

Read this article online 14


News

Semiconductor manufacturers can use intelligent systems with a vision system to


eliminate hard automation and tooling, which were formerly necessary for the fabrication
of semiconductors. Robots will play a larger role in semiconductor production as they
become more intelligent, accurate, and quicker. This adjustment is valuable since it
allows for more output at a cheaper cost.

References and Further Reading


Staubli.com. (2022) Cleanroom robotic solutions for semiconductor industry | Stäubli.
[online] Available at: https://www.staubli.com/gb/en/robotics/industries/more-
industries/semiconductor.html
KUKA AG. (2022) Automation in semiconductor manufacturing | KUKA AG. [online]
Available at: https://www.kuka.com/en-gb/industries/electronics-industry/automation-
in-semiconductor-fabrication
Mitsubishi Electric Americas (2022) Factory Automation Solutions | Mitsubishi Electric
Americas. Semiconductor Robot, Semiconductor Automation. [online] Available at:
https://mitsubishisolutions.com/semiconductor/
Robotics.kawasaki.com. (2022) Clean Robots for Semiconductor and Electronics |
Kawasaki Robotics. [online] Available at:
https://robotics.kawasaki.com/en1/products/robots/clean/
Weiss, A., Buchner, R., Tscheligi, M., and Fischer, H. (2011). Exploring human-robot
cooperation possibilities for semiconductor manufacturing. In the 2011 international
conference on collaboration technologies and systems (CTS) (pp. 173-177). IEEE. DOI:
http://doi.org/10.1109/CTS.2011.5928683

Disclaimer: The views expressed here are those of the author expressed in their private
capacity and do not necessarily represent the views of AZoM.com Limited T/A
AZoNetwork the owner and operator of this website. This disclaimer forms part of the
Terms and conditions of use of this website.

Written by

Taha Khan
Taha graduated from HITEC University Taxila with a Bachelors in Mechanical
Engineering. During his studies, he worked on several research projects related
to Mechanics of Materials, Machine Design, Heat and Mass Transfer, and
Robotics. After graduating, Taha worked as a Research Executive for 2 years at
an IT company (Immentia). He has also worked as a freelance content creator at
Lancerhop. In the meantime, Taha did his NEBOSH IGC certification and
expanded his career opportunities.

Read this article online 15


Thought Leaders

Robotic Microswimmers: The Next


Biomedical Breakthrough
Interview conducted by Megan Craig, M.Sc.

We speak to Dr. Daniel Ahmed from the Acoustic Robotics Systems Lab about the
development of robotic microswimmers modeled from starfish larvae. These
acoustically controlled microrobots are manipulated by ultrasound and are paving the
way for innovative collaborations between biomedical and microrobotic systems
research.

Please could you introduce yourself and tell us what


inspired your career into robotics?
I am Daniel Ahmed, an Assistant Professor of Acoustic Robotics for Life Sciences and
Healthcare at the Department of Mechanical and Process Engineering at Swiss Federal
Institute of Technology Zurich (ETH Zurich), Switzerland. As the head of the Acoustic
Robotics Systems Lab, I am developing micro/nanorobots based on ultrasound
technology that we expect to be game-changers for applications in medicine.

In particular, you work closely in developing


microswimmers. How did you become involved in
this?
This is an interesting question. I have always been fascinated by the way aquatic
animals swim in water, but have never had the opportunity to study their swimming
behavior.

In my academic career, I was always drawn to the study of acoustics. During my doctoral

Read this article online 16


Thought Leaders

studies in 2011, I was developing lab-on-a-chip devices using acoustically-activated


microbubbles. In the microfluidic channel, a microbubble was trapped in a horse-show
structure. Ultrasound interacts with these microbubbles to create vortices.

It struck me that if I could fabricate standalone horseshoe structures containing


microbubbles, they would be propelled by ultrasound. This was indeed the case, and I
was excited. Thus began the journey of artificial microswimmers.

The possibilities are endless, particularly for medical applications, which motivates me
even more. Acoustically-activated microswimmers represent a relatively new
development with many advantages, and I hope to be at the forefront of this field.

© Cornel Dillinger/ETH Zurich

Why do microrobots hold so much potential in the


biomedical sector?
I believe our innovative microrobotics systems will support novel strategies for targeted
drug delivery, investigating glioblastoma and various brain diseases.

Our technology will have a significant impact as a foundation for advancements in


brain research and vasculature biology, as well as in furthering our understanding
of diseases and the development of relevant treatments.

Read this article online 17


Thought Leaders

The combination of manipulation and drug delivery capabilities means it can also
potentially see a use for conditions such as stroke and heart attack.

Real-time imaging of micron-sized robots is a challenge. Ultrasound can generate real-


time images, but visualizing things on the micron scale is still difficult. Another challenge
is that the vasculature network of a mouse—or human—brain, is super complicated.

Could you give a brief overview of your research


objectives to our readers?
My lab, Acoustic Robotics Systems Lab (ARSL), focuses on developing new wireless
micro/nanorobots using acoustic technology. The objectives of our research include the
following:
1. Address the fundamental challenges associated with using micro/nanorobots within
living systems.
2. Develop ultrasound technology to manipulate drugs inside the body of animals to
treat diseases.
3. Develop visualization, tracking, and reinforcement learning methods to enable our
microrobots to manipulate the various vascular structures of mice.

Watch On
Youtube

This microrobot mimics a starfish larva. Video credit: ETH Zurich/YouTube.com

Read this article online 18


Thought Leaders

The flow patterns between the starfish larva and


microrobots are strikingly similar. Did you expect
such a high level of mimicry, and what aspects of
the microrobot design aided this development?
Initially, Cornel Dillinger (my doctoral student/first-author) and I did not expect such high
levels of agreement. Natural cilia and acoustic cilia are different in several ways. Cilia on
the surface of starfish larvae beat in a non-reciprocal manner with a frequency of 6Hz,
whereas acoustically-activated synthetic cilia oscillate back and forth at frequencies at
least three orders of magnitude higher.

Despite the very different dynamics, we still encountered similar microstreaming, which
was unexpected.

Our attention was drawn to the configuration of multiple cilia arranged in ciliary bands.
Inspired by the starfish larva, we arranged ciliary arrays in a "+" and "-"configuration.
Based on the arrangement of the ciliary arrays, a fluid source or sink can be developed
when the soft microstructures are activated with ultrasound.

This discovery initiated the research project leading to the creation of the starfish-
inspired microrobot and the trapping mechanism, which is also derived from starfish
larvae.

The use of ultrasound to control, propel and view


the movement of these microrobots is particularly
fascinating. Are there any other applications other
than the medical sector where this technology could
be applied?
Microfluidics or lab-on-a-chip systems are appealing because of their ability to control
liquids precisely, use small quantities of expensive chemicals and biospecimens,
separate particles with high resolution and sensitivity, and are portable and cost-
effective.

The manipulation and separation of particles and cells are required in many applications
in biology and medicine for drug screening and therapeutics. Presently, there are many
technologies available to separate particles based on filtration, centrifugation, acoustics,
optics, and electrophoresis.

Read this article online 19


Thought Leaders

Using acoustic waves for particle manipulation is attractive because it facilitates the
manipulation of particles independent of their optical, magnetic, or electrical properties
in a simple, precise, and biocompatible manner.

Our goal is to develop a new design for the lab-on-chip system for automatic
pumping and label-free manipulation and separation of cells and particles for
portable diagnostics.

Schematic of a bioinspired trapping mechanism consisting of a + ciliary band adjacent to a – ciliary band
(left). Acoustic power-dependent transport and trapping; transport and trapping of 10 µm particles (colored
trajectories) with maximum transport efficiencies of microparticles traveling from + to – ciliary band were
achieved at a 12 and b 18 volts peak-to-peak (VPP) (right). © Dillinger, C., Nama, N., and D, Ahmed. (2021)

What are the benefits of using acoustically activated


cilia over conventional external fields to control
their movement?
There are numerous advantages compared to conventional external fields in controlling
their movement. We can activate the cilia at high frequency, resulting in faster particle

Read this article online 20


Thought Leaders

transport in liquid. Furthermore, the fabrication is very straightforward and does not
require magnetic particles to activate the cilia magnetically, i.e., these systems are
biocompatible.

By using a 2-D photopolymerization process to print the microstructures, thousands of


ciliated systems can be fabricated quickly and efficiently. Eventually, the inexpensive
actuation principle using ultrasound makes these microrobotic systems accessible for
almost everybody and, therefore, can lead to more innovation.

How might the external environment alter how these


microrobots are controlled? For example, would
aspects such as viscosity alter their movement or
control?
This is an excellent question that requires some fundamental investigation. The acoustic
streaming produced by the ciliary bands of starfish in simple Newtonian fluids, such as
water, scales quadratically with input voltage. We need to examine the scaling
relationship of swimming speed in various non-Newtonian fluids, such as blood and
synovial fluid.

© Prakash Lab/Standford University

Read this article online 21


Thought Leaders

The design of the microrobots is particularly suited


to their applications. Could any components of this
larva-inspired technology be applied to robots of a
larger size?
Manipulating objects at the microscale presents a fundamental challenge since inertia
does not exist. In the case of large-scale robots, this will not be an issue. Nevertheless,
some of the components of our study can be adapted for use with large-scale robots.
We could, for example, integrate the trapping mechanism into large-scale aquatic
robots.

Watch On
Youtube

How Bioinspired Ultrasound Microrobots revolutionize Medicine | Daniel Ahmed. Video Credit:
Falling Walls Foundation/YouTube.com

Bioinspired robots have become a prominent


research and development area over the past few
years. How do you think the next 30 years of this
field will progress?
A microrobot will have a significant impact on the health sector, particularly in the
treatment of diseases.

The ARSL is exploring many of the fundamental challenges associated with microrobots,
especially when exposed to flow conditions similar to those in the vasculature. With the
advent of artificial intelligence, I anticipate that navigation of these microrobots will
become much more sophisticated, allowing them to potentially target diseases
effectively and promptly.

Read this article online 22


Thought Leaders

Upon reaching a site, these swimmers can release drugs very locally. My belief is that
microrobots could be used to treat patients suffering from heart disease or stroke. It is
likely that we will witness a revolution in medicine regarding the treatment of disease,
and we would not need to wait 30 years.

What aspect of this research do you find particularly


exciting?
I have so many reasons to be excited about my lab and research. In general, I would say
that we conduct both fundamental and application-oriented research. We find it very
exciting to explore new physical phenomena and to adapt technology to a final
application. Moreover, it is our aim that our research is also applied and, therefore, we
can give something back to society. ETH and my lab house an incredibly talented group
of students and researchers, and I enjoy interacting with and mentoring the next
generation of leaders.

Where can our readers go to stay up to date with


this research?
The homepage for my lab can be accessed here. A list of related research is shown
below with our most recent paper highlighted at the top:

Ultrasound-activated ciliary bands for microrobotic systems inspired by starfish


Neutrophil-inspired propulsion in a combined acoustic and magnetic field
Selectively manipulable acoustic-powered microswimmers

For more information on the project as a whole please see here.

About Dr. Daniel Ahmed


Daniel Ahmed is an Assistant Professor of Acoustic Robotics
for Life Sciences and Healthcare in the Department of
Mechanical and Process Engineering at ETH Zurich
(Switzerland).

Currently, he leads the newly-built Acoustic Robotics


Systems Lab. Daniel Ahmed holds Bachelor's, Master's, and
Doctoral degrees in Engineering Science and Mechanics
from Pennsylvania State University (U.S.).

Read this article online 23


Thought Leaders

He was a postdoctoral researcher and a senior scientist in the Department of Mechanical


and Process Engineering (D-MAVT) at ETH Zurich. Daniel Ahmed's research focuses on
utilizing acoustics in combination with micro/nanorobotic and microfluidic systems with
the aim of developing technologies at the interface of biotechnology, biomedical
engineering, and medicine. During his doctoral studies at Pennsylvania State University,
Daniel Ahmed initiated and led a microbubble-based acoustofluidics group to develop
new microfluidic and microrobotic systems.

His dissertation established a new concept in micro-actuation and soft microrobots,


showing that such robots could be controlled wirelessly by an acoustic field. During his
research at ETH Zurich, Daniel combined multiple actuation modalities to develop new
micro/nanorobotics systems. He was awarded a European Research Council (ERC)
Starting Grant on Acousto-Magnetic Micro/Nanorobots for Biomedical Applications
(SONOBOTS) in 2019.

Daniel Ahmed has published 31 peer-reviewed journal articles and has published 14
conference proceedings. He has been selected as one of the ten winners in Falling
Walls Science Breakthroughs of the Year 2021 in Engineering and Technology.

Disclaimer: The views expressed here are those of the interviewee and do not
necessarily represent the views of AZoM.com Limited (T/A) AZoNetwork, the owner and
operator of this website. This disclaimer forms part of the Terms and Conditions of use
of this website.

Written by
Megan Craig
Megan graduated from The University of Manchester with a B.Sc. in Genetics,
and decided to pursue an M.Sc. in Science and Health Communication due to
her passion for combining science with content creation. As part of her studies,
Megan partnered with Jodrell Bank Discovery Centre as a Digital Marketing
Assistant, producing content and updating sections of their website. In her
spare time, she loves to travel, exploring each location's culture and history -
including the local cuisine. Her other interests include embroidery, reading
fiction, and practicing her Japanese language skills.

Read this article online 24


Thought Leaders

Social distancing surveillance: how


robots will keep you in line
Interview conducted by Emily Henderson, B.Sc.

Could a COVID-19 robot help in social distancing surveillance? We spoke to Professor


Dinesh Manocha and Adarsh Jagan Sathyamoorthy about how their new robot may be
able to do just that.

Please could you introduce yourself and tell us what


inspired your latest research into COVID-19 and
social distancing?
Our group has done a lot of work on robot navigation in the presence of multiple walking
pedestrians and in crowded scenarios. When the Covid-19 pandemic hit, our goal was to
use this robotic navigation technology to deal with multiple challenges due to the
pandemic.

We explored the literature studied the precautions that are required during a pandemic
and identified social distancing to be a strong recommendation in social or public settings.
This inspired us to develop a robot system that could monitor and encourage social
distancing in public places using AI technologies. This robot can be deployed without any
human supervision.

Social distancing has been encouraged to help


reduce the spread of COVID-19. Why is it important
that people adhere to social distancing rules?
Research in medicine, simulations of pandemics, and CDC strongly recommend social

Read this article online 25


Thought Leaders

distancing has one of the highest impacts control the spread of a disease. It has been
effective and easy to follow. The scientific community has observed that during this
pandemic that social distancing has made a difference.

Image Credit: fatmawati achmad zaenuri/Shutterstock.com

There have been technology-based methods


employed to help detect social-distancing breeches
including WiFi and Bluetooth. What are some of the
problems associated with these methods and how
does your new robot overcome them?
WiFi and Bluetooth-based methods are accurate in detecting social distancing breaches
and our approach complements them. The WiFI and Bluetooth-based methods need
appropriate sensing technologies and cannot be easily deployed in all kinds of
environments (e.g., public places or isolated locations). These methods also need
additional infrastructure to be in place for detection.

Our method uses the visual feed from a depth camera onboard a mobile robot and
existing CCTV infrastructure (if available) to detect social distancing breaches. In
addition, the robot can autonomously navigate and interact with people and encourage
them to maintain social distancing. The robot also has a thermal camera, whose thermal
image feed can possibly be transmitted to security/healthcare professionals.

If an individual with a much higher temperature than average (symptoms of fever) is


detected in a certain location, the healthcare/security professionals in the building can

Read this article online 26


Thought Leaders

discreetly follow protocols for COVID testing and contact tracing. We do not record
people’s faces or identities. Therefore, an individual’s identity or symptoms (or any
personal data) are not stored or associated.

Can you describe how you designed and made your


new social distancing robot? How does it work?
Our robot is designed to track pedestrians or humans standing at public locations by
evaluating the proximities between them. We use a depth camera for this application. If
a certain set or group of people are closer than 6 feet from each other for a certain
period, our robot system automatically classifies them as a group.

If multiple groups are detected, the robot prioritizes groups depending on whether a
group is static or mobile, the number of members in a group, social interaction between
the members, etc., and autonomously navigates to the pedestrians in a top-priority
group to encourage them to move apart. For autonomous navigation, our robot uses a
Lidar sensor and our machine learning-based navigation algorithm.

Our robot uses a screen to display messages to the groups, and a thermal camera,
whose feed can be monitored by healthcare professionals for symptom monitoring.

Image Credit: Sathyamoorthy et al., 2021, PLOS ONE, CC-BY 4.0

Read this article online 27


Thought Leaders

Your COVID-19 robot is also able to sort people


breaching rules into different groups depending on
whether they are mobile or stationary. How is it able
to do this?
Using the images from the depth camera
and existing state-of-the-art pedestrian
tracking algorithms, we can estimate the
Related Stories
positions and velocities of pedestrians
Do previously infected
around the robot. individuals still benefit from
vaccination against COVID-19?
Our technique to classify groups is
based on observing the proximities How did population immunity
between humans and pedestrians over a against SARS-CoV-2 infection and
subsequent severe disease change
certain period. These proximities can be
between December 2021 and
efficiently calculated from the estimated
November 2022?
positions and velocities. The robot is
used to provide warnings about social Effects of implemented vaccine
distance constraints to the relevant rollout strategies on spread of
humans. SARS-CoV-2, subsequent global
burden of disease, and
emergence of novel variants

How could your robot help to keep people safe and


potentially limit the spread of SARS-CoV-2?
Using a mobile robot that can make use of existing CCTV infrastructure in a building is
used to warn people that do not follow the social distance constraints. Furthermore, it
encourages and promotes humans to maintain social distances. Other methods to
monitor social distancing are different, in terms of explicit interaction with a robot.

As mentioned before, our robot also aids in contact tracing since it has an onboard
thermal camera. All these features would help curb the spread of the disease. We
believe that our robot could be another tool for healthcare/security professionals to
simplify their workloads.

Read this article online 28


Thought Leaders

What further research needs to be carried out before


your robot could be used to help control social
distancing within crowds?
We believe the robot needs the capability to be socially intelligent and automatically
interact with large crowds. The social distancing guidelines are relaxed for the groups
consisting of family members, colleges, or humans from the same household, etc.

Our current system lacks the capability to identify such social groups in all scenarios.
We are trying to devise algorithms that will help the robot to identify such groups and
make it more socially intelligent. We also need to develop better techniques for human-
robot interaction.

Image Credit: deepadesigns/Shutterstock.com

Deep learning within science has become more


widespread over the last few years. How important is
deep learning to scientific advancements and do you
see deep learning becoming more prominent as
technology continues to advance?
Deep learning is a popular machine learning technique that has had a significant impact
on many applications including computer vision, speech understanding, natural
language processing, robotics, etc. They are frequently used in autonomous driving and

Read this article online 29


Thought Leaders

driver assistant systems, smartphones, IoT devices, etc. However, deep learning
systems need good, labeled datasets.

Moreover, they use high computational power for training and we need to develop
power-efficient methods that can be used with small and cheap devices or mobile
robots.

What does the future look like for autonomous robots


within science?
The development of many autonomous robots like self-driving cars, surveillance drones,
personal robots is greatly accelerated by recent developments in sensor technologies,
machine learning, computer vision, motion planning, etc.

Many companies are developing personal robots that can be used for home monitoring
and as social companions. It is one of the most promising technologies with wide
applications in science and daily activities.

What are the next steps for you and your research?
Our upcoming research would focus on improving the human-robot interaction and
improving the social intelligence capabilities of such robots. We are also developing
next-generation robot navigation technologies that can enable us to operate in complex
indoor and outdoor scenes. Ultimately, these robots have to be safe and friendly, so that
they can be widely accepted by society. They can be used as delivery robots or social
robots

Where can readers find more information?


More information can be found here:
https://journals.plos.org/plosone/article? id=10.1371/journal.pone.0259713
https://gamma.umd.edu/researchdirections/crowdmultiagent/cm

Written by
Emily Henderson
During her time at AZoNetwork, Emily has interviewed over 300 leading experts
in all areas of science and healthcare including the World Health Organization
and the United Nations. She loves being at the forefront of exciting new
research and sharing science stories with thought leaders all over the world.

Read this article online 30


Thought Leaders

Pediatric Robotic Surgery: past,


present and potential

Written by Keynote Contributor, Dr. Joseph G. Barone.


The orderly is shocked because a child is undergoing surgery, but no surgeon is at the
bedside. Unseen in a quiet corner of the operating room sits the surgeon, at a comfortable
console performing the surgery remotely using robotics. To be clear, robotic surgery is not
surgery performed by a robot. Rather, it is surgery performed by a surgeon who controls
the daVinci® Robotic Surgical System. Think of the robotic console as a video game
controller, but the game is very real.

Watch On
Youtube

da Vinci® Robotic Surgical System

Background
Usually, the surgeon places three or four laparoscopic ports into the child's abdomen,
much like standard laparoscopic surgery. The robot allows the surgeon to operate by
controlling those laparoscopic instruments using a console or control pad located near
the operating room. For the most part, the console is in the operating room,but it can

Read this article online 31


Thought Leaders

be someplace else, like an adjoining room or office. The console is not sterile, and
surgeons could operate without scrubbing; sterility is not an issue when using the
console.

Pediatric robotic surgery is still in its infancy compared to adult robotic surgery.
Pediatric robotic surgery is difficult because children have small abdomens compared to
adults. The small working areas pose a challenge for robotic surgery because there is
not much room for robotic arms. There are no special robots or robotic instruments for
pediatrics; the standard adult instruments are used. The robotic surgeon will use the
same instruments and robot for a 57-year-old prostatectomy and a six-month-old
nephrectomy.

Pediatric robotic surgery is more difficult than adult robotic surgery due to the
small size of the child's abdomen."

As surgeons became more comfortable with adult robotics, pediatric robotics emerged
as a better way to perform surgery on children. Before 2010, there was very little
pediatric robotic surgery being performed worldwide. This was because surgeons were
still on the learning curve for adult robotics and pediatrics represented a technical
challenge because of the child's small size.

But that changed quickly. At first, only the most skilled robotic surgeons would be
willing to perform robotic surgery on children. However, as surgeons became more
skilled in adult robotic surgery, more and more surgeons grew comfortable performing
robotic surgery on children. Remember, the robotic skills and instruments were the
same; the surgery just needed to be performed in a much smaller space. Adopting
robotics in children was a natural progression as adult surgeons became more
competent in robotics.

Image Credit: MAD.vertise/Shutterstock.com

Read this article online 32


Thought Leaders

Around 2012, the medical community was hopeful that the internet would allow surgeons
from one location to perform surgery on a child in another location, for example, across
State lines. At the start, this was seen as a potential breakthrough and created the
possibility that robotic surgery could be offered to children in remote areas where
robotics was not an option.

Unfortunately, the speed of the internet was a limiting factor as the surgeon's
movements were delayed by a few seconds during the operation, a major drawback and
nonstarter for surgery. In the future, advances in internet speed, including 5G and
beyond, may reignite the discussion to perform robotic surgery remotely around the
globe.

Faster internet speeds will allow robotic surgery to be performed in remote


locations where robots are not available."

Pediatric robotic surgery in urology


Pediatric robotic surgery is largely limited to urological procedures; it has not caught on
in the other pediatric surgical specialties that are more comfortable with traditional
laparoscopic or open surgery. This could be due to the fact that the robot was originally
approved for use only by gynecologists and urologists. The utility of robot surgery is
slowly catching on in other pediatric surgical specialties. For example, pediatric general
surgical procedures are being performed regularly at some centers.

The future of pediatric robotic surgery will include a wider application of robotic
surgery outside of urology."

The adoption of robotics in urology has been so fast that new pediatric urologists need
to be trained in robotics to be marketable. It is an expected skill that is demanded of
most new employers. Procedures such as nephrectomy, bladder reconstruction, and
removal of large kidney stones. As recent as ten years ago, no such training was needed
or offered at some fellowships. Now, robotics has taken over the world of pediatric
urological surgery and set it on a new course.

At our institution, Rutgers, Robert Wood Johnson Medical School, most major urological
cases are performed robotically. Two- or three-day hospital admissions are now cut
down to outpatient surgery as we push the envelope. This is a better use of our scarce

Read this article online 33


Thought Leaders

financial resources and is likely to become more common in the future. Patients can be
sent home on the same day because there is less pain and less recovery in robotic
surgery compared to open surgery.

Several major surgical procedures can now be performed on an outpatient basis,


reducing health care costs."

Image Credit: MAD.vertise/Shutterstock.com

At our center, surgeons are now using the daVinci® Surgical System Single port system
to perform surgery. Standard robotic surgery requires three or four laparoscopic ports to
be placed by the surgeon to perform surgery. For single port surgery, only one port is
needed resulting in less pain, faster recovery, and a better cosmetic outcome, which is
important for children from a psychological standpoint (Figure 1). This advance was
made possible by new technological innovations.

Figure 1: Patient's scar.

Read this article online 34


Thought Leaders

Single-port robotic surgery is being used at a few centers worldwide and is likely to
be adopted more widely in the next five years."

As mentioned above, single-port robotic surgery is currently uncommon in children. Like


standard robotic surgery was rare in children ten years ago, single-port robotic surgery
is uncommon in children. However, unlike ten years ago, the single port robot was
adopted by some centers almost immediately after it was introduced. This should create
a faster and wider adoption of single port robotics in children.

Conclusion
The future of robotic surgery is limitless. Newer technologies will emerge in the form of
small robots that can move around and perform surgery at faster internet speeds,
allowing remote robotic surgery to be performed around the globe. There may be a day
where robotic surgery is just that, surgery performed by a robot.

References
Kelly C Johnson 1, Doh Yoon Cha, Daniel G DaJusta, Joseph G Barone, Murali K
Ankem. Pediatric single-port-access nephrectomy for a multicystic, dysplastic kidney.
J Pediatr Urol. 2009 Oct;5(5):402-4. doi: 10.1016/j.jpurol.2009.03.011. Epub 2009 Apr
29.
Blanco FC, Kane TD. Single-Port Laparoscopic Surgery in Children: Concept and
Controversies of the New Technique. Minim Invasive Surg. 2012; 2012: 232347.

About Dr. Joseph G. Barone


Dr. Joseph G. Barone is Chief of Pediatric Urology at
Rutgers, Robert Wood Johnson Medical School located in
New Brunswick, New Jersey, USA. He served as former
Senior Associate Dean for Clinical Affairs and Chief
Operating Officer for the medical school overseeing an
academic practice of over 550 adult and pediatric
providers. Dr. Barone is a nationally recognized leader in
pediatric urology and has published over 150 peer-
reviewed articles, book chapters, and is the author of the
parenting book "It's Not Your Fault, Strategies for Solving
Toilet Training and Bedwetting Problems."

Read this article online 35


Thought Leaders

Dr. Barone's clinical research has appeared in scientific and non-scientific publications,
including Yahoo, the New York Times, and The Wall Street Journal, to name a few. His
focus is on performing research that can be useful for parents and children today. For
example, Dr. Barone demonstrated an association between breastfeeding and early
toilet training and documented the harmful effects of secondhand smoke on the
developing pediatric bladder.

Dr. Barone is part of the pediatric robotics team at Rutgers, Robert Wood Johnson
Medical School that also includes Dr. Sammy Elsamra, Chief of Robotics, Dr. Haris
Ahmed, Pediatric Urologist, and a dedicated group of anesthesiologists and nurses
whose collective vision is to innovate and advance robotic surgery for children.

Disclaimer: This article has not been subjected to peer review and is presented as the
personal views of a qualified expert in the subject in accordance with the general terms
and conditions of use of the News-Medical.Net website.

Read this article online 36


News

SEA-KIT Unveils New H-class USV


for Ocean Survey
From SEA-KIT International
Reviewed by Megan Craig, M.Sc.

Essex-based SEA-KIT International, a leading provider of low-carbon uncrewed


surface vessel (USV) solutions in active operation across the globe, has revealed a
new USV design that focuses on hydrography and environmental data collection.

SEA-KIT H-Class USV 12m & 15m variants. Image Credit: SEA-KIT International

The SEA-KIT H-class USV, with its retractable gondola and dual sensor deployment
options, is a highly configurable design based on a wealth of operational data and
feedback collected from the company’s established X-class USVs. Several of these 12m
vessels are currently operational in the Indian Ocean, North Sea, Red Sea and the
Pacific.

The H-class features a composite hull for higher transit speeds, giving it greater range
and endurance, as well as active stabilisers to minimise roll. The new design has 12m
and 15m variants, with the 12m version transportable in a standard shipping container for
rapid, low-cost deployment. Both variants can be davit launched.

Read this article online 37


News

Ben Simpson, SEA-KIT CEO, said:


“Although many of the H-class USV’s
features directly benefit hydrographic
Related Stories
survey missions, this is a design that can
World Environment Day 2022:
perform many different tasks due to its
Visioning Carbon Neutral
large gondola and ability to dip cages
Offshore Technologies
and tow sensors. The fuel-saving, speed
and endurance benefits of the composite Uncrewed Vessel Returns from
hull add to the value of these USVs as First Volcano Caldera Survey in
low-carbon, cost-efficient solutions for a Tonga Loaded with ‘Astounding’
wide range of maritime operations. This Data
design is the next step towards our goal
of zero emission vessels.”

The H-class USV can accommodate a range of sensors as well as deploy a tow cage,
SVP, MAPR, CTD and side scan sonar for deep-water and nearshore bathymetric and
hydrographic survey missions. The vessel includes a Multibeam Echo Sounder (MBES),
station holding and winch-deployed sensor payloads for versatile ocean survey
capability.

SEA-KIT’s H-class USV is designed to MCA Category 0 for extended, over-the-horizon


capability and will hold Unmanned Marine Systems (UMS) certification from Lloyd’s
Register as well as Lloyd’s Register approval for design and hull construction.

The SEA-KIT team will officially launch the new design at Seawork, Europe’s largest
commercial marine exhibition, which takes place in Southampton, UK, later this month
(21-23 June 2022).

Source: https://www.sea-kit.com/

Read this article online 38


Article

3D Printing Polymers with


Graphene
By David J. Cross
Reviewed by Bethan Davies

The idea of self-healing skin wrapped around a robotic skeleton conjures up all kinds
of sci-fi imagery, yet a group of Japanese researchers has managed to develop a
controllable robotic finger covered in “living” skin which they say could bring truly
human-like robots a step closer to reality.

Image Credit: The University of Tokyo. (2022) Robot skin heals | The University of Tokyo. [online]
Available at: https://www.u-tokyo.ac.jp/focus/en/press/z0508_00225.html

The team from the University of Tokyo has combined their expert knowledge in tissue
culturing and robotics to make this latest innovation which could have a significant
impact on how robots are developed in the near future.

Published in the journal Matter, the researchers detail the method they have developed

Read this article online 39


Article

for generating seamless coverage of 3D objects with a “living” skin equivalent. Led by
Professor Shoji Takeuchi, a leader in the field of biohybrid robots, the research is a clear
indicator of the intersection of robotics and bioengineering.

We have created a working robotic finger that articulates just as ours does, and is
covered by a kind of artificial skin that can heal itself.

Professor Shoji Takeuchi, A Leader in the Field of Biohybrid Robots

“Our skin model is a complex three-dimensional matrix that is grown in situ on the finger
itself. It is not grown separately then cut to size and adhered to the device; our method
provides a more complete covering and is more strongly anchored too,” Professor
Takeuchi continued.

Robotic Wound Repair


The robotic finger has living cells and
supporting organic material wrapped
around it for optimal strength and
Related Stories
shaping. Due to the fact the skin is soft
AI Systems That Can Self-
with self-healing abilities, it could find
Design & Self-Assemble to
application in scenarios that require a
Surpass Human-Built Systems
gentle but strong touch.
Robotic Finger Mimics Motions of
Takeuchi works with a team of Human Finger
researchers that bring in different areas
of expertise from across the University
Design of Self-Oscillating
Materials can Pave the Way for
of Tokyo in order to explore concepts
Fabricating Self-Regulating Soft
such as lab-grown meat, artificial Robots
muscles, synthetic odor receptors, etc.

The robotic finger covered in self-healing skin hopes to support medical research on
skin damage, including burns and severe wounds, as well as advancing various
manufacturing applications.

Takeuchi and his team were inspired by the medical treatment of deeply burned skin
using grafted hydrogels.

Read this article online 40


Article

We demonstrate wound repair of a robotic finger by culturing the wounded tissue


grafted with a collagen sheet.

Professor Shoji Takeuchi, A Leader in the Field of Biohybrid Robots

Cultivating ‘Living’ Skin


For some time now, 3D-skin models have been under the lens of research and testing in
cosmetics and rugs-based applications, yet this is the first time the technology has been
directly applied to robotics. Comprised of a lightweight collagen matrix known as a
hydrogel, the synthetic skin has several different types of living skin cells known
asfibroblasts and keratinocytes.

The living skin can be cultivated directly on to the robotic finger, which the researchers
found to be one of the more complex aspects of this research, necessitating engineered
structures that have been specifically made to which the collagen matrix can be
anchored.

Our creation is not only soft like real skin but can repair itself if cut or damaged in
some way. So we imagine it could be useful in industries where in situ repairability
is important as are human-like qualities, such as dexterity and a light touch.

Professor Shoji Takeuchi, A Leader in the Field of Biohybrid Robots

When developing the technology further, the researchers also aim to add other kinds of
cells, giving robotic devices the capability to sense as humans do.

In the future, we will develop more advanced versions by reproducing some of the
organs found in skin, such as sensory cells, hair follicles and sweat glands. Also,
we would like to try to coat larger structures.

Professor Shoji Takeuchi, A Leader in the Field of Biohybrid Robots

Human-like robotics could thus pave the way for new possibilities in advanced
manufacturing, and while the movement of the robotic finger is purely mechanical, it
could open up the door to advanced automation.

Read this article online 41


Article

Furthermore, the self-healing skin technology could provide hope for patients that
require life-changing procedures for burns or other kinds of wounds, reducing the
complexity of future research.

References and Further Reading


M. Kawai, et al., Living skin on a robot, Matter (2022), [online] Available at:
https://www.cell.com/matter/fulltext/S2590-2385(22)00239-9?
_returnURL=https%3A%2F%2Flinkinghub.elsevier.com%2Fretrieve%2Fpii%2FS25902
38522002399%3Fshowall%3Dtrue
The University of Tokyo. (2022) Robot skin heals | The University of Tokyo. [online]
Available at: https://www.u-tokyo.ac.jp/focus/en/press/z0508_00225.html

Disclaimer: The views expressed here are those of the author expressed in their private
capacity and do not necessarily represent the views of AZoM.com Limited T/A
AZoNetwork the owner and operator of this website. This disclaimer forms part of the
Terms and conditions of use of this website.

Written by
David J. Cross
David is an academic researcher and interdisciplinary artist. David's current
research explores how science and technology, particularly the internet and
artificial intelligence, can be put into practice to influence a new shift towards
utopianism and the reemergent theory of the commons.

Read this article online 42


News

Fully Autonomous Inspection


Robot Systems for Sewer Pipe
Networks
Reviewed by Bethan Davies

A maze of pipes and conduits for water, sewage, and gas can be found beneath the
streets. Regular monitoring of these pipes for leakage or maintenance usually
necessitates these to be dug up. The latter is not only laborious and costly (an
estimated yearly cost of £5.5 billion in the UK alone), but it also causes traffic
inconvenience and annoyance to individuals living nearby, not to mention
environmental damage.

Layout of the autonomous Joey mini-robot. Image Credit: TL Nguyen, A Blight, A Pickering,
A Barber, GH Jackson-Mills, JH Boyle, R Richardson, M Dogar, N Cohen

Imagine a robot that can navigate the tightest of pipe networks and send images of
damage or blockages to human operators. This is no longer just a dream, according to a
new study published in Frontiers in Robotics and AI by a team of scientists from the
University of Leeds.

Here we present Joey—a new miniature robot—and show that Joeys can explore
real pipe networks completely on their own, without even needing a camera to
navigate.

Dr. Netta Cohen, Study Final Author and Professor, University of Leeds

Read this article online 43


News

Joey is the first robot that is capable of navigating mazes of pipes as narrow as 7.5 cm
on its own. It weighs only 70 g and is small enough to nestle in the palm of the hand.

Pipebots Project
The current effort is part of the “Pipebots” project, a cooperation between the
universities of Sheffield, Bristol, Birmingham, and Leeds, as well as UK utility
corporations and other international academic and industrial partners.

Underground water and sewer networks are some of the least hospitable
environments, not only for humans, but also for robots. Sat Nav is not accessible
undergound. And Joeys are tiny, so have to function with very simple motors,
sensors, and computers that take little space, while the small batteries must be
able to operate for long enough.

Dr. Thanh Luan Nguyen, Study First Author and


Postdoctoral Scientist, University of Leeds

Dr. Thanh Luan Nguyen developed


Joey’s control algorithms (or “brain”).
Joey moves around on 3D-printed
Related Stories
“wheel-legs,” which roll over straight
Energid Successfully Deploys
parts and walk over tiny obstacles. It has
Software on Articulated
a variety of energy-efficient sensors that
Inspection Arm Robot for
detect its distance to walls, junctions, Nuclear Inspection
and corners, navigational tools, a
microphone, and a camera with ULC Robotics Deploys CISBOT in
“spotlights” to record and save defects Scotland for Advanced Gas Pipe
in the pipe network. The prototype only Maintenance
costs approximately £300 to make.
The Challenges the Automotive
Industry Faces with
Mud and Slippery Slopes Commercializing Autonomous
Vehicles
Joey was capable of navigating an
experimental network of pipes that
included a T-junction, a left and right corner, a dead-end, an obstacle, and three straight
segments without the assistance of human operators. Joey explored nearly 1 m of pipe
network in an average of just 45 seconds.

To make the robot’s life more challenging, the researchers demonstrated that the robot

Read this article online 44


News

can navigate up and down inclined pipes with realistic slopes. They also introduced
sand and gooey gel (really dishwashing liquid) to the pipes to test Joey’s ability to
maneuver through muddy or slippery tubes, with success.Significantly, the sensors help
Joey navigate the pipes without the assistance of a camera or power-hungry computer
vision. This saves energy and extends the life of Joey’s current battery. When the
battery runs out, Joey will return to its starting point to “feed” on power.

Joey, however, has one flaw: it cannot right itself if it accidentally turns on its back, like
an upside-down tortoise. According to the authors, the next prototype will be able to
overcome this issue. In the future, the researchers hope to make Joey waterproof and to
be able to function underwater in liquid-filled pipelines.

Joey’s Future is Collaborative


Based on a larger “mother” robot dubbed Kanga, the Pipebot scientists hope to create a
swarm of Joeys that communicate and collaborate. Kanga, which is now being
developed and tested by some of the same authors at Leeds School of Computing, will
be endowed with more advanced sensors and repair tools like robot arms, as well as the
ability to carry numerous Joeys.

Cohen says, “Ultimately we hope to design a system that can inspect and map the
condition of extensive pipe networks, monitor the pipes over time, and even execute
some maintenance and repair tasks.”

We envision the technology to scale up and diversify, creating an ecology of multi-


species of robots that collaborate underground. In this scenario, groups of Joeys
would be deployed by larger robots that have more power and capabilities but are
restricted to the larger pipes. Meeting this challenge will require more research,
development, and testing over 10 to 20 years. It may start to come into play around
2040 or 2050.

Dr. Netta Cohen, Study Final Author and Professor, University of Leeds

Journal Reference:
Nguyen, T. L., et al. (2022) Autonomous control for miniaturized mobile robots in
unknown pipe networks. Frontiers in Robotics and AI.
doi.org/10.3389/frobt.2022.997415.
Source: https://www.frontiersin.org/

Read this article online 45


LEARN MORE ABOUT
ROBOTICS AND
AUTOMATION ON

You might also like