Download as pdf or txt
Download as pdf or txt
You are on page 1of 24

Review

www.afm-journal.de

Elastomeric Haptic Devices for Virtual and Augmented Reality


Hedan Bai, Shuo Li, and Robert F. Shepherd*

Gaussian curvatures).[1,2] In order to wrap


Since the modern concepts for virtual and augmented reality are first ourselves in envisioned haptic device and
introduced in the 1960’s, the field has strived to develop technologies to convert the environmental mechanical
for immersive user experience in a fully or partially virtual environment. stresses it encounters into neural signals,
Despite the great progress in visual and auditory technologies, haptics has it needs to conform to these topographies,
remain comfortable to the wearer’s deli-
seen much slower technological advances. The challenge is because skin
cate skin, and still provide input to the
has densely packed mechanoreceptors distributed over a very large area multitude of mechanoreceptors (Figure 1).
with complex topography; devising an apparatus as targeted as an audio The present, ubiquitous solution to
speaker or television for the localized sensory input of an ear canal or iris is haptics relies on solid state vibrotac-
more difficult. Furthermore, the soft and sensitive nature of the skin makes tile actuators that present a single touch
sensation over a small area; scaling this
it difficult to apply solid state electronic solutions that can address large
solution over a larger area would prove
areas without causing discomfort. The maturing field of soft robotics offers uncomfortable, expensive, and intrac-
potential solutions toward this challenge. In this article, the definition and table due to the inability to wrap them in
history of virtual (VR) and augmented reality (AR) is first reviewed. Then an a mode that allows freedom of movement
overview of haptic output and input technologies is presented, opportunities of the user. Another example is that of
for soft robotics are identified, and mechanisms of intrinsically soft actuators voice coils which has recently been used
for “haptic illusions” that trick the mind
and sensors are introduced. Finally, soft haptic output and input devices are
into thinking a low resolution haptic input
reviewed with categorization by device forms, and examples of soft haptic is a continuous stream of touch along the
devices in VR/AR environments are presented. skin.[3] Outside of this “haptic illusion”
regime, however, the skin is still addressed
by large and discrete rigid islands. These
1. Introduction examples clarify the twofold problem experienced by our pre-
sent technology: conventional devices are built with rigid mate-
Virtual and augmented reality (VR and AR) have experienced a rials and create stress concentrations where they are located
revival in the past decade since the modern concepts were first and, simultaneously, it is difficult to wrap them onto our skin’s
introduced in the 1960s. With the goal of creating an interac- expansive and complex surface (Figure 2).[4]
tive media where the user experiences presence in a fully or Soft robotics is a maturing subfield of robotics where actua-
partially virtual environment, the field has initially focused on, tors and sensors are created via unconventional materials and
and made great advances in, providing realistic visual and audi- structures for mechanisms that undergo large amplitude defor-
tory feedback. Haptics, on the other hand, is a less developed mation modes upon stress loadings induced from interaction
technology; however, in the real world, the sensation of touch with the objects of interest. In many cases, these objects of
is essential for human perception, both functionally and emo- interest are organisms[5,6] and, for our special case of haptics,
tionally. This gap between touch in the real and virtual world is humans. To program this mechanical compliance, extrinsically
primarily due to the challenge that skin is the largest organ on soft robots use components with thin geometries (e.g., wires,[7]
human body (areal coverage of ≈2 m2 of an adult’s body, with films,[8] etc.).[9] This approach could employ a wide range of
some locations having up to 500 mechanoreceptors per cm3), materials, including those with high Young’s moduli (e.g.,
with very complex topography (positive, negative, and zero steel E ≈ 205 GPa,[10] shape memory alloy E ≈ 40 GPa),[7] but
the resulting system has constrained geometry due to the thin
H. Bai, Prof. R. F. Shepherd structural components. Intrinsically soft robots, on the other
Sibley School of Mechanical and Aerospace Engineering hand, are constructed from organic elastomers—carbon based
Cornell University polymers above their glass transition temperature (Tg)—that
124 Hoy Road, Ithaca, NY 14853, USA have storage moduli similar to human tissues ranging from
E-mail: rfs247@cornell.edu
1 kPa < E < 1 MPa.[11] The matching of intrinsic mechanical
Dr. S. Li, Prof. R. F. Shepherd
properties with human tissue allows for organic robots to be
Department of Materials Science and Engineering
Cornell University built with various constructions and thicknesses, and for dif-
4 Central Avenue, Ithaca, NY 14853, USA ferent functions while maintaining safe human-robot interac-
tion in a natural manner (Figure 2). In this review, we focus our
The ORCID identification number(s) for the author(s) of this article
can be found under https://doi.org/10.1002/adfm.202009364.
discussions on this class of robotic systems.
Actuators and sensors composed of elastomers offer the
DOI: 10.1002/adfm.202009364 potential for conformal haptic devices that provide inputs

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (1 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

Figure 1. Mechanoreceptor distributions in the expansive and complex surface area of human skin and soft haptic devices with conformal skin contact.
Left: Mechanoreceptor distributions. Reproduced with permission under the terms of the Creative Commons License.[2] Copyright 2017, the authors.
Published by Frontiers. Right: Soft haptic output and input devices. Output devices from top to bottom: Tactile display on a fingertip. Reproduced with
permission.[149] Copyright 2008, IEEE. Kinesthetic feedback glove. Reproduced with permission.[114] Copyright 2016, IEEE. Vibrotactile feedback wear-
able. Reproduced with permission.[86] Copyright 2019, Springer Nature. Input devices from top to bottom: Pressure sensor on a fingertip. Reproduced
with permission.[166] Copyright 2020, American Association for the Advancement of Science (AAAS). Motion and force sensing glove. Reproduced
with permission.[174] Copyright 2020, AAAS. Sensing sleeve for gesture detection. Reproduced with permission.[165] Copyright 2020. Springer Nature.

and outputs addressing the large area and varying mecha- 2. Virtual Reality and Augmented Reality
noreceptor densities of the skin, as well as, the large range
of motion of the joints. The low-cost fabrication techniques 2.1. Definition of Virtual Reality and Augmented Reality
(e.g., molding of fluidic elastomer actuators (FEAs),[12] or
direct 3D printing[13,14]) promise the next generation cost Although virtual reality is widely known by its hardware tech-
effective haptic devices for VR and AR. In this review, we nologies of head mounted displays, headsets, controllers, weara-
focus on recent advances in soft robotics that have been, or bles, and optical tracking systems, it is substantive to formulate
have the potential to be used for haptic applications in VR the definition of virtual reality from the perspective of human
and AR. We first provide an overview of the definition and experience. Jonathan Steuer defined virtual reality as “… a real
progress of virtual and augmented reality to present tech- or simulated environment in which a perceiver experiences tel-
nical goals specific to VR and AR applications. We then epresence,” where telepresence refers to the experience of pres-
briefly review the field of haptics, with a focus on output and ence, generated by perception factors of the perceiver’s sensory
input technologies, to identify opportunities for soft robotic inputs and mental processes, in a mediated environment.[16]
contributions. Next, we introduce advances in intrinsically Based on the experience-centric definition, Steuer identified the
soft actuators and sensors in the context of haptic feedback two determinants of virtual reality technologies to generate tel-
and sensing. Then we review soft haptic output and input epresence: Vividness and interactivity (Figure 3a).
devices categorized by the device form of wearable, graspable Vividness refers to the “ability of a technology to produce
and touchable interfaces[15] and show examples of soft haptic sensorily rich mediated environment” and is characterized by
devices demonstrated for VR and AR applications. At last, we i) sensory breadth: The number of perception channels stim-
conclude with challenges in the field and discuss potential ulated among the five perception systems-orientation, audi-
future research directions. tory, haptic, taste-smell, and visual systems; and ii) sensory

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (2 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

Figure 2. Material composition of emerging soft haptic devices in comparison with existing rigid haptic devices. Soft devices from left to right: Vibro-
tactile actuator (silicone elastomer). Reproduced with permission.[134] Copyright 2020, Mary Ann Liebert, Inc. Soft controller (polyurethane elastomer).
Reproduced with permission.[191] Copyright 2018, IEEE. Soft haptic interface (hydrogel and silicone elastomer). Reproduced with permission.[182] Copy-
right 2019, Mary Ann Liebert, Inc. Rigid devices from left to right: Piezoelectric bimorphs (piezoelectric ceramics). Reproduced with permission.[4]
Copyright Fuji Ceramics. Valve index controller (plastics and metals). Reproduced with permission.[48] Copyright Valve Corporation. Shape morphing
surface (plastics and metals). Reproduced with permission.[28] Copyright 2019, ACM.

depth: The quality of each perception channel presentation. characterized by speed, range, and mapping. Modern VR con-
While there have been great improvements in the quality of trollers have achieved real-time translation of the user’s input
visual and auditory feedback technologies in recent years, far to actions in the virtual environment. The range, defined by
less effort has been spent on the remaining perception chan- the number of possible actions at any given time, is limited
nels. Steuer speculated that the breadth and depth of sensory by the functions built in the controller. Mapping refers to the
perceptions are multiplicatively related to the experience of way users’ actions impact functions in the mediated environ-
presence. Therefore, simultaneous engagement of multiple ment. While mapping can be arbitrary to the function being
perception channels could be extremely effective in generating performed, in cases where the function has a direct real-world
the sense of presence even if the quality of individual sensory counterpart, it is ideal to adopt a natural action mapping, since
feedback is low.[16] our perceptual system has been optimized to interact with
Interactivity describes the extent the user can modify the real-world.[16] Wearable technologies have demonstrated
the form and content of the mediated environment and is human-computer interactions in an instinctive way.
While virtual reality immerses the user in an entirely arti-
ficial world, augmented reality supplements the real world
with additional information generated virtually. Milgram and
Kishino introduced a representation of virtuality continuum
(Figure 3b) that defined environments based on proportions
of real and virtual information involvement as real environ-
ment (RE), augmented reality (AR), augmented virtuality (AV),
and virtual environment (VE).[17] They termed environments
with both real and virtual components as mixed reality (MR)
and, today, it is more commonly referred to as augmented
reality (AR). Azuma defined AR with three characteristics:
“1) combines real and virtual; 2) is interactive in real-time;
3) is registered in three dimensions.”[18] These pose additional
requirements to the input and output devices of AR systems.
The input devices need to sense both the user’s actions and
objects in the real environment in real time and register them
Figure 3. Definition of virtual and augmented reality. a) Technological
as 3D information in the system; the output devices would ide-
variables of virtual reality based on the experience-centric definition.
Reproduced with permission.[16] Copyright 2006, Oxford University Press. ally create reactions of the real objects subject to virtual influ-
b) Milgram’s diagram of virtuality continuum. Reproduced with permis- ences in addition to sensory stimulus for the user to feel the
sion.[17] Copyright 1994, IEICE. virtual proportion of AR.

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (3 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

2.2. History of Virtual Reality and Augmented Reality developing children and adults, haptics further plays an essen-
tial role in our perception of the physical environment, as well
The second half of the twentieth century sees the emergence and as, the social function of interpersonal bonding.[21] As visual
rapid development of modern VR and AR technologies enabled and auditory technology advancements expand our access
by the technological advances in electronics. In 1962, Morton beyond the immediate physical environment toward a virtual or
Heilig built the Sensoroma, an “Experience Theater” considered a remote one, the critical haptic interaction that humans have
as one of the earliest VR systems that engages senses of sight, historically relied upon is being lost in the translation. Haptic
sound, smell, and touch. In 1968, Ivan Sutherland invented the interaction involves delivering human actions (i.e., motion and
first VR/AR head-mounted display (HMD), creating the sense force) to the virtual or remote environment (haptic input) and
of immersion through interactive 3D moving perspectives via reconstructing haptic feedback sensations from that environ-
a mechanical tracking setup and computer-generated graphics. ment (haptic output). Since the ideal haptic inputs and outputs
The head-tracking setup was dubbed “The Sword of Damocles” would happen in a natural and realistic manner, challenges
as the large and heavy structure connecting the HMD needed arise in the quest of technologies that could deliver realistic
to be suspended from the ceiling while using linkages to track haptic interactions in these environments.
head movements. The Sensoroma and The Sword of Damo-
cles are considered first examples of modern VR technologies
in that they provide various sensory feedbacks from the vir- 3.1. Haptics Background
tual environment to the user creating the sense of immersion.
In 1969, Myron Kruger devised a series of computer-generated In the past 30 years, the field of haptics has seen steady devel-
environments that could respond to the users’ actions, which opment where useful haptic technologies have demonstrated
he termed as “artificial reality.” His work, the Videoplace built benefits for specialized applications such as medical training,
in 1975, is considered the first illustration of an interactive aug- minimally invasive surgery, and teleoperation of robots.[22–24]
mented reality, where users can interact with virtual objects, such The field employs well-established actuation mechanisms: Elec-
as, moving or drawing on them, in a room equipped with video tric motors, pneumatics, and piezoelectric actuators, and, more
cameras and pressure plates as sensors.[19,20] recently, ultrasound, electrical stimulation, voice coils, and
In the 1980’s and 1990’s, the emerging developments in VR thermal modulations to create force feedback or tactile feed-
hardware technologies could be categorized as input devices, back sensations.[15] Despite a rich body of literature describing
that allow users to interact with the virtual environment a variety of haptic technologies, the wide real-world adoption of
through an interface, and output devices that create sensory haptics, especially for consumer electronics, is still limited to
stimulus for the sense of immersion. Jaron Lanier, founder of vibrotactile cues for notification functions. At issue for the wide
VPL Research, coined the term “virtual reality” in 1987 and con- adoption of more advanced haptics is the performance versus
tributed to the concept’s popularization. The company devel- cost tradeoff. Traditional haptic technologies constructed with
oped the first commercially available wearable gesture tracking rigid linkages and motors are expensive, and interaction with
glove, the VPL Dataglove, and a full-body tracking garment, such devices feels unnatural.
the Datasuit, that use fiber optic bundles to connect sensors At the time of a global pandemic (e.g., COVID-19 outbreak),
on the garment and the computer to track user’s motion as the a surging need for haptics emerges, not only for the special-
input. Developments in output devices focused on visual and ized applications, but even more so for daily routines and activi-
audio feedback in the form of HMDs. During this time, VR had ties. The luxury to perform work remotely becomes a necessity
found its early applications in military and medical training, and the need to cope with increasing stress and anxiety from
education, and entertainment. Even though the goal was to pro- both the virus and the social isolation becomes more essen-
vide immersive experience through the 3D interactive vision tial. All these tasks, whether for doctors to physically examine
and audio construction, technologies at the time, such as, the their patients remotely, for baseball players to feel the ball in
speed of PC based graphics accelerators, could not satisfy the the hand during a virtual pitching practice, or for self-quaran-
requirements to stimulate the sense of presence. The expensive tined individuals to virtually enjoy a comforting hug from their
cost to build such systems halted development of VR technolo- family, require haptic technologies to deliver more natural, real-
gies in the mid-1990s.[19,20] istic, and richer haptic sensations but at a lower cost compat-
In the past decade, accelerated advances in integrated cir- ible with the generic applications. This challenge brings oppor-
cuits and computational power described by Moore’s law led to tunities to explore haptic technologies from the fundamental
the revival of VR and AR technologies, with a focus on visual materials chemistry, to device level actuation and sensing, and
and audio feedback. VR and AR headsets, along with dynamic to higher level feedback controls toward the goal of improving
binaural audio, presently provide sufficient performance (i.e., device functionalities, as well as, reducing the cost in proto-
visual resolution, field of view, distant focus, etc.) to create typing and manufacturing.
improved sense of audiovisual immersion.

3.2. Haptic Output


3. Haptics
Haptic perception can be categorized as kinesthetic or cuta-
Haptics is the first sensation humans acquire in our prenatal neous perception. The former allows us to feel our limbs’ posi-
development, allowing us to perceive our sense of self.[21] For tion, movement, and force to gain knowledge of an object’s

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (4 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

Figure 4. Haptic output devices. a) The Phantom Premium for direct force feedback. Reproduced with permission.[25] Copyright 3D systems. b) Skin
deformation device as force substitution for kinesthetic perception. Reproduced with permission.[26] Copyright 2017, IEEE. c) Shape morphing with
pin arrays. Reproduced with permission.[28] Copyright 2019, ACM. d) Lateral force feedback device with in-plane ultrasonic oscillation and out-of-plane
electroadhesion. Reproduced with permission.[30] Copyright 2019, IEEE.

shape, size, and stiffness; The latter helps us gain knowledge used in existing haptic technologies. To bridge the gap in (i),
of an object’s texture, humidity, and temperature. Haptic feed- intrinsically soft actuators could provide large and high-band-
back devices create some of these sensations through either width force feedbacks while being conformally wrapped on a
varying some of the properties directly, or creating synthetic large area of the skin, as well as, over the joints, and thus allow
haptic effects (e.g., haptic illusion) that lead to the percep- kinesthetic and tactile feedback devices with small form factors.
tion of certain haptic properties. Traditional kinesthetic haptic The obstacle in (ii) is more challenging as the variation of these
devices for direct force feedback, benchmarked by the phantom properties requires the introduction of a portfolio of tools in
premium (Figure 4a),[25] typically rely on motors to provide the material science to the field of haptics. Here, we briefly discuss
high force and high bandwidth performance with multiple variable stiffness and morphing surfaces as parts of soft robotic
degrees of freedom (DOFs), and therefore are usually associ- technologies. For a more comprehensive review of material
ated with higher cost and larger form factor. To address this chemistry for tactile feedbacks, ref. [35] offers an inspiring out-
issue, researchers investigated schemes of using skin deforma- look for modulating tactile perception with molecular control
tion (i.e., lateral skin stretch, skin indentation) to induce indi- of materials.
rect kinesthetic force feedback sensations with a smaller device
form factor and have demonstrated perception of stiffness or
force variations (Figure 4b).[26] Direct shape variations have 3.3. Haptic Input
been demonstrated with pin array haptic surfaces, but they usu-
ally require a large structure to actuate the pins (Figure 4c)[27,28] Haptic input, capturing users’ movements and forces as they hap-
and indirect perception of shape variation has been created tically interact with the environment, is another important aspect
with controllable lateral force on a flat surface induced by ultra- that largely determines the quality of realistic interaction between
sonic oscillation and electrostatic modulation (Figure 4d).[29,30] a user and a virtual or remote environment. An ideal haptic inter-
Tactile feedback devices are generally delivered through haptic action would allow the user to intuitively interact with virtual
surfaces and wearables. Friction variation has been achieved objects in the same way as manipulating real physical objects
through electrostatic force modulation,[31,32] and vibrotactile when the virtual simulation has a direct real-world counterpart.
signals employed on wearables and haptic surfaces have shown Toward this end, tracking of the user’s movements and forces
great promise in generating indirect tactile sensations through in a manner that is undisruptive to the user’s intuitive actions
spatiotemporal control of the vibrations.[33] Electrical stimula- and generalizable to different tasks represents the fundamental
tion could create various tactile sensations with a more light- requirement for the ideal haptic input devices. The evolution of
weight and compact device form, yet this mechanism is more commercialized haptic input technologies can be captured by
invasive than other mechanical schemes and requires careful game controllers as one of the earliest and most widely adopted
control of the electrical stimuli to ensure safe operation.[34] For input devices to perform various actions in a virtual environment.
a comprehensive review of haptic technologies, we refer the Early game controllers employed switches and buttons where
readers to ref. [15]. each switch and button is a 1D binary control to trigger an action
While indirect haptic stimulations with existing technolo- in the game; Directional pads are designed to have four cross-
gies have led to the perception of various haptic sensations, the shape buttons allowing four-way digital direction control; Analog
level of realism and the number of addressed haptic proper- joysticks provide continuous analog directional input with poten-
ties are still far from satisfactory. We believe that direct haptic tiometers; Trackpads achieve 2D finger movement tracking with
stimulation (i.e., direct application of force for kinesthetic sen- capacitive coupling and further allow 2D gesture inputs for more
sation and direct variation of object haptic properties) could intuitive control.[36] Modern game controllers (Figure 5a) and VR
become an avenue toward more realistic haptic feedback. This controllers (Figure 5b) generally employ a combination of these
approach, however, is limited by: i) The lack of haptic actuators haptic inputs with various controller designs to achieve more
that could conformably integrate onto our skin and joints while intuitive and functional haptic control.[37,38]
delivering the large range of force and bandwidth of frequency; Parallel to the development of haptic control with physical
ii) the incompatibility of the mechanical properties of materials tools, there has been a long-standing interest in gesture control

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (5 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

Figure 5. Haptic input devices. Controller input devices: a) Gamepad (PlayStation5). Reproduced with permission.[37] Copyright Sony Interactive
Entertainment Inc. b) VR controller (Quest 2 Controller). Reproduced with permission.[38] Copyright Oculus. Gesture input devices: c) DataGlove.
Reproduced with permission.[40] Copyright 1986, ACM. d) Power Glove. Reproduced with permission.[42] Copyright Evan-Amos. e) Attachable VR tracker
(VIVE tracker). Reproduced with permission.[45] Copyright HTC Corporation.

for direct haptic input. Early commercial exploration can be and are therefore prone to drifts from accumulated error.[43] Fur-
traced back to the DataGlove (Figure 5c) first released by VPL ther, these IMU’s have a sampling rate of ≈1 kHz and exhibit
Research in 1987. The glove allows hand position tracking in 3D large measurement errors for motions faster than ≈100 Hz,
space (three positional directions and three orientations) enabled rendering them inadequate for tracking many human motions,
by scratched optical fiber sensors to measure finger flex (with the especially in sports.[44] Incorporating inertial and optical sensing,
resolution of 256 positions per finger), and ultrasonic or mag- Nintendo’s Wii Remote and, later, Joy-Con controllers popu-
netic sensors to detect hand position and orientation (yaw, pitch, larized motion-based game controls. Also combining IMUs
roll).[39] Piezoelectric cantilevers are also integrated in the glove with optical sensing, most of today’s VR and AR systems are
to indicate virtual contact with vibrations.[40] Direct manipulation equipped with motion tracking controllers and some systems
of virtual objects in 3D space has been demonstrated with the are able to track full-body motion with trackers attached to joints
DataGlove and its applications in teleoperation of robots, sign lan- of the body (Figure 5e).[45] Recently, Oculus demonstrated 3D
guage interpretation, and virtual musical performance have also hand tracking (position, orientation and finger configuration)
been explored.[41] Despite these functionalities, DataGlove did not with only the inside-out camera on the headset, and no wearable
receive too much commercial popularity due to the high cost. elements.[38] High speed motion (e.g., joint angle movements of
Nintendo’s Power Glove (Figure 5d) was developed fol- 1000’s o s−1) tracking in environments without the aid of external
lowing the DataGlove to reduce the price by a factor of 100, vision remains an open challenge.[46]
through reducing the features of ultrasonic tracking (orienta- In spite of this technical gap, motion and gesture tracking
tion tracking reduced to roll) and replacing finger tracking with have been effectively captured by optical tracking and IMU sen-
low-cost carbon-based resistive sensors (resolution reduced to sors in VR systems. As another important haptic input, force
4 positions per finger).[41] Released in 1989 as a controller for the sensing on the other hand, is less explored. Although force sen-
Nintendo Entertainment System, the Power Glove successfully sors have been integrated into game-props (e.g., Ring-Con[47]) or
gained market attention initially, but quickly failed due to the VR controllers (e.g., Valve Index[48]) to detect squeeze, general-
difficult glove control arising from the lack of games compat- ized force sensing has not yet been achieved that captures the
ible with the glove gesture input, as well as, insufficient hard- force interaction with a tool/environment not specifically inte-
ware resolution, accuracy, and bandwidth.[42] grated with sensors for VR/AR. Toward this end, soft sensors
In recent years, motion tracking technologies have seen offer a great platform for capturing the generalized interaction
rapid development where real-time measurements of the posi- force as they can be wrapped conformably onto the users’ skin
tion and orientation the body have been demonstrated with in a large area and measure force in a distributed manner as the
optical, ultrasonic, radiofrequency, electromagnetic, and iner- users haptically interact with passive tools.[49]Furthermore, when
tial measurements.[19] Optical tracking with infrared sensors wrapped around joints, soft sensors could provide relative posi-
or video cameras connected to computer vision algorithms tion tracking and complement the existing tracking technologies
(with variabilities in marker or marker-less tracking and place- to achieve higher resolution. At last, the versatility of soft sensors
ment of the cameras on the moving object (inside-out) or sta- to track both motion and force, combined with soft actuators that
tionary (outside-in)) is most widely adopted in today’s motion- could provide kinesthetic and tactile feedbacks with the same
sensing game controllers and VR systems. Since this method soft material construction, showing great promise for a compact,
requires direct line of sight between the camera and the user integrated, and immersive haptic experience.
being tracked, visual occlusion interrupts proper tracking and
increasing the number of cameras increases cost.[43]
An inertial measurement unit (IMU) is another popular 4. Intrinsic Soft Actuators for Haptic Feedback
motion tracking sensor, consisting of an accelerometer, a gyro-
scope, and a magnetometer in a small form factor MEMS device 4.1. Haptic Perception
that is free of location limitations, and is easily integrated into
controllers, smartphones, and as wearables. IMUs have low Humans gain haptic perception through two afferent subsys-
latency but provide relative rather than absolute measurements tems, the cutaneous and the kinesthetic systems. Cutaneous

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (6 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

Figure 6. Haptic perception. a) Mechanoreceptors in the skin categorized by adaptation rate of rapidly adapting (RA) and slowly adapting (SA) and
receptive field. Reproduced with permission.[56] Copyright 2016, Elsevier. b) Two-point discrimination thresholds on different parts of the body from
four studies. Reproduced under terms of the Creative Commons CC BY License.[57] Copyright 2014, the authors. Published by American Neurological
Association.

perception is achieved through mechanoreceptors distributed and requires distributed actuators with a wide range of actua-
in the skin that detect an object’s surface texture, temperature, tion frequencies and amplitudes to stimulate perception of the
and humidity;[50] Kinesthesia comes from the mechanoreceptors object’s material and geometric properties. Kinesthetic feed-
in joints, tendons, and muscles for perception of limb position back is typically applied to the joints and requires the actuator
and movement and detection of force and weight.[51] To provide to generate large force and induce shape change. The spatial
realistic haptic perception of a virtual environment, the device and temporal sensitivities of the skin provide a framework for
would ideally stimulate the user to feel all variations of the afore- the tactile device design requirements. There are four popu-
mentioned object properties. While it is extremely challenging lations of mechanoreceptors (SA I, RA I, SA II, RA II) in the
to create a ubiquitous device that can be actuated to exhibit any skin with various sensitivities for different primary functions
combination of these properties, it is possible to stimulate the (Figure 6a). SA I and SA II are sensitive to low frequencies
sensations that lead to some of the same perception outcomes (SA I<5 Hz) and are responsible for detecting sustained pres-
through spatiotemporal control of distributed force, shape, and sure and spatial deformation (e.g., detection of object pattern,
temperature variations. For example, texture perception is medi- form, coarse texture and movement); RA I and RA II are sensi-
ated through vibrational cues for fine textures (spatial periods tive to higher frequencies (RA I: 5–40 Hz, RA II: 40–400 Hz)
of less than ≈200 µm) and spatial cues for coarse textures.[50,52] and are used to detect temporal change of the skin (e.g., fine
Skin wetness is perceived through the combination of thermal texture).[50,55,56] For static tactile stimuli, psychophysical study
and mechano-receptors.[53] Perception of compliance comes from of two-point threshold test shows the variation of spatial acuity
cues of displacement, as well as surface shape deformation.[54] across different body parts (Figure 6b), where fingertip has the
Human haptic perception is a complex process that involves sen- highest spatial sensitivity (2–4 mm) and the back and limbs
sation, perception, and cognition.[50] While there is still much have the lowest spatial sensitivity (≈20 mm).[57] The perception
to be learned about the neurophysiological and psychophysical sensitivity for fingertip skin deformation is 10 µm and 0.8 mN.
mechanisms of human haptic perception, advances in our The maximum sensitivity for dynamic tactile stimuli at the fin-
understanding of the somatosensory system have provided the gertip and palm is 200–300 Hz with perceivable threshold of
basis for haptic feedback device designs. normal stimulation at 0.1–0.2 µm.[58] Psychophysical studies
Haptic feedback devices can be categorized into tactile and of the kinesthetic system show that perceivable movement
kinesthetic devices. Tactile feedback relies on skin deformations threshold at the joints falls in the range of 1°–8° and is more

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (7 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

sensitive for faster movement. The perceivable change in force Tactile feedback devices require actuators with high actua-
is proportional to the current level of force with an average dif- tion frequency (up to ≈400 Hz) and high spatial resolution
ferential threshold 7–10% over a force range of 0.5–200 N.[51] (≈1 mm) based on the acuity of the skin.[50] Since FEAs rely
Whole body kinesthetic dynamic threshold is 20–30 Hz.[58] on mass transport of fluids for actuation, high actuation fre-
Haptic feedback devices should be able to address the spatial quency is a challenge compared to electrically powered sys-
and temporal sensitivities of the skin and the large force and tems. The pressure generation mechanism plays an important
shape change requirements of the joints. Here, we review the role in determining the actuation frequency of a FEA system.
actuation mechanisms of elastomeric actuators based on the Since the pressurized inviscid air generates actuation faster
performance metrics of actuating force, frequency, and spatial than viscous liquids when forced through narrow pathways
resolution. typical in soft robotic actuators, we focus our discussion on
the pneumatic energy sources. For a more comprehensive and
insightful review on the existing pneumatic energy sources for
4.2. Fluidic Elastomer Actuator soft robotics, we refer the readers to ref. [61]. To summarize,
most pneumatic soft robots rely on the pressure shift from
FEAs are elastomeric composites with embedded channels battery-powered microcompressors or cylinders of compressed
for actuation generated by fluidic (pneumatic or hydraulic) fluids, where the former provide high capacity but limited pres-
pressure. Upon pressurization, the elastomeric membrane is sure and flow rate, and the latter provides limited capacity but
stretched, storing energy as the entropy of the polymer chains high pressure and flow rate. Through modifying the actuator
is reduced; the membrane passively returns to the unstretched design, Mosadegh et al. have shown a rapidly actuated FEA that
state as the polymer chains reconfigure to the entropically bends from linear to a quasi-circular shape in 50 ms (20 Hz)
favored state when the elastic energy is released. FEAs employ when pressurized to 345 kPa.[62]
elastomers with different elastic modulus or strain limiting High spatial resolution is challenging for compressed air
structures (e.g., inextensible fabric) to induce anisotropic defor- powered FEAs, since each actuator needs to have an individ-
mations (rather than inflating a balloon).[1] Fabrication methods ually addressable pressure input regulated by a valve and a
of replica molding and 3D printing produces FEAs with more compressed air source, leading to an overall system with large
complex structures for actuators that can reversibly extend, form factor. Combustion generates high pressure rapidly
bend, or twist, mimicking functions of muscular hydrostat with less mass and has been shown to power soft robot with
such as the elephant trunk.[9,59] By exploiting elastomers with fast actuation.[63,64] The speed of pressure generation makes
relatively high elastic modulus (e.g., ELASTOSIL M4601 a/b, combustion a promising approach toward high actuation fre-
Wacker Chemical Corp.) and a monolithic fabrication method quency, but the requirement for precise stoichiometry control
called rotational casting, Zhao et al. achieved a high force soft warrants extensive system-level study for repeatable actua-
bending actuator that generates 27.4 N at 234 kPa (measured tion with high frequency and high spatial density. Peroxide
via block force method) for use in exoskeletons (Figure 7a).[60] decomposition provides an alternative method to rapidly gen-
The ability to generate large shape change and high force erate high pressure with few additional parts and therefore
with the compliant soft material properties makes FEA an less overall system mass. Wehner et al. have demonstrated
ideal candidate for kinesthetic feedbacks. For wearable appli- one such soft robot (i.e., Octobot) powered by peroxide
cations, weight also needs to be taken in to consideration for decomposition, which is controlled by microfluidics with
the choice of fluid: While liquids are more beneficial than air catalysts embedded.[65] For tactile applications, individually
for generating large forces, they increase the system’s weight addressable actuator at high spatial density still require more
and decrease the actuation bandwidth of FEAs due to the mass system level studies on controllability and operation time of
transport of viscous liquids through narrow fluidic pathways.[9] this actuation mechanism.

Figure 7. Actuation mechanisms with soft materials. a) High force fluidic elastomer actuator. Reproduced with permission.[60] Copyright 2015, Elsevier.
b) Rolled dielectric elastomer actuator with broad bandwidth. Reproduced with permission.[152] Copyright 2020, Mary Ann Liebert, Inc. c) Electromag-
netic actuator with soft coil composed of liquid metal alloy encapsulated in silicone filament. Reproduced with permission.[94] Copyright 2018, John
Wiley and Sons.

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (8 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

While tubing is fine for grounded graspable or touchable actuators employed liquid as the dielectric layer instead of
haptic devices, being completely untethered may be more elastomers; as voltage is applied to the electrodes, the elec-
desired for wearable devices. The ability for a mobile or manip- trostatic pressure displaces the locally encapsulated liquid to
ulative soft robot to be untethered usually depends on the the surrounding volume, thus enabling high speed electrohy-
tradeoff between payload capacity and weight of the system draulic actuation while being robust during potential dielectric
components. The payload capacity loosely scales with the mass breakdown.[78] Coupled with electrostatic zipping mechanism,
of the power source, and untethered mobile robots have been Peano-HASEL actuators demonstrated fast linear contraction
demonstrated with batteries and microcompressors.[66] For (up to 10% contraction with 900% s−1 strain rate at 50 Hz) with
wearables, however, the user has to bear all the weight from the high force (200 times the actuator’s weight).[79] Improvement
robotic device. Thus, untethering has a stricter requirement for in manufacturing methods, such as, 3D printing, has dem-
the actuation system’s weight, especially, when a large number onstrated rapid prototyping of a HASEL antagonistic actuator
of actuators are included. Pressure generation in a more inspired by artificial hydrostats.[80]
volume and weight efficient manner is critical for untethered
wearable FEAs. Embodied energy, where the energy source is
directly integrated in the constructing materials of the robot, 4.4. Stimulus Responsive Materials
is an emerging research topic that could provide a solution.
Octobot shows an example of embodied energy enabling self- Stimulus responsive materials for actuation is a well-explored
powered autonomous robot. Inspired by the multifunctional field, where materials that respond to various external stimuli
circulatory system in animals, more recently, an untethered soft (heat, light, magnetic field, etc.) are fabricated with pre-pro-
robotic fish demonstrates embodied energy through a synthetic grammed patterns and experience reversible shape changes
circulatory system modelling the redox flow battery, combining upon exposure and removal of the stimulus. Building intelli-
force actuation, transmission, and energy storage all in the gence in the material itself (the material is the actuator) allows
same circulating hydraulic fluids. The increased energy density wireless actuation with minimal components, which is advanta-
allows long operation of the robotic fish (up to 36 h).[67] geous for wearable haptic devices. Many of the stimuli-respon-
sive mechanisms, however, are diffusion limited, resulting in
limitation of the actuation speed and maximum force that can
4.3. Dielectric Elastomer Actuator be generated. An important class of thermal responsive mate-
rial is liquid-crystalline elastomers (LCE), where the liquid
Dielectric Elastomer Actuators (DEAs) comprise of a thin elas- crystal (LC) mesogens can be covalently incorporated in an
tomeric membrane sandwiched between two compliant elec- elastomeric network in an ordered manner via mechanical
trodes in the shape of a parallel plate capacitor. As voltage is alignment. Heating, induced by either direct thermal radia-
applied, the electrostatic force generated between the two tion, Joule heating, or photothermal effect, above a phase-
electrodes squeezes the dielectric elastomer, causing compres- transition temperature reduces the order of the LC mesogens
sion and stretching of the film.[68] The compression thick- and causes macroscopic contraction in the LCE.[81] Other align-
ness strain from the electrostatic force actuation is shown in ment methods via surface patterning, external electric or mag-
Equation (1):[69] netic fields, or shear-induced flow can lead to more complex
deformations. Tactile displays constructed with LCEs have dem-
εV 2 onstrated actuation speed of 6 s,[82] 40 mN force, and 0.8 mm
sz = − (1) displacement.[83] For another major class of actuators, thermal
Yz 2
driven materials-shape memory alloys and polymers, we refer
where ε is the absolute permittivity, Y is the Young’s modulus the readers to the review article by Sun et al.[84] Light-driven soft
of the dielectric elastomer, V is the applied voltage, and z is actuators typically incorporate photochromic molecules, such
the film thickness. In order to actuate a 50 µm thick DEA with as azobenzene, that undergo molecular configuration changes
typical elastomeric materials (Young’s moduli from hundreds upon absorption of specific wavelengths of light. In a photome-
of kPa to 1 MPa) to 10–30% compression strain or higher, the chanical elastomer, UV irradiation triggers the trans-cis isomer-
driving voltage of a few kilovolts are required.[70] Recent pro- ization of azobenzene molecules, upon collective action, leading
gresses in the field have managed to reduce the actuation to the macroscopic contractile actuation in the crosslinked net-
voltage to a few hundred volts.[71–75] Despite the high voltage work. The light-absorption based mechanism leads to bending
issue yet to be solved, electrical actuation allows DEAs to yield motion (over 10 s) of a free-standing film, since light intensity
much faster response than FEAs with a small form factor in is attenuated as it penetrates through the thickness of the film
millimeter scale, which is promising for tactile feedback. Fur- and thus forms an intensity gradient with gradually decreasing
thermore, circumferential constraints translate the in-plane contraction.[85]
biaxial strain of DEA to out-of-plane actuation. Magnetic responsive materials own the advantages of fast
Actuation force of the thin film DEA is low (<50 mN) for actuation speed (up to 300 Hz[86,87]). Soft magnetic actuators
useful tactile feedback. Increasing the number of DEA layers are fabricated by embedding magnetic particles or fillers in
through stacking or folding has demonstrated large contractile fluids or elastomers. As fillers align with the magnetic field
forces (Figure 7b).[73,76,77] Another force amplification approach (that has great penetration in many media), various macro-
exploits the combination of hydraulic and electrostatic forces. scopic deformations and motions are achieved with different
The hydraulically amplified self-healing electrostatic (HASEL) filler patterns in a time-varying magnetic field. In recent years,

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (9 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

small-scale (mm to cm) magnetically driven robots have and those built with intrinsically soft materials (i.e., epidermal
achieved a myriad of rapid and untethered locomotion modes electronics) have both demonstrated innovative wearable sen-
(swim, climb, jump, roll, etc.) in various environments (wet sors for a broad range of measurands.[102]
and dry).[88–92] In haptic applications, conventional electromag- For soft robotic applications, requirements for the mechan-
netic actuators, such as, linear resonate actuators, are widely ical properties of the sensors become more extreme, where
adopted to generate high-frequency vibrations for tactile per- Young’s modulus E < 1 MPa and strain to failure ε > 200% are
ception.[15] A densely packed electromagnetic actuator array generally required for the detection of soft robotic actuations
(4 × 4 taxels on a 8 mm pitch) is achieved with each taxel com- with minimal impedance.[9] There have been exciting develop-
posed of a stationary planar microcoil and a moving perma- ments in e-skins and epidermal electronics that could detect
nent magnet, where a magnetic shielding structure is used to various aspects of human activities; we refer the readers to
reduce interaction between neighboring taxels.[93] In recent refs. [100,103,104] for more details. In this section, we summa-
years, there is growing interest in developing soft electromag- rize sensing mechanisms that have been widely adopted in soft
netic actuators to both allow safe interaction with human and robotics applications, specifically, mechanical strain sensors
at the same time harvest the desirable performance of this built with intrinsically soft materials, to address both objectives
actuation mechanism. A popular approach is to replace the for haptic sensing in VR and AR. For a comprehensive and
rigid metal coil in a linear resonate actuator with one that has detailed review on this topic, we refer the reader to ref. [99] by
liquid metal encapsulated in an elastomeric matrix. By control- Amjadi et al.
ling the current in the soft coil, vibration of the permanent
magnet or the elastomeric coil can be achieved by Lorentz
force (Figure 7c).[94–96] Recently, Mao et al. developed the soft 5.2. Resistive and Capacitive Sensing
electromagnetic actuators that utilize a large plate magnet to
introduce a spatially extended and large magnetic field, and Resistive sensors (Figure 8a) respond to geometric variations
thus separating the soft coil actuators from the magnet.[96] At or changes in resistivity of a conductive medium. Capacitive
last, tactile displays have also demonstrated stiffness modula- sensors (Figure 8b) detect and measure the changes in elec-
tion based on magnetorheological fluids with 5 mm spatial trodes area and dielectric thickness. One of the greatest chal-
resolution.[97] Although external stimulus allows wireless actu- lenges with both of these sensors is the lack of elastomeric
ation, designing a system that could deliver the stimulus for conductors with high conductivity and low elastic modulus.
individual control of each tactile actuator is still challenging. Typical strategies to endow soft materials with conductivity
Furthermore, in cases where the actuation stimulus, such as at high strains (>100%) can be generally categorized into:
heat, is also part of the haptic modality of human skin, there Synthesizing compliant, solid-state conductors, and encap-
might be decoupling issues for stimulating the desired haptic sulating conductive liquids to an elastomeric shell. Typical
perception. compliant, solid-state conductors include but are not limited
to intrinsically conducting polymers (e.g., poly(3,4-ethylened
ioxythiophene:poly(styrene sulfonate) [PEDOT:PSS], polypyr-
5. Intrinsic Soft Sensors for Haptic Sensing role [PPy], and polyaniline [PANI]), elastomers dispersed
with nanofillers (e.g., silver nanowire [AgNW], carbon nano-
5.1. Haptic Sensing tube [CNT], and graphene), and ionically conductive hydro-
gels (e.g., salt-containing water-swollen polyacrylamide).
Sensing in haptic devices has two objectives: i) Human activity Conducting polymers and polymer nanocomposites exhibit
detection as inputs for interaction with computers; ii) sensing relatively high electrical conductivity, but in general face a
the extent of actuation from haptic feedback actuators for pre- trade-off between electrical and mechanical properties: for
cise control. The field of human-machine interaction has made examples, solution-processable PEDOT:PSS has excellent
great advances in computer vision based detection of the user’s electrical conductivity (≈100 000 S m−1) but suffers fracture at
body gesture, gaze, and affective interaction.[98] In the past low strains (<30%);[105] higher filler concentration above the
decade, there is accelerated development in stretchable elec- percolation threshold in elastomer nanocomposites can yield
tronics for use as bio-integrated or wearable sensors, which higher conductivity, but may induce microscale aggregates
have impressively expanded the detectable measurands (both and thus deteriorate the overall mechanical behaviors.[106]
biomechanics and biometrics) of the human body and enabled Hydrogels with the addition of ionic salts represent another
measurements that are not spatially restricted by cameras.[99,100] important class of soft solid-state conductors. With their ionic
In this section, we limit our discussion to the sensing of conductivity (≈10 S m−1) enabled by the movement of mobile
mechanical deformations only. ions such as Na+, Li+, OAc−, and Cl− in the network matrices
For wearables used in human activity detection, a minimum and activated via applied electric field, these hydrogels also
requirement and yet a challenging engineering problem is that demonstrate superior stretchability, toughness, and optical
the mechanical properties of the device should be similar to transparency.[107] The reliability of ionic hydrogels, however,
those of the skin (Young’s modulus E ≈ 0.4–0.9 MPa and strain is susceptible to dehydration or freezing under harsh physio-
to failure ε ≈ 60–75%).[101] Sensors need to withstand the com- logical and engineering conditions, potentially leading to hys-
plex deformations of the skin (e.g., bend, stretch, twist, etc.) teresis or temperature drift in the fabricated sensors.[108] As
while maintaining reliable performance. Two conceptually dif- another scheme of imparting compliance to conductors, ionic
ferent approaches: Structurally compliant devices (i.e., e-skin), liquids (e.g., sodium chloride dissolved in water, potassium

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (10 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

Figure 8. Sensing mechanisms with intrinsically soft materials. a) Liquid metal resistive sensor integrated with a pneumatic vibrotactile actuator for
closed-loop control. Reproduced with permission.[134] Copyright 2020, Mary Ann Liebert, Inc. b) Hyperelastic light-emitting capacitor as a capacitive
sensor. Reproduced with permission.[178] Copyright 2016, AAAS. c) Stretchable optical waveguide detects deformation with intensity variation. Repro-
duced with permission. Copyright 2016, AAAS.

iodide dissolved in glycerol) and liquid metal alloys (e.g., (i.e., index of refraction, absorbance) of elastomers can be tai-
eutectic gallium indium [EGaIn], gallium-indium-tin [Galin- lored to achieve desired guiding properties (e.g., total internal
stan]) can be infused into elastomer matrices to accommo- reflection, selective wavelength absorption, and number of
date very large mechanical strains (theoretically only limited modes).[116–120] Based on the same principle, in 2016, Zhao and
by the encasing elastomer), while still maintaining stable, co-workers developed a highly compliant and stretchable optical
bulk conductivity.[109,110] Despite the reconfigurability of a waveguide (elastic modulus < 400 kPa at 100% strain; strain to
conductive path enabled by the superior flowability of fluids, failure > 700%) for feedback sensation in a soft prosthetic hand
both methods involve costly fabrication of encapsulating (Figure 8c).[121] Using a high refractive index polyurethane core
microchannels with subsequent imbibing, risk of leakage, and a low refractive index silicone cladding, the waveguide is
and rigorous design requirement for robust interface with intentionally designed to be lossy to improve sensitivity during
conventional metal interconnects.[111,112] elongation, while still allowing detectable amount of light inten-
sity over the length scale of a human hand. Such measurement
based on light intensity is well suited for measuring a single
5.3. Optoelectronic Sensing characteristic such as, strain, curvature, or pressure, but fails
to detect and spatially differentiate combined, volumetric defor-
Optoelectronic sensors detect changes in primary properties mations. In order to address this issue, Van Meerbeek et al.
of light (e.g., intensity, wavelength spectrum, phase, polariza- used machine learning techniques (k-nearest neighbors [kNN]
tion) by means of electrical signals.[81] When a mechanical classifier with multi-output regression model) to train diffuse
deformation (e.g., strain, curvature, or pressure) is inflicted to reflected light, successfully differentiating twist and bend,
a light-transmitting medium, this medium or sensing element and predicting their magnitudes.[122] Xu et al. created distrib-
(usually in the form of an optical fiber) can be modified along uted optical waveguide arrays throughout the volume of a 3D
its length to let the anticipated measurand modulate one of the printed soft lattice and achieved localized proprioception and
properties of light, thus formulating an excellent platform for exteroception with submillimeter accuracy via frustrated total
multifunctional sensing with high spatial resolution. Compared internal reflection.[49]
to resistive and capacitive sensing that exploit electrical proper-
ties of materials for signal transduction, optoelectronic sensing
provides an alternative, nonelectrical method of operation that 6. Control
is immune to electromagnetic interference, chemically inert,
and forms easier interface with data collection systems.[113] Feedback control, as a basis for automation in robotic sys-
Although commercial plastic fiber optics already demonstrates tems, involves designing a controller that changes the inputs
promise for integrated sensing in soft robotics,[114] there is still a of a dynamical system, for various performance objectives
need of stretchable waveguide sensors toward broader human- such as disturbance rejection, parameter sensitivity control,
machine interface applications including prosthetics, haptics, command tracking, etc., based on the difference between
and virtual reality. The earliest introduction of elastomeric the measured process variables and the desired output refer-
waveguides for mechanical deformation sensing can trace ences. Designing closed-loop feedback control is particularly
back to 1980s.[115] Since 1990s, new manufacturing methods, challenging for soft devices, as the infinite passive DOFs and
such as, soft lithography, permit easier fabrication of the core- nonlinear mechanical properties of soft materials make it dif-
cladding structure of waveguides, where optical properties ficult to develop an analytical dynamic model for controller

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (11 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

design; and the integration of soft sensors and actuators is 7.1. Wearable
still in early stage. We refer the readers to refs. [9,123] for
reviews on the state of the art soft robotic control strategies. 7.1.1. Exoskeleton
To summarize, the most commonly used strategy is the open-
loop feedforward control, where sensory feedbacks are not Exoskeletons are body-grounded devices, where the actuators
incorporated, with static model, where the steady-state vari- are mounted onto the body and exert forces to joints relative
ables are controlled based on simplified models (constant cur- to body-anchoring points, providing kinesthetic feedback to
vature approximation) or empirical data.[124–129] Model-based control the movement of certain part of the body such that the
and model-free dynamic control with state sensory feedback is users can feel the force of a virtual object. Compared to conven-
still in its infancy.[123,130–133] For haptic devices, feedback con- tional rigid exoskeletons, soft exoskeletons made of elastomers
trol is necessary to stimulate accurate kinesthetic and tactile or textiles are lightweight and do not interfere with natural
sensations to the user. In a recent work by Sonar et al., active human movements. Here, we limit our discussion to hand
closed-loop control of a completely soft vibrotactile haptic plat- exoskeletons and haptic feedback applications. Cable-driven
form with high frequencies (up to 100 Hz) has been demon- actuation is a widely adopted actuation mechanism for exoskel-
strated to reject external loading disturbances. Comprising of etons, where cables are used for the flexible transmission of a
a soft pneumatic actuator integrated with stretchable liquid remotely located actuation unit of motors, gears and linkages.
metal sensor, the haptic device adopted a proportional integral A benchmark commercialized exoskeleton glove is the Cyber-
derivative (PID) controller derived from an analytical model, Grasp, where each finger has rigid link routed cables providing
demonstrating capabilities of disturbance rejection where resistive force to enable feeling of a virtual object’s shape and
actuation is maintained consistent with different external size (Figure 9a).[136] We refer the readers to ref. [137–140] for
loading conditions (Figure 7a).[134] Very recently, Aguas- more cable-driven exoskeleton designs and here we review soft
vivas et al. employed parsimonious recursive neural networks actuators that can be directly placed over the joints to apply
and Newton-Raphson algorithms for online optimization of forces. Taking advantage of the large deformations and forces
models that provided accurate and updated predictions for a enabled by the soft actuators, soft exoskeletons have demon-
stretchable fiberoptic network providing feedback into a DC strated more natural human-device interaction with lighter,
motor—tendon based soft actuator system in a feedback loop less expensive, and less complex systems compared to their
operating in excess of 100 Hz.[135] rigid counterparts.[9,141,142] In a soft orthotic glove demonstrated
by Zhao et al., the FEA is able to generate large force (25 N
at fingertip) with conformal finger contact and safe human-
7. Soft Haptic Devices robot interaction (Figure 6a).[60] Integrating optical sensors for
feedback control, Zhao et al. further demonstrate accurate cur-
In the previous sections, we reviewed working principles of vature control of the soft orthotic glove with a state machine
elastomeric actuators and sensors that could be applied to controller and a PID controller (Figure 9b).[114,143] An alternative
haptic applications and discussed their pros and cons in the to soft actuators to control joint movements in exoskeletons is
haptics context. In this section, we review soft robotic systems soft clutches, where two elements can actively lock or unlock
that have been demonstrated as haptic input or output devices, to block or allow movement of a joint.[144,145] Hinchet et al.
but have not yet been integrated to VR/AR systems, and we cat- report electrostatic clutches with high force density (frictional
egorize the devices by their modalities as wearable, graspable, shear stresses of 21 N cm−1 at 300 V) demonstrated in a kin-
and touchable.[15] esthetic glove, where the electrostatic clutch is programmed

Figure 9. Exoskeleton gloves. a) CyberGrasp, a force-reflecting exoskeleton fitted over a CyberGlove for sensing. Reproduced with permission.[136]
Copyright CyberGlove Systems. b) Pneumatically drive exoskeleton glove integrate with fiber optic sensors demonstrating curvature feedback control.
Reproduced with permission.[114] Copyright 2016, IEEE. c) Electrostatic clutch for passive locking mechanism to indicate virtual contact. Reproduced
with permission.[146] Copyright 2019, John Wiley and Sons.

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (12 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

to block finger motion as the finger contacts a virtual solid A ≈ 1 cm3 rolled DEA by Zhao et al. has demonstrated about
object (Figure 9c).[146] The passive locking mechanism offers the 1 N of block force and 1 mm of displacement with up to
advantage of low energy consumption, light weight, compact, 200 Hz frequency.[73] 4 of these actuators are integrated into
and low profile device form at the cost of continuous force con- a haptic device on the forearm, and psychophysical testing
trol to allow more complex force feedbacks. showed human haptic perception of touch position and direc-
tion with a broad bandwidth (10 Hz–200 Hz, Figure 10c).[152]
Electrohydraulic actuators with zipping mechanism have been
7.1.2. Tactile Display miniaturized to mm-scale, enabling lightweight and compact
tactile displays with improved function.[153,154] With the dielec-
In 2000, Moy et al. demonstrated a compliant pneumatic tactile tric liquid stored in a pouch perpendicularly beneath the display
display molded from silicone rubber that ensures conformal surface, Han et al. demonstrated a tactile display with 2.5 mm
contact between the finger and the device.[147] The display is spatial resolution that actuates to 1.45 mm and 13 mN at 5 Hz
composed of 5 × 5 pressurized chambers with 2.5 mm spacing, and 200 µm at 200 Hz (Figure 10d).[153] Leroy et al. created the
each regulated by a solenoid valve with pulse width modula- hydraulically amplified taxel (HAXEL) capable of multimodal
tion control. Each pneumatic actuator is capable of applying out-of-plane normal displacements and in-plane shear displace-
<200 mN force over <0.6 mm displacement and with 5 Hz ments. With the liquid stored beneath the non-stretchable elec-
bandwidth. In more recent years, tactile displays are often built trodes and on the peripheral of the elastomeric cavity, HAXEL
with DEAs and their variations for the large frequency band- modulates displacement modes through segmentation of the
width required for vibrotactile feedbacks. Soft tactile displays electrodes, creating 500 µm normal displacement and 760 µm
that can be conformally wrapped around the fingertip have lateral displacement and a specific power of 100 Wkg−1 with the
achieved vibrotactile feedback with addressable millimeter response time below 5 ms (Figure 10e).[154] Recently, the same
scale DEA arrays that actuate at <100 Hz with <50 µm ampli- group achieved integrated high-density electrostatic zipping
tude (Figure 10a).[148–151] As DEAs are generally actuated by actuator arrays with multimaterial inkjet printing.[155]
voltage over 1 kV, high voltage control circuits are necessary to Individually addressing each actuator in a large tactile dis-
address DEA arrays and the bulky high voltage switches pose play with a small form factor is a challenging engineering
a challenge for haptic applications. Marette et al. developed a problem that requires innovations both on the fundamental
compact control circuit, the flexible high-voltage thin-film tran- actuation mechanism and the system level integration. Com-
sistors, that independently switches each DEA in a 4 × 4 display bining different actuation mechanisms based on their merits
tactile display using a single 1.4 kV supply (Figure 10b).[150] could lead to a hybrid system that satisfies the many require-
Another challenge of using DEA for tactile feedback is the ments of a tactile display (frequency, force and displacement,
actuation force, since force generated by a thin film DEA is spatial density, and small form factor). Large scale microflu-
small (<50 mN). Two approaches for force amplification in tac- idic circuits with fluidic multiplexors made of arrays of binary
tile devices are multilayer stacking and hydraulic amplification. valves could provide a way for compact FEA arrays but the

Figure 10. Tactile displays. a) DEA tactile display. Reproduced with permission.[149] Copyright 2008, IEEE. b) Compact DEA control circuit with high-
voltage thin-film transistors. Reproduced with permission.[150] Copyright 2017, John Wiley and Sons. c) Broad bandwidth tactile feedback device for the
arm based on rolled DEA. Reproduced with permission.[152] Copyright 2020, Mary Ann Liebert, Inc. d) Tactile display with dielectric fluidic transducers.
Reproduced with permission.[153] Copyright 2020, IEEE. e) Multimodal tactile display with hydraulically amplified taxel (HAXEL) for both normal and
shear displacements. Reproduced with permission.[154] Copyright 2020, John Wiley and Sons.

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (13 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

Figure 11. Compact and individually addressable large actuator array. a) Large array of thermal responsive hydrogels actuated by light projection.
Reproduced with permission.[157] Copyright 2009, John Wiley and Sons. b) Large active tactile array combining actuations of shape memory polymer
and single pneumatic supply. Reproduced with permission.[158] Copyright 2017, John Wiley and Sons. c) Wirelessly powered and controlled VR interface
with near-field communication. Reproduced with permission.[86] Copyright 2019, Springer Nature.

actuation speed is still limited by mass transport of fluids for a antenna, serves as an individual actuator that vibrates with
long distance.[156] In a large array (65 × 65 actuators within an the time-dependent Lorenz force induced by the time-varying
area of 37.7 × 37.7 mm2) of thermal responsive hydrogel actua- current in the coil. A control circuit with system-on-a-chip
tors, an external projector is used to actuate individual pixels (SoC) integrated circuits (ICs) allows wireless communica-
through photothermal effect that controls the swelling of each tion through near-field communication (NFC) Data Exchange
hydrogel actuator (Figure 11a).[157] This method provides a way Format messages that programs the output frequency of each
to wirelessly control a large array of actuators with high spatial of the 32 actuators (4 SoCs with 8 GP I/Os each) in a four-byte
density and small form factor, but is limited in actuation speed single communication. The system elegantly integrates NFC
(400 ms with active cooling) due to diffusion driven swelling enabled wireless power delivery and control, demonstrating a
of the hydrogels. Besse et al. demonstrated a large active array completely wireless wearable haptic interface for the first time.
(32 × 24 actuators with 4 mm pitch) optimized for tactile feed- Currently, the actuators have a spatial density of ≈1 cm, close to
backs with independently controllable actuators that are capable human skin spatial acuity of the limbs and the trunk and could
of large force and displacements (over 1 N and 400 µm with a be further improved for the hand; the NFC working range
3 mm diameter actuator; Figure 11b).[158] The system employs has been improved to 80 cm, sufficient for indoor VR applica-
Joule heating of shape memory polymer pixels to modulate tions and might need further improvements for longer range
local stiffness and combines pneumatic actuation for large applications.
displacements, allowing a more compact system integration
requiring only one air inlet and wires for thermal control. An
actuation cycle, however, takes 5 s from the processes of Joule 7.1.3. Wearable Sensors
heating, pressure transfer, and passive cooling.
Yu et al. demonstrated a wirelessly controlled and wirelessly Despite the prevalent adoption of optical tracking and IMU
powered epidermal VR interface, capable of 100–300 Hz vibra- technologies to provide the user’s movement in commer-
tions with individually controllable millimeter-scale vibratory cial haptic input devices, stretchable wearable sensors, in the
actuators (Figure 11c).[86] A permanent magnet, encapsulated form of skin-mountable or textile-based devices,[103,159–163]
in a cavity and on top of a magnetic radio frequency (RF) loop offer unique advantages of robust (e.g., unaffected by optical

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (14 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

occlusion or lighting variation) and highly sensitive movement been achieved by myriad of tactile pressure sensors as sum-
tracking, as well as, tactile sensing of contact force, tempera- marized in ref. [104]. Recently, Lee et al. reported a nanomesh
ture, and humidity. A comprehensive review of tactile sensors pressure sensor that monitors finger contact pressure without
can be found at ref. [159]. Here, we enlist recent examples of detectable human sensation, quantified by grip forces with and
stretchable sensors demonstrated as haptic input devices. Com- without the sensor attached. (Figure 12b)[166] Direction-sensitive
bining visual data and somatosensory data (finger bending) pressure and shear detection has also been achieved by 3D
from stretchable sensors, Wang et al. reported a data fusion structures or hierarchical microstructure arrays that produce
architecture that achieved a 100% hand gesture recognition different output patterns in response to force applied in dif-
accuracy even when visual data are not ideal (noisy, under or ferent directions.[159,167–170]
over exposed), and showed 3.3% error robotic navigation with Differentiation and decoupling of multimodal tactile meas-
hand gesture control in the dark.[164] Based on strain-mediated urands with high spatial resolution pose great challenges in
contact in anisotropically resistive structures, Araromi et al. cre- tactile sensors since most transduction mechanisms rely on
ated a strain sensor with high sensitivity (gauge factor ≈ 85000) a single output (e.g., capacitance, resistance, or optical inten-
at low strains (<5%), allowing the sensor to detect small muscle sity) in response to various stimuli. Recent research progresses
movements. A sleeve integrated with four sensors detect have addressed this challenge by different approaches and
muscle contraction at four locations of the arm, and by training demonstrated advanced functionalities in haptic input devices.
a neural network model with the sleeve output and ground Kim et al. achieved differentiated and decoupled sensing of
truth from motion capture facility, hand gestures, and motions stretch, bend and compression by employing heterogenous
could be tracked (Figure 12a).[165] Contact force detection has sensing mechanisms of optical, resistive, and capacitive sensing

Figure 12. Wearable sensors. a) Ultrasensitive sensors detect muscle movements on the arm as inputs for gesture control. Reproduced with permis-
sion.[165] Copyright 2020, Springer Nature. b) Nanomesh pressure sensor allows contact pressure measurement without detectable human sensation.
Reproduced with permission.[166] Copyright 2020, AAAS. c) Multimodal wearable sensor enabled by heterogenous sensing mechanisms. Reproduced
with permission.[171] Copyright 2020, AAAS. d) Decoupled strain and temperature sensing via ion relaxation dynamics. Reproduced with permission.[172]
Copyright 2020, AAAS. e) Large electronic textile for distributed sensing with one fiber sensor via electrical time-domain reflectometry. Reproduced
with permission.[173] Copyright 2020, Springer Nature. f) Stretchable distributed fiber optic sensor capable of distributed multimodal sensing. A wire-
less glove demonstrates simultaneous measurements of force and multi-location bending with one sensor. Reproduced with permission.[174] Copyright
2020, AAAS.

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (15 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

with data interpretation through machine learning. The sensor 7.2. Graspable
is composed of an ionic liquid core encapsulated in silicone with
conductive fabric attached on the outside. The ionic core and sili- Another way to create the haptic experience is to construct a
cone cladding as an optical waveguide sensor respond to stretch, real-world representation of the virtual object, often in the form
bend, and compression; the ionic liquid as a resistive sensor of a graspable prop or a touchable surface, which could exhibit
respond to stretch and compression; the conductive fabric all the haptic features (i.e., shape, stiffness, texture, etc.) with
is most sensitive to stretch. With an ANN machine learning programmable control. While this is a challenging engineering
model, multimodal deformations are classified and decoupled. problem that requires solutions across a wide range of scales,
In a demonstration of a wrist wearable, different combinations the field of soft robotics has demonstrated programmable con-
of wrist stretch, bend, and compression are used as inputs to trol of variable stiffness and shape morphing in both graspable
control a robotic gripper in a manipulation task. (Figure 12c)[171] and touchable device form.
You et al. achieved simultaneous strain and temperature meas-
urement in a single ion conductor with no signal interference
through studying ionic relaxation dynamics: Charge relaxation 7.2.1. Variable Stiffness
time is insensitive to strain and measures absolute temperature;
normalized capacitance measures strain and is insensitive to Variable stiffness has been achieved with pressure-based stiff-
temperature variation. A tactile sensor array is fabricated with ness control of particle jamming, temperature-based stiff-
the ion conductor sandwiched between patterned electrodes and ness control of low melting point materials (e.g., metal alloy,
demonstrates independent temperature and pressure mapping wax, thermoplastic polymers, etc.), and magnetic or electric
at the same time. Interpolation of the temperature and pres- field-based stiffness control of magnetorheological or elec-
sure data could also enable shear sensing (Figure 12d).[172] An trorheological materials. We refer the readers to ref. [175] for
alternative approach of distributed sensing other than sensor a comprehensive review of these mechanisms. Here we show
pixels and arrays is the fiber-based sensor system, where the an example of a variable stiffness and shape morphing object
system only requires single input and output contact points and that could be potentially used as a graspable haptic device. Van
thus reduces risk of contact point failure. Leber et al. reported Meerbeek et al. introduced a metal-elastomer-foam composite
a fiber-based stretchable sensor that performs distributed where the stiffness is controlled reversibly by temperature: The
sensing of pressing and stretching via electrical time-domain composite is stiff below the melting temperature of the metal
reflectometry, a time-of-flight technique that send high fre- alloy (62 °C); above the melting temperature, the composite
quency pulses through an impedance-controlled transmission exhibit compliant rubbery properties of the elastomeric foam.
line and discontinuities reflect pulses with distinctive feature of The composite can be stretched to over 200% strain when
the disturbance. The fiber is thermally drawn with liquid metal heated and the deformed shape can be locked by decreasing
conductors embedded in an elastomeric dielectric as transmis- temperature below 62 °C. Shape memory morphing has also
sion lines and has a 0.2 N force resolution, 6 cm spatial resolu- been demonstrated where a rectangular cuboid is locked into
tion, and 0.25% strain resolution. Integrating a single 10 m long a cylinder shape, and recovers the cuboid shape upon elevated
sensor on a 50 cm × 50 cm large electronic textile, detection of temperature (Figure 13a).[176] With further system integration,
both pressure and stretch has been achieved with one input and the foam composite could be potentially used as a haptic grasp-
output connection (Figure 12e).[173] Optics provides a good plat- able prop with programmable stiffness and shape.
form to perform distributed and multimodal sensing, since a
broad-wavelength spectrum input offers a wide range of wave-
lengths that could be modulated without signal interference in 7.2.2. Volumetric Sensing
a single transmission line. Bai et al. reported a stretchable dis-
tributed fiber optic elastomeric sensor capable of measuring the An important step toward an interactive graspable prop is volu-
mode, location and magnitude of press, stretch and bend along metric sensing, which could provide rich haptic input as the user
a single fiber without the assistance of machine learning. The deforms the prop. Volumetric sensing of arbitrary deformation
multifunctionality is achieved through coupling a white LED to a soft body is challenging since it would require decouplable
to a dual-core fiber partially doped with absorbing dye patterns multimodal sensors distributed in a 3D volume. Van Meer-
that enables wavelength-specific absorption and frustrated total beek et al. approach this challenge with an optical foam senor
internal reflection. Distinctive intensity and chromaticity com- system where outputs are interpreted by a machine learning
binations at the two cores’ outputs differentiate the mode of algorithm. 2D array of optical fibers are embedded at the bottom
the deformation and chromatic patterns (discrete or gradient) of a block of foam that both illuminate the foam and receive the
color-code the spatial information. The sensor has resolutions of reflected light. Foam pores distributed throughout the volume
0.14 N compression, 1% strain, 0.5° bend, and spatial resolu- with high density scatter the incoming light and as the pores
tion of 1 mm with the gradient chromatic pattern. By thresh- deform with the macroscopic foam deformation, different scat-
olding and solving a vector sum model in the color space for the tering patterns are interpreted with machine learning. A kNN
outputs, decoupled multilocation and multimodal deformation algorithm predicts the type of deformation with 100% accuracy
sensing have been achieved. In a demonstration of a wireless (bend up, bend down, twist left, twist right) and the magnitude
glove wearable, one sensor with three dyes can accurately recon- of the deformation with 0.06° error (Figure 13b).[122] Xu et al.
struct the angles of the three finger joints and at the same time report an arbitrary 3D grid of stretchable lightguides (optical
detect presses at different locations (Figure 12f).[174] lace) distributed in a soft scaffold to provide spatially continuous

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (16 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

Figure 13. Graspable devices. a) Variable stiffness and shape morphing metal-elastomer-foam composite. Reproduced with permission.[176] Copyright 2016,
John Wiley and Sons. b) Internally illuminated elastomer foam differentiates bending and twisting with machine learning. Reproduced with permission.[122]
Copyright 2018, AAAS. c) 3D arrays of optical lace for spatially continuous deformation sensing. Reproduced with permission.[49] Copyright 2019, AAAS.

deformation sensing throughout the volume. The 3D sensor and haptic sensing at the same time. HLEC is constructed
network operates based on frustrated total internal reflection with a ZnS phosphor doped dielectric elastomer sandwiched
and light coupling between neighboring lightguides upon con- between transparent hydrogel electrodes and changes the illu-
tact. One lightguide with light input is threaded through the minance and capacitance under deformation (Figure 8b).[178]
volume and several output lightguides are distributed to dif- Based on HLEC, Li et al. created a stretchable display that also
ferent locations and each is connected to a photodetector. As functioned as a touch interface with heterogenous patterning of
the soft scaffold is deformed, output lightguides contact with phosphors and interdigitated patterning of the electrodes, and
the input lightguide at various locations with output intensities demonstrated the ability of multi-touch sensing (Figure 14c).[179]
and patterns indicating the 3D shape change of the soft scaffold. Kim et al. reported a stretchable ionic touch panel that could
Volumetric sensing is demonstrated through reconstructing the track a single touch point even when the panel is highly
non-uniform deformation of a soft cylinder with stiffness gra- deformed (1000% areal strain). The panel employs a surface-
dient under uniaxial strain (Figure 13c).[49] capacitive system comprising a rectangular hydrogel film con-
nected to four electrodes at the corners: Finger contact leads to
current flow from electrodes to the touch point and the induced
7.3. Touchable current increases as the touch point comes closer to the cor-
responding electrode. The soft ionic touch panel is attached to
7.3.1. Morphing Surface the arm and demonstrated for human-computer interaction of
writing words and playing games with haptic inputs from the
Inspired by cephalopod’s muscular morphology for textural cam- touch panel (Figure 14d).[180] Larson et al. explored using shape
ouflage, Pikul et al. developed a 2D stretchable surface that could changes of a soft touch surface as gesture inputs for human-
be programmed to transform into a target 3D shape. The mech- computer interaction. The soft haptic interface is composed of
anism of circumferentially constrained and radially stretched an array of stretchable carbon nanotube capacitors as a dome-
elastomer programs inextensible textile mesh in an elastomeric shape inflated rubbery membrane through which the user
membrane that allows one-to-one mapping of the radial stretch could deform to transmit information. Convolutional neural
of the elastomer into a target 3D displacement when pneumati- networks are used to classify and localize touch on the interface
cally actuated. Stretchable surfaces have been demonstrated to and supervised learning is used to relate gesture inputs to com-
morph into shapes of plants and rocks that camouflage into the puter commands. The haptic interface is demonstrated in the
background environment (Figure 14a).[1] Stanley et al. created a video game Tetris in realtime, where the Tretromino could be
haptic surface capable of shape morphing and variable stiffness moved, rotated, and dropped with more intuitive gesture inputs
control, where an array of coffee ground cells are laid on top of of poking, twisting, and pinching the soft dome (Figure 14e).[181]
pressure regulated air chambers. Pneumatic control changes the Combining HLEC with FEA in a soft haptic interface,
surface shape and vacuum control tunes the stiffness of coffee Peele et al. built an untethered stretchable display for tactile
ground layer through particle jamming (Figure 14b).[177] interaction where multimodal feedbacks of both visual and
haptic stimulus were demonstrated in an interactive memory
game similar to Simon (Figure 15a).[182] In the work by Rob-
7.3.2. Interactive Surface inson et al., a haptic interface synthetic to sensory-motor analog
is composed of an integrated system with pneumatic actuators
Larson et al. created a hyperelastic light-emitting capacitor and stretchable capacitive sensors capable of feedback control
(HLEC) that achieved light emission at large strains (≈500%) with exteroceptive and proprioception sensing (Figure 15b).[183]

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (17 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

Figure 14. Touchable devices. a) Morphing surfaces actuated from 2D to a targeted 3D shape and camouflage into the background. Reproduced with
permission.[1] Copyright 2017, AAAS. b) Haptic surface for shape morphing and variable stiffness with pneumatic actuation and particle jamming.
Reproduced with permission.[177] Copyright 2018, AAAS. c) Stretchable multicolor display with multi-touch sensing. Reproduced with permission.[179]
Copyright 2016, John Wiley and Sons. d) Stretchable ionic touch panel for human-computer interaction. Reproduced with permission.[180] Copyright
2016, AAAS. e) A deformable haptic interface for human touch recognition allowing natural haptic input. Reproduced with permission.[181] Copyright
2019, Mary Ann Liebert, Inc.

7.4. Soft Haptic Devices in Virtual Reality and Augmented with commercial VR/AR software. Maturity of a haptic tech-
Reality nology is not necessarily reflected by its integration in a VR/
AR system. Jadhav et al. created a soft exoskeleton haptic
Decades after the Power Glove, commercial haptic wearables glove to simulate the force of a button press through applying
now have great improvements in the functionality, although forces on the finger joints with the McKibben muscle actua-
the cost is still high for consumer electronics. A VR haptic tors (Figure 17a).[187] A pilot user study tested the glove in a
glove, HaptX, integrates functions of motion tracking, virtual reality environment that incorporated visual, audio and
force feedback, and tactile feedback in one embodiment the haptics feedbacks for a virtual piano-playing program and
(Figure 16a).[184] The glove employs electromagnetic sensors demonstrated a more immersive user experience. Song et al.
for finger motion tracking, passive locking mechanism for created a soft VR glove with an electrostatically driven pneu-
resistive force feedback, and elastomer-encapsulated pneu- matic actuator for tactile feedbacks that indicate contact with a
matic microfluidic arrays for tactile feedback. The glove virtual object (Figure 17b).[188] In another VR glove, Kim et al.
demonstrates haptic feedback with improved realism in VR integrated flexible thermoelectric devices to provide thermal
(e.g., simulate sensations of raindrops) but requires a bulky feedbacks and a user test involving both visual and thermal
pneumatic control unit to program over one hundred tac- stimuli in a virtual environment showed improved realism
tile actuators on a glove. TacSuit (bHaptics) has 40 eccentric with thermal feedbacks (Figure 17b).[189] Barreiros et al. pre-
rotating mass motors on a vest to provide vibrotactile feed- sented a haptic controller sleeve (the Omnipulse) wrapped
backs (Figure 16b).[185] A full-body wearable, TeslaSuit, pro- around a VR controller that provides dynamic kinesthetic
vides haptic feedback with electrostimulation and IMU-based feedback (12 patterns) with twelve FEA actuators exhibiting
motion tracking (Figure 16c).[186] large force (5–45 N) and displacement (3–10 mm) (Figure 17c).
Many of the soft haptic devices reviewed in Section 6 User study showed improved experience playing VR games
could be integrated into VR/AR environments to provide with the Omnipulse sleeve compared to a conventional con-
haptic inputs or outputs that improve the experience of troller.[190] Mac Murray et al. demonstrated stiffness and shape
immersion. In this section, we show examples of soft haptic control in a compliant VR controller handle through pneu-
devices from research labs that have been demonstrated in matic actuation. The handle is compliant with a small diam-
a VR/AR environment. Note that there are fewer examples eter when not pressurized and becomes stiff with a larger
since these demonstrations require access to and integration diameter when pressurized. The compliant controller has

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (18 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

Figure 15. Interactive surfaces. a) Untethered stretchable display for tactile interaction with haptic and visual feedbacks. Reproduced with permis-
sion.[182] Copyright 2019, Mary Ann Liebert, Inc. b) 3D printed haptic interface with capacitive sensors and pneumatic actuations for interactive haptic
feedback. Reproduced with permission.[183] Copyright 2015, Elsevier.

a 3D printed elastomeric lattice on the inside to provide struc- and movement tracking with higher sensitivity and robust-
tural support and restoring force. The controller is integrated ness and greatly enrich haptic input types to multimodal tac-
in a VR game to simulate the different stiffness of the virtual tile sensing for more natural and advanced haptic interaction.
objects and further programmed as an input to the game Apart from functionality, cost poses another major challenge
(Figure 17d).[191] for the wide adoption of existing advanced haptic technolo-
gies. In this aspect, soft haptic devices could be cost-effective
due to the low-cost construction materials (e.g., silicone and
8. Conclusion other synthetic elastomers) and fabrication methods (e.g., rep-
lica molding). Although there has not been scaled production
Soft Organic Robotics constructing intelligent soft machines of soft haptic devices; balloons demonstrate this potential as
with intrinsically soft actuators and sensors has the poten- trillions of units are produced annually.
tial to enable the next generation haptic devices for more Power and untethering are major challenges for soft actua-
realistic VR/AR experience. Soft actuators improve haptic tors. FEAs powered with pumps and valves have large form fac-
feedback functionalities by enabling low profile wearables tors and DEAs require high voltage electronics. As reviewed in
with conformal contact to the skin and over the joints while Section 3, these are open research topics actively explored by
delivering large amplitude and persistent deformation for the field. Actuation voltage of DEAs has been reduced to a few
kinesthetic feedbacks, as well as, large frequency bandwidth hundred volts and compact high voltage control circuits have
vibrotactile feedbacks. Objects or surfaces that could morph been demonstrated. Alternative pressure generation mecha-
their shape and stiffness with a small form factor introduce nisms via combustion, peroxide decomposition, and electrolytic
potential for direct modulation of haptic properties in haptic hydraulics have enabled compact untethered powering of FEAs
feedback devices. Soft sensors improve haptic input of gesture and thus show promise for pumpless and valveless FEAs for

Figure 16. Commercial haptic wearables for VR and AR. a) Haptic glove (HaptX) with tactile feedback, kinesthetic feedback and gesture sensing. Repro-
duced with permission.[184] Copyright HaptX, Inc. b) Vest with vibrotactile feedback (TacSuit). Reproduced with permission.[185] Copyright bHaptics.
c) Full body wearable with electrostimulation haptic feedback and motion tracking (TeslaSuit). Reproduced with permission.[186] Copyright TESLASUIT.

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (19 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

Figure 17. Soft devices demonstrated in VR/AR. a) Soft exoskeleton glove simulating force of a button press playing virtual piano. Reproduced with
permission.[187] Copyright 2017, UC San Diego Jacobs School of Engineering. b) VR glove with gesture sensing and pneumatic force feedback or tem-
perature feedback. Reproduced with permission.[188,189] Copyright 2019, 2020, the authors. c) VR controller sleeve with pneumatic actuators producing
dynamic kinesthetic feedback. Reproduced with permission.[190] Copyright 2019, IEEE. d) Soft VR controller with variable stiffness and shape for haptic
feedback. Reproduced with permission.[191] Copyright 2018, IEEE.

portable VR/AR systems. Soft sensors with higher spatial reso- [1] J. H. Pikul, S. Li, H. Bai, R. T. Hanlon, I. Cohen, R. F. Shepherd,
lution, bandwidth, multifunctionality, and selectivity could lead Science 2017, 358, 210.
to more powerful haptic inputs. [2] W. T. Navaraj, C. G. Núñez, D. Shakthivel, V. Vinciguerra,
F. Labeau, D. H. Gregory, R. Dahiya, Front. Neurosci. 2017, 11, 501.
At last, haptic perception is a subjective and complex pro-
[3] H. Culbertson, C. M. Nunez, A. Israr, F. Lau, F. Abnousi,
cess and the relationship between measurable parameters and A. M. Okamura, presented at 2018 IEEE Haptics Symposium
haptic perception is still not yet fully understood. By providing (HAPTICS), IEEE, New York 2018, pp. 32–39.
direct variation and programmable control of haptic proper- [4] “Bimorph type vibrators”, http://www.fujicera.co.jp/en/product/
ties, soft robotics built with functional organic elastomers could bimorph/ 2021.
facilitate psychophysical experiments to understand human tac- [5] K. C. Galloway, K. P. Becker, B. Phillips, J. Kirby, S. Licht,
tile perception. Advances in our understanding of how haptic D. Tchernov, R. J. Wood, D. F. Gruber, Soft Robot. 2016, 3, 23.
perception translates into addressable engineering parameters [6] N. R. Sinatra, C. B. Teeple, D. M. Vogt, K. K. Parker, D. F. Gruber,
could enable future haptic devices to produce more realistic R. J. Wood, Sci. Robot. 2019, 4, eaax5425.
haptic stimulus. [7] W. Wang, J.-Y. Lee, H. Rodrigue, Bioinspir. Biomim. 2011, 6, 026007.
[8] K. Sappati, S. Bhadra, Sensors 2018, 18, 3605.
[9] P. Polygerinos, N. Correll, S. A. Morin, B. Mosadegh, C. D. Onal,
K. Petersen, M. Cianchetti, M. T. Tolley, R. F. Shepherd, Adv. Eng.
Acknowledgements Mater. 2017, 19, 1700016.
The authors acknowledge funding from Office of Naval Research [10] S. Hirose, Y. Umetani, Mech. Mach. Theory 1978, 13, 351.
(grant no. N00014-20-1-2438), National Science Foundation (grant no. [11] R. N. Palchesko, L. Zhang, Y. Sun, A. W. Feinberg, PLoS One 2012,
EFMA-1830924), and Air Force Office of Scientific Research (grant no. 7, e51499.
FA9550-20-1-0254). [12] R. F. Shepherd, F. Ilievski, W. Choi, S. A. Morin, A. A. Stokes,
A. D. Mazzeo, X. Chen, M. Wang, G. M. Whitesides, Proc. Natl.
Acad. Sci. U. S. A. 2011, 108, 20400.
[13] B. N. Peele, T. J. Wallin, H. Zhao, R. F. Shepherd, Bioinspir. Biomim.
Conflict of Interest 2015, 10, 055003.
The authors declare no conflict of interest. [14] T. J. Wallin, J. Pikul, R. F. Shepherd, Nat. Rev. Mater. 2018, 3, 84.
[15] H. Culbertson, S. B. Schorr, A. M. Okamura, Annu. Rev. Control,
Rob., Auton. Syst. 2018, 1, 385.
[16] J. Steuer, J. Commun. 1992, 42, 73.
Keywords [17] P. Milgram, F. Kishino, IEICE Trans. Inf. Syst. 1994, E77-D, 1321.
augmented reality, haptics, soft robotics, virtual reality [18] R. T. Azuma, Presence Teleoperators Virtual Environ. 1997, 6, 355.
[19] M. Mihelj, D. Novak, S. Begus, Virtual Reality Technology and Appli-
Received: November 2, 2020 cations, Vol. 68, Springer, Netherlands 2014, pp. 1–16.
Revised: April 20, 2021 [20] P. King, IEEE Eng. Med. Biol. Mag. 2004, 23, 205.
Published online: June 2, 2021 [21] A. J. Bremner, C. Spence, Adv. Child Dev. Behav. 2017, 52, 227.

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (20 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

[22] S. Kapoor, P. Arora, V. Kapoor, M. Jayachandran, M. Tiwari, J. Clin. [60] H. Zhao, Y. Li, A. Elsamadisi, R. Shepherd, Extreme Mech. Lett.
Diagn. Res. 2014, 8, 294. 2015, 3, 89.
[23] A. Bolopion, S. Regnier, IEEE Trans. Autom. Sci. Eng. 2013, 10, 496. [61] M. Wehner, M. T. Tolley, Y. Mengüç, Y.-L. Park, A. Mozeika, Y. Ding,
[24] E. P. W. - v. d. Putten, R. H. M. Goossens, J. J. Jakimowicz, C. Onal, R. F. Shepherd, G. M. Whitesides, R. J. Wood, Soft Robot.
J. Dankelman, Minimally Invasive Ther. Allied Technol. 2008, 17, 3. 2014, 1, 263.
[25] “Phantom Premium”, https://www.3dsystems.com/haptics-devices/ [62] B. Mosadegh, P. Polygerinos, C. Keplinger, S. Wennstedt,
3d-systems-phantom-premium/ 2016. R. F. Shepherd, U. Gupta, J. Shim, K. Bertoldi, C. J. Walsh,
[26] S. B. Schorr, A. M. Okamura, IEEE Trans. Haptics 2017, 10, 418. G. M. Whitesides, Adv. Funct. Mater. 2014, 24, 2163.
[27] H. Iwata, H. Yano, F. Nakaizumi, R. Kawamura, in Proc. 28th [63] R. F. Shepherd, A. A. Stokes, J. Freake, J. Barber, P. W. Snyder,
Annual Conf. on Computer Graphics and Interactive Techniques – A. D. Mazzeo, L. Cademartiri, S. A. Morin, G. M. Whitesides, Angew.
SIGGRAPH ‘01, ACM, New York 2001, pp. 469–476. Chem., Int. Ed. 2013, 52, 2892.
[28] K. Nakagaki, D. Fitzgerald, Z. J. Ma, L. Vink, D. Levine, H. Ishii, [64] N. W. Bartlett, M. T. Tolley, J. T. B. Overvelde, J. C. Weaver,
in Proc. 13th Int. Conf. on Tangible, Embedded, and Embodied B. Mosadegh, K. Bertoldi, G. M. Whitesides, R. J. Wood, Science
Interaction, ACM, New York 2019, pp. 615–623. 2015, 349, 161.
[29] D. J. Meyer, M. Wiertlewski, M. A. Peshkin, J. E. Colgate, presented [65] M. Wehner, R. L. Truby, D. J. Fitzgerald, B. Mosadegh,
at 2014 IEEE Haptics Symposium (HAPTICS), IEEE, Houston, TX G. M. Whitesides, J. A. Lewis, R. J. Wood, Nature 2016, 536, 451.
2014, pp. 63–67. [66] M. T. Tolley, R. F. Shepherd, B. Mosadegh, K. C. Galloway,
[30] H. Xu, M. A. Peshkin, J. E. Colgate, IEEE Trans. Haptics 2019, 12, 497. M. Wehner, M. Karpelson, R. J. Wood, G. M. Whitesides, Soft
[31] D. J. Meyer, M. A. Peshkin, J. E. Colgate, presented at 2013 World Robot. 2014, 1, 213.
Haptics Conference (WHC), IEEE, Daejeon, Korea 2013, pp. 43–48. [67] C. A. Aubin, S. Choudhury, R. Jerch, L. A. Archer, J. H. Pikul,
[32] A. Yamamoto, S. Nagasawa, H. Yamamoto, T. Higuchi, IEEE Trans. R. F. Shepherd, Nature 2019, 571, 51.
Visualization Comput. Graphics 2006, 12, 168. [68] R. Pelrine, R. Kornbluh, Q. Pei, J. Joseph, Science 2000, 287, 836.
[33] Y. H. Jung, J. Kim, J. A. Rogers, Adv. Funct. Mater. 2020, 2008805. [69] R. E. Pelrine, R. D. Kornbluh, J. P. Joseph, Sens. Actuators, A 1998,
[34] J. Tirado, V. Panov, V. Yem, D. Tsetserukou, H. Kajimoto, 64, 77.
in Haptics: Science, Technology, Applications, Vol. 12272, (Eds: [70] S. Rosset, H. R. Shea, Appl. Phys. A 2013, 110, 281.
I. Nisky, J. Hartcher-O’Brien, M. Wiertlewski, J. Smeets), Springer, [71] A. Poulin, S. Rosset, H. R. Shea, Appl. Phys. Lett. 2015, 107, 244104.
Cham 2020, pp. 442–450. [72] M. Duduta, R. J. Wood, D. R. Clarke, Adv. Mater. 2016, 28, 8058.
[35] D. J. Lipomi, C. Dhong, C. W. Carpenter, N. B. Root, [73] H. Zhao, A. M. Hussain, M. Duduta, D. M. Vogt, R. J. Wood,
V. S. Ramachandran, Adv. Funct. Mater. 2020, 30, 1906850. D. R. Clarke, Adv. Funct. Mater. 2018, 28, 1804328.
[36] G. Gerpheide, US5305017, 1994. [74] Y. Sheima, P. Caspari, D. M. Opris, Macromol. Rapid Commun.
[37] “DuelSence Wireless Controller”, https://www.playstation.com/ 2019, 40, 1900205.
en-us/accessories/dualsense-wireless-controller/ 2021. [75] X. Ji, X. Liu, V. Cacucciolo, M. Imboden, Y. Civet, A. El Haitami,
[38] “Oculus Rift S”, https://www.oculus.com/rift-s/ 2021. S. Cantin, Y. Perriard, H. Shea, Sci. Robot. 2019, 4, eaaz6451.
[39] T. G. Zimmerman, US4542291, 1982. [76] A. O’Halloran, F. O’Malley, P. McHugh, J. Appl. Phys. 2008, 104, 071101.
[40] T. G. Zimmerman, J. Lanier, C. Blanchard, S. Bryson, Y. Harvill, [77] G. Kovacs, L. Düring, S. Michel, G. Terrasi, Sens. Actuators, A 2009,
ACM SIGCHI Bull. 1986, 17, 189. 155, 299.
[41] D. J. Sturman, D. Zeltzer, IEEE Comput. Graphics Appl. 1994, 14, 30. [78] E. Acome, S. K. Mitchell, T. G. Morrissey, M. B. Emmett,
[42] “Power Glove”, https://en.wikipedia.org/wiki/Power_Glove/ 2021. C. Benjamin, M. King, M. Radakovitz, C. Keplinger, Science 2018,
[43] W. R. Sherman, A. B. Craig, Understanding Virtual Reality, Elsevier, 359, 61.
Amsterdam 2018, pp. 190–256. [79] N. Kellaris, V. G. Venkata, G. M. Smith, S. K. Mitchell, C. Keplinger,
[44] T. Provot, X. Chiementin, E. Oudin, F. Bolaers, S. Murer, Sensors Sci. Robot. 2018, 3, eaar3276.
2017, 17, 1958. [80] M. R. O’Neill, E. Acome, S. Bakarich, S. K. Mitchell, J. Timko,
[45] “Vive Tracker,” https://www.vive.com/us/accessory/vive-tracker/ C. Keplinger, R. F. Shepherd, Adv. Funct. Mater. 2020, 30, 2005244.
2021. [81] S. Li, H. Bai, R. F. Shepherd, H. Zhao, Angew. Chem., Int. Ed. 2019,
[46] E. van der Kruk, M. M. Reijne, Eur. J. Sport Sci. 2018, 18, 806. 58, 11182.
[47] “Ring Fit Adventure for Nintendo Switch”, https://ringfitadventure. [82] C. J. Camargo, H. Campanella, J. E. Marshall, N. Torras,
nintendo.com/ 2021. K. Zinoviev, E. M. Terentjev, J. Esteve, J. Micromech. Microeng.
[48] “Valve Index Controllers”, https://www.valvesoftware.com/en/ 2012, 22, 075009.
index/controllers/ 2021. [83] N. Torras, K. E. Zinoviev, C. J. Camargo, E. M. Campo,
[49] P. A. Xu, A. K. Mishra, H. Bai, C. A. Aubin, L. Zullo, R. F. Shepherd, H. Campanella, J. Esteve, J. E. Marshall, E. M. Terentjev,
Sci. Robot. 2019, 4, eaaw6304. M. Omastová, I. Krupa, P. Teplický, B. Mamojka, P. Bruns,
[50] S. J. Lederman, R. L. Klatzky, Atten., Percept., Psychophys. 2009, 71, 1439. B. Roeder, M. Vallribera, R. Malet, S. Zuffanelli, V. Soler, J. Roig,
[51] L. A. Jones, Human and Machine Haptics, MIT Press, Cambridge, N. Walker, D. Wenn, F. Vossen, F. M. H. Crompvoets, Sens. Actua-
MA 2000. tors, A 2014, 208, 104.
[52] M. Hollins, S. R. Risner, Percept. Psychophys. 2000, 62, 695. [84] L. Sun, W. M. Huang, Z. Ding, Y. Zhao, C. C. Wang, H. Purnawali,
[53] D. Filingeri, G. Havenith, Temperature 2015, 2, 86. C. Tang, Mater. Des. 2012, 33, 577.
[54] W. M. B. Tiest, A. Kappers, IEEE Trans. Haptics 2009, 2, 189. [85] S. Li, Y. Tu, H. Bai, Y. Hibi, L. W. Wiesner, W. Pan, K. Wang,
[55] R. S. Johansson, J. R. Flanagan, Nat. Rev. Neurosci. 2009, 10, 345. E. P. Giannelis, R. F. Shepherd, Macromol. Rapid Commun. 2019,
[56] S. Ding, B. Bhushan, J. Colloid Interface Sci. 2016, 481, 131. 40, 1800815.
[57] F. Mancini, A. Bauleo, J. Cole, F. Lui, C. A. Porro, P. Haggard, [86] X. Yu, Z. Xie, Y. Yu, J. Lee, A. Vazquez-Guardado, H. Luan,
G. D. Iannetti, Ann. Neurol. 2014, 75, 917. J. Ruban, X. Ning, A. Akhtar, D. Li, B. Ji, Y. Liu, R. Sun, J. Cao,
[58] C. Hatzfeld, in Engineering Haptic Devices (Eds: C. Hatzfeld, Q. Huo, Y. Zhong, C. Lee, S. Kim, P. Gutruf, C. Zhang, Y. Xue,
T. A. Kern), Springer, London 2014, Ch. 2. Q. Guo, A. Chempakasseril, P. Tian, W. Lu, J. Jeong, Y. Yu,
[59] M. Schaffner, J. A. Faber, L. Pianegonda, P. A. Rühs, F. Coulter, J. Cornman, C. Tan, B. Kim, K. Lee, X. Feng, Y. Huang, J. A. Rogers,
A. R. Studart, Nat. Commun. 2018, 9, 878. Nature 2019, 575, 473.

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (21 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

[87] P. Garstecki, P. Tierno, D. B. Weibel, F. Sagués, G. M. Whitesides, [125] C. Duriez, in Proc. 2013 IEEE Int. Conf. on Robotics and Automation,
J. Phys.: Condens. Matter 2009, 21, 204110. IEEE, Karlsruhe, Germany 2013, pp. 3982–3987.
[88] H. Lu, M. Zhang, Y. Yang, Q. Huang, T. Fukuda, Z. Wang, Y. Shen, [126] L. Hines, K. Petersen, M. Sitti, Adv. Mater. 2016, 28, 3690.
Nat. Commun. 2018, 9, 3944. [127] N. Farrow, N. Correll, presented at 2015 IEEE/RSJ Int. Conf. on
[89] W. Hu, G. Z. Lum, M. Mastrangeli, M. Sitti, Nature 2018, 554, 81. Intelligent Robots and Systems (IROS), Vol. 2015, IEEE, Hamburg,
[90] Y. Kim, G. A. Parada, S. Liu, X. Zhao, Sci. Robot. 2019, 4, eaax7329. Germany 2015, pp. 2317–2323.
[91] S. Tasoglu, E. Diller, S. Guven, M. Sitti, U. Demirci, Nat. Commun. [128] M. Giorelli, F. Renda, M. Calisti, A. Arienti, G. Ferri, C. Laschi, IEEE
2014, 5, 3124. Trans. Robot. 2015, 31, 823.
[92] Y. Kim, H. Yuk, R. Zhao, S. A. Chester, X. Zhao, Nature 2018, 558, [129] R. Reinhart, Z. Shareef, J. Steil, Sensors 2017, 17, 311.
274. [130] T. G. Thuruthel, E. Falotico, M. Manti, A. Pratesi, M. Cianchetti,
[93] J. J. Zarate, H. Shea, IEEE Trans. Haptics 2017, 10, 106. C. Laschi, Soft Robot. 2017, 4, 285.
[94] T. N. Do, H. Phan, T.-Q. Nguyen, Y. Visell, Adv. Funct. Mater. 2018, [131] E. H. Skorina, W. Tao, F. Chen, M. Luo, C. D. Onal, presented at
28, 1800244. 2016 IEEE International Conference on Robotics and Automation
[95] R. Guo, L. Sheng, H. Gong, J. Liu, Sci. China: Technol. Sci. 2018, 61, 516. (ICRA), Vol. 2016, IEEE, Stockholm, Sweden 2016, pp. 4997–5002.
[96] G. Mao, M. Drack, M. Karami-Mosammam, D. Wirthl, [132] E. H. Skorina, M. Luo, C. D. Onal, Front. Robot. AI 2018, 5, 83.
T. Stockinger, R. Schwödiauer, M. Kaltenbrunner, Sci. Adv. 2020, 6, [133] V. Falkenhahn, A. Hildebrandt, O. Sawodny, presented at 2014
eabc0251. American Control Conf., IEEE, Portland, OR 2014, pp. 4008–4013.
[97] H. Ishizuka, N. Miki, Jpn. J. Appl. Phys. 2017, 56, 06GN19. [134] H. A. Sonar, A. P. Gerratt, S. P. Lacour, J. Paik, Soft Robot. 2020, 7,
[98] A. Jaimes, N. Sebe, Comput. Vision Image Understanding 2007, 108, 22.
116. [135] S. A. Manzano, P. Xu, K. Ly, R. Shepherd, N. Correll,
[99] M. Amjadi, K. U. Kyung, I. Park, M. Sitti, Adv. Funct. Mater. 2016, arXiv:2101.01139 2021.
26, 1678. [136] “CyberGlove Systems,” http://www.cyberglovesystems.com/
[100] T. R. Ray, J. Choi, A. J. Bandodkar, S. Krishnan, P. Gutruf, L. Tian, cybergrasp/ 2017.
R. Ghaffari, J. A. Rogers, Chem. Rev. 2019, 119, 5461. [137] V. Sanchez, C. J. Walsh, R. J. Wood, Adv. Funct. Mater. 2021, 31,
[101] M. Pawlaczyk, M. Lelonkiewicz, M. Wieczorowski, 2008278.
Adv. Dermatology Allergol. 2013, 5, 302. [138] C. Walsh, Nat. Rev. Mater. 2018, 3, 78.
[102] J. A. Rogers, T. Someya, Y. Huang, Science 2010, 327, 1603. [139] H. Xiong, X. Diao, Disabil. Rehabil. Assistive Technol. 2020, 15, 885.
[103] J. C. Yang, J. Mun, S. Y. Kwon, S. Park, Z. Bao, S. Park, Adv. Mater. [140] J. D. Sanjuan, A. D. Castillo, M. A. Padilla, M. C. Quintero,
2019, 31, 1904765. E. E. Gutierrez, I. P. Sampayo, J. R. Hernandez, M. H. Rahman,
[104] M. Zhu, T. He, C. Lee, Appl. Phys. Rev. 2020, 7, 031305. Robot. Autom. Syst. 2020, 126, 103445.
[105] D. J. Lipomi, J. A. Lee, M. Vosgueritchian, B. C. K. Tee, [141] “A glove powered by soft robotics to interact with virtual reality
J. A. Bolander, Z. Bao, Chem. Mater. 2012, 24, 373. environments”, https://ucsdnews.ucsd.edu/pressrelease/a_glove_
[106] X. Liu, J. Liu, S. Lin, X. Zhao, Mater. Today 2020, 36, 102. powered_by_soft_robotics_to_interact_with_virtual_reality_
[107] Y. Bai, B. Chen, F. Xiang, J. Zhou, H. Wang, Z. Suo, Appl. Phys. environme/ 2017, 1.
Lett. 2014, 105, 151903. [142] C.-Y. Chu, R. M. Patterson, J. Neuroeng. Rehabil. 2018, 15, 9.
[108] Z. Lei, Q. Wang, S. Sun, W. Zhu, P. Wu, Adv. Mater. 2017, 29, [143] H. Zhao, R. Huang, R. F. Shepherd, in, Proc. 2016 IEEE Int. Conf.
1700321. on Robotics and Automation (ICRA), Vol. 2016, IEEE, Stockholm,
[109] S. Xu, D. M. Vogt, W.-H. Hsu, J. Osborne, T. Walsh, J. R. Foster, Sweden 2016, pp. 4008–4013.
S. K. Sullivan, V. C. Smith, A. W. Rousing, E. C. Goldfield, [144] S. H. Collins, M. B. Wiggin, G. S. Sawicki, Nature 2015, 522, 212.
R. J. Wood, Adv. Funct. Mater. 2019, 29, 1807058. [145] M. Plooij, G. Mathijssen, P. Cherelle, D. Lefeber, B. Vanderborght,
[110] M. D. Dickey, Adv. Mater. 2017, 29, 1606425. IEEE Robot. Autom. Mag. 2015, 22, 106.
[111] H. Wang, M. Totaro, L. Beccai, Adv. Sci. 2018, 5, 1800541. [146] R. Hinchet, H. Shea, Adv. Mater. Technol. 2020, 5, 1900895.
[112] E. B. Secor, A. B. Cook, C. E. Tabor, M. C. Hersam, Adv. Electron. [147] G. Moy, C. Wagner, R. S. Fearing, in Proc. IEEE Int. Conf. on
Mater. 2018, 4, 1700483. Robotics and Automation (ICRA), Vol. 4, IEEE, San Francisco, CA
[113] S. Li, H. Zhao, R. F. Shepherd, MRS Bull. 2017, 42, 138. 2000, pp. 3409–3415.
[114] H. Zhao, J. Jalving, R. Huang, R. Knepper, A. Ruina, R. Shepherd, [148] H. S. Lee, H. Phung, D.-H. Lee, U. K. Kim, C. T. Nguyen, H. Moon,
IEEE Robot. Autom. Mag. 2016, 23, 55. J. C. Koo, J. Nam, H. R. Choi, Sens. Actuators, A 2014, 205, 191.
[115] S. Begej, IEEE J. Robot. Autom. 1988, 4, 472. [149] I. M. Koo, K. Jung, J. C. Koo, J.-D. Nam, Y. K. Lee, H. R. Choi, IEEE
[116] O. J. A. Schueller, X. M. Zhao, G. M. Whitesides, S. P. Smith, Trans. Robot. 2008, 24, 549.
M. Prentiss, Adv. Mater. 1999, 11, 37. [150] A. Marette, A. Poulin, N. Besse, S. Rosset, D. Briand, H. Shea,
[117] J. Missinne, S. Kalathimekkad, B. Van Hoe, E. Bosman, Adv. Mater. 2017, 29, 1700880.
J. Vanfleteren, G. Van Steenberge, Opt. Express 2014, 22, 4168. [151] S. Mun, S. Yun, S. Nam, S. K. Park, S. Park, B. J. Park, J. M. Lim,
[118] M. Kolle, A. Lethbridge, M. Kreysing, J. J. Baumberg, J. Aizenberg, K.-U. Kyung, IEEE Trans. Haptics 2018, 11, 15.
P. Vukusic, Adv. Mater. 2013, 25, 2239. [152] H. Zhao, A. M. Hussain, A. Israr, D. M. Vogt, M. Duduta,
[119] A. Leber, B. Cholst, J. Sandt, N. Vogel, M. Kolle, Adv. Funct. Mater. D. R. Clarke, R. J. Wood, Soft Robot. 2020, 7, 451.
2018, 5, 1802629. [153] A. K. Han, S. Ji, D. Wang, M. R. Cutkosky, IEEE Robot. Autom. Lett.
[120] J. D. Sandt, M. Moudio, J. K. Clark, J. Hardin, C. Argenti, M. Carty, 2020, 5, 4021.
J. A. Lewis, M. Kolle, Adv. Healthcare Mater. 2018, 7, 1800293. [154] E. Leroy, R. Hinchet, H. Shea, Adv. Mater. 2020, 32, 2002564.
[121] H. Zhao, K. O’Brien, S. Li, R. F. Shepherd, Sci. Robot. 2016, 1, [155] S. Schlatter, G. Grasso, S. Rosset, H. Shea, Adv. Intell. Syst. 2020,
eaai7529. 2, 2000136.
[122] I. M. Van Meerbeek, C. M. De Sa, R. F. Shepherd, Sci. Robot. 2018, [156] T. Thorsen, S. J. Maerkl, S. R.. Quake, Science 2002, 298, 580.
3, eaau2489. [157] A. Richter, G. Paschew, Adv. Mater. 2009, 21, 979.
[123] T. G. Thuruthel, Y. Ansari, E. Falotico, C. Laschi, Sci. Robot. 2018, [158] N. Besse, S. Rosset, J. J. Zarate, H. Shea, Adv. Mater. Technol. 2017,
5, 149. 2, 1700102.
[124] A. D. Marchese, D. Rus, Int. J. Robot. Res. 2016, 35, 840. [159] T. Yang, D. Xie, Z. Li, H. Zhu, Mater. Sci. Eng., R 2017, 115, 1.

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (22 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

[160] J. S. Heo, J. Eom, Y.-H. Kim, S. K. Park, Small 2018, 14, 1703034. [175] M. Manti, V. Cacucciolo, M. Cianchetti, IEEE Robot. Autom. Mag.
[161] V. Sanchez, C. J. Payne, D. J. Preston, J. T. Alvarez, J. C. Weaver, 2016, 23, 93.
A. T. Atalay, M. Boyvat, D. M. Vogt, R. J. Wood, G. M. Whitesides, [176] I. M. Van Meerbeek, B. C. M. Murray, J. W. Kim, S. S. Robinson,
C. J. Walsh, Adv. Mater. Technol. 2020, 5, 2000383. P. X. Zou, M. N. Silberstein, R. F. Shepherd, Adv. Mater. 2016, 28,
[162] J. Hu, H. Meng, G. Li, S. I. Ibekwe, Smart Mater. Struct. 2012, 21, 2801.
053001. [177] A. A. Stanley, A. M. Okamura, IEEE Trans. Haptics 2015, 8, 20.
[163] J. Shi, S. Liu, L. Zhang, B. Yang, L. Shu, Y. Yang, M. Ren, Y. Wang, [178] C. Larson, B. Peele, S. Li, S. Robinson, M. Totaro, L. Beccai,
J. Chen, W. Chen, Y. Chai, X. Tao, Adv. Mater. 2020, 32, 1901958. B. Mazzolai, R. Shepherd, Science 2016, 351, 1071.
[164] M. Wang, Z. Yan, T. Wang, P. Cai, S. Gao, Y. Zeng, C. Wan, [179] S. Li, B. N. Peele, C. M. Larson, H. Zhao, R. F. Shepherd, Adv.
H. Wang, L. Pan, J. Yu, S. Pan, K. He, J. Lu, X. Chen, Nat. Electron. Mater. 2016, 28, 9770.
2020, 3, 563. [180] C.-C. Kim, H.-H. Lee, K. H. Oh, J.-Y. Sun, Science 2016, 353,
[165] O. A. Araromi, M. A. Graule, K. L. Dorsey, S. Castellanos, 682.
J. R. Foster, W.-H. Hsu, A. E. Passy, J. J. Vlassak, J. C. Weaver, [181] C. Larson, J. Spjut, R. Knepper, R. Shepherd, Soft Robot. 2019, 6,
C. J. Walsh, R. J. Wood, Nature 2020, 587, 219. 611.
[166] S. Lee, S. Franklin, F. A. Hassani, T. Yokota, M. O. G. Nayeem, [182] B. Peele, S. Li, C. Larson, J. Cortell, E. Habtour, R. Shepherd, Soft
Y. Wang, R. Leib, G. Cheng, D. W. Franklin, T. Someya, Science Robot. 2019, 6, 142.
2020, 370, 966. [183] S. S. Robinson, K. W. O’Brien, H. Zhao, B. N. Peele,
[167] S. M. Won, H. Wang, B. H. Kim, K. Lee, H. Jang, K. Kwon, M. Han, C. M. Larson, B. C. M. Murray, I. M. Van Meerbeek, S. N. Dunham,
K. E. Crawford, H. Li, Y. Lee, X. Yuan, S. B. Kim, Y. S. Oh, W. J. Jang, R. F. Shepherd, Extreme Mech. Lett. 2015, 5, 47.
J. Y. Lee, S. Han, J. Kim, X. Wang, Z. Xie, Y. Zhang, Y. Huang, [184] “Make virtual reality feel real with true-contact haptics”, https://
J. A. Rogers, ACS Nano 2019, 13, 10972. haptx.com/virtual-reality/ 2021.
[168] C. M. Boutry, M. Negre, M. Jorda, O. Vardoulis, A. Chortos, [185] “Most Advanced Full Body Haptic Suit – bHaptics TactSuit”,
O. Khatib, Z. Bao, Sci. Robot. 2018, 3, eaau6914. https://www.bhaptics.com/tactsuit/tactsuit-x40/ 2021.
[169] H. Guo, Y. J. Tan, G. Chen, Z. Wang, G. J. Susanto, H. H. See, [186] “Teslasuit”, https://teslasuit.io/ 2021.
Z. Yang, Z. W. Lim, L. Yang, B. C. K. Tee, Nat. Commun. 2020, 11, [187] S. Jadhav, V. Kannanda, B. Kang, M. T. Tolley, J. P. Schulze,
5747. Electron. Imaging 2017, 2017, 19.
[170] H. Oh, G.-C. Yi, M. Yip, S. A. Dayeh, Sci. Adv. 2020, 6, eabd7795. [188] K. Song, S. H. Kim, S. Jin, S. Kim, S. Lee, J.-S. Kim, J.-M. Park,
[171] T. Kim, S. Lee, T. Hong, G. Shin, T. Kim, Y.-L. Park, Sci. Robot. Y. Cha, Sci. Rep. 2019, 9, 8988.
2020, 5, eabc6878. [189] S.-W. Kim, S. H. Kim, C. S. Kim, K. Yi, J.-S. Kim, B. J. Cho, Y. Cha,
[172] I. You, D. G. Mackanic, N. Matsuhisa, J. Kang, J. Kwon, L. Beker, Sci. Rep. 2020, 10, 11403.
J. Mun, W. Suh, T. Y. Kim, J. B. H. Tok, Z. Bao, U. Jeong, Science [190] J. Barreiros, H. Claure, B. Peele, O. Shapira, J. Spjut, D. Luebke,
2020, 370, 961. M. Jung, R. Shepherd, IEEE Robot. Autom. Lett. 2019, 4,
[173] A. Leber, C. Dong, R. Chandran, T. D. Gupta, N. Bartolomei, 277.
F. Sorin, Nat. Electron. 2020, 3, 316. [191] B. C. M. Murray, B. N. Peele, P. Xu, J. Spjut, O. Shapira, D. Luebke,
[174] H. Bai, S. Li, J. Barreiros, Y. Tu, C. R. Pollock, R. F. Shepherd, R. F. Shepherd, in Proc. 2018 IEEE Int. Conf. on Soft Robotics
Science 2020, 370, 848. (RoboSoft), IEEE, Livorno, Italy 2018, pp. 264–269.

Hedan Bai is a Ph.D. candidate in the Sibley School of Mechanical and Aerospace Engineering
at Cornell University. She received her B.S. degree from Cornell University in 2016 in Mechanical
and Aerospace Engineering and Operations Research and Information Engineering. Her research
focuses on the design and fabrication of stretchable fiber-optic sensors.

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (23 of 24) © 2021 Wiley-VCH GmbH
www.advancedsciencenews.com www.afm-journal.de

Shuo Li received B.S. degree (2014) from the University of Illinois at Urbana-Champaign,
M.S. degree (2016), and Ph.D. degree (2020) from Cornell University, all in Materials Science
and Engineering. He is currently a postdoctoral fellow in the Querrey Simpson Institute for
Bioelectronics at Northwestern University.

Robert F. Shepherd is an associate professor in the Sibley School of Mechanical and Aerospace
Engineering at Cornell University. He received his Ph.D. from the University of Illinois at Urbana-
Champaign in 2010 and conducted postdoctoral research at Harvard University from 2010 to
2012. He is the director of the Organic Robotics Lab (ORL), where the primary research areas are
bioinspired robotics, biomedical robots, haptic interfaces, soft sensors and displays, embodied
energy, and advanced manufacturing.

Adv. Funct. Mater. 2021, 31, 2009364 2009364 (24 of 24) © 2021 Wiley-VCH GmbH

You might also like