Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

sensors

Article
Wearable Drone Controller: Machine Learning-Based Hand
Gesture Recognition and Vibrotactile Feedback
Ji-Won Lee 1 and Kee-Ho Yu 2,3, *

1 KEPCO Research Institute, Daejeon 34056, Republic of Korea


2 Department of Aerospace Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea
3 Future Air Mobility Research Center, Jeonbuk National University, Jeonju 54896, Republic of Korea
* Correspondence: yu@jbnu.ac.kr

Abstract: We proposed a wearable drone controller with hand gesture recognition and vibrotactile
feedback. The intended hand motions of the user are sensed by an inertial measurement unit (IMU)
placed on the back of the hand, and the signals are analyzed and classified using machine learning
models. The recognized hand gestures control the drone, and the obstacle information in the heading
direction of the drone is fed back to the user by activating the vibration motor attached to the
wrist. Simulation experiments for drone operation were performed, and the participants’ subjective
evaluations regarding the controller’s convenience and effectiveness were investigated. Finally,
experiments with a real drone were conducted and discussed to validate the proposed controller.

Keywords: human–drone interface; wearable device; hand gesture recognition; machine learning;
vibrotactile feedback

1. Introduction
Nowadays, multicopter drones have been widely used because of their simple mecha-
Citation: Lee, J.-W.; Yu, K.-H. nism, control convenience, and hovering feature [1]. Drones are important in surveillance
Wearable Drone Controller: Machine and reconnaissance, aerial photography and measurement, search and rescue missions,
Learning-Based Hand Gesture communication relay, and environmental monitoring [2,3]. To complete such applica-
Recognition and Vibrotactile tions and missions, highly sophisticated and dexterous drone control is required. Au-
Feedback. Sensors 2023, 23, 2666. tonomous control has been partly used in their applications, as in waypoint following
https://doi.org/10.3390/s23052666 and programmed flight and mission, because of limited autonomy [4–6]. However, in
Academic Editors: Enrico Vezzetti,
the autonomous flight of drones, sometimes the autopilot is switched to manual control
Andrea Luigi Guerra, Gabriele
by a human operator according to the flight phase, such as landing, and in unexpected
Baronio, Domenico Speranza and circumstances. The human role is necessary in the control loop when the system cannot
Luca Ulrich fully reach an autonomous state.
Therefore, natural user interfaces for human operators have been studied extensively.
Received: 10 January 2023 An early study reported a novel user interface for manual control based on gestures, haptics,
Revised: 14 February 2023
and PDA [7]. Moreover, multimodal natural user interfaces, such as speech, gestures, and
Accepted: 23 February 2023
vision for human-drone interaction, were introduced [8,9]. Recently, hand gesture-based
Published: 28 February 2023
interfaces and interactions using machine learning models have been proposed [10–15].
Hand gestures are a natural way to express human intent, and their effectiveness for
applications of human–machine/computer interface/interaction has been reported in
Copyright: © 2023 by the authors.
previous works. Some of the applications were focused on the control of drones based on
Licensee MDPI, Basel, Switzerland. deep learning models [16–19]. They used vision, optical sensors with infrared lights, and
This article is an open access article an inertial measurement unit (IMU) to capture the motion of the hand. The IMU attached
distributed under the terms and to the user’s hand senses the motion of the hand robustly compared to conventional vision
conditions of the Creative Commons systems, which are easily affected by light conditions and require tedious calibrations.
Attribution (CC BY) license (https:// In contrast, to interact with a machine/computer, tactile stimulation has been adopted
creativecommons.org/licenses/by/ as a feedback means to humans for a long time [20–22]. Tactile stimulation is an addi-
4.0/). tional channel to visual information for providing necessary information to humans. In

Sensors 2023, 23, 2666. https://doi.org/10.3390/s23052666 https://www.mdpi.com/journal/sensors


Sensors 2023, 23, 2666 2 of 19

particular, tactile feedback is important for visually impaired persons in terms of rep-
resenting surrounding circumstances and the environment instead of having to rely on
visual information [23,24]. Even for sighted people, tactile feedback helps improve their
understanding of the environment when visual information is insufficient or blocked. Some
previous work related to tactile and force feedback for drone control was reported [25–27].
Tsykunov et al. [25] studied a tactile pattern-based control glove for cluster flight of nano-
drones. Cluster flights were intuitive and safe, in which the user’s position had to move
continuously to guide the drone. Rognon et al. [26] addressed the flyjacket to control
the drone with torso movements using IMU. The cable-driven haptic feedback system
was mounted on the elbows of the user, who navigated the direction of waypoints. The
flyjacket was demonstrated to control drones as much as joysticks and tablets, while the
torso of the user was restricted. Duan et al. [27] developed a human–computer interaction
system for improving realistic VR/AR. The tactile feedback device was used for estimating
environments and objects. The gesture-based control was implemented with Leap Motion
and Neural Network, which tend to be affected by light conditions.
We studied and implemented a wearable drone controller with hand gesture recog-
nition and vibrotactile feedback, and the basic idea of the interface was presented as a
preliminary version of this paper [28]. The hand gestures are sensed by the IMU, having
robust features for motion acquisition compared to conventional vision placed on the
back of the hand. The dominant parameters of the motion are extracted through sensitive
analysis, and then machine learning-based classifications are performed. The classified
and recognized hand gestures are commanded to the drone, and the distance to obstacles
in the heading direction of the drone measured by the ultrasonic sensor is represented by
activating the vibration motor attached to the user’s wrist. The IMU-based hand motion
capture is relatively free in the distance between the drone and the user, and the drone does
not need to make the pose to acquire the hand motion visually. Vibrotactile feedback is an
effective way to obtain obstacle information around a drone, especially when the visual
information is limited or blocked during operation of the drone.
The remainder of this paper is organized as follows. The system configuration of the
wearable drone controller is presented in Section 2, and hand gesture recognition based
on machine learning models is analyzed in Section 3. The simulation experiment with
subjective evaluation of participants is performed in Section 4, and a real drone experiment
for validation is presented in Section 5. Finally, Section 6 presents the conclusion.

2. System Configuration
The controller developed here comprised vibrotactile feedback and hand gesture-based
drone control. An ultrasonic sensor was mounted on the drone head to detect obstacles.
A vibrator in the controller was stimulated according to the distance between the drone
and its obstacles, as shown in Figure 1a. When the operator recognizes the vibration, the
operator commands the gesture to avoid its obstacle (Figure 1b). The gestures were sensed
by an IMU attached to the back of the hand. The gestures were categorized into two groups
to control the drone appropriately. One of the control methods, called the direct mode,
was defined to control drones directly for movements such as roll, pitch, and up/down.
The cruise velocity of the drone was determined from the inclination of the IMU attached
to the hand. Another control method, called the gesture mode, was used to control the
drone with hand gestures more easily but may not perform quantitatively, as in the direct
mode. The patterns of hand gestures used in the gesture mode were defined by imitating
hand motions for helicopter guidance. Gesture pattern analysis and classification were
conducted through machine learning to recognize these gestures, which are more sizable
than the direct mode.
All sensors were connected to the Raspberry Pi 3b (Raspberry Pi Foundation, Cam-
bridge, UK), as shown in Figure 2. The signal of the IMU was processed at 50 Hz and
transmitted to the PC for gesture classification using MATLAB/Simulink. The classified
gestures were also delivered to the AR drone control module of LabVIEW to command the
Gesture pattern analysis and classification were conducted through machine learning to
recognize these gestures, which are more sizable than the direct mode.
Sensors 2022, 22, x FOR PEER REVIEW 3 of 20
Sensors 2023, 23, 2666 3 of 19

Gesture pattern analysis and classification were conducted through machine learning to
drone. The ultrasonic sensor measured the distance, with the signal at 40 kHz. The distance
recognize these
information gestures, which
is transmitted are to
wirelessly more sizable than
the Raspberry Pi ofthe
thedirect mode.
controller and converted to
vibration intensity. The overall shape of the controller was fabricated using 3D printing.

(a) (b)
Figure 1. Concept of the proposed controller with vibrotactile feedback and gesture-based drone
control. (a) Obstacle is detected in front of the drone, and the vibrator is stimulated. (b) Gesture of
right direction is performed to avoid obstacle using gesture control.

All sensors were connected to the Raspberry Pi 3b (Raspberry Pi Foundation,


Cambridge, UK), as shown in Figure 2. The signal of the IMU was processed at 50 Hz
and transmitted to the PC for gesture classification using MATLAB/Simulink. The
classified
(a) gestures were also delivered to the AR drone control (b) module of LabVIEW to
command the drone. The ultrasonic sensor measured the distance, with the signal at 40
Figure
Figure1.1.Concept
kHz. TheConcept of
of the
distance proposedcontroller
information
the proposed controller
is with
transmitted
with vibrotactile
wirelessly
vibrotactile feedback
to the
feedbackRaspberryand gesture-based
Pi of thedrone drone
and gesture-based
control. (a) Obstacle
controller
control. is detected
and converted
(a) Obstacle is detected in front of
toinvibration the drone,
intensity.
front of the and
Thethe
drone, and the
overall vibrator
vibratorshape is stimulated. (b)was
of the controller
is stimulated. Gesture
(b) Gesture of of
right direction
directionis
fabricated
right isperformed
using toavoid
3D printing.
performed to avoidobstacle
obstacle using
using gesture
gesture control.
control.

All sensors were connected to the Raspberry Pi 3b (Raspberry Pi Foundation,


Cambridge, UK), as shown in Figure 2. The signal of the IMU was processed at 50 Hz
and transmitted to the PC for gesture classification using MATLAB/Simulink. The
classified gestures were also delivered to the AR drone control module of LabVIEW to
command the drone. The ultrasonic sensor measured the distance, with the signal at 40
kHz. The distance information is transmitted wirelessly to the Raspberry Pi of the
controller and converted to vibration intensity. The overall shape of the controller was
fabricated using 3D printing.

Figure
Figure 2. System
2. System configuration
configuration of wearable
of wearable drone
drone interface
interface comprises
comprises a drone
a drone controller,
controller, hand hand ges-
gesture
ture recognition and drone control module in PC,
recognition and drone control module in PC, and drone.and drone.

2.1. Vibrotactile
2.1. Vibrotactile Feedback
Feedback
InIn
general,
general, manual
manual drone
drone control
controlis is
based
based ononvisual feedback.
visual feedback. However,
However, obstacles
obstaclesinin
thethe
operating
operating environment
environment might
might interfere
interferewith visual
with feedback,
visual feedback,which occasionally
which causes
occasionally
drone
causesaccidents. Hence, other
drone accidents. sensory
Hence, otherfeedback, such as tactile
sensory feedback, such as and/or
tactileaudio,
and/orisaudio,
required
is
torequired
transfer obstacle information effectively.
to transfer obstacle information effectively.
InInthis
thisstudy,
study,vibrotactile
vibrotactilefeedback
feedbackwas wasadopted
adoptedto to represent
represent obstacle information.
obstacle information.
AnAn ultrasonic
ultrasonic sensor (HC-SR04, SparkFun Electronics, Colorado, USA) was mountedon
sensor (HC-SR04, SparkFun Electronics, Colorado, USA) was mounted on
thethe
drone
drone head
headwith
witha Raspberry
a Raspberry Pi Pi
to to
operate
operate thethe
sensor
sensorsosoas as
to to
detect obstacles
detect in in
obstacles front.
The reliable detection region of
of wearable
the sensordronewas determined to be 0.03 within ±15
to 2 mcontroller, ◦.
Figure 2. System configuration interface comprises a drone hand ges-
The
ture measuredand
recognition distance
drone is transmitted
control moduletointhe PC,Raspberry
and drone.Pi of the controller to generate
vibrotactile stimulation.
A coin-type vibration motor (MB-1004V, Motorbank Inc., Seoul, Korea) was used as
2.1. Vibrotactile Feedback
the actuator for tactile stimulation. The motor’s size (diameter × height) was 9.0 × 3.4 mm,
andInitsgeneral,
small size manual drone
is suitable control isdevices.
for wearable based Theon visual
motor is feedback.
attached to However,
the wrist, obstacles
which in
the operating environment might interfere with visual feedback, which occasionally
causes drone accidents. Hence, other sensory feedback, such as tactile and/or audio, is
required to transfer obstacle information effectively.
In this study, vibrotactile feedback was adopted to represent obstacle information.
front. The reliable detection region of the sensor was determined to be 0.03 to 2 m within
±15 °. The measured distance is transmitted to the Raspberry Pi of the controller to
generate vibrotactile stimulation.
A coin-type vibration motor (MB-1004V, Motorbank Inc., Seoul, Korea) was used as
Sensors 2023, 23, 2666 the actuator for tactile stimulation. The motor’s size (diameter x height) was49.0 of 19x 3.4

mm, and its small size is suitable for wearable devices. The motor is attached to the
wrist, which allows free hand motion for the intended gesture without any movement
allows free hand
restriction. motion for deliver
To effectively the intended gesture
vibration to without
the user,any movement
a tactile restriction.
stimulator was To
designed
effectively deliver vibration to the user, a tactile stimulator was designed as a bracelet,
as a bracelet, which comprised a spring and hemispherical plastic under the vibration
which comprised a spring and hemispherical plastic under the vibration motor, as shown
motor, as shown in Figure 3.
in Figure 3.

.
Figure3.3.Structure
Figure Structureof vibrotactile stimulator.
of vibrotactile stimulator.
Based on previous related studies [29–31], an amplitude of 100 µm and a frequency
of 96.2Based on fixed
Hz were previous
as therelated studies
stimulation [29–31],toan
conditions amplitude
satisfy of 100 µm
the vibrotactile and a frequency
threshold of
of 96.2 HzThe
detection. were fixed as
vibration the stimulation
intensity conditions
changes according to to
thesatisfy thebetween
distance vibrotactile threshold o
the drone
detection.
and The vibration
the obstacle. intensity
A preliminary changeswas
experiment according to the
conducted to distance
determinebetween
appropriatethe drone
and the obstacle.
stimulation A preliminary
conditions experiment
for the vibration motor. Thewas conducted
vibration to determine
intensity appropriate
was controlled by
modulating
stimulationthe pulse width
conditions for (PWM) appliedmotor.
the vibration to the The
vibration motor.
vibration The duty
intensity wasratecontrolled
was by
changed by two levels with a step change because a minimum stimulation
modulating the pulse width (PWM) applied to the vibration motor. The duty rate was difference is
required for human tactile perception [20]. At first, the duty rate of 50% is given as an initial
changed by two levels with a step change because a minimum stimulation difference is
vibration intensity when the measured distance to an obstacle is within 1.0 m, and then the
required for human tactile perception [20]. At first, the duty rate of 50% is given as an
duty rate is raised to 100% for strong vibration intensity when the measured distance is
initial0.5
within vibration intensity
m and the drone iswhen the
getting measured
closer distance to an obstacle is within 1.0 m, and
to the obstacle.
then the duty rate is raised to 100% for strong vibration intensity when the measured
2.2. Geustre-Based
distance Drone
is within 0.5 Control
m and the drone is getting closer to the obstacle.
Gestures facilitate the delivery of intentions, emotions, thoughts, and nonverbal
2.2. Geustre-Based
communication. InDrone Control
particular, hand gestures have been widely used as effective and
easy communication methods [32,33]. Hand gesture-based control provides intuitive and
Gestures facilitate the delivery of intentions, emotions, thoughts, and nonverbal
portable drone control. There have been studies on hand gesture recognition using vision-
communication. In particular, hand gestures have been widely used as effective and
based sensors, such as RGB and depth cameras, and Kinect sensors [34,35]. However, these
easy communication
systems have limitations methods
in terms[32,33].
of light,Hand
angle,gesture-based
and position of control provides intuitive
the environment. To
and portable drone control. There have been studies on hand
overcome this problem, we propose a wearable sensor-based gesture recognition gesture recognition
system. using
vision-based
The sensors, such
wearable sensor-based as RGB
method and electromyography
includes depth cameras, and (EMG)Kinect sensors
and IMU, [34,35].
etc. [36,37].
However,
We these(SEN-14001,
used an IMU systems have limitations
SparkFun in terms
Electronics, of light,
Colorado, angle,
USA) andmeasures
which positiontheof the
3-dimensional angle, angular velocity, and linear acceleration.
environment. To overcome this problem, we propose a wearable sensor-based gesture
As mentioned
recognition system. above,
The gestures
wearablewere categorizedmethod
sensor-based into twoincludes
groups corresponding
electromyography to
control purposes. The direct mode was defined to match drone movements such as roll,
(EMG) and IMU, etc. [36,37]. We used an IMU (SEN-14001, SparkFun Electronics,
pitch, and up/down, as shown in Figure 4. The roll, pitch, and z-axis acceleration of the
Colorado, USA) which measures the 3-dimensional angle, angular velocity, and linear
user’s hand were calculated using an inertial sensor and mapped to the drone’s posture
acceleration.
and speed. The available hand motion range and its rate were considered to avoid the
As mentioned
recognition of unintendedabove, gestures
hand were
gestures. categorized
A roll and pitch into
motiontwoofgroups
less thancorresponding
±30◦ and to
control
linear purposes.of
acceleration Thelessdirect
than mode
±10 m/s wasare defined to match
disregarded for drone movements
the stable recognitionsuch
of as roll,
pitch,gestures.
hand and up/down, as shown in Figure 4. The roll, pitch, and z-axis acceleration of the
Thehand
user’s gesture
weremode shown inusing
calculated Figurean5 is defined
inertial by imitating
sensor the handtosignal
and mapped used forposture
the drone’s
helicopter guidance from the ground operator [38,39]. The gesture mode
and speed. The available hand motion range and its rate were considered to avoid the has difficulty
controlling the drone quantitatively, as in the direct mode, but can be used more easily
recognition of unintended hand gestures. A roll and pitch motion of less than ±30 ° and
to control flight direction. These hand gestures are generated naturally with individual
deviations; therefore, the patterns should be analyzed and classified for accurate recognition.
This study adopted machine learning to learn and classify hand gestures.
linear acceleration of less than ±10 m/s are disregarded for the stable recognition of hand
Sensors 2022, 22, x FOR PEER REVIEW gestures. 5 of 2

Sensors 2023, 23, 2666 linear acceleration of less than ±10 m/s are disregarded for the stable recognition
5 of 19 of hand

gestures.

.
Figure 4. Definition of direct mode. It can directly control the drone’s posture and speed according
to inclination and up/down of hand.

The gesture mode shown in Figure 5 is defined by imitating the hand signal used
for helicopter guidance from the ground operator [38,39]. The gesture mode has
difficulty controlling the drone quantitatively, as in the direct mode, but can be used
more easily to control flight direction. These hand gestures are generated naturally with
.
individual deviations; therefore, the patterns should be analyzed and classified for
Figure 4. accurate
Definition of recognition.
direct mode. It This
can study
directly adopted
control the machineposture
drone’s learning tospeed
and learn according
and classify
to hand
Figure 4. Definition of direct mode. It can directly control the drone’s posture and speed according
gestures.of hand.
inclination and up/down
to inclination and up/down of hand.

The gesture mode shown in Figure 5 is defined by imitating the hand signal used
for helicopter guidance from the ground operator [38,39]. The gesture mode has
difficulty controlling the drone quantitatively, as in the direct mode, but can be used
more easily to control flight direction. These hand gestures are generated naturally with
individual deviations; therefore, the patterns should be analyzed and classified for
accurate recognition. This study adopted machine learning to learn and classify hand
gestures.
Figure
Figure 5. Definition 5. Definition
of gesture mode.ofGesture
gesture mode
mode. can
Gesture mode
control thecan control the
direction direction
of drone of drone
with with natura
natural
hand motion.
hand motion.
Sensors 2022, 22, x FOR PEER REVIEW 6 of 20
3. Hand Gesture3.Recognition
Hand Gesture Recognition
The signal
The signal processing processing
scheme in handscheme in hand
gesture modegesture mode isinillustrated
is illustrated Figure 6.in Figure 6.

Figure 5. Definition of gesture mode. Gesture mode can control the direction of drone with natura
hand motion.

3. Hand Gesture Recognition


The signal processing scheme in hand gesture mode is illustrated in Figure 6.

6. Scheme
Figure 6.
Figure Schemeofofsignal processing
signal for hand
processing gesture
for hand recognition.
gesture recognition.
In Figure 6, the signals of the hand movements are obtained from the inertial sensor of
In FigureSubsequently,
the controller. 6, the signalstheofkey
thefactors
hand movements aredetermined
of gestures were obtained from
usingthe inertial sensor
sensitivity
of the controller.
analysis to reduceSubsequently,
the calculation the
timekey factorsprocessing.
of signal of gesturesBased
wereon
determined using
the analysis, key
sensitivity
signals were segmented using sliding windows and filtered using a low-pass filter tothe
analysis to reduce the calculation time of signal processing. Based on
analysis, keygravity
eliminate the signalscomponent
were segmented using sliding
of the accelerometer. Thewindows andextracted
features were filtered using a low-
from the
pass filter to eliminate the gravity component of the accelerometer. The features were
extracted from the processed data, such as the mean, RMS, autocorrelation, power
spectral density (PSD), and spectral peak. These features were input into the machine
learning algorithm. Classification performance was evaluated based on accuracy,
Sensors 2023, 23, 2666 6 of 19

processed data, such as the mean, RMS, autocorrelation, power spectral density (PSD), and
spectral peak. These features were input into the machine learning algorithm. Classification
performance was evaluated based on accuracy, precision, recall, and the F-1 score.

3.1. Dataset
Large-scale motion data are required to classify hand gestures using a machine learning
algorithm. Hence, the participants (male = 10, female = 2; right-hand dominant = 12;
26.6 ± 6, 24–30 years old) were asked to perform the motions of gesture mode (forward,
backward, right, left, stop, up, and down). The signal was recorded at 50 Hz to obtain
quaternion, acceleration, and angular velocity along the three axes. To prevent imbalanced
data, the number of each gesture data was 3300, and the overall number of data was 23,100.

3.2. Sensitivity Analysis and Preprocess


A sensitivity analysis was conducted with reduced processing time to ensure that key
parameters
Sensors 2022, 22, x FOR PEER REVIEW dominantly influence hand gestures. The results of the sensitivity analysis are 7 of 20
shown in Figure 7. Based on the results, the acceleration elements of x, y, z, and the angular
velocity of y and z are commonly influenced as key parameters of hand gestures.

Figure
Figure 7. Result of sensitivity
Result of sensitivityanalysis
analysisonon gesture
gesture mode
mode movements.
movements. In particular,
In particular, gestures
gestures such such
as as
forward,
forward, backward, up, and down were found to perform numerous pitch motions. Left and right right
backward, up, and down were found to perform numerous pitch motions. Left and
gestures wereknown
gestures were knowntotoperform
perform rollroll actions
actions primarily.
primarily. According
According to the to the results,
results, acceleration
acceleration of x, y, z of x,
y, z and
and angular
angular velocity
velocity of yzand
of y and werez determined
were determined to parameters
to be key be key parameters
in common.in common.

Raw signals
Raw signals were also processed
processed to to remove
remove the
the gravity
gravity component
componentforced
forcedon onthe
the
accelerometer using
accelerometer using aa low-pass
low-pass filter.
filter. These
Thesesignals
signalswere
weresegmented
segmentedwith
withfixed
fixedsliding
sliding
windows of
windows of 2.56
2.56ssand
andaa50%
50%overlap
overlap (128 readings/window)
(128 readings/window) [40–43].
[40–43].
3.3. Feature Extraction
3.3. Feature Extraction
Feature extraction is generally conducted to improve efficiency of the algorithm compu-
Feature
tation. extraction
The features is generally
are determined as conducted toand
time-domain improve efficiency offeatures
frequency-domain the algorithm
[40–45].
computation. The features are determined as time-domain and frequency-domain
features [40–45]. We computed features such as the mean, RMS, autocorrelation, power
spectral density (PSD), and spectral peak. The mean, RMS, and autocorrelation 𝑟 are
included in the time-domain features. The characteristics are as follows:
Sensors 2023, 23, 2666 7 of 19

We computed features such as the mean, RMS, autocorrelation, power spectral density
(PSD), and spectral peak. The mean, RMS, and autocorrelation rk are included in the
time-domain features. The characteristics are as follows:
1 N
mean =
N ∑ i =1 a i , (1)
r
1 2
a1 + a22 + · · · + a2N

RMS = (2)
N
∑iN=k+1 ( ai − a)( ai−k − a)
rk = 2
(3)
∑iN=1 ( ai − a)
where ai denotes the components of the acceleration and angular velocity, and N denotes
the window length. Equation 3 computes autocorrelation, which is the correlation between
ai and the delayed value ai+k , where k = 0, · · · , N, a denotes the mean of ai . PSD and
spectral peaks were included in the frequency-domain features. PSD was calculated as
the squared sum of the spectral coefficients normalized by sliding window length. PSD is
described as follows [44,45]:
1 N −1 2
N i∑
PSD = xi + y2i (4)
=0
   
2π f i 2π f i
with xi = zi cos N and yi = zi sin N , where z denotes the discrete data for the
frequency spectrum, and f represents the Fourier coefficient in the frequency domain. The
spectral peak was calculated as the height and position of PSD [44,45].

3.4. Classification Model


To employ an appropriate classification algorithm, we compared the classification
results of the ensemble, SVM, KNN, naive Bayes, and trees. The classifiers were used in
MATLAB, and the training and test data were used in 9 to 1.
The performance of the learned model was evaluated based on the following
expressions:
TP + TN
Accuracy = (5)
TP + TN + FP + FN
TP
Precision = (6)
TP + FP
TP
Recall = (7)
TP + FN
recall ∗ precision
F1 − score = 2 ∗ (8)
recall + precision
where TP is a true positive, TN is a true negative, FP is a false positive, and FN is a false
negative. Figure 8a illustrates the comparison results of performance. Figure 8b shows the
detailed results of each classifier’s accuracy. According to Table 1, the ensemble method
exhibited adequate performance in terms of classification accuracy.

Table 1. Comparison result of classifiers (%).

Ensemble SVM KNN Naive Bayes Trees


accuracy 97.9 95.1 92.8 84.0 83.6
precision 97.9 95.1 93.0 84.1 83.8
recall 97.9 95.1 92.9 84.1 83.8
F1-score 97.9 95.1 92.9 84.1 83.8
𝑇𝑃
𝑅𝑒𝑐𝑎𝑙𝑙 (7)
𝑇𝑃 𝐹𝑁
𝑟𝑒𝑐𝑎𝑙𝑙 ∗ 𝑝𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛
𝐹1 𝑠𝑐𝑜𝑟𝑒 2∗ (8)
𝑟𝑒𝑐𝑎𝑙𝑙 𝑝𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛
Sensors 2023, 23, 2666
where TP is a true positive, TN is a true negative, FP is a false positive, and FN is a 8 of 19
false negative. Figure 8a illustrates the comparison results of performance. Figure 8b

(a) (b)
Figure 8. Result of gesture mode classification: (a) Comparison accuracy of each classification
Figure 8. Result of gesture mode classification: (a) Comparison accuracy of each classification model.
model. (b) Confusion matrix of ensemble, SVM, KNN, naive Bayes, and trees. From top left, the
(b) Confusion matrix of ensemble, SVM, KNN, naive Bayes, and trees. From top left, the hand
hand motions of forward, backward, right, left, stop (hover), up, and down are presented in order.
motions of forward, backward, right, left, stop (hover), up, and down are presented in order.
Table 1. Comparison result of classifiers (%).
4. Simulation Experiment
Sensors 2022, 22, x FOR PEEREnsemble
REVIEW SVM
4.1. Gesture-Based Control KNN Naive Bayes Trees9 of 20
accuracy 97.9
4.1.1. Simulation95.1 Setup 92.8 84.0 83.6
precision 97.9 95.1 93.0 84.1 83.8
To assess the effectiveness of the proposed gesture control, the participants (male = 10,
To assess the95.1
effectiveness of the92.9
recall 97.9
female = 2; right-hand dominant = proposed
12; 26.6 ±gesture
6, 24–30control,
yearsthe
84.1 participants
old) performed (male =
83.8 a virtual flight
F1-score 10, female = 2; right-hand
97.9 95.1 dominant 92.9
= 12; 26.6 ± 6, 24-30 years
84.1 old) performed a83.8 virtual
simulation with hand gestures while equipped with a wearable device. The participants
flight simulation with hand gestures while equipped with a wearable device. The
did not have experience flying drones or were beginners. A schematic of the simulation
4.participants
Simulationdid not have experience flying drones or were beginners. A schematic of the
Experiment
mission
simulation mission is shown in 9.
is shown in Figure The9.participants
Figure The participantsexecuted
executedthetheappropriate hand gesture
appropriate hand
4.1.
to Gesture-Based
maneuver Control
the drone through thethe
1212waypoints describedbyby
gesture to maneuver the drone through waypoints described thethe passing windows.
passing
4.1.1.
These Simulation
windows. These Setup
waypoints were placed
waypoints wereaccording to the yellow
placed according guideline
to the yellow to pass,
guideline andand
to pass, one waypoint
was requiredwas
one waypoint to adjust
required the flight the
to adjust altitude of the drone.
flight altitude The The
of the drone. position of of
position thethedrone was
calculated at 50 Hz through
drone was calculated at 50 Hz quadrotor dynamics
through quadrotor from MATLAB/Simulink.
dynamics from MATLAB/Simulink. The simulation
The simulation
experiment experiment
comprised comprised
three three preliminary
preliminary sessions and sessions and the
the main main
experiment with a given
experiment
mission. Allwith a given mission.
participants All participants
were given sufficient were
time given sufficient
to complete time tomission
a flight complete with direct
a flight
and mission
gesture with direct and gesture modes.
modes.

Figure 9.
Figure 9.Mission
Missionschematic forfor
schematic drone flight
drone simulation
flight with gesture-based
simulation drone control.
with gesture-based drone control.

Average
Averagetrial duration,
trial duration,gesture accuracy,
gesture and number
accuracy, of gesture
and number ofrepetitions were
gesture repetitions were
calculated totoevaluate
calculated evaluate thethe
learning effect
learning and and
effect adaptability of gesture-based
adaptability drone drone control.
of gesture-based
control. Average
Average trial duration,
trial duration, meaningmeaning the required
the required average
average timetime for task
for task completion,
completion, was used to
was used to evaluate the learning effect in the subjects. The average gesture accuracy
evaluate the learning effect in the subjects. The average gesture accuracy was used to eval-
was used to evaluate the performance of the classification of hand gestures. The average
uate the performance of the classification of hand gestures. The average number of gesture
number of gesture repetitions, counting the frequency of motion changes, was used to
repetitions, counting the frequency of motion changes, was used to evaluate adaptability.
evaluate adaptability. Participants were also asked to respond to a questionnaire
regarding convenience of the control operation and physical discomfort after the
experiment. Questions were answered in the form of a 7-point Likert scale.

4.1.2. Results
Sensors 2023, 23, 2666 9 of 19

Participants were also asked to respond to a questionnaire regarding convenience of the


control operation and physical discomfort after the experiment. Questions were answered
in the form of a 7-point Likert scale.

4.1.2. Results
The results of the first and last trials were analyzed to investigate the controller’s
efficiency and the learning effect. Figures 10 and 11 indicate the drone trajectory and the
frequency of motion changes during the first and last attempts using direct mode control.
Compared with the first trial, the trajectory stabilized in the last trial. In addition, the
frequency of motion changes decreased because of the adaptation of direct mode control.
Sensors 2022, 22, x FOR PEER REVIEW
Average accuracy also increased as the experiment progressed, and there was a larger10 of 20
Sensors 2022, 22, x FOR PEER REVIEW 10 of 20
reduction in trial duration and repetition, as shown in Figure 12.

(a) (b)
(a) (b)
Figure 10. Example Figure 10. Example
of drone flight of drone flight
trajectory withtrajectory with direct
direct mode. Graymode.
line isGray line is guideline,
guideline, dotted
dotted line is line is
trajectory
Figure of first of
10. Example trial, andflight
drone blacktrajectory
line is trajectory of last
with direct trial. Gray
mode. Comparing the trajectory
line is guideline, between
dotted line isfirst
trajectory of first trial, and black line is trajectory of last trial. Comparing the trajectory between first
and lastoftrials,
trajectory the last
first trial, andone is aline
black more stable shape.
is trajectory (a) trial.
of last Isometric view ofthe
Comparing drone trajectory
trajectory can befirst
between found
and last trials,and
thebylast
last one isthe
adjusting
trials, a more
drone stable
altitude.
last one shape.
(b)
is a more Top (a)shape.
view
stable Isometric
of drone view of view
trajectory
(a) Isometric drone oftrajectory
can be determined can becan
found
by behavior
drone trajectory of drone
be found
by adjusting drone altitude.
according
by adjusting to(b)
drone Top
userview
thealtitude. theofTop
on (b) drone
x-y viewtrajectory
dimension. can be determined
of drone trajectory by behavior
can be determined of drone
by behavior of drone
according to theaccording
user ontothe thex-y
userdimension.
on the x-y dimension.

(a)
(a)

(b)
(b)
Figure 11. Frequency of gesture changes during trial duration with direct mode. (a) Classified ges-
ture11.
Figure
Figure 11. Frequency changes
of at first
Frequency
gesture oftrial.
changes (b) changes
gesture Classified
during gesture
during
trial changes
trial at last
duration
duration with with trial,
direct which
direct shows
mode.
mode. less change
(a)Classified
(a) Classified than
ges-
turefirst trial. at first trial. (b) Classified gesture changes at last trial, which shows less change than
changes
gesture changes at first trial. (b) Classified gesture changes at last trial, which shows less change than
first trial.
first trial.
s 2022, 22, x FOR PEER REVIEW
Sensors 2023, 23, 2666 10 of 19

(a)

(b)

(c)
Figure 12. Results of direct mode simulation (mean ± SD). (a) Total average accuracy of each
Figure 12. Results of direct mode simulation (mean ± SD). (a) Total average acc
trial. First trial is 91.5 ±7.4%, and last trial is 99 ± 2.4%. (b) Average trial duration. First trial is
First trial
227.7 is s,91.5
± 48.7 ±7.4%,
and last and±last
trial is 153.0 trial
18.1 s. is 99
(c) Total ± 2.4%.
repetitions (b) Average
of gestures. First trial istrial
77.8 ±duration.
32.8, F
and last trial is 47.2 ± 14.0.
48.7s, and last trial is 153.0 ± 18.1s. (c) Total repetitions of gestures. First trial i
last trial is 47.2 ± 14.0.

Similarly, Figures 13 and 14 show drone trajectory and frequency o


Sensors 2023, 23, 2666 11 of 19

Similarly, Figures 13 and 14 show drone trajectory and frequency of motion changes in
the first and last trials with gesture mode control. In the first trial, the drone trajectory was
unstable, especially when the user executed the gesture in the left direction. The frequency
of motion changes also decreased compared with the first trial. Across all trials, average
Sensors 2022,
Sensors 2022, 22,
22, xx FOR
FOR PEER REVIEWaccuracy tended to increase, as did a larger reduction in trial duration and repetition,
PEER REVIEW 12 as
12 of 20
of 20
shown in Figure 15. These results indicate that gesture-based control has learning effects
and adaptability.

(a)
(a) (b)
(b)
Figure
Figure 13.
Figure13. Example
13.Example
Example ofof
of drone
drone
drone flight
flight trajectory
flighttrajectory with
trajectorywith gesture
withgesture mode.
gesturemode. Gray
mode.Gray line
Grayline isisguideline,
lineis guideline, dotted
guideline,dotted line
dottedline
represents
represents drone
drone trajectory
trajectory of
of first
first trial,
trial, and
and black
black line
line is
is last
last trial.
trial. (a)
(a) Isometric
Isometric
line represents drone trajectory of first trial, and black line is last trial. (a) Isometric view can be view
view can
can be
be found
found
altitude
altitude control using
control
found altitude using gestures.
controlgestures. (b) Top
(b)
using gestures. Top(b) view
view
Topofof drone
drone
view trajectory
trajectory
of drone can be
can
trajectory be
canfigured
figured out drone
out
be figured drone move-
move-
out drone
ments
ments on x-y
on x-yondimension.
dimension. While
WhileWhile first trial
first first is
trialtrial low-accuracy
is low-accuracy and
and the
thethetrajectory
trajectory is unstable, last trial is
is
movements x-y dimension. is low-accuracy and trajectoryisisunstable, last trial
unstable, last trial
high-accuracy
high-accuracy and confirms
andand
confirms that
that the
thethetrajectory
trajectory appeared similar to the guideline.
is high-accuracy confirms that trajectoryappeared
appearedsimilar
similarto to the
the guideline.
guideline.

..
(a)
(a)

(b)
(b)
Figure
Figure 14.
Figure14.
Frequency
14.Frequency
of gesture
Frequencyofofgesture
changes
gesturechanges
during
changesduring
trial
duringtrial
duration
trialduration
with
durationwith
gesture
withgesture
mode.
gesturemode.
(a)
mode.(a)
Classified
(a)Classified
Classified
gesture changes
gesture changes at at first
first mission.
mission. (b)
(b) Classified
Classified gesture
gesture changes
changes at
at last
last trial,
trial, which
which shows
shows less
less
gesture changes at first mission. (b) Classified gesture changes at last trial, which shows less change
change than
change than first
first trial.
trial.
than first trial.
(b)
Figure 14. Frequency of gesture changes during trial duration with gesture mode. (a) Classified
Sensors 2023, 23, 2666 12 of 19
gesture changes at first mission. (b) Classified gesture changes at last trial, which shows less
change than first trial.

sors 2022, 22, x FOR PEER REVIEW 13 of

(a)

(b)

(c)
15. Results of gesture mode simulation (mean ± SD). (a) Average accuracy of first trial was
FigureFigure
15. Results of gesture mode simulation (mean ± SD). (a) Average accuracy of first trial w
85.9 ± 8.4%, and final trial was 96.8 ± 3.9%. (b) First trial duration was 264 ± 73.3 s, and last duration
85.9 ± 8.4%, and final trial was 96.8 ± 3.9%. (b) First trial duration was 264 ± 73.3 s, and last durati
was 155.7 ± 28.9 s, which shows decreased duration. (c) Difference of total repetitions between first
was 155.7 ± 28.9s,
and last trial is which
138.9 ± shows
59.1. decreased duration. (c) Difference of total repetitions between fi
and last trial is 138.9 ± 59.1.
4.2. Vibrotactile Feedback
4.2.1. Simulation Setup
4.2. Vibrotactile Feedback
An obstacle avoidance simulation experiment was conducted to demonstrate the
4.2.1. performance
SimulationofSetup
vibrotactile feedback. The participants, who were the same as mentioned
above,
An were asked
obstacle to avoid simulation
avoidance obstacles distributed in frontwas
experiment of theconducted
drone using to
gesture control. the
demonstrate
All the participants performed the simulation experiments three times with a fixed field of
performance of vibrotactile feedback. The participants, who were the same as mention
view, as shown in Figure 16a. In the first trial, participants controlled the drone without
above, were asked
vibrotactile to avoid
conditions. In obstacles distributed
the other trials, the dronein was
front of the drone
controlled using gesture
using vibrotactile
control. All the participants performed the simulation experiments three times with a
feedback. To assess the performance of the vibrotactile stimulation, the evaluator measured
fixed the
field of view,
collision asthrough
status shownthe intop
Figure 16a.
view of theIn theenvironment,
flight first trial, participants controlled
as shown in Figure 16b. the
drone without vibrotactile conditions. In the other trials, the drone was controlled usin
vibrotactile feedback. To assess the performance of the vibrotactile stimulation, the
evaluator measured the collision status through the top view of the flight environment
as shown in Figure 16b.
above, were asked to avoid obstacles distributed in front of the drone using gest
control. All the participants performed the simulation experiments three times w
fixed field of view, as shown in Figure 16a. In the first trial, participants controll
drone without vibrotactile conditions. In the other trials, the drone was controlle
Sensors 2023, 23, 2666 vibrotactile feedback. To assess the performance of the vibrotactile13stimulation,
of 19 t
evaluator measured the collision status through the top view of the flight enviro
as shown in Figure 16b.

nsors 2022, 22, x FOR PEER REVIEW

Figure 16. Mission schematic for vibrotactile feedback simulation experiment


the participants who were asked to control the drone flight avoiding obstacles.
flight environment
(a) to evaluate the control performance(b)with vibrotactile feed
Figure 16. Mission schematic for vibrotactile feedback simulation experiment. (a) Fixed view from
4.2.2. Result
the participants who were asked to control the drone flight avoiding obstacles. (b) Top view of drone
flight environment to evaluate the control performance with vibrotactile feedback.
Without vibrotactile feedback, the participants could not manipu
4.2.2. Result
appropriately because of a lack of visual feedback, as shown in Figure
Without vibrotactile feedback, the participants could not manipulate the drone appro-
successfully
priately because ofavoided all feedback,
a lack of visual obstacles withinvibrotactile
as shown feedback,
Figure 16a. The drone as illustr
successfully
avoided all obstacles with vibrotactile feedback, as illustrated in Figure 17.

Example
Figure 17. 17.
Figure of droneof
Example flight trajectories
drone flightwith and without vibrotactile
trajectories with andconditions.
withoutWhen the
vibrotactile
vibrotactile feedback did not work, the drone crashed into boxes because of lack of visual feedback
vibrotactile feedback did not work, the drone crashed into boxes because of la
(dashed line). With vibrotactile feedback, the drone achieved the mission of avoiding obstacles
(dashed
(black line). line). With vibrotactile feedback, the drone achieved the mission
(black line).

The results of the vibrotactile feedback simulation are presented


participants experienced crashes with obstacles at least once. When th
Sensors 2023, 23, 2666 14 of 19

The results of the vibrotactile feedback simulation are presented in Table 2. All
the participants experienced crashes with obstacles at least once. When the vibrotactile
stimulator was not used, the participants did not recognize the obstacles well with limited
visibility. Therefore, the drone crashed an average of 2.2 times. However, with vibrotactile
feedback, each participant detected obstacles via vibrotactile stimulation in the wrist and
effectively commanded the drone to avoid obstacles.

Table 2. Results of obstacle avoidance using vibrotactile feedback. Without vibrotactile feedback, the
drone failed to avoid obstacles; meanwhile, collision avoidance was completed successfully with
vibrotactile feedback.

First Trial Second Trial Third Trial


Participant No.
(without Feedback) (with Feedback) (with Feedback)
Participant 1 2/4 * 4/4 4/4
Participant 2 1/4 4/4 4/4
Participant 3 1/4 4/4 4/4
Participant 4 2/4 4/4 4/4
Participant 5 1/4 4/4 4/4
Participant 6 2/4 4/4 4/4
Participant 7 2/4 4/4 4/4
Participant 8 1/4 4/4 4/4
Participant 9 2/4 4/4 4/4
Participant 10 2/4 4/4 4/4
Participant 11 2/4 4/4 4/4
Participant 12 2/4 4/4 4/4
* Successful/avoidance trials.

4.3. Subjective Evaluation of Participants


Subjective evaluations were conducted to assess the user-friendliness and effectiveness
of the controller after the experiment. Participants were asked to respond to the experience
of use based on a 7-point Likert scale. The participants gave assessment points for the
user-friendliness and effectiveness of the direct and gesture modes. Similarly, participants
were asked to respond to the control with and without vibrotactile feedback. Table 3
shows the subjective evaluation of gesture-based drone control. The participants agreed
to the convenience and effectiveness of the controller by more than 6 points and did not
agree on discomfort and fatigue by less than 3 points. There are no significant differences
between the direct and gesture modes, confirming that the proposed controller is easy to
use and effective.

Table 3. Results of subjective evaluation by participants in gesture-based drone control.

Question Direct Mode Gesture Mode


1. The proposed gesture was natural for me. 6.4 ± 0.6 6.0 ± 0.7
2. I felt physical discomfort while controlling. 1.6 ± 0.6 2.0 ± 0.9
3. My hand and arm were tired while controlling. 2.0 ± 0.6 2.4 ± 0.8
4. The proposed gesture was user-friendly. 6.5 ± 0.9 6.3 ± 1.4
5. I felt the convenience of controlling a drone with one hand. 6.5 ± 0.6 6.6 ± 0.5
6. It was interesting to fly a drone with a gesture. 6.5 ±1.0 6.9 ± 0.3

Table 4 shows the subjective evaluations of the vibrotactile feedback. All the partic-
ipants responded that there were substantial differences between the control with vibro-
tactile feedback and the control relying on only limited visual information. They expected
that vibrotactile feedback would be helpful for drone control in the field.
Sensors 2023, 23, 2666 15 of 19

Table 4. Results of subjective evaluation by participants in vibrotactile feedback.

Question Mean ± SD
1. The vibrotactile feedback was helpful for obstacle avoidance. 6.9 ± 0.3
2. The vibration intensity was appropriate. 6.6 ± 0.6
3. My hand and wrist were tired by vibrotactile feedback. 1.4 ± 0.8
4. The obstacle avoidance was difficult without the vibrotactile feedback condition. 6.5 ± 0.7
5. If I flew a drone in real life, vibrotactile feedback would be helpful. 6.5 ± 0.3

5. Experimental Validation
5.1. Gesture-Based Drone Control
5.1.1. Setup
The implemented controller was tested to validate its performance. The flight test
was conducted using a quadrotor (AR.Drone 2.0, Parrot, Paris, France). In the mission
scenario, the user flew the drone through four gates using gesture control, as shown in
Figure 18. The drone cruised at 0.4 m/s and hovered at a height of about 0.8 m. The user
was positioned within gray dotted lines (Figure 18a) to ensure visibility for drone control.
Sensors 2022, 22, x FOR PEER REVIEW 16 of 20
Three gates were placed on the guideline, and one was installed on the table to control
up/down movements. An appropriate gesture was assigned for each trajectory, and the
user’s gesture was recognized and compared with the assigned gesture. To evaluate the
evaluate the performance, gesture accuracy was evaluated using drone states, classified
performance, gesture accuracy was evaluated using drone states, classified gestures, and
gestures, and aerial videos during the experiment.
aerial videos during the experiment.

(a) (b)
Figure
Figure 18.18. Experiment
Experiment scenario
scenario with
with gesture-based
gesture-based drone
drone control.
control. (a)(a) Schematic
Schematic diagram
diagram forfor exper-
experi-
iment. The yellow guideline was installed for reference of drone trajectory. The user walked within
ment. The yellow guideline was installed for reference of drone trajectory. The user walked within
gray dotted lines to secure the view. (b) Experiment environment based on schematic was set in-
gray dotted lines to secure the view. (b) Experiment environment based on schematic was set indoors
doors for safe operation.
for safe operation.
5.1.2.
5.1.2. Result
Result
The
The entire
entire recognition
recognition rateininthe
rate thedirect
directmode
modeisisshown
shownin inFigure
Figure19a.
19a. The
The average
average
accuracy
accuracy of of
thethe direct
direct mode
mode waswas 96.4%,
96.4%, andand
thethe mission
mission duration
duration waswas
103103s. The
s. The classifica-
classification
tion rate in mode
rate in the gesture the gesture mode
condition is condition is shown
shown in Figure 19b.inThe
Figure 19b. accuracy
average The averageof the
accuracy of the gesture mode was 98.2%, and the
gesture mode was 98.2%, and the mission duration was 119 s. mission duration was 119 s.
gray dotted lines to secure the view. (b) Experiment environment based on schematic was set in-
doors for safe operation.

5.1.2. Result
The entire recognition rate in the direct mode is shown in Figure 19a. The average
Sensors 2023, 23, 2666 16 of 19

(a) (b)
Figure19.
Figure 19. Accuracy
Accuracyof
ofreal
realdrone
dronecontrol.
control.(a)
(a)Average
Averageaccuracy
accuracyofof direct
direct mode
mode is about
is about 96%,
96%, which
which is
is about 2.6% lower than the simulation. (b) Average accuracy of gesture mode is about 98%, which
about 2.6% lower than the simulation. (b) Average accuracy of gesture mode is about 98%, which is
is about 1.4% higher than the simulation.
about 1.4% higher than the simulation.

5.2. Vibrotactile
5.2. Vibrotactile Feedback
Feedback
5.2.1.
5.2.1. Setup
Setup
Vibrotactile
Vibrotactile feedback
feedback was tested
tested for
forefficiency
efficiencyinina areal
realenvironment,
environment, asas shown
shown in in
Figure
Figure20.
Sensors 2022, 22, x FOR PEER REVIEW 20.The
Theuser
usercontrols
controlsthe
thedrone
droneusing
usinggesture
gesture mode
mode through
through gates to to
gates avoid
avoidobstacles.
17 of 20
To demonstrate
obstacles. the effectiveness
To demonstrate of vibrotactile
the effectiveness feedback, the
of vibrotactile user’s visibility
feedback, the user’swas limited
visibility
by
wasthe blinding
limited of the
by the panel, of
blinding which blocked
the panel, partblocked
which of the obstacle
part of and the gateand
the obstacle to pass. The
the gate
first trialThe
to pass. wasfirst
performed without
trial was performeda vibrotactile
without acondition.
vibrotactileThe second trial
condition. Thewas conducted
second trial
withwas conductedfeedback,
vibrotactile with vibrotactile feedback,
representing representing
the distance between thethe
distance
drone between the drone
and the obstacle.
and the obstacle.

(a) (b)
Figure
Figure 20. 20. Flight
Flight experiment
experiment for for vibrotactile
vibrotactile feedback
feedback withwith gesture
gesture mode.
mode. (a) (a) Schematic
Schematic diagram
diagram
comprising visual obstruction, drone, passing gates, obstacle, and guideline. (b) Based on the
comprising visual obstruction, drone, passing gates, obstacle, and guideline. (b) Based on the scheme,
scheme, flight experiment was set up.
flight experiment was set up.

5.2.2.
5.2.2. Result
Result
TheThe drone
drone failedtotoavoid
failed avoid collisions
collisions without
without vibrotactile
vibrotactilefeedback,
feedback,asasshown
shownin in
Figure 21a. The user cannot estimate the distance
Figure 21a. The user cannot estimate the distance between between the drone and the obstacle
drone and the obstacle
owing
owing to to limited
limited visualinformation.
visual information.However,
However, collision
collision avoidance
avoidance waswasachieved
achieved suc-
successfully
cessfully with vibrotactile
with vibrotactile feedback,
feedback, as shown
as shown in Figure
in Figure 21b.21b. In addition,
In addition, thethe distance
distance
to the
to the obstacle
obstacle inin frontofofthe
front thedrone
dronewaswasinferred
inferredthrough
through the
the intensity of
of the
the vibrotactile
vibrotac-
stimulation, and the mission was carried out by maintaining a certain distance from the
obstacle.
scheme, flight experiment was set up.

5.2.2. Result
The drone failed to avoid collisions without vibrotactile feedback, as shown in
Figure 21a. The user cannot estimate the distance between the drone and the obstacle
Sensors 2023, 23, 2666 17 of 19
owing to limited visual information. However, collision avoidance was achieved
successfully with vibrotactile feedback, as shown in Figure 21b. In addition, the distance
to the obstacle in front of the drone was inferred through the intensity of the vibrotactile
stimulation,
tile and the
stimulation, andmission was carried
the mission was out by maintaining
carried a certain distance
out by maintaining from
a certain the
distance from
obstacle.
the obstacle.

(a) (b)
Figure 21. Result of vibrotactile feedback. (a) Drone could not avoid the collision in front. Owing to
Figure 21. Result of vibrotactile feedback. (a) Drone could not avoid the collision in front. Owing to
the lack of visual information for the obstacle, the mission failed. (b) Drone flew successfully
the lack of
through visual
gates, andinformation forobstacle
distance to the the obstacle, the mission
was maintained failed.
using (b) Drone
vibrotactile flew successfully through
feedback.
gates, and distance to the obstacle was maintained using vibrotactile feedback.
5.3. Discussions
5.3. Discussions
The presented drone controller exhibited user-friendly and intuitive features, which
can be used more easily by an inexperienced user for drone maneuvering. Through
simulation experiments on drones, an effective interface for drone control was revealed
by estimating control accuracy and mission duration. An experiment on a real drone was
carried out to validate the effectiveness of the proposed controller. Indeed, vibrotactile
feedback was helpful in detecting obstacles compared to using only visual information.
The participants also felt more comfortable with and interested in the proposed controller.
In a further study, the addition of yawing control of the drone was considered to
implement a supplementary sensor to measure the hand motion’s orientation accurately.
Furthermore, implementing additional ultrasonic sensors on a drone enables the omnidi-
rectional distance measurement of obstacles. The aforementioned approaches are expected
to complete a wearable drone control system with a natural user interface.

6. Conclusions
This study proposed a wearable drone controller with hand gesture recognition and
vibrotactile feedback. The hand motions for drone control were sensed by an IMU attached
to the back of the hand. The measured motions were classified based on machine learning
using the ensemble method with a classification accuracy of 97.9%. Furthermore, the dis-
tance to obstacles in the heading direction of the drone was fed back to the user stimulating
the vibration motor attached to the wrist, known as vibrotactile feedback. In the simulation
experiment by the participant group, the hand gesture control showed good performance,
and the vibrotactile feedback helped the user be aware of the operation environment of the
drone, especially when limited visual information was available. A subjective evaluation of
the participants was performed to assess the convenience and effectiveness of the proposed
controller. Finally, an experiment with a real drone was conducted, confirming that the
proposed controller could be applicable for drone operation as a natural interface.

Author Contributions: Conceptualization and methodology, K.-H.Y. and J.-W.L.; simulation and
experiment, J.-W.L.; analysis and validation, K.-H.Y. and J.-W.L.; writing, K.-H.Y. and J.-W.L. All
authors have read and agreed to the published version of the manuscript.
Funding: This research has been supported by the National Research Foundation of Korea (NRF)
under Grant 2021R1A2C1009127.
Sensors 2023, 23, 2666 18 of 19

Acknowledgments: The authors would like to thank Byeong-Seop Sim for his technical support in
fabricating the device, Kun-Jung Kim for his technical advice about the machine learning approach,
and Min-Chang Kim for his support in performing the experiment.
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Bouabdallah, S.; Becker, M.; Siegwart, R. Autonomous miniature flying robots: Coming soon!—Research, Development, and
Results. IEEE Robot. Autom. Mag. 2007, 14, 88–98. [CrossRef]
2. Shakhatreh, H.; Sawalmeh, A.H.; Al-fuqaha, A.; Dou, Z.; Almaita, E.; Khalil, I.; Othman, N.S.; Khreishah, A.; Guizani, A.M.
Unmanned Aerial Vehicles (UAVs): A Survey on Civil Applications and Key Research Challenges. IEEE Access 2019, 7,
48572–48634. [CrossRef]
3. Zeng, Y.; Zhang, R.; Lim, T.J. Wireless Communications with Unmanned Aerial Vehicles: Opportunities and Challenges. IEEE
Commun. Mag. 2016, 54, 36–42. [CrossRef]
4. Chen, J.Y.C.; Barnes, M.J.; Harper-Sciarini, M. Supervisory Control of Multiple Robots: Human-Performance Issues and User-
Interface Design. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 2011, 41, 435–454. [CrossRef]
5. Zhou, J.; Zhi, H.; Kim, M.; Cummings, M.L. The Impact of Different Levels of Autonomy and Training on Operators’ Drone
Control Strategies. ACM Trans. Hum. -Robot Interact. 2019, 8, 22. [CrossRef]
6. Smolyanskiy, N.; Gonzalez-Franco, M. Stereoscopic First Person View System for Drone Navigation. Front. Robot. AI 2017, 4, 11.
[CrossRef]
7. Fong, T.; Conti, F.; Grange, S.; Baur, C. Novel interfaces for remote driving: Gestures, haptic and PDA. In Proceedings of the
Society of Photo-Optical Instrumentation Engineers, San Diego, CA, USA, 29 July–3 August 2001; Volume 4195, pp. 300–311.
8. Fernandez, R.A.S.; Sanchez-Lopez, J.S.; Sampedro, C.; Balve, H.; Molina, M.; Campoy, P. Natural User Interface for Human-Drone
Multi-Modal Interaction. In Proceedings of the 2016 International Conference on Unmanned Aircraft Systems, Arlington, VA,
USA, 7–10 June 2016; pp. 1013–1022.
9. Tezza, D.; Andujar, M. The State-of-the Art of Human-Drone Interaction: A Survey. IEEE Access 2019, 7, 167438–167454. [CrossRef]
10. Hakim, N.L.; Shih, T.K.; Arachchi, S.P.K.; Aditya, W.; Chen, Y.-C.; Lin, C.-Y. Dynamic Hand Gesture Recognition Using 3DCNN
and LSTM with FSM Context-Aware Model. Sensors 2019, 19, 5429. [CrossRef]
11. D’Eusanio, A.; Simoni, A.; Pini, S.; Borghi, G.; Vezzani, R.; Cucchiara, R. Multimodal Hand Gesture Classification for the
Human-Car Interaction. Informatics 2020, 7, 31. [CrossRef]
12. Choi, J.-W.; Ryu, S.-J.; Kim, J.-W. Short-Range Radar Based Real-Time Hand Gesture Recognition Using LSTM Encoder. IEEE
Access 2019, 7, 33610–33618. [CrossRef]
13. Han, H.; Yoon, S.W. Gyroscope-Based Continuous Human Hand Gesture Recognition for Multi-Modal Wearable Input Device for
Human Machine Interaction. Sensors 2019, 19, 2562. [CrossRef] [PubMed]
14. Cifuentes, J.; Boulanger, P.; Pham, M.T.; Prieto, F.; Moreau, R. Gesture Classification Using LSTM Recurrent Neural Networks.
In Proceedings of the 2019 International Conference on IEEE Engineering in Medicine and Biology Society, Berlin, Germany,
23–27 July 2019; pp. 6864–6867.
15. Zhao, H.; Ma, Y.; Wang, S.; Watson, A.; Zhou, G. MobiGesture: Mobility-aware hand gesture recognition for healthcare. Smart
Health 2018, 9–10, 129–143. [CrossRef]
16. Liu, C.; Sziranyi, T. Real-Time Human Detection and Gesture Recognition for On-Board UAV Rescue. Sensors 2021, 21, 2180.
[CrossRef]
17. Begum, T.; Haque, I.; Keselj, V. Deep Learning Models for Gesture-controlled Drone Operation. In Proceedings of the 2020
International Conference on Network and Service Management, Izmir, Turkey, 2–6 November 2020.
18. Hu, B.; Wang, J. Deep Learning Based Hand Gesture Recognition and UAV Flight Controls. Int. J. Autom. Comput. 2020, 17, 17–29.
[CrossRef]
19. Shin, S.-Y.; Kang, Y.-W.; Kim, Y.-G. Hand Gesture-based Wearable Human-Drone Interface for Intuitive Movement Control. In
Proceedings of the 2019 IEEE International Conference on Consumer Electronics, Las Vegas, NV, USA, 11–13 January 2019.
20. Burdea, G. Force and Touch Feedback for Virtual Reality; John Wiley & Sons, Inc.: New York, NY, USA, 2000.
21. Bach-y-Rita, P.; Kercel, S.W. Sensory substitution and the human-machine interface. Trends Cogn. Sci. 2003, 7, 541–546. [CrossRef]
[PubMed]
22. Erp, J.B.F.V. Guidelines for the use of vibro-tactile displays in human computer interaction. In Proceedings of the Eurohaptics
Conference, Edingburgh, UK, 8–10 July 2002; pp. 18–22.
23. Dakopoulos, D.; Bourbakis, N.G. Wearable Obstacle Avoidance Electronic Travel Aids for Blind: A Survey. IEEE Trans. Syst. Man
Cybern. C Appl. Rev. 2010, 40, 25–35. [CrossRef]
24. Kim, D.; Kim, K.; Lee, S. Stereo camera based virtual cane system with identifiable distance tactile feedback for the blind. Sensors
2014, 14, 10412–10431. [CrossRef]
25. Tsykunov, E.; Labazanova, L.; Tleugazy, A.; Tsetserukou, D. SwamTouch: Tactile Interaction of Human with Impedance Controlled
Swarm of Nano-Quadrotors. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems,
Madrid, Spain, 1–5 October 2018.
Sensors 2023, 23, 2666 19 of 19

26. Rognon, C.; Ramachandran, V.; Wu, A.R.; Ljspeert, A.J.; Floreano, D. Haptic Feedback Perception and Learning with Cable-Driven
Guidance in Exosuit Teleoperation of a Simulated Drone. IEEE Trans. Haptics 2019, 12, 375–385. [CrossRef]
27. Duan, T.; Punpongsanon, P.; Iwaki, D.; Sato, K. FlyingHand: Extending the range of haptic feedback on virtual hand using
drone-based object recognition. In Proceedings of the SIGGRAPH Asia 2018 Technical Briefs, Tokyo, Japan, 4–7 December 2018.
28. Lee, J.W.; Kim, K.J.; Yu, K.H. Implementation of a User-Friendly Drone Control Interface Using Hand Gestures and Vibrotactile
Feedback. J. Inst. Control Robot. Syst. 2022, 28, 349–352. [CrossRef]
29. Lim, S.C.; Kim, S.C.; Kyung, K.U.; Kwon, D.S. Quantitative analysis of vibrotactile threshold and the effect of vibration frequency
difference on tactile perception. In Proceedings of the SICE-ICASE International Joint Conference, Busan, Korea, 18–21 October
2006; pp. 1927–1932.
30. Yoon, M.J.; Yu, K.H. Psychophysical experiment of vibrotactile pattern perception by human fingertip. IEEE Trans. Neural Syst.
Rehabil. Eng. 2008, 16, 171–177. [CrossRef]
31. Jeong, G.-Y.; Yu, K.-H. Multi-Section Sensing and Vibrotactile Perception for Walking Guide of Visually Impaired Person. Sensors
2016, 16, 1070. [CrossRef] [PubMed]
32. Ahmed, M.A.; Zaidan, B.B.; Zaidan, A.A.; Salih, M.M.; Bin Lakulu, M.M. A Review on System-Based Sensory Gloves for Sign
Language Recognition State of the Art between 2007 and 2017. Sensors 2018, 18, 2208. [CrossRef] [PubMed]
33. Avola, D.; Bernardi, M.; Cinque, L.; Foresti, G.L.; Massaroni, C. Exploiting Recurrent Neural Networks and Leap Motion
Controller for the Recognition of Sign Language and Semaphoric Hand Gesture. IEEE Trans. Multimed. 2019, 21, 234–245.
[CrossRef]
34. Nagi, J.; Giusti, A.; Di Caro, G.A.; Gambardella, L.M. Human Control of UAVs using Face Pose Estimates and Hand Gestures. In
Proceedings of the 2014 ACM/IEEE International Conference on Human-Robot Interaction, Bielefeld, Germany, 3–6 March 2014.
35. Vadakkepat, P.; Chong, T.C.; Arokiasami, W.A.; Xu, W.N. Fuzzy Logic Controllers for Navigation and Control of AR. Drone
using Microsoft Kinect. In Proceedings of the 2016 IEEE International Conference on Fuzzy Systems, Vancouver, BC, Canada,
24–29 July 2016.
36. DelPreto, J.; Rus, D. Plug-and-Play Gesture Control Using Muscle and Motion Sensors. In Proceedings of the 2020 ACM/IEEE
International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020; pp. 439–448.
37. Gromov, B.; Guzzi, J.; Gambardella, L.M.; Alessandro, G. Intuitive 3D Control of a Quadrotor in User Proximity with Pointing
Gestures. In Proceedings of the International Conference on Robotics and Automation, Paris, France, 31 May–31 August 2020;
pp. 5964–5971.
38. Visual Signals; Army Publishing Directorate: Washington, DC, USA, 2017.
39. Visual Aids Handbook; Civil Aviation Authority: West Sussex, UK, 1997.
40. Anguita, D.; Ghio, A.; Oneto, L.; Parra, X.; Reyes-Ortiz, J.L. Human Activity Recognition on Smartphones Using a Multiclass
Hardware-Friendly Support Vector Machine. In Ambient Assisted Living and Home Care; Bravo, J., Hervás, R., Rodríguez, M., Eds.;
IWAAL 2012; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2012; Volume 7657.
41. Anguita, D.; Ghio, A.; Onte, L.; Parra, X.; Reyes-Ortiz, J.L. A Public Domain Dataset for Human Activity Recognition Using
Smartphones. In Proceedings of the European Symposium on Artificial Neural Networks, Bruges, Belgium, 24–26 April 2013;
pp. 437–442.
42. Helou, A.E. Sensor HAR Recognition App. 2015. Available online: https://www.mathworks.com/matlabcentral/fileexchange/
54138-sensor-har-recognition-app (accessed on 20 November 2021).
43. Helou, A.E. Sensor Data Analytics. 2015. Available online: https://www.mathworks.com/matlabcentral/fileexchange/54139-
sensor-data-analytics-french-webinar-code (accessed on 20 November 2021).
44. Attal, F.; Mohammed, S.; Dedabrishvili, M.; Chamroukhi, F.; Oukhellou, L.; Amirat, Y. Physical Human Activity Recognition
Using Wearable Sensors. Sensors 2015, 15, 31314–31338. [CrossRef] [PubMed]
45. Dehzangi, O.; Sahu, V. IMU-Based Robust Human Activity Recognition using Feature Analysis, Extraction, and Reduction. In
Proceedings of the 2018 International Conference on Pattern Recognition, Beijing, China, 20–24 August 2018.

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.

You might also like