Download as pdf or txt
Download as pdf or txt
You are on page 1of 28

Comparison of Manual, Automatic, and Voice

Control in Wheelchair NavigationSimulation in


Virtual Environments: Performance Evaluation of
User and MotionSickness.
Enrique A. Pedroza-Santiago
Universidad Iberoamericana, cdmx
J. Emilio Quiroz-Ibarra (  jose.quiroz@ibero.mx )
Universidad Iberoamericana, cdmx
Erik R. Bojorges-Valdez
Universidad Iberoamericana, cdmx
Miguel A. Padilla-Castañeda
Universidad Nacional Autonoma de Mexico

Research Article

Keywords: Virtual Reality, Virtuals Enviroments, Wheelchair, Autonomous Control, Voice Control, Manual
Control

Posted Date: November 7th, 2023

DOI: https://doi.org/10.21203/rs.3.rs-3537753/v1

License:   This work is licensed under a Creative Commons Attribution 4.0 International License.
Read Full License

Additional Declarations: No competing interests reported.


Springer Nature 2021 LATEX template

”Comparison of Manual, Automatic, and


Voice Control in Wheelchair Navigation
Simulation in Virtual Environments:
Performance Evaluation of User and Motion
Sickness.”
Pedroza-Santiago Enrique A.1 , Quiroz-Ibarrra
J.Emilio2*† , Bojorges-Valdez Erik R.3†
and Padilla-Castañeda Miguel A.4†
1* Instituto de Investigacion Aplicada y Tecnologı́a InIAT,
Universidad Iberoamericana, Prol. Paseo de la Reforma, Santa
Fe, Zedec Sta Fé, Álvaro Obregón, 01376, Mexico City, Mexico.
2 Instituto de Investigacion Aplicada y Tecnologı́a InIAT,

Universidad Iberoamericana, Prol. Paseo de la Reforma, Santa


Fe, Zedec Sta Fé, Álvaro Obregón, 01376, Mexico City, Mexico.
3 Departamento de Estudios en Ingenierı́a para la Innovación DEII,

Universidad Iberoamericana, Prol. Paseo de la Reforma, Santa


Fe, Zedec Sta Fé, Álvaro Obregón, 01376, Mexico City, Mexico.
4 Instituto de Ciencias Aplicadas y Tecnologı́a ICAT, Universidad

Nacional Autonoma de México, Cto. Exterior S/N, C.U.,


Coyoacán, 04510, Mexico City, Mexico.

*Corresponding author(s). E-mail(s): jose.quiroz@ibero.mx;


Contributing authors: enriquepedroza2012@gmail.com;
erik.bojorges@ibero.mx; miguel.padilla@icat.unam.mx;
† These authors contributed equally to this work.

Abstract
Mobility is a fundamental necessity for individuals with physical dis-
abilities, and wheelchairs play a vital role in improving their quality of
life. In recent decades, there has been a growing interest in designing

1
Springer Nature 2021 LATEX template

2 Article Title

and developing more advanced wheelchairs, particularly regarding con-


trol systems that enable users to interact effectively and efficiently with
their chairs. Implementing different control systems in wheelchairs has
been the subject of research and development to enhance user experience
and overcome limitations associated with conventional controls. Virtual
reality (VR) is a valuable tool for testing new technologies, ensuring
user safety, and reducing development time. This article aims to present
a comparative evaluation of various controls implemented in a simu-
lated wheelchair, analyzing their effectiveness and ease of use through
navigation in three different daily-life scenarios. Twenty participants
(11 males and 9 females) received VR operation and controls training.
Each participant completed three tests per control in each scenario. The
results showed that participants performed better while navigating with
autonomous and manual controls than with voice controls. However,
participants experienced associated motion sickness during the interac-
tion. Contrary, voice control, despite causing less motion sickness, the
participants’ performance shows that a high voice command recogni-
tion rate and reliability are mandatory to be a viable option. Our setup
represents a useful experimental setup for future research and improve-
ments in developing wheelchair control systems in virtual environments
to cater to the individual needs and abilities of end-users in reality.

Keywords: Virtual Reality, Virtuals Enviroments, Wheelchair, Autonomous


Control, Voice Control, Manual Control
Springer Nature 2021 LATEX template

Article Title 3

1 Introduction
Wheelchair controls are devices designed to allow people with mobility dis-
abilities to operate the wheelchair independently and comfortably [1][2]. These
controls vary according to the individual needs of each person and can cover
a wide range of options [3]. Several types of controls are used in wheelchairs,
and selecting the appropriate control depends on the user’s abilities and pref-
erences [4]. Currently, there is an interest in improving the type of wheelchair
controls to provide greater comfort and safety to users. From joysticks to
autonomous control schemes. The joystick is an intuitive control widely used
in electric wheelchairs. It allows for controlling direction and speed through
simple movements of the joystick in different directions [5]. In an autonomous
control, some wheelchairs are equipped with remote controls that enable users
to operate their wheelchairs without directly touching the controls on the chair
[5][6]. These controls are particularly useful for individuals with limited hand
mobility. Similarly, other technologies, such as voice command controls, are
being tested for implementation in these devices [7]. Still, to ensure the safety
of users, they are tested and evaluated first through simulations, especially
in virtual reality environments [8][9]. This implementation enables individuals
with physical disabilities to practice and improve their wheelchair-driving skills
in a safe and controlled manner [10][11]. They can face different virtual chal-
lenges and scenarios, such as overcoming obstacles, navigating narrow spaces,
or performing complex maneuvers. At the same Time, users can receive real-
time feedback on their performance and learn new movement strategies [12].
In addition to being a training tool, virtual reality offers therapeutic benefits
[13]. Providing an immersive visual experience can help improve confidence,
coordination, and physical rehabilitation. Users can strengthen their motor
skills and work on their balance and stability.
Virtual reality environments consist of computer-generated scenes and
objects that allow users to feel immersed in them [14]. The visualization of this
environment is achieved through virtual reality headsets or similar devices.
These simulations can recreate situations involving risk or requiring testing
before implementation, such as medical procedures or the behavior of certain
devices [15]. Another advantage is the possibility of connecting devices such as
joysticks, motion-sensing gloves, microphones, or vests to achieve more effec-
tive interaction between the user and the machine [16] in a safer and more
economically simulated environment before its implementation in real envi-
ronments. Virtual objects can be manipulated in various ways, such as using
joysticks, voice commands, or video-tracking, as seen in Kinect [17]. Each
alternative requires user learning, so the user’s experience is a key factor in
choosing the control method [18]. A user can control a character using various
devices, depending on the virtual scenario or the control method that suits
them best. Manual controls are commonly used to manipulate physical and
virtual devices due to their ease of use. This type of control has been used since
the mid-20th century for military purposes [19] and has been adapted for other
uses, such as entertainment in video games, robot control, and even motorized
Springer Nature 2021 LATEX template

4 Article Title

wheelchair operation due to its functional design [20]. Using voice commands
has become popular as an alternative to operating video games. Users can
interact with games more naturally and seamlessly by speaking directly to
them, known as Natural Language Processing (NLP) [21]. In this work, the
speech recognition tool is used [22]. Speech recognition is a technology that
allows computers to understand and process the language and voice of individ-
uals. Models transcribe speech into text or actions, allowing users to interact
with technology through voice instead of typing or touch interfaces [23]. Cur-
rently, this option is being implemented in devices such as smartphones with
voice assistants, tablets, smart speakers (Alexa, Siri, Google) [24], and even
virtual games like ”Scream Go Hero” [25], where the main character moves
based on the player’s verbal commands. This type of control is advantageous
as it allows users to interact with devices remotely without physical contact,
although a quiet environment is required.
Within a virtual environment, some objects have programmed movements
and make decisions as needed, meaning they change their behavior depend-
ing on the situation they encounter, which is known as Machine Learning [26].
This type of implementation can be observed in games such as ”PacMan”
[27] or ”Ping Pong” [28], where enemies change direction as the character
advances through the field. More complex systems can also be seen utiliz-
ing this machine learning discipline, such as autonomous vehicles, which are
given a destination, plot a route, and avoid obstacles they may encounter [29].
Autonomous controls are primarily used in navigation and movement in both
robots and commercial vehicles [30]. This control is constantly improving and
can become a useful tool, especially for individuals with mobility issues. How-
ever, there are advantages and disadvantages to using these controls, which
may be more comfortable or feasible for some individuals compared to oth-
ers, depending on their operation or task. Manual controls, operated through
a joystick, button controls, or other means that require user intervention, are
easy to use and provide physical feedback, such as vibration, which produces
a more natural sensation [31]. On the other hand, autonomous systems have
the advantage of performing tasks and making decisions independently, taking
various alternatives depending on the situation, making them convenient as
they do not depend on user intervention [32]. However, autonomous systems
can be challenging for individuals, especially those not used to technology.
Besides, repairing them in case of failure can be costly and complicated. Voice
controls are reliable and fast, particularly when it comes to tasks related to
text generation, and they have high accuracy [33]. They are used for tasks in
intelligent assistants or remote operations in education, healthcare, security,
military, or entertainment industries [34]. However, being a voice-interacting
system, they need to be in quiet environments or have a high signal-to-noise
ratio to accurately identify voice, which can be a disadvantage in crowded or
noisy places [35].
Each control system has specific characteristics that facilitate task exe-
cution compared to others. For example, in the study by Astuti [36], voice
Springer Nature 2021 LATEX template

Article Title 5

commands are used to start the engine and control a car. Similarly, in
autonomous control systems, obstacles can be identified along the way, as
observed in Chatziparaschis [37], where a rescue robot makes decisions instead
of an expert who would operate the system manually.
In another study, Jayakodi [38] implemented a voice command module in
a physical wheelchair in a controlled environment, enabling smooth movement
and quick response to user requests. On the other hand, Alim [39] incorpo-
rated voice commands through a microphone and an Arduino controller in a
wheelchair, which responds to the user’s movement instructions until it detects
an object in its path. Infrared sensors were installed around the wheelchair for
this purpose. It is interesting to see how these controls are being implemented
in various fields and applications, whether to improve the mobility of people
with physical disabilities, operate vehicles, or explore and rescue in dangerous
areas, among others. The use of voice recognition technologies and autonomous
systems enables a higher degree of interaction and control with devices and
systems, which can positively impact people’s quality of life and the efficiency
and safety of various industries. It is important to note that although these
controls offer many advantages, evaluating their safety and reliability is also
necessary. Autonomous systems can have failures and errors, and appropriate
safety measures must be implemented to prevent accidents and operational
problems. Additionally, it is important that these controls are accessible and
easy to use for all individuals and that exclusion or discrimination due to lack
of access or technical knowledge is avoided.
The use of these control systems may be more convenient or practical for
users, depending on the purpose or conditions of the individual. However, to
ensure their safety, it is necessary to conduct tests in virtual environments
before implementing them on a physical device.
Summing up, to evaluate the best possible behavior of a wheelchair control
system and minimize risks, simulation presents itself as an adequate research
method. For example, in Faria [40], a virtual wheelchair controlled by a physical
device (joystick) was implemented. In contrast, in the research carried out by
Harkishan [41], an autonomous control system is shown where the wheelchair
follows a trajectory until reaching its destination.
With these motivations, this paper presents an experimental study com-
paring three types of driving control systems (manual, voice, and autonomous)
implemented in a wheelchair in a virtual reality environment to determine their
effectiveness, usability, and ease of use in everyday scenarios. These control
systems were evaluated in three virtual scenarios designed to represent typi-
cal daily life environments, such as a supermarket, a museum, and a city. The
structure of this document is divided into five sections: introduction (Section
1), methodology and materials used (Section 2), results obtained (Section 3),
discussion of the results (Section 4), and conclusions (Section 5).
Springer Nature 2021 LATEX template

6 Article Title

2 Methods
The simulator consists of two functional modules. Firstly, there is a physical
workstation that serves as a starting point for users. Secondly, the simulation
module based on virtual reality technology is responsible for recreating immer-
sive scenarios. The physical workstation is equipped with specialized software
that allows the recreation of these virtual environments. In addition to the
virtual reality goggles used by users to visualize these environments, an addi-
tional screen is available to observe the participants’ movements and provide
assistance if necessary.

Fig. 1 Experimental setup with a participant interacting with one of the virtual environ-
ments.

2.1 Workstation
The virtual environment testing was conducted in a specialized computer lab-
oratory equipped with the following components: an HP 280 G3 computer
running Windows 10 operating system, an Intel Xenon processor, 256 GB SSD
storage capacity, and 64 GB RAM. The computer was also used to collect user
information during the experimental interactive sessions with the users.

2.2 Software and Virtual Tools


The Unity platform was chosen as the main development engine due to its easy
integration with virtual environments and virtual reality headsets [42]. The
virtual simulation and computer connection was implemented using the C#
programming language [43]. For visualization of the virtual environment, Ocu-
lus Rift [44] virtual reality headsets with 128 GB storage capacity were used,
along with an additional screen that allowed the technical team to observe
Springer Nature 2021 LATEX template

Article Title 7

the application in operation and provide timely instructions to the user (see
Figure 1).

2.3 Setup Integration


Once the virtual scenario was implemented, it was necessary to visualize it
using the Oculus virtual reality headsets and an additional screen. The virtual
reality headsets allowed for observation of the scenario in a 360° environment,
enabling exploration of every designed corner. The extra screen allowed the
team to observe the movements of the user wearing the virtual reality headsets.
The connection between the Unity platform and the virtual reality headsets is
illustrated in the diagram shown in Figure 2.

Fig. 2 Connection Diagram between Unity Platform and Oculus Headset.

2.4 Implemented Controls


Three types of controls were implemented to control and navigate the virtual
wheelchair: manual control, autonomous control, and voice command control.

2.4.1 Manual Control


The joystick is one of the most commonly used controls to interact with objects
in virtual environments. This control consists of a lever rotating around its
axis, and its movement is projected onto the controlled object. In this case,
the user can operate the wheelchair using this control, which is implemented
in the controls of the Oculus Rift headset. Due to its rotational nature, it
allows free movement in any direction within the virtual scenarios, simulating
the movement of electric wheelchairs.

2.4.2 Autonomous Control


Autonomous control refers to a system installed in a machine that utilizes com-
puter systems capable of emulating human-like driving and control behavior
Springer Nature 2021 LATEX template

8 Article Title

[45]. In the case of the virtual wheelchair, autonomous control enables navi-
gation within the virtual environment while avoiding obstacles along the way.
Autonomous navigation of the wheelchair is based on the A* (A-star) algo-
rithm [46], which is a graph search algorithm used to find the shortest path
between a starting node and a destination node. The A* algorithm employs
a heuristic function to estimate the remaining cost of the path from the cur-
rent node to the destination node. This heuristic function is combined with
the accumulated cost of the already traveled path to determine which node
should be explored next. The algorithm selects the node with the lowest sum
of these two values, i.e., the shortest ”distance” from the current node to the
destination node. This algorithm is widely used in video games, robotics, plan-
ning systems, and general navigation applications [47]. The implementation
of this algorithm consists of route finding and area limitation. To achieve this
result, a simple object, such as a cube, was constructed to reach a destina-
tion point by being physically controlled, which suggests the way the system
should move. Subsequently, when the object could move autonomously, it was
tuned to follow the found route and was assigned a score each Time it suc-
cessfully reached the destination. To detect objects, the technique of bounding
circles [48] was implemented, which places a circumference around objects to
establish tolerance space to trigger contact detection before physically collid-
ing. This circumference defines the navigable areas and obstacles that game
objects must avoid. In this way, each object in the virtual scenario was assigned
a circumference or boundary so that the virtual wheelchair could maintain a
safe distance and avoid collisions by taking alternative routes. Next, to allow
the user to navigate with the virtual wheelchair and reach scattered objects
in the scenario, a map of the place is presented to the user, who can choose
a destination, and the wheelchair then autonomously calculates a route and
navigates through the virtual environment, avoiding obstacles and following
the path to the destination.

2.4.3 Voice Control


Voice control or voice commands are natural instructions or communication
between a user and a computer, where the computer understands and executes
the requested instructions [49]. This technology offers significant advantages,
such as the possibility of not needing hands or a screen, a generally simpler
user interface, and the ability to control the computer remotely. In this case,
the ’Speech Recognition API’ [50] was used, a package defined in Unity capable
of recognizing voice in multiple languages. This package works through logical
grammar models [51], which allows understanding the structure of a language,
giving coherence to a series of words or sentences and understanding their
meaning. To use this package, it was necessary to implement a dictionary and
create a list of words linked to specific movement instructions. These words
are listed in Table 1.
Springer Nature 2021 LATEX template

Article Title 9

Instructions
Forward Right
Move to (forward, back, right, left) Left
Go to (forward, back, right, left) Turn right
Address (forward, back, right, left) Turn left
Drive Turn Stop
Advance Cease
Back Discontinue
Return Halt
Reverse (Direction) Break
Table 1 Voice Control Instructions

2.5 Experimental Virtual Reality Scenarios


Three virtual scenarios of everyday use were used to observe and compare the
implemented controls. These scenarios included a supermarket, a museum, and
a city, as explained below.

Fig. 3 Example of a Path Tour on the Supermarket ”Scene Floor Plan”.

Fig. 4 Interior view of the Supermarket Scene.


Springer Nature 2021 LATEX template

10 Article Title

2.5.1 Supermarket
This scenario represents an enclosed space of a typical supermarket scenario
in a daily life situation. It was designed with ten aisles, fourteen shelves, two
refrigerators, five checkout counters, one ATM, and a door that functions as
an entrance and exit, as shown in the floor plan in Figure 4. Due to its narrow
space, caution must be taken with surrounding objects to avoid collisions, as
would happen in a real supermarket. The task to complete in this scenario
is to collect five products distributed throughout the environment to gather
data related to the different types of implemented control, avoiding random
obstacles such as moving people or shopping carts. An example of a traversal
through the scenario is shown in Figure 3.

2.5.2 Museum
This scenario has wider corridors and more distant objects than the previous
one. It has a total of 3 exhibition rooms: the Egyptian room, the Oriental
room, and the Indian room, which are connected through an access, as can be
seen in the floor plan in Figure 5. Each room contains a total of four exhibits
in the Oriental room, six objects in the Egyptian room, and eight objects in
the Indian space. In this scenario, five objects related to the museum were
distributed, which the users had to retrieve while avoiding obstacles.

Fig. 5 Example of a Path Tour on the Museum Scene ”Floor Plan” and ”Interior View”.

2.5.3 City
This scenario represents a part of a city with streets, buildings, shops, side-
walks, parks, and people, among other elements. It is classified as an open place
with no clear edges or boundaries, as objects are connected through streets.
Springer Nature 2021 LATEX template

Article Title 11

The scenario features 22 buildings, four primary and three secondary streets
(Figure 6). In this scenario, five objects are distributed throughout the space,
but people move randomly and can obstruct the user’s path, requiring the user
to choose an alternative route.

Fig. 6 Example of a Path Tour on the City Scene ”Floor Plan” and ”Air View”.

3 Experiments
3.1 Objective of the Study
The main objective of the present study was to evaluate the effectiveness of
the controls used, as well as their ease of use in each implemented scenario and
the acceptance by the users. Additionally, the study aims to investigate how
these tools can improve the user experience and navigation for wheelchairs.
Furthermore, the influence of motion sickness on utilizing these controls was
investigated, as well as the users’ adaptability to operating a wheelchair in a
virtual or physical environment.

3.2 Participants
Twenty participants (9 females and 11 males) between the ages of 18 and 65
took part in this experiment. All participants were right-handed and did not
have visual perception problems, mobility difficulties, or similar conditions. All
volunteers provided written informed consent, and the study was approved by
the ethics committee of Universidad Iberoamericana, in accordance with the
Helsinki Declaration. The participants were positioned in front of the computer
and used virtual reality goggles, seated in a comfortable chair one meter away
from the computer and a screen that allowed them to visualize the simulation.

3.3 Experimental procedure


First, the participants were positioned in front of the computer and wearing
the virtual reality head-mounted display. They were seated in a comfortable
chair one meter away from the computer and a screen that allowed them to
Springer Nature 2021 LATEX template

12 Article Title

visualize the simulation from an adequate point of view. Then, each partici-
pant completed three attempts in each scenario, for a total of nine attempts,
using a different control in each one, with a 5-minute rest between attempts,
in a random counterbalanced order. A time limit of 5 minutes per scenario was
established, although the participants could finish the tests in less Time. Par-
ticipants navigated each scenario in search of various target objects scattered
throughout the place, picking them up and trying to avoid other obstacles to
prevent collisions. Collisions occurred when the circumference assigned to the
wheelchair encountered an object or wall in the scenario. Participants were
instructed to adopt a comfortable posture, avoid sudden movements or stand
up during the session. They were informed that an assistant would always be
available in case they needed to stop the game.

3.4 Measurements
Three ordinal scale metrics (Time, distance, and number of collisions) and one
nominal type metric (motion sickness) were measured from the simulation of
the scenarios. These metrics helped analyze the participants’ movements and
understand the relationship between their performance and the obtained data.
To obtain the total Time for each simulation, a stopwatch was used, which
started when the wheelchair started moving and stopped when it reached the
endpoint or exit. To measure the distance traveled, coordinates (x, y) were
tracked from the origin to the final point where the simulation was stopped.
Then the mean velocity and acceleration were calculated from the recorded
path.

3.5 System Assessment


The questionnaire used to assess the experience comprises three questions that
scored the perceptual experience regarding the perception of how much tired
the participants perceived during the navigation (Q1), how easy it was to
control the wheelchair movements (Q2), how much the participants liked the
control used for wheelchair movements (Q3). These questions were scored using
a five-point Likert scale (1 for ”strongly disagree,” 5 for ”strongly agree,” and
3 for ”neutral”).
In addition, two binary questions evaluate the presence or absence of dis-
comfort and dizziness during the navigation to evaluate possible undesired side
effects during the immersive experience.
Table 2 shows the complete question statements applied after each trial for
specific scenarios and control combinations for nine applied questionnaires per
participant.
Springer Nature 2021 LATEX template

Article Title 13

Question Criterion
Q1 On a scale of 1 to 5, how tired did you feel after wearing the VR glasses? Tiredness
Q2 On a scale of 1 to 5, how easy was it for you to control the wheelchair movements? Easiness
Q3 On a scale of 1 to 5, how much did you like the control used for wheelchair movements? Pleasantness
Q4 Did you feel any discomfort? Discomfort
Q5 Did you feel dizzy during the session? Dizziness

Table 2 Perception Questionnaire

3.6 Statistical Analysis


The system was evaluated in three different scenarios. The city scenario is
characterized by being more open, without obstacles in the path, although it
involves covering longer distances. On the other hand, the museum is a semi-
enclosed space with objects scattered in the rooms. Lastly, the supermarket
scenario stands out for having objects closer to each other and narrower aisles.
All data was collected, and dispersion measures such as mean and standard
deviation were calculated, as shown in Table 3. Subsequently, a series of
multivariate analyses of variance (MANOVA) [52] following an experimental
design of 3 Scenarios (supermarket, museum, city) X 3 Controls (manual,
autonomous, voice) was conducted for the variables of Time, Distance, Veloc-
ity, and Acceleration, and the number of Collisions. Bonferroni honestly post
hoc tests [53] were subsequently applied per each MANOVA. Finally, a series
of non-parametric Kruskal-Wallis tests [54] were applied to each perception
questionnaire item, comparing the control navigation condition. All analyses
were performed using SPSS software version 27.

4 Results
4.1 Navigation performance
The series of MANOVA revealed a main factor effect for the sce-
nario for the five variables Time (F(2,177)=547.33, p<0.0001), Distance
(F(2,177)=5030.42, p<0.0001), Velocity (F(2,177)=13839.04, p<0.0001),
Acceleration (F(2,177)=2546.53, p<0.0001), and Collisions (F(2,177)=319.76,
p<0.0001). Moreover, the consequent Bonferroni post hoc tests confirmed sig-
nificant differences among the three scenarios for all the variables, indicating a
different motion performance of participants during the navigation, depending
on the virtual scenario. Table 3 displays the observed descriptive statistics for
the five variables, compared by scenario. As can be observed, in general, partic-
ipants traveled longer paths in the city outdoor scenario with faster movements
and fewer collisions than in the other indoor scenarios. On the other hand,
they covered the shorter distance at a slower pace but collided more in the
supermarket indoor scenario.
Springer Nature 2021 LATEX template

14 Article Title

Scenario
Variable Supermarket Museum City P
Time 239.19 ±53.87 173.82 ±28.75 232.66 ±47.11 <0.0001
Distance 178.02 ±28.98 203.59 ±22.23 449.89 ±58.21 <0.0001
Velocity 0.75 ±0.06 1.18 ±0.08 1.93 ±0.11 <0.0001
Acceleration 0.003 ±0.001 0.007 ±0.002 0.008 ±0.0001 <0.0001
Collisions 60 ±39 38±28 30 ±20 <0.0001

Table 3 Observed performance of participants according to scenario.

The same MANOVA performed tests also revealed a main fac-


tor effect for the type of control for most of the variables Time
(F(2,177)=587.31, p<0.0001), Distance (F(2,177)=357.166, p<0.0001), Accel-
eration (F(2,177)=371.14, p<0.0001), and Collisions (F(2,177)=1500.54,
p<0.0001), with exception of velocity. Again, the successive Bonferroni post
hoc tests confirmed significant differences among the three control types for all
the variables but for velocity, meaning a different motion performance of par-
ticipants depending on the type of control for the navigation. Table 4 shows
the observed descriptive statistics for the five variables, compared by control
type. In this case, participants generally exhibited the best performance with
the autonomous control, completing the tasks in shorter periods and distances
with remarkably much lesser collisions than the other controls. Contrarily, the
poorest performance was exhibited when navigating under the voice control
regarding executing time and covered distance, which, together with manual
control, resulted in much higher collisions than in the automatic mode.

Control
Variable Manual Autonomous Voice P
Time 206.79 ±25.96 182.86 ±32.02 256.02 ±46.95 <0.0001
Distance 273.03 ±125.93 239.39 ±105.79 319.08 ±142.33 <0.0001
Velocity 1.29 ±0.46 1.29 ±0.43 1.28 ±0.58 NS
Acceleration 0.006 ±0.002 0.007 ±0.002 0.005 ±0.003 <0.0001
Collisions 44 ±17 8±4 43 ±33 <0.0001

Table 4 Observed performance of participants according to type of control.

Finally, as shown in Table 5, the MANOVA tests also revealed statisti-


cal interaction among Scenario X Control combinations, meaning a different
motion performance if navigating in different scenarios and depending on
the navigation control scheme, but generally better using the autonomous
navigation.
Springer Nature 2021 LATEX template

Article Title 15

Supermarket
Variable Manual Autonomous Voice P
Time 209.06 ±11.76 196.90 ±3.82 311.62 ±22.33 <0.0001
Distance 169.78 ±7.74 151.88 ±4.99 212.4 ±22.08 NS
Velocity 0.81 ±0.02 0.77 ±0.01 0.68 ±0.03 <0.0001
Acceleration 0.003 ±0.002 0.003 ±0.006 0.002 ±0.001 <0.0001
Collisions 65 ±9.49 12±4.88 104 ±9.33 <0.0001
Museum
Time 177.23 ±14.05 139.56 ±3.82 204.67 ±22.33 <0.0001
Distance 203.06 ±7.74 179.89 ±4.99 227.83 ±22.08 NS
Velocity 1.15 ±0.0171 1.29 ±0.0130 1.11 ±0.030 <0.0001
Acceleration 0.006 ±0.007 0.009 ±0.001 0.005 ±0.003 <0.0001
Collisions 36 ±9.49 7 ±4.873 74 ±9.326 <0.0001
City
Time 234.09 ±11.76 212.13 ±3.82 251.77 ±22.33 <0.0001
Distance 446.26 ±7.74 386.40 ±4.99 517.01 ±22.08 NS
Velocity 1.90 ±0.0171 1.82 ±0.0130 2.05 ±0.0304 <0.0001
Acceleration 0.008 ±0.002 0.008 ±0.006 0.008 ±0.001 <0.0001
Collisions 31 ±9.49 6 ±4.873 52 ±9.326 <0.0001

Table 5 Descriptive statistics of the measured motion variables of the participant’s


trajectories in relation to scenario and control conditions.

The boxplots in Figure 7 display a graphical comparison of the time taken


for each scenario and control type. The boxplots in Figure 8 visually compare
the distance covered for each scenario and control type. Finally, the boxplots
in Figure 9 illustrate the comparison of the collisions that occurred for each
scenario and control type.

Fig. 7 ”Graph of the Time Variable by Scenario and by Control”


Springer Nature 2021 LATEX template

16 Article Title

Fig. 8 ”Graph of the Distance Variable by Scenario and by Control”

Fig. 9 ”Graph of the Collisions Variable by Scenario and by Control”

4.2 Questionnaire Ratings and System Assessment


After conducting the perception questionnaire, two series of non-parametric
Kruskal-Wallis tests were performed, comparing the type of controls and the
different scenarios, evaluating the differences in perception of tiredness, eas-
iness, and pleasantness, among the navigation with the wheelchair controls
separately, and the different scenarios.
The Kruskal-Wallis tests revealed significant differences among the types
of controls for the three perceptual variables: tiredness (p<0.013), easiness
(p<0.0001), and pleasantness (p<0.031). Following Bonferroni tests revealed
more specifically a higher significant sensation of tiredness using the manual
control (mean score of 3.33±1.42, median 4) than the voice control (mean score
of 2.93±1.53, median 2) was perceived, as shown in Fig. 10. For easiness, the
participants perceived significantly more difficult to use the voice control (mean
score of 3.23±1.14, median 2) than the manual (mean score of 4.18±0.62,
median 5) and the automatic one (mean score of 4.43±0.67, median 5). For
pleasantness, the perceived experience was positive for the three controls, with
Springer Nature 2021 LATEX template

Article Title 17

a significantly higher level for voice (mean score of 4.27±0.71, median 5) than
for manual control (mean score of 3.95±0.56, median 5).
For the evaluation of the perceived differences among the different sce-
narios, the Kruskal-Wallis tests did not reveal significant differences between
scenarios for the perceived sensation of tiredness and easiness, with a low level
of tiredness (median score of 2) and low level of difficulty (median easiness
sensation of 4). Participants generally enjoyed navigating the three scenarios,
but slightly significant differences were observed for pleasantness (p<0.0001).
To a lesser extent, within the supermarket (mean score of 3.77±0.62, median
4) than within the museum (mean score of 4.14±0.63, median of 4) and the
highest level in the city scenario (mean score of 4.41±0.67, median 5).

Fig. 10 ”Mean difference of the statistical analysis of the perception questionnaire.”

Regarding the possible perceived side effects, it was observed first that
both manual and autonomous control generated a higher frequency of dizziness
while navigating with the manual control, followed by the autonomous control,
with much less occurrence during the voice control (Fig. 11), in general for the
three scenarios.
With respect to the discomfort perceived by the users (Fig. 12), it can be
observed that the manual control generated a higher frequency compared to
the other controls, especially in the museum scenario. The autonomous control
is in second place, with the highest value also in the museum. Finally, the
control that generated the least discomfort among users was voice control.
Thus, although manual and autonomous controls were considered easier
to use by users, they were also associated with a higher sense of dizziness or
discomfort. Conversely, voice command control, although more challenging to
handle, generated less dizziness compared to the other two control types.
Springer Nature 2021 LATEX template

18 Article Title

Fig. 11 Incidence of Individuals who experienced dizziness.

Fig. 12 Incidence of Individuals who experienced discomfort.


Springer Nature 2021 LATEX template

Article Title 19

5 Discussions
The adoption of autonomous navigation technology has been maturing into
real-life applications, for example, in the automotive industry. This type of
navigation can impact improving assistance devices for people with motor
disabilities.
A relevant type of application is the incorporation of autonomous or
semi-autonomous navigation schemes in wheelchairs for patients with motor
disabilities. Wheelchair controls are devices designed to allow people with
mobility disabilities to operate the wheelchair independently and comfortably.
Ideally, these controls could be personally configured according to the individ-
ual needs of each individual, and their behavior could be adapted to cover a
wide range of options.
However, designing autonomous control strategies for wheelchairs presents
important challenges. An alternative is establishing virtual simulation environ-
ments, which allow exhaustive experimentation on the effectiveness of various
control strategies economically and without putting patients at risk.
In this work, we presented a virtual reality platform to simulate naviga-
tion using wheelchairs, using three types of controls: manual, automatic, and
voice. We studied the performance in navigation tasks in a group of healthy
participants, comparing the three types of controls in three typical daily life
scenarios. Additionally, we investigated the adverse effects of dizziness and
feelings of discomfort of the participants during their immersive navigation
experiences. The above had the objective of evaluating the potential of the use
of autonomous and voice control concerning manual control, as well as assess-
ing the potential of the proposed virtual environment as an experimentation
tool in the design and use of wheelchairs.
Regarding performance, both autonomous and manual controls proved
effective in maneuvering the wheelchair in the virtual environment. These
methods allowed participants to navigate the wheelchair precisely and quickly,
suggesting a satisfactory response in terms of motion control. This can be
attributed to participants’ familiarity with manual controls and the careful
implementation of autonomous control, which could adapt to users’ commands
and requirements.
However, despite their better performance, autonomous and manual con-
trols generated dizziness among participants. This may result from the need
to perform quick and precise movements in a virtual environment, which can
overload participants’ sensory and visual systems. Additionally, the lack of
haptic feedback in autonomous control may have contributed to the dizziness,
as participants did not receive direct physical sensations when interacting with
the virtual wheelchair.
On the other hand, voice control exhibited the poorest performance in
terms of control accuracy and speed. Participants struggled to communi-
cate the appropriate commands accurately and experienced delays in the
wheelchair’s response. Although this control method did not generate as much
dizziness as the others, its unsatisfactory performance suggests that voice
Springer Nature 2021 LATEX template

20 Article Title

recognition accuracy and capability improvements are necessary to make it


viable in virtual environments.
The results obtained through statistical analysis indicate significant differ-
ences in average times among the scenarios, regardless of the type of control
used. Specifically, it was observed that the average times in scenarios 1 (Super-
market) and 3 (City) are similar, while scenario 2 (Museum) has shorter times.
This suggests that the scenario influences participants’ performance, with sce-
nario 2 being more efficient regarding time, regardless of the type of control
used. Additionally, significant differences were found in other variables such
as distance, speed, acceleration, and number of collisions among the control
groups and scenarios.
It is worth noting that these results are consistent with the work of
Rebsamen[55], who mentioned that the context or environment in which a
task is performed can significantly impact people’s performance. For example,
the study by Yue [56] has shown that familiarity with the environment, sce-
nario complexity, or the presence of distractions can influence performance in
navigation or driving tasks.
Overall, these findings highlight the importance of considering both per-
formance and user perception when selecting a control method for wheelchairs
in virtual environments. While autonomous and manual controls offer better
performance, the dizziness associated with these methods must be addressed
by reducing sensory and visual load. Voice control, although causing less dizzi-
ness, requires improvements in accuracy and recognition to be a viable option.
These results provide a solid foundation for future research and improve-
ments in designing and implementing wheelchair control systems in virtual
environments.
The study conducted in this work provided helpful information about
the efficiency and perception of controls in virtual environments simulating
wheelchair navigation in different realistic scenarios. It aims to promote the
implementation of experimental setups for assessing, on the one hand, the
effects of autonomous and voice command-based controls in virtual environ-
ments and to elucidate the possible configurations able to increase efficiency
and reduce motion sickness in these environments. In general, these results
can help improve the implementation of virtual reality in various fields and
design better controls for a more effective and comfortable user experience.
Thus, the experiences reported in this study may lead to the basis for novel
studies in wheelchair navigation under different experimental conditions but
in a risk-free and economical fashion.
However, it is important to consider the limitations of the study, such
as the sample size and the generalization of the results to other contexts,
and further research with a broader and more rigorous focus is recommended
to confirm and expand these findings, such as having enhanced autonomous
control and better voice recognition algorithm. In addition, the experiences
suggest that strategies should be considered to reduce sensory and visual load
Springer Nature 2021 LATEX template

Article Title 21

when implementing these control methods in virtual environments, such as


providing haptic feedback to enhance perception and alleviate motion sickness.

6 Conclusions
In this paper, the role of virtual reality environments was shown as a valuable
tool for the design and evaluation experience of wheelchair navigation using
different control strategies. In summary, the importance of considering both
performance and user perception when choosing a control method in virtual
wheelchairs is emphasized.
Although the experimental results in our study emphasize that autonomous
and manual control offer better performance, the associated motion sickness
issues must be addressed. On the other hand, voice control, despite causing
less motion sickness, requires improvements in accuracy and recognition to be
a viable option. These findings lay the groundwork for future research and
improvements in developing wheelchair control systems in simulated virtual
environments.

7 Funding
”We would like to express our gratitude to the Iberoamerican University for
the financial support and for granting access to the facilities and equipment
necessary to conduct this research.”

Data availability: The data that support the findings of this study
are openlyavailable in Github at https://github.com/Enrique1636/Virtual-
Wheelchair-Files

8 Acknowledgments
E Pedroza-Santiago would like to thank CONAHCYT, Mexico, for the schol-
arship received.

9 Declarations
Conflict of interest: The authors declare no confict of interest.
Informed Consent: Informed consent was obtained from all subjects involved
in the study
Springer Nature 2021 LATEX template

22 Article Title

References
[1] Choi, J.H., Chung, Y., Oh, S.: Motion control of joystick interfaced electric
wheelchair for improvement of safety and riding comfort. Mechatronics
59, 104–114 (2019). https://doi.org/10.1016/j.mechatronics.2019.03.005

[2] Kundu, A.S., Mazumder, O., Lenka, P.K., Bhaumik, S.: Design and per-
formance evaluation of 4 wheeled omni wheelchair with reduced slip
and vibration. Procedia Computer Science 105, 289–295 (2017). https:
//doi.org/10.1016/j.procs.2017.01.224

[3] Mahmud, S., Lin, X., Kim, J.-H., Iqbal, H., Rahat-Uz-Zaman, M., Reza,
S., Rahman, M.A.: A multi-modal human machine interface for con-
trolling a smart wheelchair. In: 2019 IEEE 7th Conference on Systems,
Process and Control (ICSPC), pp. 10–13 (2019). https://doi.org/10.1109/
ICSPC47137.2019.9068027. IEEE

[4] Abdulghani, M.M., Al-Aubidy, K.M., Ali, M.M., Hamarsheh, Q.J.:


Wheelchair neuro fuzzy control and tracking system based on voice recog-
nition. Sensors 20(10), 2872 (2020). https://doi.org/10.3390/s20102872

[5] Budiharto, W.: Intelligent controller of electric wheelchair from manual


wheelchair for disabled person using 3-axis joystick. ICIC Express Letters,
Part B: Applications 12(2), 201–206 (2021). https://doi.org/10.24507/
icicelb.12.02.201

[6] Letaief, M., Rezzoug, N., Gorce, P.: Comparison between joystick-and
gaze-controlled electric wheelchair during narrow doorway crossing: Fea-
sibility study and movement analysis. Assistive Technology 33(1), 26–37
(2021)

[7] Venkatesan, M., Mohan, L., Manika, N., Mathapati, A.: Voice-controlled
intelligent wheelchair for quadriplegics. In: Computer Communication,
Networking and IoT: Proceedings of ICICC 2020, pp. 41–51 (2021).
Springer

[8] Genova, C., Biffi, E., Arlati, S., Redaelli, D.F., Prini, A., Malosio, M., Cor-
betta, C., Davalli, A., Sacco, M., Reni, G.: A simulator for both manual
and powered wheelchairs in immersive virtual reality cave. Virtual Reality
26(1), 187–203 (2022). https://doi.org/10.1007/s10055-021-00547-w

[9] Hoter, E., Nagar, I.: The effects of a wheelchair simulation in a virtual
world. Virtual Reality 27(1), 407–419 (2023)

[10] Archambault, P.S., Bigras, C.: Improving wheelchair driving performance


in a virtual reality simulator. In: 2019 International Conference on Vir-
tual Rehabilitation (ICVR), pp. 1–2 (2019). https://doi.org/10.1109/
Springer Nature 2021 LATEX template

Article Title 23

ICVR46560.2019.8994644. IEEE

[11] Li, W., Talavera, J., Samayoa, A.G., Lien, J.-M., Yu, L.-F.: Automatic
synthesis of virtual wheelchair training scenarios. In: 2020 IEEE Confer-
ence on Virtual Reality and 3D User Interfaces (VR), pp. 539–547 (2020).
https://doi.org/10.1109/VR46266.2020.00075. IEEE

[12] Yan, H., Archambault, P.S.: Augmented feedback for manual wheelchair
propulsion technique training in a virtual reality simulator. Journal of
neuroengineering and rehabilitation 18(1), 1–19 (2021). https://doi.org/
10.1186/s12984-021-00936-x

[13] Montalbán, M.A., Arrogante, O.: Rehabilitation through virtual real-


ity therapy after a stroke: A literature review. Revista Cientı́fica de
la Sociedad de Enfermerı́a Neurológica (English ed.) 52, 19–27 (2020).
https://doi.org/10.1016/j.sedeng.2020.01.001

[14] Jessica, P., Salim, S., Syahputra, M.E., Suri, P.A., et al.: A systematic lit-
erature review on implementation of virtual reality for learning. Procedia
Computer Science 216, 260–265 (2023). https://doi.org/10.1016/j.procs.
2022.12.135

[15] Vaughan, N., Dubey, V.N., Wainwright, T.W., Middleton, R.G.: A review
of virtual reality based training simulators for orthopaedic surgery. Med-
ical engineering & physics 38(2), 59–71 (2016). https://doi.org/10.1016/
j.medengphy.2015.11.021

[16] Nithva, B., Asha, V., Kumar, K., Giri, J.: Efficacious opportunities and
implications of virtual reality features and techniques. In: 2022 Fourth
International Conference on Cognitive Computing and Information Pro-
cessing (CCIP), pp. 1–8 (2022). https://doi.org/10.1109/CCIP57447.
2022.10058642. IEEE

[17] Aşkın, A., Atar, E., Koçyiğit, H., Tosun, A.: Effects of kinect-based virtual
reality game training on upper extremity motor recovery in chronic stroke.
Somatosensory & motor research 35(1), 25–32 (2018). https://doi.org/
10.1080/08990220.2018.1444599

[18] Hufnal, D., Osborne, E., Johnson, T., Yildirim, C.: The impact of con-
troller type on video game user experience in virtual reality. In: 2019
IEEE Games, Entertainment, Media Conference (GEM), pp. 1–9 (2019).
https://doi.org/10.1109/GEM.2019.8811543. IEEE

[19] Khurshid, J., Bing-Rong, H.: Military robots-a glimpse from today and
tomorrow. In: ICARCV 2004 8th Control, Automation, Robotics and
Vision Conference, 2004., vol. 1, pp. 771–777 (2004). https://doi.org/10.
1109/ICARCV.2004.1468925. IEEE
Springer Nature 2021 LATEX template

24 Article Title

[20] Cooper, R.A., Spaeth, D.M., Jones, D.K., Boninger, M.L., Fitzgerald,
S.G., Guo, S.: Comparison of virtual and real electric powered wheelchair
driving using a position sensing joystick and an isometric joystick. Medical
engineering & physics 24(10), 703–708 (2002). https://doi.org/10.1016/
S1350-4533(02)00111-X

[21] Zagal, J.P., Tomuro, N., Shepitsen, A.: Natural language processing for
games studies research. Journal of Simulation & Gaming (S&G), Special
Issue on Games Research Methods 43(3), 353–370 (2011)

[22] Krishnamurthy, V., Rosary, B.J., Joel, G.O., Balasubramanian, S.,


Kumari, S.: Voice command-integrated ar-based e-commerce application
for automobiles. In: 2023 International Conference on Signal Process-
ing, Computation, Electronics, Power and Telecommunication (ICon-
SCEPT), pp. 1–5 (2023). https://doi.org/10.1109/IConSCEPT57958.
2023.10170152. IEEE

[23] Chaudhry, A., Batra, M., Gupta, P., Lamba, S., Gupta, S.: Arduino based
voice controlled robot. In: 2019 International Conference on Computing,
Communication, and Intelligent Systems (ICCCIS), pp. 415–417 (2019).
https://doi.org/10.1109/ICCCIS48478.2019.8974532. IEEE

[24] Subhash, S., Srivatsa, P.N., Siddesh, S., Ullas, A., Santhosh, B.: Artificial
intelligence-based voice assistant. In: 2020 Fourth World Conference on
Smart Trends in Systems, Security and Sustainability (WorldS4), pp. 593–
596 (2020). https://doi.org/10.1109/WorldS450073.2020.9210344. IEEE

[25] Cheung, V., Girouard, A.: Tangible around-device interaction using rota-
tory gestures with a magnetic ring. In: Proceedings of the 21st Interna-
tional Conference on Human-Computer Interaction with Mobile Devices
and Services, pp. 1–8 (2019). https://doi.org/10.1145/3338286.3340137

[26] Justesen, N., Bontrager, P., Togelius, J., Risi, S.: Deep learning for video
game playing. IEEE Transactions on Games 12(1), 1–20 (2019). https:
//doi.org/10.1109/TG.2019.2896986

[27] dos Santos Junior, E.S., de Oliveira Machado, A., Macedo, M.C., dos
Santos Souza, A.C.: Reinforcement learning para treino do pac-man em
speedrun/reinforcement learning for pac-man speedrun training. Brazilian
Journal of Development 5(11), 25927–25957 (2019). https://doi.org/10.
34117/bjdv5n11-242

[28] Waterreus, A.J.: Utilization of supervised and reinforcement learning in


the automation of the classical atari game “pong”. PhD thesis, The
University of Texas at San Antonio (2019)

[29] Shi, X., Wong, Y.D., Chai, C., Li, M.Z.-F.: An automated machine
Springer Nature 2021 LATEX template

Article Title 25

learning (automl) method of risk prediction for decision-making of


autonomous vehicles. IEEE Transactions on Intelligent Transportation
Systems 22(11), 7145–7154 (2020). https://doi.org/10.1109/TITS.2020.
3002419

[30] Surmann, H., Jestel, C., Marchel, R., Musberg, F., Elhadj, H., Ardani, M.:
Deep reinforcement learning for real autonomous mobile robot navigation
in indoor environments. arXiv preprint arXiv:2005.13857 (2020). https:
//doi.org/10.48550/arXiv.2005.13857

[31] Nguyen, V.T., Sentouh, C., Pudlo, P., Popieul, J.-C.: Joystick haptic force
feedback for powered wheelchair-a model-based shared control approach.
In: 2020 IEEE International Conference on Systems, Man, and Cyber-
netics (SMC), pp. 4453–4459 (2020). https://doi.org/10.1109/SMC42975.
2020.9283235. IEEE

[32] Li, G., Yang, Y., Zhang, T., Qu, X., Cao, D., Cheng, B., Li, K.: Risk assess-
ment based collision avoidance decision-making for autonomous vehicles
in multi-scenarios. Transportation research part C: emerging technologies
122, 102820 (2021). https://doi.org/10.1016/j.trc.2020.102820

[33] Simon, C., Rajeswari, M.: Voice-based virtual assistant with security.
In: 2023 Second International Conference on Electronics and Renew-
able Systems (ICEARS), pp. 822–827 (2023). https://doi.org/10.1109/
ICEARS56392.2023.10085043. IEEE

[34] Yuvaraj, S., Badholia, A., William, P., Vengatesan, K., Bibave, R.: Speech
recognition based robotic arm writing. In: Proceedings of International
Conference on Communication and Artificial Intelligence: ICCAI 2021,
pp. 23–33 (2022). Springer

[35] Javed, A., Malik, K.M., Irtaza, A., Malik, H.: Towards protecting cyber-
physical and iot systems from single-and multi-order voice spoofing
attacks. Applied Acoustics 183, 108283 (2021). https://doi.org/10.1016/
j.apacoust.2021.108283

[36] Astuti, W., Riyandwita, E.B.W.: Intelligent automatic starting engine


based on voice recognition system. In: 2016 IEEE Student Conference on
Research and Development (SCOReD), pp. 1–5 (2016). https://doi.org/
10.1109/SCORED.2016.7810061. IEEE

[37] Chatziparaschis, D., Lagoudakis, M.G., Partsinevelos, P.: Aerial and


ground robot collaboration for autonomous mapping in search and rescue
missions. Drones 4(4), 79 (2020). https://doi.org/10.3390/drones4040079

[38] Jayakody, A., Nawarathna, A., Wijesinghe, I., Liyanage, S., Dissanayake,
Springer Nature 2021 LATEX template

26 Article Title

J.: Smart wheelchair to facilitate disabled individuals. In: 2019 Interna-


tional Conference on Advancements in Computing (ICAC), pp. 249–254
(2019). https://doi.org/10.1109/ICAC49085.2019.9103409. IEEE

[39] Alim, M.A., Setumin, S., Rosli, A.D., Ani, A.I.C.: Development of a voice-
controlled intelligent wheelchair system using raspberry pi. In: 2021 IEEE
11th IEEE Symposium on Computer Applications & Industrial Electron-
ics (ISCAIE), pp. 274–278 (2021). https://doi.org/10.1109/ISCAIE51753.
2021.9431815. IEEE

[40] Faria, B.M., Ferreira, L., Reis, L.P., Lau, N., Petry, M., Couto, J.: Man-
ual control for driving an intelligent wheelchair: A comparative study of
joystick mapping methods. environment 17, 18 (2012)

[41] Grewal, H., Matthews, A., Tea, R., George, K.: Lidar-based autonomous
wheelchair. In: 2017 IEEE Sensors Applications Symposium (SAS), pp.
1–6 (2017). https://doi.org/10.1109/SAS.2017.7894082. IEEE

[42] Jerald, J., Giokaris, P., Woodall, D., Hartholt, A., Chandak, A., Kuntz, S.:
Developing virtual reality applications with unity. In: 2014 IEEE Virtual
Reality (VR), pp. 1–3 (2014). https://doi.org/10.1109/VR.2014.6802117.
IEEE

[43] Ferrone, H.: Learning C# by Developing Games with Unity 2020: An


Enjoyable and Intuitive Approach to Getting Started with C# Program-
ming and Unity. Packt Publishing Ltd (2020)

[44] Woodard, W., Sukittanon, S.: Interactive virtual building walkthrough


using oculus rift and microsoft kinect. In: SoutheastCon 2015, pp. 1–3
(2015). https://doi.org/10.1109/VR.2014.6802117. IEEE

[45] Chiang, H.-H., Chen, Y.-L., Wu, B.-F., Lee, T.-T.: Embedded driver-
assistance system using multiple sensors for safe overtaking maneuver.
IEEE Systems Journal 8(3), 681–698 (2012). https://doi.org/10.1109/
JSYST.2012.2212636

[46] Liu, Q., Zhao, L., Tan, Z., Chen, W.: Global path planning for autonomous
vehicles in off-road environment via an a-star algorithm. International
Journal of Vehicle Autonomous Systems 13(4), 330–339 (2017)

[47] Graham, R., McCabe, H., Sheridan, S.: Pathfinding in computer games.
The ITB Journal 4(2), 6 (2003)

[48] Wang, C., Savkin, A.V., Garratt, M.: A strategy for safe 3d navigation of
non-holonomic robots among moving obstacles. Robotica 36(2), 275–297
(2018). https://doi.org/10.1017/S026357471700039X
Springer Nature 2021 LATEX template

Article Title 27

[49] Dalal, P., Sharma, T., Garg, Y., Gambhir, P., Khandelwal, Y.: “jarvis”-
ai voice assistant. In: 2023 1st International Conference on Innovations in
High Speed Communication and Signal Processing (IHCSP), pp. 273–280
(2023). https://doi.org/10.1109/IHCSP56702.2023.10127134. IEEE

[50] Matarneh, R., Maksymova, S., Lyashenko, V., Belova, N.: Speech recog-
nition systems: A comparative review (2017)

[51] Villar Miguélez, C.: Diseño de un videojuego orientado a mejorar el


proceso de enseñanza-aprendizaje de la lengua inglesa (2014)

[52] Salimi, Z., Ferguson-Pell, M.W.: Motion sickness and sense of presence
in a virtual reality environment developed for manual wheelchair users,
with three different approaches. PloS one 16(8), 0255898 (2021). https:
//doi.org/10.1371/journal.pone.0255898

[53] Shams, A., Nobari, H., Afonso, J., Abbasi, H., Mainer-Pardos, E., Pérez-
Gómez, J., Bayati, M., Bahrami, A., Carneiro, L.: Effect of aerobic-based
exercise on psychological well-being and quality of life among older people:
a middle east study. Frontiers in Public Health 9, 764044 (2021). https:
//doi.org/10.3389/fpubh.2021.764044

[54] Lee, S.W., et al.: Methods for testing statistical differences between
groups in medical research: statistical standard and guideline of life cycle
committee. Life Cycle 2 (2022). https://doi.org/10.54724/lc.2022.e1

[55] Rebsamen, B., Guan, C., Zhang, H., Wang, C., Teo, C., Ang, M.H.,
Burdet, E.: A brain controlled wheelchair to navigate in familiar environ-
ments. IEEE Transactions on Neural Systems and Rehabilitation Engi-
neering 18(6), 590–598 (2010). https://doi.org/10.1109/TNSRE.2010.
2049862

[56] Lu, Y., Fu, X., Lu, C., Guo, E., Tang, F., Zhu, J., Li, H.: Effects of
route familiarity on drivers’ psychological conditions: Based on driving
behaviour and driving environment. Transportation research part F: traf-
fic psychology and behaviour 75, 37–54 (2020). https://doi.org/10.1016/
j.trf.2020.09.005

You might also like