Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 117

24.

Introduction to Unmanned Systems: Air, Ground, Sea & Space

chapter 1: Introduction to Unmanned Systems

[mh]Historical Evolution

The historical evolution of unmanned systems is a fascinating journey that spans centuries and
encompasses a wide array of technological innovations, military applications, and societal impacts. From
the earliest experiments with unmanned aerial vehicles to the cutting-edge developments in autonomous
robotics, the evolution of unmanned systems reflects humanity's quest for exploration, efficiency, and
innovation.

The roots of unmanned systems can be traced back to ancient times, with the earliest recorded use of
unmanned devices dating back to ancient China. In the 13th century, the Chinese military employed
primitive forms of unmanned aerial vehicles known as "fire arrows" or "fire dragons" to launch
incendiary projectiles at enemy forces during battles. These early unmanned devices laid the groundwork
for the development of more sophisticated unmanned systems in the centuries that followed.

During the Renaissance period, Leonardo da Vinci conceptualized various designs for unmanned flying
machines, including the ornithopter, a mechanical device designed to mimic the flight of birds. While da
Vinci's designs were never realized during his lifetime, they laid the foundation for future advancements
in aviation and unmanned flight.

The modern era of unmanned systems began to take shape during the 20th century, with the emergence of
radio-controlled aircraft and unmanned aerial vehicles (UAVs) for military reconnaissance and
surveillance purposes. In the early 20th century, radio-controlled aircraft were used by the military for
target practice and aerial photography, laying the groundwork for the development of more advanced
UAVs in the decades that followed.

During World War I, both the Allies and the Central Powers experimented with unmanned aerial vehicles
for reconnaissance and bombing missions. One notable example is the Kettering Bug, an early form of
cruise missile developed by the United States military for use against enemy targets. While the Kettering
Bug was never deployed in combat, it laid the groundwork for the development of more sophisticated
unmanned aerial vehicles in the years that followed.

The interwar period saw continued advancements in unmanned systems, with the development of radio-
controlled aircraft for military reconnaissance and target practice. These early unmanned aircraft were
used by the military for training purposes and as target drones for anti-aircraft gunnery practice.

During World War II, unmanned aerial vehicles played a significant role in military operations, with both
the Allies and the Axis powers employing them for reconnaissance, surveillance, and bombing missions.
One notable example is the German V-1 flying bomb, a pilotless aircraft equipped with an explosive
payload that was used by the Luftwaffe to attack targets in Britain during the later stages of the war.

Following World War II, unmanned systems continued to evolve rapidly, with advancements in
technology enabling the development of more sophisticated unmanned aerial vehicles, unmanned ground
vehicles, and unmanned maritime vehicles. The Cold War era saw significant investments in unmanned
systems by military powers, with the United States and the Soviet Union both developing a wide array of
unmanned platforms for reconnaissance, surveillance, and combat operations.

The late 20th century saw the emergence of unmanned aerial vehicles (UAVs) as a transformative
technology with a wide range of military and civilian applications. The Gulf War in 1991 marked the first
large-scale deployment of UAVs in combat, with the United States military using them for
reconnaissance, surveillance, and targeting missions.

In the early 21st century, advancements in technology, miniaturization, and artificial intelligence led to a
proliferation of unmanned systems across a wide range of domains, including military, civilian, and
commercial applications. The use of unmanned aerial vehicles (UAVs) expanded beyond the military to
include applications such as aerial photography, surveying, agriculture, and disaster response.

Today, unmanned systems are increasingly integrated into everyday life, with drones being used for a
wide range of applications, from aerial photography and videography to package delivery and
infrastructure inspection. Advancements in artificial intelligence, sensor technology, and communication
systems continue to drive innovation in unmanned systems, enabling them to perform increasingly
complex tasks with greater autonomy and efficiency.

Looking ahead, the future of unmanned systems holds tremendous promise, with potential applications
ranging from space exploration and colonization to autonomous transportation and environmental
monitoring. As unmanned systems continue to evolve and proliferate, it is essential to address the ethical,
legal, and societal implications of their use, ensuring that they are deployed responsibly and in accordance
with established norms and regulations. By harnessing the potential of unmanned systems while
addressing the challenges they pose, humanity can unlock new opportunities for exploration, discovery,
and innovation in the years to come.

[mh]Types of Unmanned Systems

The field of the micro electro‐mechanical system (MEMS) technology enables small and lightweight, yet
accurate inertial measurement units (IMUs), inertial and magneto ‐resistive sensors. The latest technology
in high‐density power storage offers high‐density battery cells. Thus, allowing lightweight unmanned
aerial systems (UASs) with increased flight time suitable for varying missions. So far, UASs have been
used mainly for military applications, for reconnaissance but also in engagement. However, increasing
attempts are being made to use such systems in non ‐military scenarios. Within the ICARUS project,
several UAS platforms are being developed to support human teams in disaster scenarios as part of an
integrated set of unmanned search and rescue (SAR) systems. While there exist a wide range of aerial
platforms equipped with basic functionality, the UASs need to be adapted with functions specific to the
ICARUS framework and its scenarios.

The autonomous UASs for SAR applications are suited as remote sensors for mapping the disaster area,
inspection of the local infrastructure, efficient localisation of people, and as tools for fast interaction with
them. As an additional capability, the UAS tools will be complemented with a communication relay
functionality to establish its own communication network in case the local infrastructure is broken. To
fulfil these missions, the UAS systems will have to feature accurate on ‐board localisation and control
while still requiring minimum power. The power requirement is a crucial role for UASs in order to allow
for sufficient operation time carrying the on‐board payload. The systems need to work autonomous with
minimal human interaction, allowing human forces to be focused on other tasks. Furthermore, the project
intends to provide solutions for adapting to unknown or even dynamically changing environments: as
they operate close to people, it is necessary for the platforms to function robustly without hitting anything
or anyone.

All necessary payloads for the SAR missions need to be identified and integrated with the platforms,
while still meeting the tight specifications on endurance and manoeuverability. This consists of additional
sensors and devices for flight autonomy and integration into the ICAURS framework.

[h]UAS applications for SAR missions

SAR missions can benefit from the use of UASs in many distinct ways. Their unique ability to reach
higher altitudes from ground can be useful to provide means, other robots or humans on ground can never
achieve. Within the ICARUS project, five distinct functionalities of UASs are proven to be useful

1. Mapping/sectorisation: A UAS can provide operators on‐site and even higher level coordinators
with valuable aerial information over an extended period of time and at a large scale in an efficient
period of time. This information can be used for planning and risk assessment. A direct video
stream to the operator can give them valuable live information about the situation. Furthermore,
the aerial data can be processed to build a 3D map of the scanned area or building.
2. Victim search: Search for victims is one of the key elements of the ICARUS system. Using to a
large extent the on‐board thermal cameras, a large area can be scanned and potential victim
locations are forwarded to the operator automatically to be verified.
3. Target observation: The UAS provides the operator/coordinators with a pair of remote eyes in the
sky. A specified location can be watched for an extended period of time without having to care
about low‐level control of the UAS. For the long‐endurance UAS intrinsically incapable of
keeping the camera view constant over time, a circling of the target will be the default procedure.
The target to be observed can also involve assessment of the structural integrity of buildings both
from the outside or the inside. In addition, the captured and registered images can also be used for
mapping and sectorisation.
4. Delivery: Items possibly useful to be delivered by the UAS consist of, for example, bottles of
water, medical packages and potentially inflatable buoyancy aids. The UAS must be capable of
carrying and deploying those items to a user‐specified location.
5. Communication relay: In a distributed network of robotic and human agent, communication plays
a key role. UASs naturally offer good visibility to a large area and are thus suited for acting as
communication relay when integrated in a common network. This functionality differentiates itself
from the previous ones, since the request for acting as a communication relay will typically be
received when UAS platforms are on other missions (but it is of course not limited to this
situation). In this case, the prioritisation of the communication relaying (i.e. possible
interruption/relocation) over the current mission is handled by the coordinator or the UAS
operator. A flight plan may be suggested automatically providing coverage necessary for
performing the relaying task.

In order to provide efficient tools for the defined functionalities, a set of heterogeneous UASs have been
developed. The goal of all UAS platforms is to efficiently gather information about the disaster area and
possible victims, to provide initial life support and interaction with victims, assist the SAR teams and
ground robots with additional information, and acting as communication relay in case the local
infrastructure is broken. During a SAR mission, different level of precision of the provided information is
needed, which leads to the developed platforms with different types of area coverage and speeds.
The first platform is a long‐endurance fixed‐wing UASs, as shown in the top image of Figure. This UAS
flies at low altitudes in the airspace of 150–200 m above the ground. It can provide aerial images both
from visual and thermal cameras. With its long endurance ranging from several hours up to a few days
and its autonomous capabilities, the long‐endurance UAS is able to cover large areas in a short period of
time. Thus, it is most useful in the planning phase of the SAR teams. It can be started at the base of
operation (BOO) before reaching the disaster side with the SAR team. Its main purpose is to complement
the aerial images gathered from satellites for a first assessment of the disaster side including pointing at
potential positions of victims. It has a fast deployment time of a few hours and due to its hand launch
capability and autonomy it is easy to operate. The (potential fleet of) long ‐endurance UAS can be given a
scan area to cover by aerial images. It will autonomously scan the area and return to the BOO to deliver
the image data for further processing. This information can be used as a first overview of the level of
destruction at the disaster side and as an assessment of the local infrastructure, for example, the roads
leading to the disaster side for moving to and installing the forward base of operation (FBOO).
Furthermore, this UAS is an ideal candidate to act as a communication relay with its high ground
clearance, connecting the different SAR teams at a further stage will still providing aerial information.

Figure :UAS fleet within ICARUS. Top: AtlantikSolar from ETH. Bottom left: AROT from EURECAT.
Bottom right: Indoor multirotor from Skybotix

The second platform is a large outdoor quadrotor shown in the bottom left image of Figure. This UAS is
used for detailed observation and for gathering thermal and visual aerial imageries at low altitudes up to
100 m above the ground. It is packed within a compact box and can be transported to and used at the
FBOO or be used during displacement of the SAR team. It is able to operate autonomously while
mapping in detail a given area or it can be directly piloted over the remote command and control system
(RC2) if the operator wants to assess specific details. It is able to map a short area of interest with a much
higher resolution than the fixed‐wing UAS. Furthermore, due to the lower altitude it is able to confirm the
possible locations of victims found by the fixed‐wing UAS and search for and detect additional victims in
that area. Since the quadrotor is able to hover it is suited to map walls of buildings for structural integrity
checks. Furthermore, it permits to interact with victims to assess their health and with the help of the
integrated delivery system to provide water or first aid kits to them.

The last platform is the indoor multirotor, as shown in the bottom right of Figure. This multirotor with a
small footprint can be used to inspect buildings from the inside whose structural integrity might be
compromised. In comparison to the unmanned ground vehicles (UGVs), the aerial multirotor has the
advantage of easily accessing the buildings without being blocked by possible debris inside, as long as
there is aerial clearance. Furthermore, it can easily climb different floors for inspection. This UAS assists
the SAR teams to assess the structural properties and mapping of the building without risking the live of a
human rescue team member. As there is in general no map of the inside available and there might be
some debris blocking the paths, the UAS must be piloted by an operator over the RC2, while a video
stream is fed back to enable a first person view (FPV) flight. Since the operator will not have a complete
oversight of the path, the UAS needs to be aware of potential obstacles and avoid them if necessary.

All those platforms have a very distinctive purpose and complement each other to from a complete set of
aerial robots for different possible SAR scenarios. They operate at different locations and altitudes for
airspace separation and collision avoidance between the UAS.

[h]UAS mechanical design

As discussed, the UAS fleet with its sensors can provide a great deal of situational awareness for the SAR
teams. There exists a great variety of available drones in the market. However, they still have just limited
autonomy due to the limited sensor capabilities and thus need a constant supervision of a trained pilot.
The main task within the ICARUS project are thus to extend the autonomy and situational awareness of
the systems to help the SAR teams within their missions with limited human interaction.

[h]Visual sensor payload

Since all the platforms have similar requirements for the visual sensor payload, a common visual sensor
payload was chosen. This reduces overall development and maintenance. This visual inertial (VI) sensor
(see Figure) combines visual information from up to four visual and/or thermal cameras with the
information from the inertial measurement unit (IMU). This forms a complete measurement set for
vision‐based mapping and localisation. Since the resolution and weight constraints are not yet met by the
thermal camera development in WP 210, an FLIR Tau 2 thermal camera is used over all platforms. The
sensors are hardware synchronised for tight fusion in the image processing algorithms. It is suitable for
sensing and performing on‐board processing for simultaneous localisation and mapping (SLAM) as well
as application‐oriented algorithms (cartography, victim detection). The core consists of a field
programmable gait array (FPGA) for visual pre‐processing as well as an processing unit (Kontron COM
express module) that can be exchanged and allows for the use of standard tools (e.g. install a standard
Linux operating system and run the vision algorithms). The unit was designed for low ‐power and low ‐
weight constraints such as use on‐board small UAS. The mass amounts to 150 g, while the power
consumption is of around 10–15 W.
Figure :VI sensor hardware with two visual cameras and mounted IMU

The sensor unit can be used for accurate pose estimation and mapping at real ‐time in all six dimensions
(position and orientation). The integrated FPGA can provide raw visual data as well as pre ‐processed
visual data such as visual keypoints used for SLAM and mapping. The team of the swiss federal institute
of technology in Zurich (ETHZ) has verified the mapping capabilities of the VI sensor in GPS ‐denied
environments and thorough analysis may be found in Ref. .

The raw and processed information from the VI sensor can be used for the mapping, victim search, as
well as for the target observation algorithms. Furthermore, increased control performance can be achieved
using the improved pose estimation.

[h]AtlantikSolar

AtlantikSolar from ICARUS partner ETHZ was chosen as the fixed ‐wing long ‐endurance UAS. It is a
small and lightweight solar‐powered UAS. Concerning solar UAS, prototypes of different sizes and
wingspans have been successfully operated by NASA, Quinetiq, Airbus, and others. The integrated solar
technology increases the overall flight time to even perpetual flights. This makes such UAS a perfect
candidate for extended information gathering of large‐scale areas, and acting as an airborne
communication relay.

The difference between the AtlantikSolar UAS (as shown in Figure) compared to others is its capability
to fly at low altitudes and its transportation and fast deployment capabilities due to its small size and
weight. The UAS has a wingspan of 5.6 m and an overall weight of 7.5 kg. Due to the small weight, it can
be launched by hand and thus can be deployed at almost all wide open locations without the need of an
intact airstrip. The main wing consists of three parts, which can be disassembled for transportation. The
ribs of the main wing are built out of balsa wood. These profile giving structures are interconnected by a
spar made out carbon‐fibre tubes, which runs from wingtip to wingtip. This results in a lightweight
structure and allows the transportation of the necessary battery packs with a total weight of 3.5 kg and
with capacity of 34.5 Ah at a nominal voltage of 21.6 V. The batteries are stored within the carbon ‐fibre
tubes throughout the main wing. The whole top surface of the main wing is used to embed solar modules.
Flying at a nominal speed of 9.5 m/s it is capable of flying up to 10 days at appropriate weather
conditions. The nominal power usage is around 60 W while generating up to 250 W at noon with the solar
panels. The complete specifications can be found in Ref. . The long ‐endurance capability was shown with
a successful flight demonstration to break a new world record of over 81 hours continuous flight by
travelling 2316 km in July 2015 in Switzerland.
Figure :Specification of the AtlantikSolar UAS

The UAS autopilot is based on the open‐source Pixhawk project , which was adapted for autonomous
waypoint‐based navigation. It is equipped with a sensor suite consisting of an inertial measurement unit
(IMU), a magnetometer for determining the heading of the UAS, a global position system (GPS) device, a
pitot tube measuring the airspeed and a static pressure sensor. Those sensors can be fused together to
estimate the pose of the UAS at all times as described in Ref. . An additional sensor payload (SensorPod)
has been mounted at the wing and incorporates the VI sensor together with the communication system.
The sensor payload consists of an atom motherboard with the Linux operating system running ROS. The
ROS interface was used for running the vision algorithm as well as communicating over the adapted joint
architecture for unmanned system (JAUS) protocol within the ICAURS mesh.

As depicted in Figure, the operator may control the UAS either directly by radio control or by the ground
control station (GCS). The manual control is intended for safety reasons only, in normal operation, the
UAS can be controlled over the GCS by providing a flight path. Two different long ‐range communication
channels are integrated for the data link to ensure connectivity throughout the whole operation. A high‐
bandwidth link using the 5 GHz Wi‐Fi technology for the camera transmissions, as well as a long ‐range
low‐bandwidth communication device for control and operation of the UAS.
Figure :Communication structure of the AtlantikSolar UAS

Towards complete autonomy, the AtlantikSolar UAS implements path‐planning algorithms that address
the problem of area coverage and exploration. The algorithm is tackling the problem of covering a large
area, taking into account the dynamics of a fixed‐wing UAS together with a fixly mounted camera. The
implemented algorithm is based on sampling‐based methods and is able to search for the optimal path that
ensures full exploration of a given area in minimum time. The implemented framework respects the non‐
holonomic constraints of the vehicle. Results and detailed explanations can be found in Ref. . The
framework has been successfully tested in multiple cases including the ICARUS public field trials in
Barcelona/CTC and Marche‐en‐Famenne in Belgium. The results can be seen in Figure, where the
generated map of the Marche‐en‐Famenne trial site is shown together with the optimised path for total
area coverage. Thus, the operator only needs to provide a desired area to be covered, while the UAS is
planning fully autonomous the whole mission and returns for delivering the gathered information.
Figure :Optimal path planning for the AtlantikSolar UAS. The path is optimal to cover the trial site in
Marche‐en‐Famenne. The images are used to form the shown map

Finally, the UAS is able to detect possible locations of the human victims using both the thermal and
visual camera information provided by the VI sensor (see Figure). Since the human body temperature is
generally higher than the surrounding, the thermal imagery can be used to detect humans. Due to the
restricted resolution of the thermal camera, the visual camera can be used to reference to found location.
The algorithms work by subtracting the background from the thermal image and comparing the resulting
regions of interest with known human features. A complete explanation of the algorithm can be found in
Ref. . The victims are tracked over time by the UAS, as long as they stay within the camera coverage, and
their location is sent back and displayed at the RC2.
Figure :Victim detection with the UAS. Possible locations of human bodies are marked in the picture and
send to the RC2

[h]LIFT quadrotor

The lightweight and integrated platform for flight technologies (LIFT) coaxial ‐quadrotor developed by
EURECAT is a hovering UAS with vertical take‐off and landing capabilities (VTOL), capable of carrying
out different tasks in the context of SAR operations. Quadrotor like the LIFT is a popular configuration
along the VTOL UAS. Its design is suitable for robust hovering and omnidirectionality. Further, it can
generate high torques on its main body for highly agile manoeuvres. The hovering capability and its
ability to lift a certain payload of the Eurecat quadrotor complements the functionality of the
AtlantikSolar UAS. It is able to fly closer to the ground for detailed mapping, lock and approach on
potential victims as well as delivering goods to them. It can be controlled by the operator to give a steady
aerial images feed due to his hover capability.

The LIFT (see Figure) is a large coaxial‐quadrotor with a weight of 4.3 kg. The length from shaft ‐to ‐shaft
is 0.86 m and on each end are two propellers in a coaxial configuration. The propellers can lift a
maximum weight of 9.6 kg. The main frame consists of carbon fibre tubes connected at the centre with
the autopilot unit at top and a variable payload attachment at the bottom between the landing gear. The
operational altitude is between 25 and 100 m above the ground. Two lithium polymer batteries with a
capacity of 10 Ah at a nominal voltage of 21.6 V power the UAS. The flight time depends on the used
payload and can be up to 30 minutes before replacing the batteries. It can be safely stored in a box to
transport the UAS to the destination point.
Figure :The Eurecat coaxial‐quadrotor with the camera payload

The UAS autopilot architecture is similar to that of the AtlantikSolar UAS. A pixhawk autopilot is
combined with a netbook processor running Linux and the ROS operating systems for communication in
the ICARUS mesh and for running the vision algorithms. The UAS is equipped with state of the art
proprioceptive (accelerometers, gyroscopes, and altimeter) sensors used to stabilise the attitude of the
UAS within the autopilot explained in Ref. . Using the GPS measurements, the position of the UAS can
be controlled. Additionally, the UAS is equipped with perceptive (VI sensor and range) sensors for
enhanced state estimation. The platform can fly autonomously or to be piloted remotely. It has an
advanced set of sensors to carry out different tasks in the context of SAR operations, such as first aid kit
delivery, victims search outdoors, terrain mapping and reconnaissance. It has several communication
links, starting from a radio control link to directly control the UAS. This link has an override mechanism
to take over the UAS in the case of emergency. The primary communication link is used to send basic
information between the GCS and the UAS such that that it can be monitored constantly. Further, the
ICARUS communication link is incorporated inside the payload to connect to the ICAURS mesh using
the Wi‐Fi connection. It can be operated manually up to fully autonomously following a desired path
within the GCS or the RC2.

Using the VI sensor with the thermal camera as payload, LIFT is able to map small areas and buildings
from the outside. Compared to the fixed‐wing UAS it can fly close to the buildings to gather detailed
visual information. This information can be used to assess the structural integrity of the buildings. The
operator can precisely guide the UAS to get to and lock on specific locations or targets for further
investigation using the live video stream. The mapping capability was shown in the Marche ‐en ‐Famenne
trial site where two buildings are entirely mapped, as shown in Figure :

Figure :Map of two buildings from the Marche‐en‐Famenne trial site created by the Eurecat quadrotor

The visual and thermal camera can be used as well for victim detection. Detected victim positions can be
sent to and displayed at the RC2. The aerial capability of the UAS is beneficial to detect humans within a
forest or rubble field where humans or robots on ground can only move slowly. Compared to the fixed‐
wing UAS, LIFT has the advantage to fly at lower altitudes, to hover at a certain point, and to move
omnidirectional. Thus, it can cover an area in much more detail and from different specific views to
increase the probability to find the victims. One of such potential scenarios is shown in Figure, where the
UAS is capable of detecting a human in the forest partially covered by a tree. This also shows the
advantage of using the thermal compared to the visual camera, where the victim is barely visible. The
people detector takes advantage of histograms of oriented gradients (HOGs) using a sliding window
approach in the images. One of the main reasons to use the HOG human detector is that it uses a ‘global’
feature to describe a person rather than a collection of ‘local’ features, that is, that the entire person is
represented only by a single feature vector. Another important feature is that this classifier is already
trained. However, the training data set is limited to some people postures and camera orientations. One
drawback of the HOG detector is that people need to have a relative big size in the image (around 64 ×
128 pixels), failing when a person occupies a small part of the image (a common situation for aerial
images). For this reason, a region‐growing algorithm based on temperature blobs is implemented to
search for victims.
Figure :Victim detection using the Eurecat quadrotor. The victim lying in the forest and partially covered
by a tree is found using the thermal images

Finally, the UAS payload can be exchanged for a delivery mechanism. The delivery mechanism is based
on an electromagnet as a release mechanism to deliver the goods. Attaching a magnet to the delivery kit,
it can be easily connected to the UAS. As soon as the UAS reaches the destination point, the operator can
disable the electromagnet and deploy the kit, as shown in Figure. It is used to help injured but responsive
victims by providing them with necessary supplies and treatment until the ground rescue teams arrive.
Figure :Eurecat quadrotor delivering a first aid kit to an injured victim

[h]Skybotix multicopter

The skybotix multicopter is a hexacopter with a small footprint. As the LIFT, it is capable of VTOL
flights and hovering. However, it is mainly used to inspect the inside of buildings. With its small footprint
it is able to enter building by small openings like open windows or doors. The platform is chosen since it
can hover robustly not being sensitive to wind and other disturbances, while still being able to fly through
narrow passages. Capable of flying indoors, it can navigate and help analysing the structural integrity of
the building from the inside and search for possible humans trapped inside the building.

The Skybotix multicopter, shown in Figure, is a modified version of the AscTec firefly with a weight of
1.4 kg including the additional sensor payload of 420 g. It has six propellers in a hexagonal configuration
with a radius of 0.66 m and a height of 0.17 m. The soft propellers are used to not harm potential humans.
The UAS has integrated some fault tolerance, since it can even fly with only five propellers. The
propellers are connected with carbon fibre tubes to the centre of the main body. A battery is powering the
UAS with a capacity of 4.9 Ah at a voltage of 12 V. It has a maximum flight time of 15 minutes before
the battery needs to be replaced. The upper part of the main body encapsulates the proprioceptive sensor
and microprocessor for control, whereas the below the VI sensor, communications, and an additional
CPU are mounted for mapping and integration into the ICARUS mesh.

Figure :Skybotix multicopter flying through a window for inspecting a building from the inside

The UAS can be flown manually over the dedicated remote control link or over the RC2. It has many
different modes of operation starting from attitude stabilised mode to fully position assist mode. Since the
indoor environment is not known a priory and the environment could be cluttered, an operator has to
constantly fly the UAS by providing translational velocity commands in the position assist mode. A
constant video stream provides a feedback for the operator. The basic idea of the obstacle avoidance is to
constrain the commanded velocity vector of the UAS by overloading a repelling velocity. This is done by
using a potential field around the obstacles generating a repelling velocity, which increases in magnitude
while decreasing the distance between the UAS and the obstacle. The tracked velocity command is thus
the average of the user set point together with all repelling velocities of all obstacles in the vicinity.
Although the user commands can be given as velocity set points, the UAS itself is controlled in position
to neglect drift in position as long as there is no user input to the system even in the presence of external
disturbances such as wind gusts. A detailed explanation of the control can be found in Ref. .

In order to perform obstacle avoidance and UAS navigation, the UAS is required to have a notion of its
surrounding. To this end, the VI sensor is used to generate depth images, which are subsequently
incorporated into a 3D voxelgrid. The corresponding depth image is estimated using a time ‐synchronised
stereo‐image pair from the VI sensor. A block‐matching algorithm is used to find correspondence objects
in both image frames that can be triangulated knowing the baseline between both cameras. The depth
images are only estimate once the UAS has moved to far outside the known terrain and a new pose key
frame must be generated, to avoid short‐term drifts in the localisation and map generation and to reduce
the computational burden. Using the pose key frame, the depth image can be transformed into a 3D point‐
cloud. Since this point‐cloud can quickly become untrackable for large areas, the information must be
compressed inside an efficient grid‐based map. Within this research, the OctoMap mapping framework is
used which results in an environmental representation as shown in Figure. This representation used a
resizable grid, where each node represents a probability of being occupied. All the node probabilities are
updated by incorporating the new point‐cloud with its associated noise model uncertainty.

Figure :OctoMap representation of the interior of a building generated by the Skybotix UAS

[mh]Applications and Use Cases

Applications and use cases for unmanned systems span a wide range of industries and domains, from
military and defense to civilian and commercial applications. Unmanned systems, including unmanned
aerial vehicles (UAVs), unmanned ground vehicles (UGVs), unmanned maritime vehicles (UMVs), and
unmanned space systems (USS), offer numerous benefits, including enhanced safety, increased
efficiency, and expanded capabilities in various tasks and operations. This explores the diverse
applications and use cases of unmanned systems across different sectors, highlighting their impact and
potential for innovation.

Military and Defense Applications: Unmanned systems have revolutionized military and defense
operations, offering capabilities that enhance situational awareness, intelligence gathering, and combat
effectiveness. UAVs, also known as drones, are extensively used for reconnaissance, surveillance, and
target acquisition (RSTA) missions, providing real-time aerial imagery and data to military commanders
and intelligence analysts. Additionally, UAVs equipped with precision-guided munitions can conduct
precision strikes against enemy targets with minimal risk to human operators, reducing the need for
manned aircraft in combat operations. UGVs, such as unmanned ground vehicles (UGVs), are utilized for
various tasks, including explosive ordnance disposal (EOD), route clearance, and perimeter security.
These unmanned ground vehicles can navigate challenging terrain and hazardous environments,
performing tasks that would be dangerous or impractical for human operators. UMVs, including
unmanned surface vessels (USVs) and unmanned underwater vehicles (UUVs), are used for maritime
surveillance, mine countermeasures, and anti-submarine warfare, enhancing naval capabilities and
protecting maritime interests.

Civilian and Commercial Applications: Unmanned systems have found numerous applications in civilian
and commercial sectors, ranging from agriculture and environmental monitoring to infrastructure
inspection and disaster response. In agriculture, UAVs equipped with sensors and cameras are used for
crop monitoring, precision agriculture, and yield optimization. These aerial platforms can capture high-
resolution imagery of agricultural fields, allowing farmers to assess crop health, detect pests and diseases,
and optimize irrigation and fertilization practices. In environmental monitoring, UAVs are employed for
wildlife tracking, habitat mapping, and ecological research, providing valuable data for conservation
efforts and environmental management. Additionally, UAVs are utilized for disaster response and
emergency management, conducting aerial surveys of disaster-affected areas, assessing damage, and
identifying hazards such as collapsed buildings and flooded areas. In the energy sector, UAVs are used
for infrastructure inspection and maintenance of oil and gas pipelines, power lines, and wind turbines,
reducing the need for manual inspections and improving safety and efficiency.

Law Enforcement and Public Safety: Unmanned systems play a crucial role in law enforcement and
public safety operations, providing law enforcement agencies and first responders with enhanced
capabilities for surveillance, search and rescue, and crisis management. UAVs equipped with thermal
imaging cameras and other sensors are used for aerial surveillance and reconnaissance, helping law
enforcement agencies monitor criminal activity, track suspects, and gather evidence for investigations.
Additionally, UAVs are utilized for search and rescue operations in remote or hazardous environments,
such as wilderness areas, mountains, and disaster zones, where traditional search methods may be
impractical or ineffective. UGVs, such as robotic ground vehicles and drones, are deployed for bomb
disposal, hostage situations, and hazardous material incidents, reducing the risk to law enforcement
personnel and civilians. UMVs, including unmanned surface vessels (USVs) and unmanned underwater
vehicles (UUVs), are employed for maritime law enforcement, border patrol, and maritime security
operations, enhancing the capabilities of coast guard and maritime law enforcement agencies.

Infrastructure Inspection and Maintenance: Unmanned systems are increasingly used for infrastructure
inspection and maintenance across various sectors, including construction, transportation, and utilities.
UAVs equipped with cameras, LiDAR, and other sensors are employed for aerial inspection of bridges,
roads, buildings, and other infrastructure assets, providing detailed imagery and data to engineers and
inspectors. These aerial platforms can assess the condition of infrastructure assets, identify defects and
structural damage, and prioritize maintenance and repair efforts. Additionally, UAVs are utilized for
inspection and monitoring of utility infrastructure, including power lines, pipelines, and
telecommunications towers, helping utilities companies detect faults, leaks, and other issues before they
escalate into emergencies. In the construction industry, UAVs are used for site surveys, progress
monitoring, and quality control, improving project management and efficiency.

Medical and Healthcare Applications: Unmanned systems have the potential to revolutionize medical and
healthcare delivery, offering capabilities for telemedicine, patient monitoring, and medical logistics.
UAVs equipped with medical supplies, defibrillators, and other lifesaving equipment can be deployed for
emergency medical response in remote or inaccessible areas, providing timely assistance to patients in
need. Additionally, UAVs can transport medical samples, vaccines, and blood products between
healthcare facilities, improving access to essential healthcare services in underserved communities. In
telemedicine, UAVs equipped with high-resolution cameras and communication systems enable remote
consultations between patients and healthcare providers, facilitating diagnosis, treatment, and follow-up
care for patients in remote or rural areas. Furthermore, UAVs are used for medical research and
development, transporting medical equipment, pharmaceuticals, and biological samples for research
studies and clinical trials.

Entertainment and Media Production: Unmanned systems are increasingly utilized in the entertainment
and media production industry, offering capabilities for aerial cinematography, live event coverage, and
immersive storytelling. UAVs equipped with stabilized cameras and gimbals are employed for aerial
filming and photography, capturing breathtaking aerial shots and dynamic perspectives for film,
television, and advertising projects. These aerial platforms can navigate challenging environments and
capture high-quality footage in a wide range of settings, from natural landscapes to urban cityscapes.
Additionally, UAVs are utilized for live event coverage, sports broadcasting, and news gathering,
providing aerial perspectives and real-time footage of events such as concerts, festivals, and sporting
competitions. In virtual reality (VR) and augmented reality (AR) production, UAVs equipped with 360-
degree cameras and VR/AR capture technology enable immersive storytelling and interactive
experiences, transporting viewers to remote locations and virtual worlds.

Research and Scientific Exploration: Unmanned systems play a crucial role in research and scientific
exploration, providing researchers and scientists with platforms for data collection, environmental
monitoring, and remote sensing. UAVs equipped with scientific instruments and sensors are used for
atmospheric research, climate monitoring, and environmental studies, collecting data on air quality,
weather patterns, and ecosystem health. These aerial platforms can fly at various altitudes and locations,
providing researchers with valuable insights into atmospheric processes and environmental changes.
Additionally, UAVs are employed for wildlife monitoring and ecological research, conducting aerial
surveys of animal populations, habitat mapping, and biodiversity assessments. In scientific exploration,
UAVs are utilized for planetary exploration, lunar and Martian missions, and space research, providing
valuable data and imagery for studying celestial bodies and extraterrestrial environments.

Education and Training: Unmanned systems are increasingly used in education and training programs to
provide students with hands-on experience and practical skills in robotics, engineering, and technology.
UAVs, UGVs, UMVs, and USS serve as educational platforms for students of all ages, allowing them to
learn about unmanned systems technology, programming, and operation. In academic institutions,
unmanned systems are used for research projects, student competitions, and STEM (science, technology,
engineering, and mathematics) programs, providing students with opportunities to apply theoretical
knowledge to real-world applications.

[mh]Technological Advancements

In recent years, unmanned aerial vehicles (UAVs) have become a powerful tool for diverse missions
including polymerase chain reaction (PCR) samples transportation between hospital and laboratories ,
UAV-based healthcare system to control COVID-19 pandemic , infectious diseases containment and
mitigation , traffic condition analysis in co-operation with deep learning approaches , and human behavior
understanding via multimedia data analytics in a real-time , to name a few. Currently, UAVs integration
with the emerging technologies such as block chain, internet of things, cloud computing, and artificial
intelligence can pave the way to serve mankind effectively compared to the recent past . Further, the
peculiarity of UAVs in terms of performing operations in 3D (dull, dirty, and dangerous) environments,
they can play a vital role in realization of the smart cities. Furthermore, UAVs are inevitable tool during
emergency planning and disaster management due to their abilities to perform missions aerially. Besides
the UAVs applications and use cited above, they can be highly beneficial for military purposes including
information collection and analysis, border surveillance, and transporting warfare items. The role of
UAVs in agriculture from multiple perspectives have already been recognized across the globe. Recently,
world’s leading commerce company (i.e., Amazon) has started using UAVs for delivering their products
to customers. Generally, the use of UAVs is expected to rise in many emerging sectors in the near future.
We present actual and innovative use of the UAVs during the ongoing pandemic in Figure. Majority of
the applications given in Figure employed multiple UAVs in order to accomplish the desired tasks.
Figure :Innovative applications of the UAVs during the ongoing pandemic

Although UAVs are highly beneficial for mankind through their innovative applications, but there exist
plenty of challenges that can hinder their use at a wider scale. For example, payload constraints and
power issues can limit their carrier abilities. Similarly, decision making during flight to ensure UAVs
safety by avoiding obstacles with sufficient accuracy is a non-trial task mainly due to no human-onboard
control. Furthermore, communication from long distances, and co-ordination among multiple UAVs to
perform complex tasks jointly are main barriers in the true realization of the UAVs technology. Besides
the challenges and issues given above, many issues concerning software and hardware also exist that need
rigorous developments and testing. Many solutions have been proposed to address these issues via cross
disciplinary approaches. Meanwhile, extensive testing and analysis of these solutions is yet to be
explored, especially in urban environments. In this chapter, we mainly focus on the ‘navigation’ that is
one of the core challenges in the UAVs technology. The navigation quandary is classified into three
cases: (i) where am I now?, (ii) where do I go?, and (iii) How do I get there?. The first two cases belong
to the localization and mapping, and the third case is about path planning (PP) . In this work, we cover
third case comprehensively, and provide concepts and developments in this regard. We present a
comprehensive overview about changing dynamics of the UAV applications in recent times, challenges of
the UAV technology, recent developments in the UAV technology, and future research trends in the PP
area in Figure. With this concise overview, we aim to aid researchers in extracting the contents enclosed
in this chapter conveniently.
Figure :Overview of changing dynamics of the UAV applications, challenges, recent developments, and
future research trends in the PP area.

The rest of this chapter is structured as follows. Section 2 discusses the basic concept of the path
planning, and categorizes the path planning approaches based on the information available about
underlying environment, and UAV used for the aerial mission. Section 3 describes the three essential
components of the PP. Section 4 critically analyzes various approaches that were proposed to lower the
computing time of the PP for UAVs. The future prospects of the research in the PP area are discussed in
Section 5. Finally, this chapter is concluded in Section 6.
[h]Path planning and categorization of the path planning approaches

PP is to find a safe (i.e., collision-free) path between two pre-determined locations (e.g., source and
destination, denoted with s and t, respectively) by optimizing certain performance objectives. The
performance objectives can be energy consumption, computing time, distance, path smoothness, and turns
etc. depending upon the mission type, operating environment, and UAVs’ type. The most important part
of the PP is to identify the environment where the pathfinding is carried out for UAVs. In this work, we
categorize the PP approaches based on the type of environment’s information, and UAVs strength,
respectively.

[h]Categorization of the path planning approaches based on information about environment

Generally, there are three possibilities about the availability of information regarding environment where
UAVs tend to operate. The operating environment can be fully known in advance (e.g., obstacles’
geometry information is known.), it can be completely unknown, and/or it can be partially known (e.g.,
few portions are known, and some portions are explored and modeled during the flight.). Based on the
degree of information about environment, PP approaches are mostly classified into two categories, local
PP (LPP) and global PP (GPP). In LPP, the environment is not known, and UAVs use sensors or other
devices in order to acquire information about the underlying environment. In GPP, PP is performed in a
fully known environment, meaning all information about environment is known in advance. Based on the
availability of the information regarding underlying environment, GPP approaches have lower complexity
compared to the LPP approaches. Recently, some PP approaches have jointly employed LPP and GPP
concepts in order to find a path for UAVs . In literature, GPP and LPP approaches are also classified as
offline and online PP approaches, respectively. Based on the extensive review of the literature, we present
a categorization of the PP approaches based on information about environment in Figure. We refer
interested readers to gain more insights about the LPP approaches in the previous studies .
Figure :Categorization of the PP approaches based on the availability of information about operating
environment.

Apart from the categorization provided above, environment can be classified into rural and urban
environments. The tendency of UAVs applications were high in the non-urban environments in the past.
Moreover, due to the significant development in control domain, UAVs are increasingly employed in the
urban environment these days. For instance, in urban environments, they can be used to monitor people
compliance with the social guidelines given by the respective governments in order to control the
COVID-19’s spread.

[h]Categorization of the path planning problems

Based on the mission’s type, either one or multiple UAVs can be employed. The scenarios in which only
one UAV is deployed are referred as single agent PP problem. In contrast, those scenarios in which
multiple UAVs are used are called multiple agent PP problems. PP for multiple agents is relatively
complex since UAVs need to avoid collision with the companion UAVs, and obstacles present in an
underlying operating environment. In addition, allocating target areas for coverage and optimizing
throughput also remain challenging, especially while operating at lower altitudes in urban environments.

[h]Essential components of the path planning for UAVs

Generally, there are three essential components of the PP: (i) modeling of the environment with
geometrical shapes by utilizing the obstacles/free spaces knowledge provided by a real-environment map,
(ii) task modeling with the help of graphs/trees keeping source and target locations in contact, and (iii)
applying search algorithm inclusive of the heuristic function to determine a viable path.
[h]Modeling of the environment with geometrical shapes

In the first step, a raw environment map is converted into a modeled one, in which obstacles are
represented with the help of geometrical shapes. For example, poles information provided by a real
environment map can be modeled with the help of cylinders in the modeled map. Similarly, buildings can
be modeled with the help of rectangles or polyhedron. In some cases, UAVs do not model the whole
environment map, and utilize sense and avoid (SAA) abilities to operate safely in the airspace. We
present an example of environment modeling, and well-known obstacles’ representation techniques used
for the PP in Figure. Each obstacles representation technique has different complexity and accuracy in
terms of real environment obstacles representations. In addition, each representation can be adopted
considering the UAV operating environment. For example, polygons can be used to model an urban
environment populated by various buildings.

Figure :Overview of environment modeling and obstacles’ representation techniques.

3.2 Task modeling with the graphs/trees


After modeling environment with the help of geometrical shapes, the next step is task modeling (e.g.,
generating network of paths with a graph/tree or selecting a desired portion to be modeled). For example,
road-map approach is a well-known task modeling approach for the PP, in which a graph is constructed
from the starting location to destination location by capturing the connectivity of free spaces and
obstacles’ corners. Apart from it, cell-decomposition and potential field are promising solutions for the
task modeling. We present most widely used task modeling methods in Figure :

Figure :Overview of the famous task modeling methods used in the PP adopted from .
Recently, trees-based task modeling methods have been widely used for the task modeling due to their
quick convergence in the final solution. We present an overview of the task modeling with the help of tree
in Figure. Furthermore, in some cases, more than one methods are jointly used to model the tasks on a
provided map. In addition, some approaches use task modeling and path searching simultaneously .

Figure :Overview of task modeling with a random tree.

[h]Applying path search algorithm to determine a viable path


In the last step, a search algorithm is employed on the graph/tree to find a viable path. During the path
search, a heuristic function usually accompany the path search. For example, in the A* algorithm, the
low-cost nodes are determined leveraging distance as a heuristic function. Similarly, the heuristic function
can be energy consumption or smoothness depending upon the scenario. In literature, many techniques
have been suggested to find reliable paths. The path search algorithms, such as differential evolution ,
firefly algorithm , ant colony optimization , genetic algorithms , artificial bee colony , p swarm
optimization , fuzzy logic , central force optimization , gravitational search algorithm , simulated
annealing and their advanced variants are used in the PP. Every algorithm has numerous distinguishing
factors over others regarding conceptual simplicity, computational complexity, robustness, and
convergence rates etc. We categorize the existing path search methods into five categories, and present
representative methods of each category in Figure :

Figure :Categorization of path searching methods/algorithms.


[h]Performance objectives of the path planning approaches

Every PP approach tends to optimize one or more performance objectives (PO) while finding a viable
path for UAVs. The PO can be related to hardware and software. These PO are considered in the previous
three components (i.e., environment modeling, task modeling, and path searching) related to the PP. For
instance, in order to lower the PS computing time, only some portion of a map can be modeled and a
sparse tree/graph can be constructed/used while finding a path. Similarly, memory can be preserved by
exploring some portions of a graph/tree rather than loading and exploring whole graph/tree at a same
time. The selection of PO solely depend on the nature and urgency of the mission. For example, in search
and rescue missions, the PO can be path computing time in order to reach the affected regions quickly. In
contrast, in normal circumstances, the PO can be the path length in order to reach the target location in a
most economical way by preserving UAV’s resources. We describe various most commonly used PO in
Table.

PO Concise description
Computing time It denotes overall time required to find a path using a graph/tree.
Path length It denotes the Euclidean distance between two locations.
It denotes amount of energy required/consumed while reaching to target from
Energy
source.
Turns It denotes number of turns (infeasible curvature) a path has in total.
Smoothness It denotes a turns in a path with a feasible curvatures.
Memory It denotes amount of memory used while computing a path.
Path nodes It denotes set of nodes that a UAV follows during flight.
No. of obstacles It denotes set of obstacles to be processed during path search.
Accuracy It denotes accuracy of obstacles modeling or path clearance from obstacles.
Problem size It denotes size of problem on which path is determined.
Graph size It denotes size of graph (no. of nodes, edges) employed to find a path.
Convergence rate It denotes how quickly a feasible solution can be obtained.
Constraints handling It denotes the effective resolution of constrains UAV faces during mission.
Completeness It denotes availability/non-availability of solution in a finite time.
Flexibility It denotes efforts/time required to make a solution usable for different missions.
Path re-configuration It denotes efforts/time required to gain the control of a lost path.
Path following It denotes the ability to keep following a path despite disturbances.
Path safety It denotes the ability to avoid collisions with static/dynamic obstacles.
Hyper parameter It denotes the number and variety of parameters to find a path.
Obstacle avoidance It denotes the ability to avoid static/dynamic obstacles with low-cost.
PO Concise description
Generalization It denotes the ability of a method to be applicable for different types of UAVs.
Application-speciality It denotes the ability of a method to yield superior performance in some context.
It denotes the ability of a UAV to fly for a long period of time with low-cost
Endurance
planning.

Table :Overview of the PO improved by the PP approaches.

Some PO are positively co-related. For example, finding path with less turns can save energy.

Improving two negatively co-related PO (speed and time) require optimization of another PO (problem
size).

These PO are usually considered during PP irrespective of the environment whether it is known or
unknown. Furthermore, plenty of techniques have been proposed to improve these PO with innovative
techniques or employing cross-disciplinary concepts. In addition, many PP approaches have targeted
optimizing multiple objectives rather than one/two for practical UAVs application. These PO can be
expressed as a functional model while finding a path P between two locations s and t. Some algorithms
tend to optimize more than one POs. The overview of two PO to be optimized by a PP approach is
mathematically expressed as follows.

Path planning algorithms that were proposed in the past five years

In this section, we discuss various PP algorithms that were proposed to lower the time complexity of the
PP process. We selected various algorithms that were proposed in last five years (i.e., 2016–2021), and
have somewhat identical concepts in terms of space restrictions and problem size reduction etc. We
provide brief overview, and technically evaluation of all algorithms and highlight their deficiencies.
Consequently, this analysis can pave the ways to improve PP algorithms for future UAVs’ applications.

[h]Brief overview of the selected path planning algorithms

We present brief overview of the selected algorithms in Table. These algorithms have become state-of-
the-art for many practical applications of the UAVs in the urban/non-urban environments. They are
famous due to their novel working mechanisms, and conceptual simplicity. In addition, they have mainly
focused on the UAV applications in urban environments that is focus of research across the globe. Also,
the UAVs’ applications in the urban environments are likely to increase in the coming years.
Ref. Publication year Environment used PO improved
Maini et al. 2016 3D Computing time and collision-free paths.
Frontera et al. 2017 3D Computing speed and solution quality.
Ahmad et al. 2017 3D Computing speed and energy-optimized paths.
Majeed et al. 2018 3D Computing speed and path quality.
Han et al. 2019 3D Feasible paths with reduced time.
Ghambari et al. 2020 3D Computing time and memory consumption.
Majeed et al. 2021 3D Computing speed and path quality.

Table :Overview of the latest GPP approaches that were proposed to reduce the computing time of PP
process.

All these approaches have used concepts related to search space reduction in order to find time-efficient
paths.

[h]Technical evaluation of the selected path planning algorithms

In this subsection, we provide concise description of the selected algorithms, and highlight their technical
problems. We mainly describe the key steps of the proposed algorithms.

 Maini et al. algorithm computes a low-cost path using two-steps approach. In the first step,
modified version of the Dijkstra algorithm is used to find an initial path. In the second step, initial
path is optimized more by considering the initial path nodes, and reverse path search.
 Frontera et al. algorithm computes a low-cost path using three-steps approach. First, the proposed
method reduce the search space by considering the obstacles that are on the straight axis between s
and t. Later, a visibility graph is generated solely from the corners of the selected obstacles. In the
last step, A* algorithm is employed to compute a shortest path incrementally.
 Ahmad et al. algorithm computes a low-cost path using four-steps approach. Firstly, search space
is bounded using obstacles of the straight line only. Later, the bounded space is extended to next
level by using the obstacles that hit the boundary of the first bounded space. In the third step, a
relatively dense visibility graph is generated from the bounded spaces. In the final step, A*
algorithm is employed to find an energy-optimized path.
 Majeed et al. algorithm computes a low-cost path using five-steps approach. First, the space is
reduced into a half-cylinder form with path guarantees between s and t. In the second step, multi-
criteria based method is employed to check the suitability of the reduced space for low-cost
pathfinding. Later space is extended if needed, and sparse visibility graph is generated that ensure
connectivity between s and t, and path is computed. Moreover, in some cases, path is improved by
adding more nodes around the initial path’s nodes.
 Han et al. algorithm computes a low-cost path using three-steps approach. First, critical obstacles
are identified through straight-axis between s and t. In the second step, a node set is generated
around the corners of the critical obstacles only. In the last step, a feasible path is obtained by
exploring nodes set. This approach is beneficial by resolving constraints related to obstacles
shapes.
 Ghambari et al. computes a global and local path with the help of four-steps. In the first step,
search space is reduced around the straight axis. In the second step, differential evolution algorithm
is applied to construct a graph. Later, A* algorithm is used to find a path from a graph constructed
in the first step. In the third step, subspace is divided into small portions with alternate routes in
each subspace. In the last step, a mechanism is suggested to avoid collision with the dynamic
obstacles that may appear unexpectedly during the flight.
 Majeed et al. recently proposed a PP method for low-cost pathfinding for UAVs based on the
constrained polygonal space and a waypoint graph that is extremely sparse. In proposed approach,
search space is restricted into a polygonal form, and its analysis is performed from optimality point
of view with the help of six complexity parameters. Later, space can be extended to next level if
needed, else a very sparse graph is generated by exploiting the visibility, far-reachability, and
direction guidance concepts. The suggested approach computes time-efficient paths without
degrading path quality while finding paths from urban environments.

Besides the computing time, these algorithms can indirectly optimize certain PO listed in Table. For
example, Ahmad et al. PP approach reduces the number of turns also in order to lower the energy
consumption. Han et al. PP approach can be applied to the environments with arbitrary shaped obstacles
(e.g., there exist no constraint related to the obstacles’ geometries). Hence, it can be applied in different
settings (e.g., areas with sparse obstacles or areas with dense obstacles) of the urban environment.
Similarly, Majeed et al. PP approach can significantly reduce the problem size, thereby memory
requirements can be magnificently lower. Ghambari et al. approach can be used to re-configure paths
during the flight when a UAV finds an unexpected obstacle. Hence, this approach can be used in both
(i.e., local, and global) environments. Despite the utility of these approaches in many real-world
applications, they often yield poor performance due to the local/global constraints. Based on the in-depth
review of all studies, we identified potential problems of all approaches that may hinder their use in actual
deployment. We describe technical challenges of the existing approaches in Table.

Ref. Technical problems in the proposed approach


The performance cannot be ensured in each scenario due to heavy reliance on specific
maps.
Maini et al.
Overheads can increase exponential with the problem size.
It models the whole map thereby path exploration cost is very high.
Path can collide with the nearby obstacles.
In some cases, proposed approach fails to find a path even though it exists.
Frontera et al.
Visibility graph can contain many needless and redundant nodes.
Memory consumption is higher due to loading of whole visibility map in the memory.
Two bounded spaces are used that can increase the computing time of the PP.
Visibility graph is constructed using layered approach with many redundant nodes and
Ahmad et al. edges.
Visibility check function is expensive since visibility in all directions and nodes is
checked.
Path can contain turns due to the strict boundary of the search space.
Majeed et al.
Path optimization cost may increase if initial path has many nodes.
Path quality cannot be ensured in all scenarios if obstacles’ sizes are large.
Path cost can increase exponentially with the point set.
Han et al.
Both time and optimality can be impacted if diverse shape obstacles exist in a map.
Since this is grid-based approach thereby memory consumption is higher.
Ref. Technical problems in the proposed approach
Path computing time can rise with the distance between s and t.
Recognition and avoiding obstacles in realtime can be costly.
Ghambari et al.
Fidelity of the proposed approach were analyzed with limited testing.
Since path searching is carried out twice, thereby computing time can rise.
Accurate modeling of the tiny obstacles is not possible.
Majeed et al.
Excessive calculations are performed in space analysis thereby complexity can rise.

Table :Overview of the technical problems in the proposed GPP approaches.

All these problems have been highlighted by existing studies or reported by the authors.

These challenges lay foundation for the future research in the UAVs area. Furthermore, they can assist
researchers to devise better and practical PP approaches in order to address these technical problems.
Apart from the challenges provided in Table, it is paramount to take into account the local constraints
while devising PP methods that have been mostly assumed in the existing approaches.

[h]Local path planning algorithms

Majority of the approaches discussed above are the GPP approaches, and LPP approaches have not been
discussed. To cover this gap, we discuss various representative LPP approaches in Table along with the
methodological specifics.

Ref. UAV used Technical aspects of the approach


Stecz et al. Multiple Indicated sensors based LPP approach.
Wojciech et al. Single EO/IR systems and SARs based navigation.
Siemiatkowska et al. Multiple MILP based LPP using EO/IR camera and SARs.
Hong et al. Multiple MILP-based multi-layered hierarchical architecture.
Hua et al. Multiple Multi-target intelligent assignment model based LPP.
Cui et al. Single Reinforcement learning (RL)-based LPP approach.
Maw et al. Single Graph and learning based LPP approach.
Wei et al. Single Improved ACO for LPP.
Zhang et al. Single Markov decision process (MDP) based LPP approach.
Zammit et al. Multiple LPP in the presence of uncertainties.
Wu et al. Single Interfered fluid dynamic system (IFDS) based LPP.
Bayerlein et al. Multiple Multi-agent reinforcement learning (MARL) approach for LPP.
Jamshidi et al. Single LPP based on improved version of Gray Wolf Optimization.
Yan et al. Single Sampling based LPP approach in urban environments.
Ref. UAV used Technical aspects of the approach
Sangeetha et al. Single Gain-based dynamic green ACO (GDGACO) LPP approach.
Sangeetha et al. Single Fuzzy gain-based dynamic ACO (FGDACO) LPP approach.
Choi et al. Single Improved CNN based LPP approach for UAV.

Table :Overview of the latest LPP approaches used for UAVs.

All these approaches have used the unknown environment during the PP.

These approaches perform PP in environments that are mostly unknown, and are complex compared to
the GPP approaches. These approaches enable UAVs to perform tasks in complex environments in real
time leveraging low-cost sensors, and robust artificial intelligence (AI) techniques. In addition, these
techniques have abilities to co-work with the emerging technologies including cloud, edge, and fog
computing etc. for variety of applications. The role of UAVs was dominant during the ongoing pandemic
in different countries across the globe. To this end, LPP approaches contributed significantly, and
enhanced UAVs role in curbing the pandemic spread via online missions. Barnawi et al. proposed an
IoT-based platform for COVID-19 scanning in which UAVs were used as a main source of temperature
data collection in the outdoor environments. Apart from the COVID-19 scanning, UAVs were extensively
used for spraying and disinfecting multi-use facilities and contaminated places. In some countries, they
were used for alerting people to wear masks properly, and stay indoors. The true realization of these
innovative application is possible through LPP approaches.

[h]Coverage path planning: a subtopic of the path planning

Besides the LPP and GPP, another important subtopic of the PP is coverage path planning (CPP) . In the
CPP, a path is determined that enables UAV to cover a target area fully with the help of a device/tool
mounted on it. The attached tool/device can be a sensor, camera, speaker, and/or a spray tank depending
upon the mission. We present overview of the CPP in Figure. In Figure(a), a target area in the form of a
rectangle is given that need to be covered with a UAV. In In Figure(b), a coverage path is shown that a
UAV follows in order to cover the target area.
Figure :Overview of coverage path planning for UAVs in a 3D urban environments.

In the CPP, most of the POs are identical with that of the PP, but path overlapping, and coverage
guarantees are two additional POs. Moreover, ensuring consistent path quality with respect to shape of the
target area is very challenging. Therefore, shape of the target area is considered while finding a coverage
path. CPP can be performed in five steps, modeling of the operating environment, locating target area on
the modeled map, decomposition of the target area into disjoint sub parts, task modeling (mainly traversal
order of the sub parts) with the help of a graph, and covering each sub-part using motion pattern (e.g.,
back and forth, spiral, and circular etc.). In recent years, UAVs’ coverage applications in the urban
environments have significantly increased, and a substantial number of CPP approaches have been
proposed .

[h]Prospects of the research in the near future in the PP area

In the near future, UAVs will be regarded as an inevitable tool for various practical missions, especially
in the urban environments. A substantial number of developments are underway to fully realize smart
cities, smart infrastructure, and smart buildings, to name a few. Thence, the use and applications of the
UAVs are expected to grow significantly in the near future. Recently, many innovative technologies such
as block-chain, IoT, 5G/6G technologies, and deep/machine learning approaches have been integrated
with the UAVs technology to serve mankind in effective ways . For example, BloCoV6 scheme is one of
the wonderful applications of the UAVs in the new normal (e.g., COVID-19 era). Similarly, many such
innovative applications are likely to emerge in the near future as a replacement of human beings for
complex tasks. Therefore, refinements in the existing PP approaches in relation with peculiarities of the
applications/tasks, and development of robust approaches leveraging cross-disciplinary (e.g., biological
inspired, AI-powered, and technology-driven) concepts have become necessary. Considering the
emerging applications of the UAVs, we list prospects of the research in the near future in PP area in
Figure. We categorize the avenues of future research in the PP area on four grounds (e.g., UAV
application specific PP approaches, optimization of the existing approaches’ PO, integration of the
emerging technologies and their issues handling, and developing PP approaches that can cope up with the
dynamics of the UAV operating environment.).
Figure :Categorization of the avenues of future research in the PP/UAVs area.

The most important research avenues from the optimization point of view are, devising new environment
restriction methods to reduce the problem sizes, devising low-cost methods for reducing the task
modeling overheads (i.e., graph/tree sizes), and accelerating the PS methods that enable UAV to reach the
target location safely with a significantly reduced cost. Furthermore, improving overall cost of the PP
process is an important research direction to increase UAVs’ applications in the urban environments.
Optimization of multiple objectives rather than single/two is handy in order to preserve UAV’s resources
during aerial missions. From applications point of view, low-cost methods that can improve certain POs
and can satisfy the applications features at the same time are needed. To this end, identifying each
application’s features/requirements and embedding them into the PP process can enhance the UAVs use
in the coming year significantly. Therefore, applications-oriented PP methods will be embraced more in
the near future considering the UAVs potential in executing tasks at low costs. From environment
dynamics point of view, PP methods that can effectively respond to the uncertainties/dynamics emerging
from the environment are paramount. For example, in LPP, decision making to avoid obstacles with as
least cost as possible can enhance UAV’s endurance in the aerial missions. In this regard, LPP methods
that can cope up with the underlying operating environment variations and can ensure UAV’s safety
consistently in the practical applications are paramount.

Recently, many emerging technologies have been integrated with the UAV technology. For example,
blockchain, transfer learning, computer vision, federated learning, 5G and 6G technologies, and cloud
computing etc. have revolutionized the UAVs’ applications. In this regard, incorporating more emerging
technologies in the UAV domain, and extending the current emerging technologies use to more
application areas is an important research direction for the future. Furthermore, improving the hardware
capabilities of the UAV by integrating latest technologies are important need from technical perspectives.
Despite the technical aspects mentioned above, tailoring computer vision applications in the UAV area is
a most promising avenue of the research considering UAV abilities to capture images with good
resolution . In addition, identifying niche areas (i.e., water quality analysis, target tracking, covering
spatially distributed regions, and detection of wildfire smoke, to name a few) where UAVs can perform
well compared to humans, and performing cost–benefit analysis of the UAVs versus human is important
research direction in the UAVs’ technology. Finally, exploring the possibilities towards joint use of
multiple latest technologies in order to serve mankind in an effective way using UAVs is a vibrant area of
research. Apart from the PP, devising low-cost CPP methods for UAVs is also an attractive area of
research in the near future. Development from hardware perspectives (e.g., battery power, wing-span,
payload capabilities, robust decision making abilities, and control aspects) are also a potential avenues for
development/research.

[mh]Regulatory Landscape

Unmanned systems, encompassing drones, autonomous vehicles, and robotics, have proliferated across
various industries, revolutionizing operations and capabilities. However, alongside their advancement, the
regulatory landscape governing these systems has evolved, balancing innovation with safety, security, and
ethical considerations. Understanding this regulatory framework is crucial for stakeholders, including
manufacturers, operators, policymakers, and the public. This provides an overview of the regulatory
landscape in unmanned systems, examining key aspects, challenges, and future directions.

[h]Evolution of Regulations

The regulatory journey of unmanned systems has been characterized by adaptation to technological
advancements and emerging use cases. Initially, regulations primarily focused on military applications,
driven by concerns over national security and defense. However, with the rapid expansion of civilian
applications, including commercial drones and autonomous vehicles, regulatory bodies worldwide have
been compelled to develop comprehensive frameworks to address safety, privacy, and operational
standards.

[h]Key Regulatory Bodies

Regulatory oversight of unmanned systems is typically the responsibility of governmental agencies tasked
with aviation, transportation, and technology regulation. In the United States, the Federal Aviation
Administration (FAA) plays a central role in regulating drones and other unmanned aerial systems (UAS).
The FAA's regulations encompass registration requirements, operational limitations, and certification
standards for commercial use.

Similarly, in the European Union, the European Aviation Safety Agency (EASA) sets regulatory
standards for drones through the publication of common rules and guidelines. These regulations aim to
harmonize drone operations across EU member states while ensuring safety and interoperability.

Beyond aviation, other regulatory bodies govern autonomous vehicles and robotics. For instance, the
National Highway Traffic Safety Administration (NHTSA) in the U.S. oversees the safety of autonomous
vehicles, establishing guidelines for testing and deployment on public roads.

[h]Key Regulatory Considerations

Several critical considerations shape the regulatory landscape of unmanned systems:


1. Safety: Safety is paramount in regulatory frameworks, aiming to mitigate the risks associated with
unmanned systems' operations. Regulations often mandate compliance with technical standards,
such as collision avoidance systems and fail-safe mechanisms, to prevent accidents and protect
public safety.
2. Privacy and Security: Unmanned systems raise concerns regarding privacy infringement and data
security, especially with drones equipped with cameras and sensors. Regulatory measures address
these concerns by imposing restrictions on data collection, storage, and sharing, as well as
implementing cybersecurity protocols to safeguard against unauthorized access and misuse of
information.
3. Operational Standards: Regulatory bodies define operational standards for unmanned systems,
including flight restrictions, altitude limits, and prohibited areas such as airports, critical
infrastructure, and densely populated areas. These standards aim to ensure responsible and lawful
use of unmanned systems while minimizing interference with manned aircraft and ground
activities.
4. Certification and Licensing: Certification and licensing requirements are essential components of
regulatory frameworks, particularly for commercial operators. Pilots and operators may need to
undergo training, obtain licenses, and adhere to proficiency standards to ensure competency and
compliance with regulations.

[h]Challenges and Future Directions

Despite significant progress in regulatory development, several challenges persist in the unmanned
systems landscape:

1. Rapid Technological Advancements: The pace of technological innovation often outstrips


regulatory adaptation, leading to regulatory lag and uncertainty. Regulatory bodies must adopt
agile approaches to keep pace with evolving technologies and emerging applications while
maintaining safety and compliance standards.
2. Interoperability and Harmonization: Inconsistencies in regulatory frameworks across jurisdictions
pose challenges for multinational operations and technology deployment. Efforts to promote
interoperability and harmonization through international collaboration and standardization
initiatives are essential to facilitate seamless integration and global adoption of unmanned systems.
3. Public Acceptance and Trust: Public perception of unmanned systems influences regulatory
attitudes and policies. Addressing concerns related to safety, privacy, and societal impact through
transparent communication, stakeholder engagement, and ethical guidelines is crucial to building
public acceptance and trust in unmanned systems.
4. Ethical and Legal Considerations: Unmanned systems raise complex ethical and legal dilemmas,
such as liability for accidents, accountability for autonomous decision-making, and equitable
access to technology. Regulatory frameworks need to incorporate ethical principles and legal
safeguards to ensure responsible and equitable deployment of unmanned systems.

The regulatory landscape governing unmanned systems continues to evolve in response to technological
innovation, societal needs, and regulatory challenges. Achieving a balance between fostering innovation
and ensuring safety, security, and ethical standards remains a primary objective for regulatory bodies
worldwide. Collaboration among stakeholders, proactive regulatory approaches, and ongoing dialogue are
essential to navigate the dynamic landscape of unmanned systems regulation effectively.
[mh]Future Trends and Challenges

Unmanned systems, encompassing drones, autonomous vehicles, and robotics, have witnessed
remarkable growth and innovation in recent years, with applications spanning diverse industries such as
agriculture, healthcare, transportation, and defense. As technology continues to advance, the future of
unmanned systems holds immense promise, yet it also presents numerous challenges that must be
addressed. This explores the future trends and challenges shaping the evolution of unmanned systems,
focusing on key areas of development and the implications for society, economy, and governance.

1. Advancements in Autonomy

One of the most significant trends in unmanned systems is the advancement of autonomy. Future systems
will feature increasingly sophisticated artificial intelligence (AI) and machine learning capabilities,
enabling them to make complex decisions and adapt to dynamic environments autonomously. This trend
is evident in autonomous vehicles, where self-driving cars and trucks are poised to revolutionize
transportation by enhancing safety, efficiency, and mobility for passengers and goods.

However, advancing autonomy poses challenges related to reliability, safety, and ethical decision-making.
Ensuring the robustness and reliability of AI algorithms, addressing ethical dilemmas such as moral
decision-making in critical situations, and establishing regulatory frameworks to govern autonomous
systems are essential considerations for the future of unmanned autonomy.

2. Integration of Sensor Technologies

Another key trend is the integration of advanced sensor technologies in unmanned systems. Sensors such
as LiDAR, radar, and thermal imaging enable drones and robotics to perceive and interact with their
surroundings more effectively, enhancing capabilities in navigation, obstacle avoidance, and
environmental monitoring. In agriculture, for example, drones equipped with multispectral cameras and
soil sensors enable precision farming practices, optimizing crop yields while minimizing resource use and
environmental impact.

As sensor technologies continue to evolve, challenges related to data management, interoperability, and
privacy will arise. Addressing concerns over data ownership, security, and privacy protection while
harnessing the full potential of sensor data for decision-making will be critical in the future development
of unmanned systems.

3. Urban Air Mobility (UAM)

Urban Air Mobility (UAM) represents a transformative trend in unmanned systems, offering the potential
for on-demand, aerial transportation within urban and metropolitan areas. Electric vertical takeoff and
landing (eVTOL) aircraft, commonly known as flying taxis, are poised to revolutionize urban
transportation by alleviating congestion, reducing commuting times, and enhancing mobility for
passengers.

However, the widespread adoption of UAM faces numerous challenges, including regulatory hurdles,
infrastructure requirements, and public acceptance. Establishing air traffic management systems for
densely populated urban airspace, ensuring safety and reliability standards for eVTOL aircraft, and
addressing noise and environmental concerns are among the key challenges that must be overcome to
realize the full potential of UAM.

4. Human-Machine Collaboration
The future of unmanned systems will see increasing collaboration between humans and machines,
leveraging the complementary strengths of human intelligence and machine automation. In fields such as
healthcare, robotics assistive devices enhance patient care and rehabilitation by augmenting the
capabilities of healthcare professionals and improving patient outcomes.

However, ensuring effective human-machine collaboration requires addressing challenges related to trust,
transparency, and user experience. Building trust in autonomous systems, providing transparent
explanations of AI decision-making processes, and designing user-friendly interfaces are essential
considerations for promoting acceptance and adoption of human-machine collaborative systems.

5. Ethical and Societal Implications

As unmanned systems become more integrated into society, ethical and societal implications will become
increasingly prominent. Concerns over job displacement due to automation, algorithmic bias in AI
decision-making, and the misuse of unmanned technologies for surveillance or military purposes raise
complex ethical dilemmas that require careful consideration.

Addressing these ethical and societal implications necessitates interdisciplinary collaboration, stakeholder
engagement, and the development of ethical guidelines and governance frameworks. Balancing
technological advancement with social responsibility, ensuring equitable access to benefits of unmanned
systems, and mitigating potential negative impacts on vulnerable populations are essential for fostering a
future where unmanned systems contribute to societal well-being and prosperity.

6. Regulatory and Legal Challenges

The regulatory and legal landscape governing unmanned systems will continue to evolve in response to
technological innovation and emerging use cases. Regulatory bodies must adapt to address challenges
such as airspace integration, privacy protection, and liability issues associated with autonomous systems.

Harmonizing regulatory frameworks across jurisdictions, establishing standards for safety and reliability,
and promoting responsible use of unmanned technologies are key priorities for policymakers and
regulatory agencies. Additionally, addressing legal challenges related to liability, insurance, and
jurisdictional issues will be crucial for fostering a conducive environment for innovation and investment
in unmanned systems.

The future of unmanned systems holds immense potential to transform industries, enhance quality of life,
and address societal challenges. However, realizing this potential requires addressing a myriad of
technical, ethical, regulatory, and societal challenges. By fostering collaboration, innovation, and
responsible governance, stakeholders can work together to harness the full benefits of unmanned systems
while mitigating risks and ensuring a future that is safe, equitable, and sustainable.

chapter 2: Unmanned Aerial Systems (UAS)

[mh]Overview of UAS

Unmanned Aerial Systems (UAS), commonly known as drones, have emerged as versatile and
transformative tools with applications across various sectors, including agriculture, infrastructure
inspection, disaster response, and cinematography. UAS encompass a diverse range of aerial vehicles,
from small quadcopters to large fixed-wing aircraft, equipped with sensors, cameras, and other payloads
for data collection and analysis. This overview provides insight into the components, capabilities,
applications, and challenges of UAS, highlighting their impact on industries, society, and technology.

[h]Components of UAS

UAS typically consist of three main components:

1. Unmanned Aircraft: The aircraft component of a UAS can vary in size, shape, and configuration,
depending on its intended use and payload requirements. Fixed-wing aircraft offer long-endurance
flights and large coverage areas, suitable for mapping and surveillance missions, while multirotor
drones, such as quadcopters and hexacopters, provide vertical takeoff and landing capabilities and
maneuverability in confined spaces.
2. Ground Control Station (GCS): The GCS serves as the command center for UAS operations,
allowing operators to plan flight missions, monitor aircraft status, and control flight parameters
remotely. GCS typically consist of software interfaces, communication links, and control panels
for real-time interaction with the UAS.
3. Payloads and Sensors: Payloads and sensors are critical components of UAS, enabling them to
collect data and perform specific tasks. Common payloads include cameras for aerial photography
and videography, sensors for environmental monitoring and surveillance, and specialized
equipment for tasks such as crop spraying, package delivery, and search and rescue operations.

[h]Capabilities of UAS

UAS offer several capabilities that distinguish them from manned aircraft and traditional ground-based
systems:

1. Flexibility and Maneuverability: UAS can access remote or hazardous areas that are inaccessible
or unsafe for manned aircraft or ground vehicles, providing flexibility and maneuverability in
diverse environments and conditions.
2. Cost-Effectiveness: UAS offer cost-effective solutions for various tasks, such as aerial mapping,
infrastructure inspection, and crop monitoring, by reducing the need for manned flights,
equipment, and personnel.
3. Data Collection and Analysis: UAS are equipped with sensors and payloads for data collection and
analysis, enabling applications such as aerial surveying, precision agriculture, disaster assessment,
and environmental monitoring.
4. Real-Time Monitoring and Response: UAS provide real-time monitoring capabilities for rapid
situational awareness and response in emergency situations, such as natural disasters, accidents,
and search and rescue operations.
5. Automation and Autonomy: Advances in automation and autonomy enable UAS to perform
complex tasks autonomously, including waypoint navigation, obstacle avoidance, and mission
planning, with minimal human intervention.

[h]Applications of UAS

UAS have diverse applications across numerous industries and sectors:

1. Agriculture: In agriculture, UAS are used for crop monitoring, pest management, and precision
agriculture practices, such as soil analysis, irrigation management, and crop spraying, to optimize
yields, reduce inputs, and enhance farm efficiency.
2. Infrastructure Inspection: UAS provide cost-effective solutions for infrastructure inspection and
maintenance, including bridges, power lines, pipelines, and buildings, by conducting aerial
surveys, identifying structural defects, and assessing maintenance needs.
3. Environmental Monitoring: UAS enable environmental monitoring and conservation efforts by
collecting data on wildlife habitats, ecosystems, and natural resources, facilitating research,
conservation planning, and environmental management initiatives.
4. Public Safety and Emergency Response: UAS support public safety and emergency response
efforts by providing aerial surveillance, situational awareness, and communication relays in
disaster situations, such as wildfires, floods, earthquakes, and search and rescue operations.
5. Media and Entertainment: UAS are utilized in the media and entertainment industry for aerial
photography, cinematography, and filming applications, offering unique perspectives and creative
possibilities for filmmakers, photographers, and content creators.
6. Transportation and Logistics: UAS are explored for transportation and logistics applications,
including last-mile delivery, medical supply transport, and urban air mobility initiatives, leveraging
their agility, speed, and accessibility for efficient and sustainable transportation solutions.

[h]Challenges and Considerations

Despite their numerous capabilities and applications, UAS face several challenges and considerations:

1. Regulatory Compliance: UAS operations are subject to regulatory frameworks and airspace
restrictions imposed by aviation authorities, which govern flight permissions, operational
limitations, and safety standards to ensure airspace safety and public security.
2. Safety and Risk Management: Safety is a primary concern in UAS operations, requiring risk
management measures such as pre-flight checks, emergency procedures, and collision avoidance
systems to mitigate hazards and prevent accidents.
3. Privacy and Data Security: UAS raise privacy concerns related to aerial surveillance, data
collection, and information sharing, necessitating privacy safeguards, data encryption, and
compliance with data protection regulations to safeguard personal privacy and data security.
4. Technological Limitations: UAS technology faces limitations in areas such as endurance, payload
capacity, and communication range, which impact their operational capabilities and suitability for
specific tasks and environments.
5. Public Perception and Acceptance: Public perception of UAS varies, with concerns over privacy
invasion, noise pollution, and safety risks, requiring education, outreach, and community
engagement efforts to foster understanding and acceptance of UAS technology.
6. Integration and Interoperability: Integrating UAS into existing airspace and infrastructure systems
poses challenges related to airspace management, traffic coordination, and communication
interoperability, requiring collaboration among stakeholders and regulatory bodies to address.

Unmanned Aerial Systems (UAS) represent a disruptive and transformative technology with diverse
applications and capabilities across various industries and sectors. While UAS offer numerous benefits,
including flexibility, cost-effectiveness, and data-driven insights, they also pose challenges related to
regulatory compliance, safety, privacy, and public acceptance. Addressing these challenges requires
collaborative efforts among stakeholders, policymakers, and regulatory bodies to harness the full potential
of UAS technology while ensuring responsible and sustainable integration into society and the economy.

[mh]Design and Components


Unmanned Aerial Systems (UAS), commonly referred to as drones, are complex technological platforms
designed for various applications, ranging from aerial photography and surveillance to agriculture and
infrastructure inspection. The design and components of a UAS play a crucial role in its performance,
capabilities, and suitability for specific tasks and environments. This overview provides insight into the
design principles and key components of UAS, highlighting their functionality and interdependence in
enabling unmanned flight operations.

[h]Design Principles of UAS

The design of a UAS is guided by several principles aimed at achieving optimal performance, reliability,
and safety:

1. Aerodynamics: The aerodynamic design of a UAS determines its flight characteristics, stability,
and efficiency. Factors such as airframe shape, wing configuration, and control surfaces influence
aerodynamic performance, allowing the UAS to achieve desired flight maneuvers and payload
capacity.
2. Structural Integrity: Structural integrity is critical for ensuring the durability and resilience of a
UAS during flight operations. The airframe must withstand aerodynamic forces, environmental
conditions, and mechanical stresses while supporting the weight of payloads and components.
3. Weight and Balance: Weight and balance considerations are essential in UAS design to optimize
flight performance and stability. Proper weight distribution and center of gravity management
ensure that the UAS maintains stable flight attitudes and maneuverability, minimizing the risk of
instability or control issues.
4. Power and Propulsion: Power and propulsion systems provide the necessary thrust and energy to
propel the UAS through the air. Propulsion systems, such as electric motors, combustion engines,
or jet turbines, are selected based on factors such as payload requirements, flight duration, and
energy efficiency.
5. Control and Navigation: Control and navigation systems enable pilots or autonomous algorithms to
steer and maneuver the UAS during flight. Flight control surfaces, such as ailerons, elevators, and
rudders, translate pilot inputs or autopilot commands into changes in altitude, direction, and speed,
ensuring precise control and stability.
6. Payload Integration: Payload integration involves incorporating sensors, cameras, communication
equipment, and other specialized devices into the UAS platform to fulfill specific mission
objectives. Payload placement, mounting mechanisms, and integration with onboard systems are
optimized to minimize aerodynamic drag, maintain balance, and ensure functionality.

[h]Key Components of UAS

UAS consist of several interconnected components that work together to enable unmanned flight
operations:

1. Airframe: The airframe serves as the structural framework of the UAS, providing support for other
components and payloads. Airframes can vary in size, shape, and material composition, ranging
from lightweight carbon fiber for small drones to durable composites or metals for larger
unmanned aircraft.
2. Propulsion System: The propulsion system generates thrust to propel the UAS through the air.
Electric motors, combustion engines, or jet turbines power propellers or rotors, producing lift and
forward motion. Propulsion systems are selected based on factors such as payload capacity,
endurance, and operational requirements.
3. Flight Control System: The flight control system consists of onboard electronics, sensors, and
actuators that regulate the UAS's flight attitude, stability, and maneuverability. Gyroscopes,
accelerometers, and GPS receivers provide feedback to the flight controller, which adjusts control
surfaces or motor speeds to maintain desired flight parameters.
4. Navigation System: The navigation system enables the UAS to determine its position, altitude, and
orientation relative to the Earth's surface. Global Navigation Satellite Systems (GNSS), such as
GPS or GLONASS, provide precise location data, while inertial measurement units (IMUs) and
barometric altimeters complement GPS for accurate navigation and altitude control.
5. Communication System: The communication system facilitates data exchange between the UAS
and ground control station (GCS) or remote operators. Radio frequency (RF) transceivers,
telemetry links, and wireless networks enable real-time telemetry, command and control, and video
streaming, allowing operators to monitor and command the UAS during flight.
6. Power Supply: The power supply provides electrical energy to the UAS's onboard systems,
including propulsion, avionics, and payloads. Lithium-ion batteries, fuel cells, or combustion
engines power the UAS, with battery capacity and energy density influencing flight duration and
operational range.
7. Payloads and Sensors: Payloads and sensors are specialized equipment integrated into the UAS for
data collection, imaging, and mission-specific tasks. Cameras, LiDAR scanners, multispectral
sensors, and thermal imagers capture aerial imagery, survey terrain, monitor environmental
conditions, and perform other applications.
8. Ground Control Station (GCS): The ground control station serves as the command center for UAS
operations, allowing operators to plan missions, monitor flight status, and control the UAS
remotely. GCS typically consist of software interfaces, communication links, and control panels
for real-time interaction with the UAS.

[h]Interdependence of Components

The components of a UAS are interconnected and interdependent, working together to enable safe and
efficient unmanned flight operations:

 The airframe provides structural support for other components, housing propulsion systems,
avionics, and payloads while maintaining aerodynamic stability and integrity.
 The propulsion system generates thrust to propel the UAS through the air, with motor speed and
propeller pitch controlled by the flight control system to maintain desired flight attitudes and
maneuvers.
 The flight control system regulates the UAS's flight parameters, receiving inputs from sensors and
navigation systems to adjust control surfaces, motor speeds, and flight trajectories.
 The navigation system determines the UAS's position and orientation, providing accurate location
data for waypoint navigation, autonomous flight, and mission planning.
 The communication system enables real-time telemetry, command and control, and data
transmission between the UAS and ground control station, facilitating remote operation and
monitoring.
 The power supply supplies electrical energy to onboard systems, with battery capacity and energy
efficiency influencing flight duration, endurance, and operational range.
 Payloads and sensors collect data and perform specific tasks, with payload integration and
placement optimized to minimize aerodynamic drag and maintain balance.
 The ground control station provides operators with situational awareness, mission planning tools,
and remote control capabilities, enabling safe and effective UAS operations.

Unmanned Aerial Systems (UAS) are sophisticated technological platforms designed for diverse
applications, ranging from aerial photography and surveillance to agriculture and infrastructure
inspection. The design and components of a UAS are carefully engineered to achieve optimal
performance, reliability, and safety, with each component playing a crucial role in enabling unmanned
flight operations. By understanding the principles and functionalities of UAS design, stakeholders can
develop and deploy UAS solutions that meet the needs of specific missions and environments while
advancing the capabilities and potential of unmanned aviation.

[mh]Navigation and Control Systems

Navigation and control systems are essential components of Unmanned Aerial Systems (UAS), enabling
autonomous or remote-controlled flight operations with precision, stability, and safety. These systems
encompass a range of sensors, avionics, algorithms, and actuators that work together to determine the
UAS's position, orientation, and trajectory, as well as regulate its flight parameters in real-time. This
overview provides insight into the key elements and functionalities of navigation and control systems in
UAS, highlighting their importance in enabling effective and reliable unmanned flight operations.

[h]Navigation Systems

Navigation systems in UAS enable the determination of the UAS's position, altitude, and orientation
relative to the Earth's surface and other reference points. These systems utilize a combination of sensors,
satellite signals, and onboard algorithms to provide accurate and reliable navigation capabilities. Common
components of navigation systems include:

1. Global Navigation Satellite Systems (GNSS): GNSS, such as the Global Positioning System
(GPS), GLONASS, and Galileo, provide precise positioning and timing information by receiving
signals from satellites orbiting the Earth. GNSS receivers onboard the UAS calculate the UAS's
coordinates based on satellite signals, enabling accurate navigation and waypoint tracking.
2. Inertial Measurement Units (IMUs): IMUs consist of gyroscopes, accelerometers, and sometimes
magnetometers that measure the UAS's acceleration, rotation rates, and magnetic field orientation.
IMUs provide continuous updates on the UAS's attitude (roll, pitch, and yaw angles) and motion,
serving as a complementary navigation sensor to GNSS, especially in GPS-denied or degraded
environments.
3. Barometric Altimeters: Barometric altimeters measure atmospheric pressure changes to estimate
the UAS's altitude above sea level. Barometric altitude measurements complement GNSS altitude
data, providing redundancy and accuracy in altitude estimation, particularly during vertical
maneuvers or low-altitude flight.
4. Vision-Based Systems: Vision-based navigation systems use onboard cameras and computer
vision algorithms to analyze visual features in the environment and determine the UAS's position
and motion relative to its surroundings. Visual odometry, simultaneous localization and mapping
(SLAM), and feature tracking techniques enable autonomous navigation and obstacle avoidance in
GPS-denied or indoor environments.
5. Radio Frequency (RF) Localization: RF localization systems, such as radar or radio beacons,
provide additional localization and positioning capabilities, especially in scenarios where GNSS
signals are unavailable or unreliable, such as urban canyons or indoor environments.

[h]Control Systems

Control systems in UAS regulate the aircraft's flight parameters, such as attitude (orientation), altitude,
speed, and trajectory, to achieve stable and controlled flight. These systems utilize feedback from sensors
and navigation systems to compute control commands and actuate control surfaces, motors, or thrusters to
adjust the UAS's motion accordingly. Key components of control systems include:
1. Flight Control Computer (FCC): The flight control computer is the central processing unit that
runs control algorithms, computes control commands, and manages sensor data processing. The
FCC interfaces with sensors, navigation systems, and actuators to regulate the UAS's flight
parameters in real-time.
2. Attitude and Stabilization Control: Attitude and stabilization control systems maintain the UAS's
desired orientation and stability by adjusting control surfaces, such as ailerons, elevators, and
rudders, or by varying motor speeds and thrust vectoring. Proportional-integral-derivative (PID)
controllers or more advanced control algorithms, such as model predictive control (MPC), are used
to stabilize the UAS and reject disturbances.
3. Altitude and Speed Control: Altitude and speed control systems regulate the UAS's vertical and
horizontal motion to maintain desired altitude and airspeed. Proportional-integral (PI) controllers
or state-feedback control laws adjust throttle or motor outputs to achieve altitude or speed targets
while compensating for external disturbances.
4. Waypoint Navigation: Waypoint navigation systems enable autonomous flight along predefined
paths or waypoints by computing trajectory plans and guidance commands based on mission
objectives and GPS coordinates. Path following algorithms, such as proportional navigation or
geometric path planning, guide the UAS along desired trajectories while avoiding obstacles and
maintaining safety margins.
5. Autonomous Decision Making: Advanced control systems in UAS incorporate autonomous
decision-making capabilities to adaptively respond to changing environmental conditions, mission
requirements, and safety constraints. Decision-making algorithms, such as reinforcement learning,
fuzzy logic, or artificial intelligence (AI), enable UAS to dynamically adjust their behavior and
mission plans based on real-time sensor feedback and situational awareness.

[h]Integration and Interoperability

Navigation and control systems in UAS are tightly integrated and interconnected, with sensor inputs
processed by onboard computers to generate control commands and actuate control surfaces or propulsion
systems accordingly. Interoperability between navigation and control subsystems ensures seamless
coordination and responsiveness in achieving desired flight objectives while maintaining stability,
accuracy, and safety.

Navigation and control systems play a critical role in enabling unmanned aerial systems (UAS) to
navigate, maneuver, and perform tasks autonomously or under remote control. By integrating sensors,
avionics, algorithms, and actuators, these systems enable precise positioning, stable flight, and adaptive
control in diverse environments and mission scenarios. Advances in navigation and control technologies
continue to enhance the capabilities and applications of UAS, driving innovation and enabling new
opportunities in fields such as aerial robotics, transportation, and infrastructure inspection.

[mh]Payloads and Sensors

Payloads and sensors are integral components of Unmanned Aerial Systems (UAS), enabling these
versatile platforms to perform a wide range of tasks across various industries and applications. From
aerial photography and surveillance to environmental monitoring and infrastructure inspection, payloads
and sensors play a crucial role in collecting data, capturing imagery, and performing specialized functions
from the air. This overview provides insight into the diverse types of payloads and sensors used in UAS,
their functionalities, applications, and advancements driving innovation in unmanned aerial technology.

[h]Types of Payloads and Sensors


1. Imaging Systems: Imaging systems are among the most common payloads used in UAS,
encompassing cameras, sensors, and optical instruments for capturing visual data and imagery.
These systems include:
o Visible Light Cameras: Visible light cameras capture high-resolution images and videos in
the visible spectrum, providing visual information for applications such as aerial
photography, cinematography, and surveillance.
o Infrared (IR) Cameras: Infrared cameras detect thermal radiation emitted by objects in the
infrared spectrum, enabling temperature measurements, night vision, and heat detection for
applications such as search and rescue, firefighting, and industrial inspection.
o Multispectral and Hyperspectral Cameras: Multispectral and hyperspectral cameras capture
imagery across multiple spectral bands or narrow spectral ranges, enabling spectral analysis
and classification of vegetation, soil, and environmental features for agriculture, forestry,
and environmental monitoring.
o LiDAR (Light Detection and Ranging): LiDAR sensors emit laser pulses to measure
distances to objects and create detailed 3D maps of terrain, structures, and topography for
applications such as mapping, surveying, and urban planning.
2. Environmental Sensors: Environmental sensors measure physical, chemical, and biological
parameters of the atmosphere, water, and soil, providing data for environmental monitoring,
research, and management. These sensors include:
o Temperature and Humidity Sensors: Temperature and humidity sensors measure ambient
temperature and relative humidity levels, providing insights into weather conditions,
microclimate variations, and environmental changes.
o Air Quality Sensors: Air quality sensors detect pollutants, particulate matter, and gases such
as carbon dioxide, nitrogen dioxide, and ozone, enabling monitoring of air pollution levels
and assessment of health risks and environmental impacts.
o Weather Sensors: Weather sensors measure meteorological parameters such as wind speed,
wind direction, atmospheric pressure, and precipitation, facilitating weather forecasting,
climate research, and meteorological monitoring.
3. Remote Sensing Instruments: Remote sensing instruments capture data from the Earth's surface,
atmosphere, and oceans using electromagnetic radiation across various wavelengths. These
instruments include:
o Spectrometers: Spectrometers measure the intensity of electromagnetic radiation across
different wavelengths, enabling spectral analysis and identification of materials, chemicals,
and environmental features.
o Radiometers: Radiometers quantify the amount of electromagnetic radiation emitted or
reflected by objects, surfaces, or the Earth's atmosphere, providing insights into surface
properties, energy balance, and radiative forcing.
o Radar and Microwave Sensors: Radar and microwave sensors use radio waves to penetrate
through clouds, vegetation, and terrain, enabling remote sensing of subsurface features, soil
moisture, and biomass content for applications such as agriculture, hydrology, and geology.
4. Communications and Networking Systems: Communications and networking systems enable data
transmission, command and control, and real-time communication between the UAS and ground
control station (GCS). These systems include:
o Radio Frequency (RF) Transceivers: RF transceivers establish communication links
between the UAS and GCS, transmitting telemetry data, control commands, and video feeds
over radio frequencies.
o Satellite Communication Systems: Satellite communication systems provide long-range
communication capabilities for beyond-line-of-sight operations, enabling remote control
and monitoring of UAS in remote or inaccessible areas.
o Mesh Networks: Mesh networks enable peer-to-peer communication between multiple UAS
or ground nodes, forming self-organizing networks for collaborative missions, relay
communication, and distributed data processing.
5. Payload Delivery Systems: Payload delivery systems enable the deployment of payloads, cargo, or
packages from the UAS to the ground or target area. These systems include:
o Drop Mechanisms: Drop mechanisms release payloads or cargo from the UAS using
electromechanical actuators, pneumatic systems, or gravity, enabling aerial delivery of
supplies, medical aid, or emergency equipment in disaster zones or remote locations.
o Winch Systems: Winch systems lower or raise payloads on cables or tethers attached to the
UAS, facilitating vertical deployment or retrieval of instruments, sensors, or sampling
devices for environmental monitoring or scientific research.

[h]Applications of Payloads and Sensors in UAS

Payloads and sensors enable UAS to perform a wide range of applications across various industries and
sectors:

1. Agriculture: In agriculture, UAS equipped with multispectral cameras, LiDAR sensors, and
environmental monitors enable precision agriculture practices such as crop monitoring, soil
analysis, irrigation management, and pest detection, optimizing yields, reducing inputs, and
enhancing farm efficiency.
2. Infrastructure Inspection: UAS are used for infrastructure inspection and maintenance tasks,
including bridge inspection, power line inspection, pipeline monitoring, and building surveying,
using imaging systems, LiDAR scanners, and structural health monitoring sensors to identify
defects, assess condition, and prioritize maintenance needs.
3. Environmental Monitoring: UAS support environmental monitoring and conservation efforts by
collecting data on wildlife habitats, ecosystems, and natural resources using remote sensing
instruments, environmental sensors, and aerial surveying techniques, facilitating research,
conservation planning, and environmental management initiatives.
4. Disaster Response: UAS play a critical role in disaster response and emergency management by
providing situational awareness, damage assessment, and search and rescue support using aerial
photography, thermal imaging, and communication systems, enabling rapid response,
coordination, and resource allocation in disaster zones.
5. Mapping and Surveying: UAS are utilized for mapping, surveying, and cartography applications,
generating high-resolution maps, 3D models, and geographic information systems (GIS) data using
LiDAR, photogrammetry, and GPS technologies, supporting urban planning, land management,
and infrastructure development projects.
6. Media and Entertainment: UAS are employed in the media and entertainment industry for aerial
photography, cinematography, and filming applications, capturing breathtaking aerial footage,
panoramic views, and creative perspectives for films, documentaries, advertisements, and virtual
reality experiences.

[h]Advancements in Payloads and Sensors

Advancements in payloads and sensors for UAS continue to drive innovation and expand the capabilities
of unmanned aerial technology. Key advancements include:

1. Miniaturization and Lightweight Design: Payloads and sensors are becoming increasingly
compact, lightweight, and energy-efficient, enabling integration into smaller UAS platforms and
extending flight endurance and operational range.
2. Enhanced Sensing Technologies: Sensor technologies are evolving to provide higher resolution,
greater sensitivity, and improved spectral and spatial capabilities, enabling more precise data
collection and analysis for a wide range of applications, from agriculture and environmental
monitoring to infrastructure inspection and scientific research.
3. Intelligent Payload Integration: Intelligent payload integration and modular design enable UAS to
carry multiple sensors and payloads simultaneously, allowing for multi-mission capabilities and
adaptive payload configurations tailored to specific application requirements.

[mh]Civil and Military Applications

If the qualities required of a leader to be a good commander and a good decision maker remain
constant in human history in the face of the complexity of the battle, the leader of tomorrow will have
to adapt to the uses of new technologies. This will allow him to be better informed, and consequently
to be more reactive in order to keep the initiative in the manoeuvre, but also to carry his action further
and delegate certain tasks to the machines he will have at his disposal. Such adaptations are not trivial,
because they reconsider the existing military doctrines, and can call into question the very principle of
the hierarchy that makes the strength of armies. It is therefore necessary for the military to know how
to use these new technologies through training, but also to know how to keep control of the use of new
systems integrating a certain form of autonomy. Above all, it is important for the military leader to
preserve the very essence of his very identity: to give meaning to military action and command to
achieve his goals.

Primarily, a military leader must command, which implies legitimate decision-making authority and a
responsibility towards the soldiers entrusted to him for the mission which he must ensure.

The command is the very expression of the personality of the leader. It depends on the tactical situation
which includes the risk and the obligations of the mission to be carried out.

To be a good military leader implies several additional qualities: to be demanding, to be competent, to


have a high moral strength in the face of the difficulties of war, to have confidence in his own abilities
and in those put at disposal, to be in responsibility to assume his decisions and for that to put in
responsibility his subordinates, and finally to be able to decide in complete freedom.

He is the one who decides and commands. He is the one to whom all eyes turn in difficulty , but the
exercise of his command requires a demanding discernment between reflection and action.

The military world is very demanding and dangerous. Having to take into account the danger for his
soldiers, the danger for himself and the responsibility of the mission he has been given, the military leader
should:

 discern in complexity (deploy true situational intelligence);


 decide in uncertainty (have the strength of character to accept calculated risks);
 act in adversity (to unite energies, encourage collective action and make conscious decisions);

This forms the basis of the educational project of the Saint-Cyr Coëtquidan military academy, and
perfectly synthesises the objectives of a training system adapted to the officers of the 21st century.
However, this initial training must take into account the technological evolutions allowing military
decision-makers of today and tomorrow to reduce the fog of war.

[h]Military leader is accountable for the decision

What is decision-making for a military officer? It consists of choosing between different possibilities and
opting for a conclusion among the different possible solutions, while having analysed all effects that this
decision implies.

In order to decide, the leader must master the various areas listed below: a perfect knowledge of the
mission entrusted to him, of the means at his disposal and of his troops. Nothing is worse than indecision
when the lives of soldiers are in danger. His decision must call for moral and intellectual courage.

“The unknown is the governing factor in war” said Marshal Foch. However, the role of the leader is
above all to be able to adapt and modify his analysis and the behaviour of his troop in order to respond to
unforeseen situations. This ability to adapt is essential to maintain the freedom of action that allows for
initiative on the battlefield, and to be able to innovate according to the constraints.

The leader must show discernment in action, to appreciate facts according to their nature and their fair
value. This implies being cautious in his choices and the scope of his choices.

Finally, the leader must be lucid, and control his stress, pressure and emotions. These to preserve his
“esprit d’initiative”.

[h]Information, the key to victory

To meet all these requirements, information is one of the major foundations for the exercise of the
command of the chief. It is the keystone of all military action, to keep the initiative and maintain
supremacy on the ground .

In fact, information allows the chief to plan the military action, taking into account the means at his
disposal, ensuring the transport logistics, and confronting the possible friendly and enemy modes of
action in order to determine the manoeuvre that he will conduct.

The management of the information received is reflected “en conduite” by the regular rhythm of reports
and situation updates to higher or subordinate levels, in order to anticipate threats and maintain a capacity
to react as quickly and efficiently as possible in the face of adversity or any obstacle hindering the
manoeuvre.

For the decision-making process to run smoothly, the information must be updated regularly because the
situation can change very quickly and the leader will have to adapt his analysis accordingly.

Thus, there is no single decision of the military commander in operation, but a continuum of decisions,
some of which are almost routine or implicit, while others require extensive analysis. Some decisions are
ultimately critical, as they can result in a favourable or tragic outcome to a given situation.

[h]What is fundamentally changing


This chapter addresses the change in the art of decision-making for a military officer, implied by the use
of some technologies that will gradually invade the battlefield.

Indeed, some technologies will allow the leader to be better informed, but also to be more reactive in
order to keep the initiative. Their management requires a mastery of new data management processes
resulting from the digitisation of the battlefield, in particular the possible influx of operational data from
the field and their synthesis for the military leader.

[h]A more accurate and faster remote information acquisition

The one who sees further and before the others is the one who dominates the military manoeuvre. This is
what enables him to gain a tactical advantage because the one who acts first with determination is most
often the one who wins. Moreover, the ability to see further and more accurately thanks to remote sensors
or cameras brings an undeniable advantage to the military leader, enabling him to react faster than his
enemy.

Today, spaces are getting tighter, and information can be transmitted in a few milliseconds to any point
on the planet, provided that the sensor capturing the information is available. This is done through
cyberspace which must be secured for military forces so that they can be sure of the veracity of the data
they use. This immediacy of information is a new parameter in the art of command. It forces the leader to
make a quick analysis and to be reactive in his response.

It also raises the question of his capacity to process the information, if there is too much data to process.
In this case, it will be necessary to process automatically the data as soon as it is received by the systems,
to extract only the relevant information. And if these systems are unable to do this, the leader will have to
be assisted in the analysis and decision-making by a third party, which may also be a machine. This raises
the question of the control of these decision aids provided and which he must rely on.

[h]Act remotely to remove the danger and increase the area of action

One of the major military revolutions that began at the start of the 21st century in the Iraq and
Afghanistan wars is the robotisation of the battlefield. It is unavoidable and will gradually be introduced
into the battlefield because the use of unmanned robots (UAV, USV, UUS and UGV) offers many
advantages to the armies that will use them on the ground.

Firstly, it avoids exposing our own combatants, which is all the more important in our modern armies
where the latter are a scarce and expensive resources to train.

Secondly, it extends the area of perception and action of a military unit. In a sense, they are the “5
deported senses” of the fighter, i.e. his eyes (camera), his ears (reception), his mouth (transmission), his
touch (actuator arm) and even his sense of smell and taste (detection of CBRN products).

As tools placed at the disposal of the combatant, robots will allow him to control the battlefield by
deporting effectors or sensors allowing a control of the various dimensions and spaces of the battlefield,
on land, in the air, at sea and even electromagnetically. These will thus progressively move the combatant
behind the contact zone, in order to move him away from the dangerous area and reduce the risks, or
allow him to dive in with the maximum of means at his disposal, thus significantly reducing the
vulnerability of the combatants .

Finally, the ability to act remotely while preserving the lives of his men will allow the leader to act even
the enemy can even deploy his forces for his manoeuvre.
Robotic systems will thus become new tactical pawns that the military leader will now use to prepare his
action, to facilitate his progress, allowing him new effects on the enemy, the terrain, the occupation of
space and on the rhythm of the action. Especially since these machines will eventually be more efficient,
more precise and faster for specific tasks than a human being can be. This is currently evident in
industrial manufacturing and assembly plants.

[h]The disruption of autonomy

This military revolution of deporting action with robotic systems is accompanied by another, no less
disruptive, that of the autonomy of these systems. Autonomy will allow for omnipresence of action in the
area, 24 hours a day, subject to energy sufficiency. It will allow the machines to adapt to the terrain and
its unforeseen events in order to carry on the mission entrusted to them by the military leaders.
Autonomous systems will allow them to react to complex situations by adapting their positioning
strategy, and even adapting the effects it produces on the battlefield. For example, it may be an automatic
reorganisation of the swarm formation adopted by a group of robots to follow an advancing enemy,
followed by the decision to block an axis of progression with smoke or obstacles to hinder enemy
progression.

However, autonomy is not fundamentally new for a leader. A section or a platoon leader has combat
groups under his command, whose group leader who receives a mission has full autonomy to carry it out.
The new fact is that if robots are tactical pawns at the disposal of the combatant, and if they can have a
certain form of autonomy in the execution of their action, they do not have and will never have the
awareness of their action and the capacity of discernment which are characteristics of the human being.
This opens up a number of ethical questions regarding the opening of fire that will not be addressed in
this chapter (See ).

[h]The contribution of new technologies to military decision-making

These upheavals are based on technologies that create new opportunities in military decision-making
processes.

[h]All deployed systems are interconnected

The digitisation of the battlefield stems from the constant trend towards the integration of electronic
components in all future military equipment, which, coupled with a means of transmission, allow for their
interconnection and the dissemination of the information collected. It affects all systems deployed in the
field (from weapons systems to military vehicles), right down to the disembarked combatant who, just
like any civilian with a smartphone, will be connected to the great digital web of the battlefield and
therefore traceable and reachable. Just like every individual in the civil society, every actor on the
battlefield is traceable and able to communicate.

[h]Enriched information

As explained above, technology will enable a faster detection of threats on the battlefield. The Law of
Moore has sometimes been used to describe the increase in the capabilities of digital cameras, according
to a ratio of “twice as far” or “twice as cheap” or “twice as small” every 3 years. In fact, each innovation
allows to see further for a smaller footprint. The digital zoom allows high magnifications but at the cost of
algorithmic processing of the image which causes lesser definition quality. It is often paired with the
optical zoom, which consists of adapting the focal length to the target you want to look at. Cameras can
now merge data from multiple sensors of different types. In particular, thermal imaging allowing you to
see a large fraction of the spectrum and to view and measure the thermal energy emitted by an equipment
or a human. To which one can add light intensification processes to amplify the existing residual light to
recreate an image usable by the human eye, in low light conditions.

All of this fused data can enrich the field of vision of the combatant by superimposing additional data that
completes his knowledge of the tactical situation. This is the principle of augmented reality.

[h]The immediacy of information processing

If data acquisition and transmission is possible, the information should nevertheless be processed.
However processing it requires easily accessible hardware and software resources offering the necessary
computing capacity to react as quickly as possible, particularly in order to be extremely reactive in
situations where the analysis time is too short for a human to do it by himself. Embedded computer
software can provide such capacity at the core of deployed systems, but this capability can also be moved
to a secure cloud, which can be both a tactical cloud, i.e. a cloud deployed on the battlefield in support of
the manoeuvre, or to a further away, highly sovereign and secure cloud.

[h]To the detriment of human decision-making

This immediacy of information processing allows a hyper-reactivity of systems, foreshadowing the


concept of “hyperwar” formulated by General John Allen & Amir Hussain Allen in 2019, which puts
forward the idea that the advent of hyperwar is the next fundamentally transformative change in warfare.

“What makes this new form of warfare unique is the unparalleled speed enabled by automating decision-
making and the concurrency of action that become possible by leveraging artificial intelligence and
machine cognition… In military terms, hyperwar may be redefined as a type of conflict where human
decision-making is almost entirely absent from the observe-orient-decide-act (OODA) loop.
Consequently, the time associated with an OODA cycle will be reduced to near-instantaneous responses.
The implications of these developments are many and game changing”.

[h]A support for information processing

For information processing, the volume of data produced increases exponentially and the accuracy and
granularity of the data produced by sensors grows. This trend will become more and more pronounced
over time .

Military experts usually process observation data retrieved from the battlefield by satellites,
reconnaissance aircraft, drones or sensors abandoned on the ground. However, as human resources are
scarce and the volume of data is constantly increasing, it will be necessary to delegate the processing of
this amount of data to AI algorithms in support of the human being, at the risk of not being able to
process all of them without this technology.

On the ground, the deployed combatant will be increasingly charged cognitively by the complexity of the
systems to operate and the amount of information to process. It will be vital to automate the processing of
certain information in order to unload it, so that only what is really necessary will be presented. This
needs to be done in an extremely ergonomic way. This requires defining which data can be subjected to
artificial processing, and up to what hierarchical level their processing can be automated.
[h]The contribution of artificial intelligence

Automated management of routine, repetitive and time-consuming procedures could emerge. In a


headquarters, for example, reports management and automatic production of summaries adapted to the
level of command would immediately make the chain of command more fluid. The AI could take the
form of a dashboard to stimulate the reflection of the commander and his advisers by dynamically
delivering relevant information and updated statements .

During operational preparation, depending on the tactical situation, the leader must confront the possible
modes of action he envisages with the reference enemy situation and the possible enemy modes of action.
Very often he does not have the material time to confront his action with several enemy modes of action,
and he only anticipates certain non-compliant cases that he considers probable. Artificial intelligence
could be more exhaustive in confronting more possible modes of action of the enemy, and thus present a
more complete analysis of possible options to the military leader who could then decide accordingly.

[h]Reduction of the OODA decision cycle

The technologies listed above have a direct effect on the OODA decision cycle, which will be profoundly
impacted by the new technologies.

This concept was defined in 1960 by an American military pilot by the name of John Boyd to formalise
the decision cycle in air combat. It has since been used to schematise any decision cycle. The author will
use it here in the light of the potential offered by the technologies detailed above (Figure).

Figure :OODA cycle time reduction: A better reactivity.


[h]Observe: a better detection

“Seeing without being seen” is essential in military operations, and remains a common adage.
Technology is helping, with the extended distances made possible by long-range cameras and their
deportation to robotic systems. It can now also help to overcome several natural detection constraints
such as night, fog or walls.

Moreover, digitised systems can operate 24 hours a day with great consistency, where humans are subject
to fatigue and inattention, avoiding the risk of missing information.

For surveillance or patrol missions, where human resources are often lacking, the leader can delegate to
systems the analysis of images of the area for the detection of movements and the potential presence of
enemies. It should be noted that this detection should filter out false alarms as much as possible, such as
the movement of leaves in the trees when the wind picks up.

[h]Orient: a better analysis

Remotely seeing will make it possible to identify a potential target from afar, to discriminate it (is the
target a combatant) and to characterise its behaviour (is it hostile or not). If these criteria are met, the
target becomes a potential target that can easily be geolocated, this information will then be transmitted to
the decision-making levels. The gain here is that of anticipating the analysis for better decision-making.

The leader will also be able to rely on the automatic processing of data acquired within the digital
environment of the battlefield. Faced with the potential ‘infobesity’ of the battlefield, artificial
intelligence will enable massive data processing, subject to the availability of a computing capacity
directly embedded in remote robotic platforms, or by remote processing of information via long-distance
communications. It will allow constant monitoring of the analysis of captured images or sounds, a task
that the best human experts can only supervise because they are subject to fatigue and inattention. This is
particularly the case with satellite images or images captured by surveillance drones, which can monitor
an area 24 hours a day. Finally, it will also enable the detection of weak signals that would be invisible to
humans, by correlation between several distinct events, or by cross-checking.

There are still two essential components to the analysis of the situation that a machine can never integrate.
Firstly, instinct and intuition, which a machine cannot have and which are the fruit of a life-long learning
of human experience, and secondly, the transcendence of military action which only a metaphysical
dimension in the literal sense can provide.

[h]Decide: a better reaction

The military commander is the decision-maker for military action. It is therefore up to him to take the
decision according to the information at his disposal. He can of course rely on a deputy or on operational
advisers who help him analyse the situation, if time permits.

For example, France is intervening in Mali and the Sahara as part of the Barkhane military operation to
combat Salafist jihadist armed groups infiltrating the entire Sahel region. Launched on 1 August 2014,
this operation replaces operations Serval and Épervier. The following scenario is fictitious: an armed
Reaper drone of the French army flies over a region of the Malian desert at night and its cameras
(incorporating AI for automatic motion detection processing of the captured images) detect a suspicious
movement. The sensor operator of the drone is alerted and zooms in on the area to detect a jihadist 4x4
occupied by armed personnel via its Infrared camera. This vehicle is moving towards a village 20
kilometres away. Setting up an operation with Special Forces is not possible because they are not in the
area, and there is a great risk that the occupants of the 4x4 will disperse once they reach the village. The
legal advisor on duty quickly confirms the possibility of the drones firing on the target because no
collateral damage is possible in this desert area. The head of the operation decides to give the order to fire
the drone.

This example clearly shows the drastic reduction in the OODA decision cycle offered by the new
technologies: the chief detects and is informed as soon as possible by an automatic detection of a
suspicious movement of an enemy vehicle. He confirms with his image operator the Positive
identification (PID) of the target as an enemy. He then reports it to his hierarchy and receives the order to
open fire. He can thus, in compliance with IHL, open fire from a distance. The enemy has not even
spotted him.

There are still situations where time is critical and the leader will not have time to make a decision due to
the rapidity of the attack. The automation of response processes then becomes a possible option, i.e. he
can delegate to a machine the possibility of giving an appropriate response to a situation by itself. This is
already the case with missiles or ballistic threats, which require armies to use automatic systems to
counter them. This requires automatic systems that are faster and more precise than human beings (e.g.
coupling weapons and radar). Tomorrow, faced with future systems that will develop unpredictable
trajectory strategies (enemy missiles with AI), faced with saturating threats that risk overwhelming our
defences, faced with swarms of offensive robots, our systems will have to adapt in real time to counter the
threat. Only a certain autonomy of the defensive systems will make it possible to face them, an autonomy
which will have to remain under the control of the leader having these systems at his disposal.

[h]Act: a quicker and more accurate reaction

A quicker reaction: A man reacts in a few seconds, the machine in a few milliseconds or less. Where a
human thinks in a few seconds for the best, the machine will analyse parameters in a few milliseconds
and propose a response in near real time.

A more accurate action: A human shooter who moves, breathes and shakes is less accurate than a
machine that does not move, breathe or shake because it is not subject to emotion. Precision in action will
therefore increasingly be the prerogative of the machine.

The outcome of a fight or a counter-measure may depend on these factors 10 or 100 seconds to a
thousand seconds.

[h]Technology as a decision aid for the leader

Military decision-making is centred on the military leader, because he is at the heart of the command
situation. He takes responsibility for military action, a mission given to him by the legitimately elected
political power.

The leader must therefore control the decisions taken within the framework of military action because he
is the guarantor and he assumes the consequences.

What lessons can one learn from the opportunities offered by new technologies for military decision-
making and the possible resulting changes in the art of command?
[h]To reduce the “fog of war”

The leader must rely on technology to reduce the uncertainty and fog of war. It will allow him to be more
aware of his tactical situation by searching for intelligence. Furthermore, it will enable him to delegate to
machines the management of repetitive tasks that do not require constant situational intelligence.

Depending on the circumstances and if he has time to reflect, the digitisation of battlefield information
will also allow the leader to replay certain possible scenarios before taking a decision. Finally, it will give
him the possibility to select the information he has received that he deems important, to view it several
times (especially if the information is imprecise) before making a decision.

[h]For decision support

A digital aid will be welcome to synthesise the multiplication of digital actors on the ground with whom
he is in contact, or whom he must command or coordinate as a leader.

One of the consequences of the digitization of the battlefield is that it may lead to information overload
for the leader who is already very busy and focused on his tasks of commanding and managing. It is
already accepted in the military community that a leader can manage a maximum of seven different
information sources at the same time, and even less when under fire.

Delegating is one way to avoid cognitive overload. Thus, one possible solution is to create a “digital
assistant” who can support the leader in the information processing steps.

His digital deputy can be a digital assistant, an autonomous machine that will assist the leader in filtering
and processing information, which will help the leader in the decision-making process.

Nevertheless, the leader will have to fight against the easy way out, take a step back, allow himself time
to reflect, and reason with a critical sense when faced with machines that will think for him. This process
will help him fight against a possible inhibition of human reasoning. Artificial intelligence does not mean
artificial ignorance if it is used as an intellectual stimulant, although it can have this flaw.

[h]For an optimization of its resources

The chief will be able to entrust machines with the execution of certain time-consuming and tedious tasks,
such as patrols or the surveillance of sectors, and thus conserve his human resources for missions where
they will have a higher added value.

The same applies to missions that require reactivity and precision, especially if there is a need to be
extremely quick to adapt to the situation. For example, it will be useful in the case of saturating threats,
where targeted destruction or multi-faceted and omnipresent threats such as swarms of drones must be
dealt with.

[mh]Safety and Regulations

Unmanned Aerial Systems (UAS), commonly known as drones, have emerged as transformative
technologies with diverse applications across various industries and sectors. As UAS continue to
proliferate and integrate into airspace, safety and regulations play a critical role in ensuring responsible
and safe operation, mitigating risks, and fostering public trust and confidence in unmanned aviation. This
overview provides insight into the importance of safety and regulations in UAS operations, the key
regulatory frameworks governing UAS, and the challenges and considerations in ensuring safe and
compliant unmanned flight.

[h]Importance of Safety in UAS Operations

Safety is paramount in UAS operations, encompassing measures to prevent accidents, minimize risks to
people and property, and ensure compliance with regulatory requirements. Safety considerations in UAS
operations include:

1. Airspace Safety: UAS must operate safely within airspace shared by manned aircraft, adhering to
altitude restrictions, flight corridors, and separation requirements to avoid collisions and conflicts
with other airspace users.
2. Flight Safety: UAS must maintain safe flight operations, including pre-flight checks, in-flight
monitoring, and emergency procedures, to mitigate hazards such as loss of control, mechanical
failures, and environmental factors that could endanger people or property on the ground.
3. Public Safety: UAS operations must prioritize public safety, minimizing risks to bystanders,
pedestrians, and property in the vicinity of flight operations, and ensuring compliance with local
regulations and airspace restrictions in populated areas.
4. Data Security and Privacy: UAS operators must protect sensitive data collected during flight
operations, such as imagery, telemetry, and personal information, from unauthorized access,
interception, or misuse, safeguarding privacy rights and data security.
5. Environmental Protection: UAS operations must minimize environmental impacts, such as noise
pollution, wildlife disturbance, and habitat disruption, by adhering to environmental regulations,
flight restrictions, and best practices for responsible aerial operations.

[h]Regulatory Frameworks for UAS

Regulatory frameworks govern UAS operations, setting standards, requirements, and guidelines for safe
and compliant operation in airspace. Key regulatory bodies and frameworks governing UAS include:

1. Federal Aviation Administration (FAA): The FAA is the primary regulatory authority for UAS
operations in the United States, responsible for establishing regulations, airworthiness standards,
and operational requirements for UAS under Title 14 of the Code of Federal Regulations (14
CFR).
o Part 107: Part 107 of the FAA regulations governs small UAS operations (weighing less
than 55 pounds) for commercial, non-recreational purposes, outlining requirements for pilot
certification, operating limitations, airspace restrictions, and safety measures.
o Waivers and Authorizations: The FAA issues waivers and authorizations for UAS
operations that deviate from Part 107 regulations, such as flights beyond visual line of sight
(BVLOS), over people, at night, or in restricted airspace, subject to safety mitigations and
risk assessments.
2. European Union Aviation Safety Agency (EASA): EASA regulates UAS operations in the
European Union (EU), developing common safety rules and standards for UAS operations under
the EU UAS Regulation (Regulation (EU) 2019/947).
o Open Category: The EU UAS Regulation categorizes UAS operations into open, specific,
and certified categories, based on risk and operational requirements. The open category
encompasses low-risk operations with predefined limitations and operating conditions for
recreational and non-commercial UAS flights.
o Specific Category: The specific category requires operators to obtain operational
authorizations or declarations for higher-risk UAS operations, such as BVLOS, flights over
people, or operations in controlled airspace, following risk assessments and safety
mitigations.
o Certified Category: The certified category applies to UAS with higher mass or complexity,
requiring type certification, airworthiness standards, and operational approvals similar to
manned aircraft, for operations such as commercial air transport or beyond visual line of
sight (BVLOS) missions.
3. Civil Aviation Authority (CAA): National aviation authorities, such as the Civil Aviation
Authority (CAA) in the United Kingdom, regulate UAS operations and airspace management at
the national level, enforcing local regulations, airspace restrictions, and safety requirements for
UAS operators.
4. International Civil Aviation Organization (ICAO): ICAO develops global standards and
recommended practices (SARPs) for civil aviation, including UAS operations, through Annex 2
(Rules of the Air), Annex 8 (Airworthiness), and Annex 19 (Safety Management), guiding member
states in harmonizing regulations and ensuring safe and orderly UAS integration into airspace.

[h]Challenges and Considerations in UAS Safety and Regulation

UAS safety and regulation face several challenges and considerations, including:

1. Dynamic Technological Landscape: Rapid advancements in UAS technology, including


automation, autonomy, and payload capabilities, pose challenges for regulatory frameworks to
adapt and keep pace with evolving operational requirements, safety standards, and emerging risks.
2. Complexity of Operations: UAS operations vary in complexity, from recreational hobbyist flights
to commercial aerial services, surveying, and beyond visual line of sight (BVLOS) missions,
requiring flexible regulations, risk-based approaches, and tailored safety measures to accommodate
diverse applications and operational environments.
3. Airspace Integration: Integrating UAS into shared airspace with manned aircraft, air traffic
management systems, and existing aviation infrastructure poses challenges for airspace
management, traffic coordination, and collision avoidance, necessitating collaboration among
regulatory authorities, industry stakeholders, and airspace users to ensure safe and efficient
airspace integration.
4. Safety Culture and Awareness: Fostering a safety culture and promoting awareness among UAS
operators, manufacturers, and stakeholders is essential for encouraging compliance with
regulations, adopting best practices, and mitigating risks associated with UAS operations,
emphasizing the importance of education, training, and public outreach initiatives.
5. Privacy and Data Protection: Addressing concerns over privacy invasion, data security, and misuse
of UAS technology requires regulatory measures, industry standards, and technological safeguards
to protect personal privacy rights, secure sensitive data, and ensure responsible use of UAS for
surveillance, monitoring, and data collection activities.
6. International Harmonization: Achieving international harmonization and mutual recognition of
UAS regulations, standards, and certifications is essential for facilitating cross-border UAS
operations, promoting interoperability, and avoiding regulatory fragmentation, enhancing safety,
efficiency, and global connectivity in unmanned aviation.

Safety and regulations are critical components of Unmanned Aerial Systems (UAS) operations, ensuring
responsible, safe, and compliant operation in airspace. Regulatory frameworks established by authorities
such as the FAA, EASA, CAA, and ICAO provide standards, requirements, and guidelines for UAS
operators, manufacturers, and stakeholders to follow, addressing airspace safety, operational
considerations, and risk management. As UAS technology continues to evolve and integrate into society,
addressing challenges such as airspace integration, technological advancements, and privacy concerns
requires collaborative efforts, innovation, and regulatory agility to foster a safe, sustainable, and thriving
ecosystem for unmanned aviation.

chapter 3: Unmanned Ground Systems (UGS)

[mh]Introduction to UGS

Unmanned Ground Systems (UGS) have emerged as innovative technological solutions with diverse
applications in various domains, including military, security, agriculture, transportation, and industrial
automation. Unlike their aerial counterparts, Unmanned Aerial Systems (UAS), UGS operate on land,
performing tasks ranging from reconnaissance and surveillance to logistics support and hazardous
environment exploration. This overview provides insight into the design, components, capabilities,
applications, and challenges of Unmanned Ground Systems (UGS), highlighting their impact on
industries, society, and technology.

[h]Design and Components of UGS

Unmanned Ground Systems (UGS) consist of several key components designed to enable autonomous or
remotely controlled ground operations:

1. Mobile Platform: The mobile platform serves as the chassis or base of the UGS, providing
mobility and locomotion capabilities to traverse various terrains and environments. Mobile
platforms can include wheeled, tracked, or legged configurations, with designs optimized for
stability, maneuverability, and payload capacity.
2. Sensors and Perception Systems: Sensors and perception systems enable UGS to perceive and
interpret their surroundings, facilitating navigation, obstacle avoidance, and situational awareness.
These systems include cameras, LiDAR sensors, ultrasonic sensors, radar, and lidar for
environmental sensing, object detection, and terrain mapping.
3. Navigation and Localization Systems: Navigation and localization systems enable UGS to
determine their position, orientation, and trajectory relative to the environment. Global Navigation
Satellite Systems (GNSS), such as GPS or Galileo, inertial measurement units (IMUs), wheel
encoders, and odometry sensors provide accurate localization and motion estimation for
autonomous navigation.
4. Communication and Control Systems: Communication and control systems facilitate data
exchange, command and control, and remote operation of UGS. Wireless communication links,
such as Wi-Fi, Bluetooth, or cellular networks, enable real-time telemetry, mission planning, and
operator interaction, while onboard computers and control units process sensor data and execute
control algorithms.
5. Manipulators and Payloads: Manipulators and payloads are specialized equipment mounted on
UGS for performing tasks such as manipulation, object handling, and payload deployment.
Manipulators can include robotic arms, grippers, and end effectors for grasping, lifting, and
manipulating objects, while payloads can include sensors, cameras, actuators, or tools for specific
applications.

[h]Capabilities of UGS

Unmanned Ground Systems (UGS) offer several capabilities that distinguish them from traditional
ground-based vehicles and equipment:
1. Autonomy and Automation: UGS can operate autonomously or with minimal human intervention,
performing tasks such as navigation, obstacle avoidance, and mission execution based on pre-
defined algorithms, sensor feedback, and environmental cues. Autonomous capabilities enable
UGS to operate in hazardous or inaccessible environments, such as disaster zones, minefields, or
industrial facilities.
2. Versatility and Adaptability: UGS platforms are highly versatile and adaptable, capable of
performing a wide range of tasks across diverse domains and applications. From reconnaissance
and surveillance to logistics support, search and rescue, and agricultural operations, UGS platforms
can be customized and configured with different sensors, payloads, and accessories to meet
specific mission requirements.
3. Mobility and Maneuverability: UGS platforms exhibit mobility and maneuverability characteristics
optimized for traversing various terrains, obstacles, and environments. Wheeled or tracked
platforms offer stability and traction on rough terrain, while legged or articulated platforms provide
agility and maneuverability in confined spaces or uneven terrain.
4. Sensing and Perception: UGS platforms are equipped with sensors and perception systems for
environmental sensing, object detection, and situational awareness. Cameras, LiDAR sensors, and
radar enable UGS to perceive obstacles, terrain features, and objects in their surroundings,
facilitating navigation, mapping, and obstacle avoidance.
5. Remote Operation and Telepresence: UGS platforms support remote operation and telepresence
capabilities, allowing operators to control the UGS from a distance using wireless communication
links and control interfaces. Remote operation enables operators to perform tasks in hazardous or
remote environments while maintaining a safe distance.

[h]Applications of UGS

Unmanned Ground Systems (UGS) have diverse applications across various industries and sectors:

1. Military and Defense: In the military and defense sector, UGS are used for reconnaissance,
surveillance, target acquisition, and force protection missions. UGS platforms provide soldiers
with situational awareness, intelligence gathering, and standoff capabilities, enhancing operational
effectiveness and reducing personnel risks in combat scenarios.
2. Security and Law Enforcement: UGS support security and law enforcement agencies in border
surveillance, perimeter security, crowd control, and disaster response operations. UGS platforms
equipped with cameras, sensors, and communication systems enable real-time monitoring, threat
detection, and rapid response to security incidents.
3. Agriculture and Farming: In agriculture, UGS are utilized for precision farming, crop monitoring,
and autonomous field operations. UGS platforms equipped with sensors, cameras, and agricultural
tools enable farmers to monitor crops, apply fertilizers and pesticides, and perform tasks such as
seeding, weeding, and harvesting with precision and efficiency.
4. Industrial Automation and Manufacturing: UGS play a crucial role in industrial automation and
manufacturing processes, performing tasks such as material handling, assembly, inspection, and
inventory management. UGS platforms equipped with robotic arms, grippers, and vision systems
enable automation of repetitive or hazardous tasks, improving productivity, quality, and worker
safety.
5. Search and Rescue: UGS support search and rescue operations in disaster scenarios, natural
disasters, and hazardous environments. UGS platforms equipped with sensors, cameras, and
communication systems enable first responders to conduct reconnaissance, locate survivors, and
deliver aid or supplies in inaccessible or dangerous areas.
[mh]Types of UGS

Unmanned Ground Systems (UGS) encompass a diverse range of platforms designed for various
applications and tasks across different industries and sectors. These UGS can be categorized based on
their mobility, size, functionality, and intended use. The following are some common types of UGS:

1. Wheeled UGS: Wheeled UGS are characterized by their mobility on wheels, offering stability,
speed, and versatility in navigating flat or moderately uneven terrain. These UGS typically feature
multiple wheels, ranging from two-wheel configurations, such as simple robotic platforms, to
multi-wheel configurations, including four-wheeled rovers and six-wheeled vehicles. Wheeled
UGS are commonly used for tasks such as reconnaissance, surveillance, logistics support, and
perimeter security in both military and civilian applications.
2. Tracked UGS: Tracked UGS utilize tracks or caterpillar treads for locomotion, providing enhanced
traction, maneuverability, and terrain traversability compared to wheeled platforms. Tracked UGS
are well-suited for operating in rough, rugged, or off-road environments, including rocky terrain,
sand, mud, and snow. These UGS are often employed in military applications for reconnaissance,
mine detection, and route clearance, as well as in agriculture for soil cultivation, spraying, and
seeding tasks.
3. Legged UGS: Legged UGS feature articulated legs or limbs for locomotion, offering agility,
adaptability, and mobility in complex or cluttered environments. These UGS can traverse
obstacles, stairs, rubble, and rough terrain that are challenging for wheeled or tracked platforms,
making them ideal for search and rescue, disaster response, and exploration missions. Legged UGS
can vary in the number of legs, ranging from four-legged quadrupeds to six-legged hexapods or
even more complex configurations, such as humanoid robots designed for manipulation and human
interaction.
4. Aerial UGS: Aerial UGS, also known as unmanned aerial vehicles (UAVs) or drones, are
platforms designed for aerial reconnaissance, surveillance, and payload delivery tasks. While
primarily airborne, these UGS may feature ground-based capabilities, such as vertical takeoff and
landing (VTOL), hover capability, and ground-based operations for takeoff, landing, and
maintenance. Aerial UGS come in various configurations, including fixed-wing aircraft, rotary-
wing helicopters, multirotor drones, and hybrid designs, each offering specific advantages in terms
of range, endurance, payload capacity, and maneuverability for different mission requirements.
5. Hybrid UGS: Hybrid UGS combine multiple mobility modes, such as wheeled, tracked, or legged
locomotion, to achieve versatility, adaptability, and terrain traversal capabilities across diverse
environments. These UGS may feature interchangeable or modular components that allow for
reconfiguration or adaptation to specific mission requirements, such as switching between wheeled
mode for speed on flat terrain and tracked mode for traction in rough terrain. Hybrid UGS offer a
flexible solution for applications such as reconnaissance, surveillance, inspection, and logistics
support, where versatility and adaptability are paramount.
6. Swarm UGS: Swarm UGS comprise multiple interconnected or coordinated units operating
collaboratively to achieve collective goals, such as exploration, mapping, surveillance, or
distributed sensing. These UGS may exhibit various mobility modes, sizes, and functionalities,
working together to perform tasks that are beyond the capabilities of individual units. Swarm UGS
leverage distributed sensing, communication, and coordination algorithms to achieve synergy,
redundancy, and scalability in accomplishing complex missions in dynamic or uncertain
environments.

[mh]Mobility and Terrain Adaptability


Mobility and terrain adaptability are fundamental characteristics of Unmanned Ground Systems (UGS),
enabling these platforms to navigate diverse environments, traverse obstacles, and perform tasks across
various terrains. From wheeled and tracked vehicles to legged robots and hybrid platforms, UGS exhibit a
wide range of mobility modes and capabilities designed to overcome challenges posed by different
terrains, including rough terrain, urban environments, and hazardous conditions. This overview explores
the importance of mobility and terrain adaptability in UGS, the key factors influencing mobility design,
and the strategies employed to enhance mobility and traversability in challenging environments.

[h]Importance of Mobility in UGS

Mobility is essential for UGS to fulfill their intended missions effectively and efficiently, allowing them
to move from one location to another, access remote or hard-to-reach areas, and accomplish tasks with
agility and precision. The importance of mobility in UGS is evident in various applications across
military, civilian, and industrial domains:

1. Military Operations: In military operations, UGS mobility is critical for reconnaissance,


surveillance, and tactical maneuvers in diverse battlefield environments. UGS platforms must
navigate rugged terrain, cross obstacles, and maintain mobility in hostile or contested areas to
gather intelligence, support maneuver forces, and enhance situational awareness without putting
human operators at risk.
2. Search and Rescue: UGS mobility is vital for search and rescue operations in disaster scenarios,
natural disasters, and hazardous environments. UGS platforms equipped with mobility capabilities,
such as tracked or legged locomotion, can traverse debris, rubble, and uneven terrain to locate
survivors, deliver aid or supplies, and assist first responders in accessing inaccessible or dangerous
areas.
3. Infrastructure Inspection: In infrastructure inspection and maintenance tasks, UGS mobility
enables access to critical infrastructure assets, such as bridges, pipelines, and power lines, for
inspection, monitoring, and maintenance. UGS platforms equipped with climbing, crawling, or
flying capabilities can traverse vertical or confined spaces, perform visual inspections, and detect
structural defects or anomalies without requiring human intervention or scaffolding.
4. Precision Agriculture: In precision agriculture, UGS mobility facilitates field operations such as
soil sampling, crop monitoring, and precision spraying. UGS platforms equipped with wheeled or
tracked locomotion can navigate crop rows, uneven terrain, and varying soil conditions to collect
data, apply inputs, and perform tasks with accuracy and efficiency, optimizing crop yields and
reducing resource inputs.

[h]Factors Influencing Mobility Design

Several factors influence the design and performance of mobility systems in UGS, including:

1. Terrain Characteristics: Terrain characteristics, such as slope, roughness, surface texture, and
obstacles, significantly impact UGS mobility and traversability. UGS platforms must be designed
to navigate different types of terrain, ranging from flat surfaces to steep slopes, mud, sand, rocks,
vegetation, and urban obstacles, while maintaining stability, traction, and maneuverability.
2. Payload Requirements: Payload requirements, including weight, size, and distribution, influence
the design and configuration of UGS mobility systems. UGS platforms carrying heavy payloads or
equipment may require robust chassis, suspension systems, and powertrain components to support
the additional load while maintaining mobility performance and stability.
3. Power Source and Endurance: The choice of power source, such as electric batteries, internal
combustion engines, or hybrid systems, affects UGS mobility, endurance, and range. Electric UGS
platforms offer quiet operation, low emissions, and precise control but may have limited endurance
and require recharging. Internal combustion engines provide longer endurance but produce noise
and emissions, while hybrid systems offer a balance between efficiency and flexibility.
4. Environmental Conditions: Environmental conditions, including weather, temperature, humidity,
and atmospheric pressure, impact UGS mobility and performance. UGS platforms must be
designed to withstand harsh environmental conditions, such as extreme temperatures, high winds,
rain, snow, and dust, while maintaining operational reliability and safety.
5. Sensor and Communication Systems: Sensor and communication systems integrated into UGS
platforms influence mobility design by adding weight, complexity, and power requirements. UGS
platforms equipped with sensors, cameras, and communication antennas must accommodate these
components without compromising mobility, balance, or structural integrity.

[h]Strategies for Enhancing Mobility and Terrain Adaptability

UGS designers and developers employ various strategies to enhance mobility and terrain adaptability,
including:

1. Modular Design: Modular design allows UGS platforms to adapt to different terrains,
environments, and mission requirements by incorporating interchangeable or configurable
components, such as wheels, tracks, legs, or attachments. Modular UGS platforms can be
reconfigured or customized with specialized mobility systems, sensors, or payloads to meet
specific application needs without requiring extensive redesign or reengineering.
2. Articulated Suspension: Articulated suspension systems enable UGS platforms to maintain
stability and traction on uneven terrain by adjusting wheel or track contact with the ground.
Articulated UGS platforms can articulate or tilt individual wheels or tracks to conform to surface
irregularities, minimize slippage, and maximize traction, enhancing mobility and traversability in
rough terrain conditions.
3. All-Terrain Traction: All-terrain traction systems, such as aggressive tire treads, cleats, or tracks,
provide UGS platforms with enhanced grip and traction on challenging surfaces, including mud,
sand, snow, and steep slopes. UGS platforms equipped with all-terrain traction systems can
maintain mobility and stability in adverse conditions, allowing them to traverse obstacles and
navigate difficult terrain with confidence.
4. Active Suspension and Stabilization: Active suspension and stabilization systems use sensors,
actuators, and control algorithms to adjust UGS platform dynamics, damping, and stability in real-
time. Active suspension systems can compensate for terrain variations, vibrations, and
disturbances, enhancing ride comfort, reducing vehicle sway, and improving mobility performance
in rough or dynamic terrain environments.
5. Adaptive Locomotion: Adaptive locomotion strategies enable UGS platforms to adapt their
mobility modes and behaviors based on terrain conditions, obstacles, and mission objectives. UGS
platforms may employ hybrid locomotion modes, such as wheeled-legged or tracked-wheeled
configurations, to optimize mobility and traversability across different terrain types, transitioning
between modes as needed to overcome obstacles or terrain challenges.

[h]Challenges and Future Directions

Despite advancements in mobility and terrain adaptability, UGS face several challenges and opportunities
for improvement:

1. Autonomy and Navigation: Enhancing UGS autonomy and navigation capabilities is critical for
enabling unmanned operation in complex or dynamic environments. UGS platforms must be
equipped with robust perception, planning, and control systems to autonomously navigate and
adapt to changing terrain conditions, obstacles, and hazards while maintaining safety and mission
objectives.
2. Human-Robot Interaction: Improving human-robot interaction (HRI) is essential for facilitating
collaboration and coordination between UGS and human operators in shared workspaces or
operational environments. UGS platforms must incorporate intuitive interfaces, teleoperation tools,
and supervisory control modes to enable seamless communication, coordination, and situational
awareness between humans and robots during mission execution.
3. Energy Efficiency: Enhancing energy efficiency and endurance is a key challenge for UGS
mobility, particularly for electric-powered platforms with limited battery capacity. UGS platforms
must optimize power consumption, energy recovery, and propulsion efficiency to extend
operational range, increase mission endurance, and reduce reliance on external power sources or
recharging.
4. Multi-Domain Integration: Integrating UGS with other unmanned systems, such as aerial drones,
maritime robots, and sensor networks, presents opportunities for multi-domain collaboration and
synergy in complex missions.

[mh]Sensor Integration

Sensor integration is a critical aspect of Unmanned Ground Systems (UGS), enabling these platforms to
perceive their environment, gather data, and make informed decisions autonomously or with minimal
human intervention. Sensors serve as the primary means through which UGS acquire information about
their surroundings, including terrain features, obstacles, environmental conditions, and potential hazards.
This overview delves into the importance of sensor integration in UGS, the types of sensors commonly
used, their functionalities, integration challenges, and future trends driving innovation in sensor
technology for unmanned ground applications.

[h]Importance of Sensor Integration in UGS

Sensor integration plays a pivotal role in enhancing the capabilities and functionality of Unmanned
Ground Systems (UGS) across various applications and industries. The importance of sensor integration
in UGS can be understood through the following aspects:

1. Perception and Situational Awareness: Sensors enable UGS to perceive their environment,
providing real-time information about terrain conditions, obstacles, and surrounding objects. This
situational awareness is crucial for navigation, obstacle avoidance, and decision-making, allowing
UGS to autonomously plan and execute tasks in dynamic or unfamiliar environments.
2. Object Detection and Classification: Sensors facilitate object detection and classification, allowing
UGS to identify and differentiate between various objects, such as vehicles, pedestrians, obstacles,
and terrain features. This capability is essential for tasks such as surveillance, reconnaissance, and
target tracking, enabling UGS to detect potential threats, monitor activities, and respond
accordingly.
3. Environmental Monitoring: Sensors enable UGS to monitor environmental conditions, including
temperature, humidity, air quality, and radiation levels. Environmental sensors provide valuable
data for applications such as agriculture, environmental monitoring, and disaster response,
allowing UGS to assess conditions, detect anomalies, and provide early warning alerts in
hazardous or critical situations.
4. Data Fusion and Analysis: Sensor integration facilitates data fusion and analysis, combining
information from multiple sensors to create a comprehensive understanding of the environment.
Data fusion algorithms process sensor data to extract meaningful insights, detect patterns, and
make informed decisions, enhancing the intelligence and autonomy of UGS in performing
complex tasks and missions.
5. Communication and Networking: Sensors support communication and networking capabilities in
UGS, enabling data exchange, command and control, and collaboration with other unmanned
systems or human operators. Sensor data can be transmitted wirelessly to remote control stations,
command centers, or other UGS platforms, facilitating real-time monitoring, coordination, and
decision-making in distributed or networked operations.

[h]Types of Sensors in UGS

UGS employ a variety of sensors to perceive their environment and gather data for navigation, obstacle
avoidance, terrain mapping, and mission-specific tasks. The types of sensors commonly used in UGS
include:

1. LiDAR (Light Detection and Ranging): LiDAR sensors emit laser pulses and measure the time
taken for the pulses to reflect off objects, generating precise 3D maps of the surrounding terrain
and obstacles. LiDAR sensors provide high-resolution spatial data for navigation, mapping, and
obstacle detection in both indoor and outdoor environments.
2. Cameras: Cameras are ubiquitous sensors used in UGS for visual perception, surveillance, and
situational awareness. Cameras capture still images or video footage of the environment, allowing
UGS to identify objects, track movement, and recognize patterns. Cameras can be equipped with
visible light, infrared, or thermal imaging capabilities to capture different types of imagery under
various lighting and environmental conditions.
3. Ultrasonic Sensors: Ultrasonic sensors emit high-frequency sound waves and measure the time
taken for the waves to bounce off objects, providing proximity sensing and obstacle detection
capabilities for UGS. Ultrasonic sensors are effective for short-range detection and collision
avoidance in confined spaces or indoor environments, complementing other sensors such as
LiDAR and cameras.
4. Inertial Measurement Units (IMUs): IMUs consist of accelerometers and gyroscopes that measure
acceleration, rotation, and orientation of the UGS platform. IMUs provide motion sensing and
localization capabilities, allowing UGS to estimate its position, velocity, and attitude relative to the
Earth's surface. IMUs are essential for navigation, motion control, and stabilization of UGS
platforms during movement.
5. GPS (Global Positioning System): GPS receivers enable UGS to determine their precise location
and navigation coordinates based on signals received from satellite constellations. GPS provides
accurate positioning information for UGS navigation, waypoint following, and geospatial
referencing in outdoor environments with clear line-of-sight to satellite signals.
6. Radar: Radar sensors emit radio waves and measure the time taken for the waves to reflect off
objects, providing long-range detection and tracking capabilities for UGS. Radar sensors are
effective for detecting moving targets, such as vehicles or aircraft, in all weather conditions,
including fog, rain, and darkness, making them valuable for surveillance, perimeter security, and
target tracking applications.
7. Environmental Sensors: Environmental sensors measure physical parameters such as temperature,
humidity, pressure, and air quality, providing information about the surrounding atmospheric
conditions. Environmental sensors enable UGS to monitor environmental variables, assess risks,
and make informed decisions in applications such as agriculture, environmental monitoring, and
disaster response.

[h]Challenges in Sensor Integration


While sensor integration offers significant benefits for UGS, it also presents several challenges and
considerations that must be addressed:

1. Sensor Fusion and Data Integration: Integrating data from multiple sensors poses challenges in
terms of data fusion, synchronization, and alignment. Sensor data may vary in format, resolution,
and accuracy, requiring algorithms for data fusion, calibration, and registration to create a coherent
and accurate representation of the environment.
2. Power Consumption and Efficiency: Sensors consume power, and UGS platforms often operate
with limited onboard power sources, such as batteries or fuel cells. Optimizing sensor power
consumption and efficiency is essential for extending UGS endurance, reducing energy
consumption, and maximizing mission duration without compromising sensor performance or
functionality.
3. Sensor Redundancy and Reliability: Redundancy is critical for ensuring sensor reliability and fault
tolerance in UGS operations. Redundant sensors can provide backup or alternative sensing
capabilities in case of sensor failures, errors, or environmental conditions that may affect sensor
performance. However, redundant sensors increase system complexity, weight, and cost,
necessitating careful trade-offs and design considerations.
4. Environmental Adaptability: Sensors must be robust and resilient to operate in harsh or
challenging environmental conditions, including temperature extremes, humidity, dust, vibration,
and electromagnetic interference. Environmental factors can affect sensor performance, accuracy,
and reliability, requiring ruggedized or specialized sensors designed for specific operating
environments.
5. Data Processing and Computational Resources: Sensor data processing and analysis require
computational resources, memory, and processing capabilities onboard UGS platforms. UGS must
be equipped with sufficient computing power, storage, and bandwidth to process sensor data in
real-time, execute algorithms for perception, navigation, and decision-making, and communicate
results to control stations or remote operators.

[mh]Civilian and Defense Applications

Civilian and defense applications represent two major domains where Unmanned Ground Systems (UGS)
play significant roles, offering innovative solutions for various tasks and missions. While civilian
applications focus on areas such as agriculture, infrastructure inspection, and disaster response, defense
applications encompass reconnaissance, surveillance, and tactical operations on the battlefield. This
overview explores the diverse applications of UGS in both civilian and defense sectors, highlighting their
contributions, benefits, and challenges in each domain.

[h]Civilian Applications of UGS

Civilian applications of Unmanned Ground Systems (UGS) span a wide range of industries and sectors,
leveraging autonomous or remotely controlled platforms to perform tasks efficiently, safely, and cost-
effectively. Some key civilian applications of UGS include:

1. Agriculture: In agriculture, UGS are utilized for precision farming, crop monitoring, and
agricultural operations. UGS platforms equipped with sensors, cameras, and agricultural tools can
navigate fields, monitor crop health, apply fertilizers and pesticides, and perform tasks such as
seeding, weeding, and harvesting with precision and efficiency. UGS enable farmers to optimize
resource utilization, increase crop yields, and reduce labor costs while minimizing environmental
impact.
2. Infrastructure Inspection: UGS support infrastructure inspection and maintenance tasks in sectors
such as construction, energy, transportation, and utilities. UGS platforms equipped with cameras,
LiDAR sensors, and other inspection tools can access critical infrastructure assets, such as bridges,
pipelines, power lines, and dams, to perform visual inspections, detect defects, and assess
structural integrity. UGS enable proactive maintenance, early detection of potential issues, and
timely repairs, enhancing infrastructure safety, reliability, and lifespan.
3. Environmental Monitoring: UGS platforms contribute to environmental monitoring and
conservation efforts by collecting data on air quality, water quality, soil conditions, and wildlife
habitats. UGS equipped with environmental sensors, cameras, and telemetry systems can survey
remote or inaccessible areas, such as forests, wetlands, and marine ecosystems, to monitor
ecosystem health, detect pollution, and assess environmental changes over time. UGS provide
valuable data for scientific research, conservation planning, and natural resource management
initiatives.
4. Search and Rescue: UGS play a vital role in search and rescue operations during natural disasters,
emergencies, and humanitarian crises. UGS platforms equipped with sensors, cameras, and
communication systems can assist first responders in locating survivors, assessing damage, and
delivering aid or supplies in hazardous or inaccessible areas. UGS enable rapid deployment,
remote reconnaissance, and situational awareness, augmenting human efforts and increasing the
effectiveness of search and rescue missions.
5. Surveillance and Security: UGS support surveillance and security applications in public safety, law
enforcement, and private security sectors. UGS platforms equipped with cameras, sensors, and
communication systems can patrol perimeters, monitor facilities, and detect intrusions or
suspicious activities in sensitive or high-risk areas. UGS provide real-time situational awareness,
threat detection, and response capabilities, enhancing security measures and deterring unauthorized
access or criminal activities.

[h]Defense Applications of UGS

Defense applications represent a significant domain for Unmanned Ground Systems (UGS), serving
various roles in military operations, security missions, and tactical scenarios. UGS platforms offer
capabilities for reconnaissance, surveillance, target acquisition, and force protection, enhancing the
effectiveness and efficiency of military forces. Some key defense applications of UGS include:

1. Reconnaissance and Surveillance: UGS platforms are extensively used for reconnaissance and
surveillance missions in military operations. UGS equipped with cameras, sensors, and
communication systems can conduct covert or overt surveillance, gather intelligence, and monitor
enemy activities in contested or hostile environments. UGS provide real-time situational
awareness, threat detection, and target tracking capabilities, enabling commanders to make
informed decisions and plan tactical operations based on actionable intelligence.
2. Force Protection: UGS contribute to force protection measures by providing perimeter security,
threat detection, and early warning capabilities to military units deployed in operational theaters.
UGS platforms equipped with sensors, cameras, and communication systems can patrol bases,
outposts, and convoys, detect hostile forces or improvised explosive devices (IEDs), and alert
personnel to potential threats or intrusions. UGS enhance force survivability, situational
awareness, and response readiness, reducing the risk of casualties and enhancing operational
effectiveness.
3. Counter-IED Operations: UGS play a crucial role in counter-IED (improvised explosive device)
operations, assisting military forces in detecting, neutralizing, and mitigating the threat of IEDs on
the battlefield. UGS platforms equipped with sensors, ground-penetrating radar (GPR), and
electronic countermeasures can search for IEDs, identify suspicious objects or anomalies, and
conduct route clearance operations to secure key supply routes and infrastructure. UGS enable safe
and effective detection of IEDs, minimizing the risk to personnel and vehicles from hidden
explosives.
4. Logistics Support: UGS platforms provide logistics support to military forces by autonomously
transporting supplies, equipment, and ammunition in combat zones or austere environments. UGS
equipped with cargo carriers, robotic arms, and navigation systems can navigate rough terrain,
transport payloads, and resupply forward operating bases (FOBs) or frontline units with critical
supplies. UGS enhance logistics efficiency, reduce manpower requirements, and minimize the
logistical footprint in theater, enabling agile and responsive support to deployed forces.
5. Urban Operations: UGS are employed in urban operations, such as reconnaissance, clearance, and
cordon-and-search missions in urban environments. UGS platforms equipped with sensors,
cameras, and communication systems can navigate streets, alleys, and buildings, gather
intelligence, and support infantry units in clearing buildings or securing objectives. UGS provide
urban warfare capabilities, situational awareness, and standoff distance, reducing the risk to
personnel and optimizing urban combat operations.

[h]Benefits and Challenges

UGS offer several benefits and advantages in both civilian and defense applications, including:

1. Increased Safety: UGS reduce the risk to human operators by performing tasks in hazardous or
dangerous environments, such as disaster zones, conflict areas, or contaminated sites, where
human presence may be impractical or risky.
2. Enhanced Efficiency: UGS improve operational efficiency by automating repetitive or labor-
intensive tasks, optimizing resource utilization, and reducing time, cost, and manpower
requirements for mission execution.
3. 24/7 Availability: UGS provide continuous surveillance, monitoring, and response capabilities,
operating day and night, in all weather conditions, without fatigue or downtime, ensuring
persistent coverage and rapid response to emergent situations.
4. Scalability and Flexibility: UGS platforms can be deployed individually or in swarms to scale
operations, cover large areas, or perform distributed tasks simultaneously, adapting to changing
mission requirements and operational needs.
5. Reduced Environmental Impact: UGS minimize environmental impact and ecological footprint
compared to manned vehicles or equipment, producing lower emissions, noise pollution, and
disturbance to natural habitats during operations.

chapter 4: Unmanned Maritime Systems (UMS)

[mh]Unmanned Maritime Systems


During maritime search and rescue operations, the safety of the rescuers is a major issue and must be
ensured in any circumstance. Therefore, these teams are often forced to adapt, or even to suspend their
operations due to external factors and conditions, such as lack of visibility or atmospheric and/or
maritime adverse conditions. On the other hand, it should be pointed out that rescue response time is a
major factor for success in these operations, due to the reduced survival time of victims that fall
overboard.

Robotic assets can therefore complement the role of search and rescue teams, as they can operate in
dangerous scenarios and under adverse environmental conditions without putting human lives in danger.
There is nowadays a broad range of unmanned maritime systems (UMS) that can operate under different
environmental conditions, transport a multitude of payload sensing systems and perform distinctive
missions . Concerning maritime robotic tools for search and rescue operations, two works are worth
mentioning the emergency integrated lifesaving lanyard (EMILY) system and the autonomous galileo-
supported rescue vessel for persons overboard (AGAPAS) project . EMILY (emergency integrated
lifesaving lanyard) is a remotely operated autonomous vessel that aims to assist the life guards in crowded
beaches, providing them a safe and fast response means. AGAPAS is a project orientated, specifically to
person overboard situations, where an automatic system perceives that someone fell from the vessel and
deploys an unmanned surface vehicle (USV) capable of fetching that.

While these systems are operated in an independent way, within the ICARUS project, particularly in the
maritime scenario, multiple heterogeneous unmanned platforms (by air or surface) will co ‐operate, in
order to detect and assist victims. This chapter addresses the adaptations of general purpose UMS and the
development of novel assets, performed within the scope of the ICARUS project, to obtain an integrated
system able to respond to search and rescue requirements in complex and challenging environments.

[h]Overall concepts of operation and platforms

The assistance of UMS in search and rescue operations may include providing means for the floatation
and thermal protection, preventing from fatigue, drowning or hypothermia, thereby increasing the
survival rate. Furthermore, when the conditions do not permit the manned search and rescue operations, a
fast and effective operation within the disaster scenario by the robotic assets makes it possible for the
rescuers to evaluate and remotely assist the victims before resuming action as soon as the safety
conditions are ensured.

Within the scope of the ICARUS project, a complementary approach for the use of UMS was followed. It
consisted in having two classes of UMS, large and fast systems, able to arrive to the disaster area in a
short time, and small and slower systems, able to get close to survivors on the water providing them
floatation and thermal protection without putting them in danger.

For the larger and fast systems, two different platforms were considered: U ‐Ranger and Roaz II. U ‐
Ranger is a 7 m long UMS, weighting more than 1000 kg and able to reach top speed exceeding 40 kts.
Roaz II is a 4 m long, weighting up to 400 kg and able to reach a maximum velocity of 10 kts. These were
existing platforms operated by partners of the ICARUS consortium, respectively, Calzoni and INESC
TEC. Within the scope of this project, both these platforms were adapted for search and rescue
operations, by integrating on them adequate sensor suites, by endowing them with autonomous
behaviours suited for these operations and by incorporating on them the ability to carry and deploy on site
smaller platforms.
The smaller platforms are the unmanned capsules (UCAPs) and were completely developed during the
project and consist of 1.5 m long UMS, weighting up to 40 kg. Each of these vessels can be remotely
operated or execute autonomous missions and carries on its deck an uninflated life raft. Upon reaching
survivors, an automatic inflation of the life raft is performed allowing the survivors to jump it on board.

[h]U‐Ranger USV

The U‐Ranger (Figure) is a remotely controlled unmanned surface vehicle (USV) mainly tailored for
harbour and ship protection, able to perform intelligence, surveillance and reconnaissance (IRS)
operations and patrolling of pre‐defined areas.

Figure :Calzoni U‐Ranger USV

The U‐Ranger can be equipped with different kinds of sensors like cameras and radar for surface area
control, sonar sensors for underwater control and other sensors for environment control. Table lists the
main technical characteristics of the system.

Boat type RHIB type, full aluminium (included tubes)


Length 7m
Beam 2.5 m
Boat type RHIB type, full aluminium (included tubes)
Weight 1400–1800 kg (depending on payloads)
Motor power 260 CV
Speed >40 kts
Autonomy >8 h
Safe operating conditions Sea: SS3 Douglas
Wind: 6 Beaufort
Control range (VHF) Up to 15 nm
Wide band range >5 nm

Table :Calzoni U‐Ranger technical specification.

Within the scope of the ICARUS project, the U‐Ranger USV was equipped with a sensor and autonomous
behaviour payload from Centre for Maritime Research and Experimentation (CMRE) . The autonomous
behaviour payload is based on the mission oriented operating suite (MOOS) open ‐source open
architecture. The MOOS is a C++ cross platform middleware for robotics research. Its advantages include
open source, flexibility, capacity for system growth, functionality across all platforms, a large user
community who contribute MOOS architecture modules for all to share, scalability through distributed
computation, protocols exist and need not be developed and considerable use in autonomous systems
internationally, at CMRE and elsewhere. MOOS interacts with hardware and operator GUI through
MOOS interface drivers and MOOS processes (autonomous behaviour sets), each of which is an
independent process linked to a central MOOS database by standard internal process connections.
Processes post data to the central MOOS database for access by any other process. Processes subscribe to
the data they require, drawing it from the MOOS database on notification of updates. Communications
between the sensor/behaviour payload and the control station on‐shore rely on a worldwide
interoperability for microwave access (WiMAX) link, while the very high frequency (VHF) link bypasses
the sensor/behaviour payload allowing direct full manual control of the USV.

The sensor suite includes the following sensors:

 RADAR: Obstacle detection


 Laser scanner: Obstacle detection
 Weather station: In Situ weather data
 Daylight camera: Survivors detection
 Thermal camera: Survivors detection

The thermal and daylight cameras allow night and day operations. Their fields of view and resolutions are
such that it is possible to detect a person in the water at 200 m. While the daylight camera is quite
sensitive to lightning conditions and in particular to the reflections of sun light on the water surface, the
thermal camera can provide useful data almost independently of the environmental conditions. An
example of an image provided by this camera can be observed in Figure. Furthermore, the cameras are
mounted on a gyro‐stabilized platform that also allows their pan and tilt command.
Figure :Example image from the thermal camera on the U‐Ranger

The radar installed on the U‐Ranger operates on the X ‐Band (9.3–9.4 GHz) and can be configured with
different range settings from 50 m to 24 nautical miles. It has a rotation rate of 24 RPM and it is
interfaced to the computational system using public domain C++/Java plugins/libraries (openCPN BR24
plugin and openbr24 Java). This radar is able to reliably detect obstacles at ranges greater than 50–100 m.
The major drawback is the difficulty in detecting fast objects or making detections during sharp turns of
the U‐Ranger.

The laser scanner has four vertically stacked beams with a spacing of 0.8°, which are steered within a
110° angle and with 0.125° horizontal resolution. The maximum scanning frequency is 50 Hz and the
obstacle detection range is greater than 100 m. This scanner is mounted near the U ‐Ranger bow on a
gyro‐stabilized platform (Figure).
Figure :Laser scanner mounting platform and electronics

The behaviour set implemented in the U‐Ranger MOOS system contains the following elements:

 Constant speed
o Classic part of the MOOS‐IvP and always active.
 Station keeping
o U-Ranger vehicle is not able to stop in an autonomous mode.
o Vehicle with a single thruster and rudder.
o Last waypoint is always considered as a station keeping point.
 Waypoint behaviour
o Selection of waypoints is the C2I operator responsibility.
o Any waypoint can be selected as the U‐CAP deployment point except the last one..
 Operational region
o Pre‐defined operational area polygon.
 Obstacle avoidance

The available behaviours are combined in real time by a function optimizer in order to determine the
direction the USV should take.

[h]Roaz II USV

Roaz II is an USV that can operate in full autonomous mode or remotely operated from a base station. It
can be configured to carry different sets of sensors and to perform several kinds of missions, including
environmental monitoring, harbour protection or bathymetric data gathering. Its main characteristics are
described in Table.
Boat type Autonomous marine surface vehicle/Roaz II
Length 4.2 m
Beam 2.0 m
Weight 200–400 kg (depending on payload and batteries)
Propulsion Two independent electric thrusters
Speed 10 kts
Autonomy >10 h (depending on operating speed)
Payload capacity Weight: 200 kg (depending of installed batteries)
Power: 1200 W
Control range (wide band) Up to 1 nm

Table :Roaz II technical specification.

Roaz II is operated from a mission control station composed by a ruggedized computer and a set of
auxiliary devices including antennas. It is capable of executing autonomous missions defined by a list of
waypoints differential GPS system and an inertial measurement unit. Telemetry as well as payload data
are transmitted in real time to the mission control station.

For navigation purposes, the vehicle uses a precision L1/L2 GPS receiver (Septentrio PolaRx2) and an
inertial motion unit (Microstrain 3DM‐GX1) providing attitude information. Propulsion is achieved
through the use of two 2 kW electric thrusters with the vehicle reaching a maximum speed of 10 kts. A set
of LiFePo batteries provides up to 8 h of autonomy. Communications with the control station are
provided by a WiFi link, allowing remote control, autonomous mission supervision and also transmission
of telemetry and payload data to shore.

Although this USV does not fulfil all the requirements for search and rescue (SAR) operations as defined
in the ICARUS project (mainly the ones related to maximum speed and range) and its adaptation would
require a great effort, its characteristics make it a valuable asset in many experiments as well as in the
final demonstration scenario .

For that purpose, a thermal camera, a visible camera as well as a radar similar to the one integrated in the
U‐Ranger were also integrated in Roaz II (Figure).
Figure :Roaz II USV

[h]Unmanned capsule

The UCAP (Figure) is a single hull vessel, with a lower rear deck to accommodate the uninflated life raft
as well as the corresponding compressed gas bottle. The hull was fabricated in fibreglass, using as
custom‐made mould .
Figure :Unmanned capsule

The UCAP dimension is 1.45 m (length) × 0.52 m (width) × 0.42 m (maximum height). It weighs 22 kg
and has a payload capacity exceeding 15 kg.

A jet drive unit assures the propulsion of the UCAP. This jet drive unit is attached to a brushless motor
and is capable of delivering a maximum force of 80 N with a power consumption of 800 W. This
maximum thrust assures a top speed greater than 5 kts.

On‐board energy is provided by two packs of ZIPPY Flightmax 5000 mAh 6S1P LiPo batteries. This
solution assures about 220 Wh of total on‐board energy. Taking into account the efficiency of the
propulsion system, the continuous operation at 1.5 m/s (3 kts) for 20 min (resulting in range of 2 km)
should require about 100 Wh of energy, leaving 120 Wh for electronics and communications, which
consume about 10 Wh. The battery pack is enclosed in a watertight box that is located in the bow
compartment. This compartment also houses another watertight box with the on ‐board computer,
navigation sensors and communications equipment. The bow compartment is also watertight assuring a
double protection for electronics and batteries.

Navigation sensors include a PNI Trax AHRS and an Ublox Neo 6P GPS. Trax AHRS is a low ‐power
and low‐cost attitude and heading reference system with a static heading accuracy of 0.2° and an overall
accuracy better than 2° in the presence of magnetic distortions. NEO 6P is a low cost GPS receiver that
operates at 10 Hz, outputs raw data, being supported by RTKLIB, an open source library that implements
differential and real time kinematic corrections using small size and inexpensive receivers.

Communications with a control station are assured by a long ‐range Wi ‐Fi link that establishes a wide
band link over distances above 1 km (depending on height of the shore station antenna over the water
surface and on the wave conditions).
A video camera is also installed on the UCAP. A video stream is fed to the control station for possible
assessment of victim conditions when the UCAP is close to them, as show in Figure :

Figure :Image taken from the unmanned capsule on‐board camera (source: ICARUS).

Besides the described items, on‐board electronics also includes a load balancing and protection system for
the batteries, the motor controllers or the water jet motor and direction servo as well as triggering systems
for the inflation of the raft. The interconnections between all on ‐board systems are depicted in Figure :

Figure :System architecture—electronics


The on‐board software is composed of several modules that communicate with each other using a

message passing mechanism as shown in Figure :

Figure :System architecture—software

These modules follow a hierarchical architecture similar to the one used in the other INESC TEC robotic
systems. At the lowest level, the modules that interact directly with the sensors and actuation devices
constitute a hardware abstraction layer. On top of these, two major modules are responsible for the
navigation (real time estimation of the UCAP position, velocity and attitude) and for the control
(execution of manoeuvres and other high level behaviours).

The navigation module processes data from the GPS and inertial measurement unit (IMU) systems. The
GPS system provides the information about location and velocity. The IMU has incorporated
magnetometers, accelerometers and gyroscopes, providing information about yaw, pitch and roll states,
acceleration and rotational movements decomposed. A data fusion algorithm is used to estimate the
position of the capsule whenever the GPS receiver loses track, possibly due to excessive roll or pitch
caused by stronger waves. At the same time, the inertial data are used to obtain updated information on
the external disturbances, allowing a better characterization of the navigation environment.

The UCAP carries a lightweight life raft as the one presented in Figure. This life raft weights 8 kg (raft +
full inflation bottle) and the overall volume before inflation is 13 dm3.
Figure :Inflated life raft

A mobile computer running software for real‐time monitoring and control of unmanned capsules
composes the operation console. A graphical interface (Figure) provides the operator with the most
relevant data concerning the state of a UCAP and allows him to control its operation. A joystick can be
optionally connected to the computer to simplify the interaction with the UCAP.
Figure :UCAP operation console

This console allows the operator to switch between different UCAP operating modes:

1. Idle mode: where the UCAP actuation is shut down, causing it to drift according to external
disturbances (winds and currents).
2. Anchor mode: that allows performing station keeping, where the UCAP will loiter compensating
drifts caused by winds, currents or other influences.
3. Waypoint navigation mode: which consists of autonomous operation of the UCAP following a
sequence of waypoints defined by the operator or imported from a previously defined file.
4. Remote control mode: where the operator remotely controls the UCAP.
5. External mode: which consists of granting the control of the UCAP to an external entity. This
mode is similar to the idle mode except that another entity (for example the ICARUS C2I) can take
control of the vehicle.

Specific information about the UCAP that is displayed in the graphical interface includes

 Velocity
 Position
 Attitude (heading, pitch and yaw)
 Actuation (thrust and direction commands)
 Battery level
 Real‐time power consumption and estimated endurance

When the UCAP is in autonomous operation, further information concerning the status of such operation
(distance to next waypoint, estimated time no next waypoint completion or distance to anchor point in
anchor mode) is also provided to the operator.

A heartbeat mechanism is implemented between the UCAPs and operator console to support emergency
behaviour of the UCAP in case of communication link failure.

[h]Unmanned capsule deployment system

The deployment system consists of a mechanical structure and a release system that could be easily
modified or redesigned to suit several carrier USVs (Figure). The structure consists of a ramp that allows
the gravity to be the main force imposing forward motion to the UCAP during launch. It is made of
anodized aluminium bars to keep the overall weight low and resist to corrosion due to salt water. When
placed on the launching ramp, the UCAP sits on rubber rollers that allow the movement in the forward
direction while constraining it in the transversal direction .

Figure :UCAP deployment system installed on Roaz II USV


The release system is composed by an electric latching device and an electronic system required for its
command. This system is housed in a watertight box and is composed by a microcontroller, a power
amplifier and a battery. The system is connected to the carrier USV communications infrastructure so that
it can receive a remote command to release the UCAP. This command can be issued by a command line
or a graphical interface running on any Linux‐based device.

The deployment system can be easily integrated on a carrier USV (Figure), requiring only the following
operations:

Figure :Deployment system installed on the U‐RANGER USV

 Mechanical integration of the ramp


 Connection of the electronics box to the USV communications network
 Configuration of the microcontroller to use the carrier USV communications network

Mounting the UCAP on the deployment system is a simple operation that can be performed, whereas the
carrier USV is moored next to pier (Figure). The UCAP can be directly mounted on the rollers on ramp
with the help of a rubber boat.
Figure :Two UCAPs mounted on the deployment system installed on the U‐RANGER

Upon mounting the UCAP on the ramp, it must be secured to the release mechanism. For that purpose, a
rope with a metal attached to it is fastened to the stern of the UCAP; and when the UCAP is in place, that
pin is attached to the latching device.

Afterwards, for releasing the UCAP, a simple command needs to be sent to microcontroller in the
electronic box of the deployment system (Figure).

Figure :UCAP being launched from the U‐RANGER

This chapter addresses the work performed within the scope of the ICARUS project on the
development of complementary unmanned maritime systems technologies for search and rescue. This
work aimed at delivering a set of tools that can act not only as a part of the ICARUS toolset, but can
also be used independently. For one side, such developments consisted in endowing medium and large
scale unmanned surface vehicles with augmented perception and autonomic capabilities so that they
could perform search and rescue operations in complex environments with the presence of other
vessels and victims on the water, reporting back to the control stations situational awareness
information. On the other hand, the concept of unmanned capsule, which is a small ‐size platform able
to carry a life raft and inflate it close to victims, was prototyped, and integration as payload of larger
unmanned platforms was also addressed. These developments and their extensive validation in several
field trials and demonstrations carried out along the project are therefore a relevant contribution for
the real‐world deployment of robotics platforms in search and rescue operations, complementing the
operation of traditional search and rescue teams.

[mh]Surface and Subsurface Vehicles

The realm of transportation has witnessed an evolutionary journey marked by the advent of surface and
subsurface vehicles. These modes of transportation have not only transformed the way humans navigate
the world but also revolutionized industries, commerce, and connectivity. From the humble beginnings of
wheeled carts to the sophisticated submersibles exploring the ocean depths, the development of surface
and subsurface vehicles represents a testament to human ingenuity and the relentless pursuit of
innovation. In this exploration, we delve into the fascinating evolution of these vehicles, tracing their
origins, advancements, and the profound impact they have had on society.

Origins of Surface Vehicles: The story of surface vehicles dates back thousands of years to the invention
of the wheel. Ancient civilizations such as the Mesopotamians and Egyptians utilized wheeled carts for
transportation of goods and people. Over time, these primitive carts evolved into more sophisticated
modes of transportation, including horse-drawn carriages and early automobiles powered by steam
engines. The 19th and 20th centuries witnessed rapid advancements in surface vehicle technology, with
the invention of the internal combustion engine and the mass production of automobiles, revolutionizing
personal mobility and shaping modern cities and infrastructure.

Advancements in Surface Vehicles: The evolution of surface vehicles has been characterized by
continuous innovation and technological breakthroughs. The introduction of electric vehicles in the late
19th century offered a cleaner and more sustainable alternative to traditional gasoline-powered cars. In
recent decades, advancements in materials science, aerodynamics, and propulsion systems have led to the
development of high-speed trains, electric cars, and autonomous vehicles. These innovations not only
enhance efficiency and safety but also pave the way for a more sustainable future by reducing carbon
emissions and dependence on fossil fuels.

The Emergence of Subsurface Vehicles: While surface vehicles have dominated terrestrial transportation,
the exploration of subsurface environments has long captured the imagination of engineers and explorers.
The development of submarines in the 19th century marked the beginning of mankind's journey into the
depths of the ocean. These early submarines, powered by hand-cranked propellers and later by diesel
engines, played a crucial role in naval warfare and underwater exploration. In the 20th century,
advancements in technology, such as nuclear power and advanced sonar systems, enabled the
construction of larger and more capable submarines capable of extended underwater operations.

Exploring the Depths: Subsurface vehicles have expanded beyond traditional submarines to include a
diverse range of underwater craft, including remotely operated vehicles (ROVs) and autonomous
underwater vehicles (AUVs). These vehicles are equipped with advanced sensors, cameras, and
manipulators, allowing scientists to explore and study the ocean depths with unprecedented precision and
detail. From mapping the seafloor to studying marine life and ecosystems, subsurface vehicles have
revolutionized our understanding of the world's oceans and their importance to life on Earth.

Challenges and Future Directions: Despite the remarkable advancements in surface and subsurface
vehicle technology, numerous challenges remain. Surface vehicles face issues such as traffic congestion,
air pollution, and dependence on finite energy resources. Subsurface vehicles encounter obstacles such as
extreme pressure, corrosive environments, and limited endurance. Addressing these challenges requires
continued investment in research and development, as well as collaboration between government,
industry, and academia. Future directions in transportation include the development of hyperloop systems
for high-speed travel, electric aircraft for urban mobility, and autonomous underwater vehicles for deep-
sea exploration.

The evolution of surface and subsurface vehicles has been a testament to human innovation and
ingenuity. From the invention of the wheel to the exploration of the ocean depths, these vehicles have
transformed the way we travel, conduct commerce, and explore the world around us. As we stand on the
cusp of a new era of transportation, characterized by sustainability, efficiency, and connectivity, the
journey of surface and subsurface vehicles serves as a reminder of the endless possibilities that await us
on the road ahead.

[mh]Autonomy and Navigation

The integration of autonomy and navigation systems in unmanned vehicles represents a pivotal
advancement in transportation, exploration, and industry. From self-driving cars navigating city streets to
autonomous drones surveying remote landscapes, these technologies are reshaping our world in profound
ways. In this exploration, we delve into the intricate mechanisms behind autonomy and navigation in
unmanned vehicles, examining their evolution, capabilities, and the transformative impact they have on
various sectors.

Evolution of Autonomy and Navigation: The evolution of autonomy and navigation in unmanned vehicles
can be traced back to early experiments in robotics and artificial intelligence. In the 20th century,
researchers began developing rudimentary autonomous systems for military applications, such as
unmanned aerial vehicles (UAVs) for reconnaissance and surveillance. These early systems relied on pre-
programmed instructions and basic sensor inputs to navigate their surroundings. However, advancements
in computing power, sensor technology, and machine learning algorithms have propelled autonomy and
navigation to new heights in recent decades.

Capabilities of Autonomous Systems: Autonomous systems leverage a combination of sensors, actuators,


and algorithms to perceive their environment, make decisions, and execute tasks without human
intervention. These systems can navigate complex and dynamic environments, adapt to changing
conditions, and interact with other agents in their surroundings. Key capabilities of autonomous systems
include perception, planning, and control. Perception involves sensing and interpreting the environment
using sensors such as cameras, lidar, and radar. Planning entails generating a sequence of actions to
achieve a desired goal while accounting for obstacles and constraints. Control involves executing these
actions with precision and efficiency to accomplish tasks effectively.

Navigation in Unmanned Vehicles: Navigation plays a critical role in enabling unmanned vehicles to
move from one location to another while avoiding obstacles and hazards. Traditional navigation methods
rely on GPS, inertial navigation systems, and odometry to determine position and orientation relative to a
reference frame. However, these methods may be limited in environments with poor GPS reception or
high levels of interference. To address these challenges, unmanned vehicles employ a variety of sensor
fusion techniques, including simultaneous localization and mapping (SLAM), visual odometry, and
terrain-relative navigation. These techniques integrate data from multiple sensors to create a robust and
accurate representation of the vehicle's surroundings, enabling precise navigation in diverse conditions.

Applications Across Industries: Autonomy and navigation have revolutionized a wide range of industries,
from transportation and logistics to agriculture and construction. In the transportation sector, autonomous
vehicles offer the promise of safer, more efficient, and environmentally friendly mobility solutions. Self-
driving cars can reduce traffic congestion, improve road safety, and provide greater accessibility to
individuals with mobility challenges. In logistics, autonomous drones and robots are transforming
warehouse operations, last-mile delivery, and inventory management. In agriculture, autonomous tractors
and drones enable precision farming techniques, such as soil mapping, crop monitoring, and targeted
pesticide application. In construction, unmanned vehicles perform tasks such as site surveying, 3D
mapping, and building inspection with speed and accuracy.

Challenges and Future Directions: Despite the tremendous potential of autonomy and navigation in
unmanned vehicles, several challenges remain to be addressed. These include regulatory hurdles, safety
concerns, ethical considerations, and technological limitations. Regulatory frameworks must evolve to
ensure the safe and responsible deployment of autonomous systems on public roads and in shared
airspace. Safety standards and best practices are essential to mitigate the risks associated with
autonomous operations, particularly in high-stakes environments such as healthcare and defense. Ethical
considerations, such as privacy, cybersecurity, and algorithmic bias, must also be carefully addressed to
build trust and acceptance among stakeholders.

Autonomy and navigation are driving a paradigm shift in unmanned vehicles, unlocking new possibilities
for exploration, innovation, and societal impact. From autonomous cars navigating busy city streets to
drones surveying remote landscapes, these technologies are reshaping the way we live, work, and interact
with our environment. As we continue to push the boundaries of autonomy and navigation, it is essential
to address challenges collaboratively and responsibly, ensuring that these technologies serve the greater
good and enhance the human experience. With continued investment in research, development, and
collaboration, the future of autonomy and navigation in unmanned vehicles holds limitless potential for
positive transformation.

[mh]Sonar and Imaging Systems

Sonar and imaging systems play a crucial role in the navigation and exploration capabilities of unmanned
vehicles, enabling them to perceive and map their surroundings in diverse environments. From
underwater drones mapping the ocean floor to aerial drones surveying inaccessible terrain, these
technologies provide valuable insights and facilitate a wide range of applications across industries. In this
exploration, we delve into the intricate mechanisms behind sonar and imaging systems in unmanned
vehicles, examining their evolution, capabilities, and the transformative impact they have on various
sectors.

Evolution of Sonar and Imaging Systems: The evolution of sonar and imaging systems in unmanned
vehicles is rooted in centuries-old principles of acoustic and electromagnetic sensing. Sonar, which stands
for Sound Navigation and Ranging, was first developed in the early 20th century for underwater detection
and navigation. Early sonar systems relied on the transmission and reception of acoustic waves to
measure distances and detect objects underwater. Over time, advancements in transducer technology,
signal processing algorithms, and underwater communication protocols have greatly enhanced the
resolution, range, and reliability of sonar systems. Similarly, imaging systems, including cameras, lidar,
and radar, have undergone significant advancements, enabling unmanned vehicles to capture high-
resolution images and 3D reconstructions of their surroundings in real-time.

Capabilities of Sonar and Imaging Systems: Sonar and imaging systems enable unmanned vehicles to
perceive their environment through a combination of acoustic and electromagnetic sensing modalities.
Sonar systems use transducers to emit pulses of sound waves into the water, which bounce off objects and
return to the vehicle as echoes. By analyzing the time delay and intensity of these echoes, sonar systems
can generate detailed maps of the underwater terrain and detect underwater objects such as wrecks, reefs,
and marine life. Imaging systems, on the other hand, capture electromagnetic radiation, such as visible
light or radio waves, to create visual representations of the environment. These systems can provide high-
resolution images, depth maps, and 3D reconstructions of the terrain, enabling precise navigation and
situational awareness for unmanned vehicles.

Applications Across Industries: Sonar and imaging systems have a wide range of applications across
industries, from maritime exploration and defense to environmental monitoring and infrastructure
inspection. In the maritime sector, unmanned underwater vehicles equipped with sonar systems are used
for bathymetric mapping, underwater archaeology, and offshore resource exploration. These vehicles can
survey large areas of the ocean floor with high resolution and accuracy, providing valuable data for
scientific research and commercial activities. In the defense sector, sonar and imaging systems are
employed for naval reconnaissance, mine countermeasures, and anti-submarine warfare, enabling military
forces to detect and neutralize underwater threats with precision and efficiency. In the environmental
sector, unmanned aerial vehicles equipped with imaging systems are used for habitat monitoring, wildlife
conservation, and disaster response, providing valuable insights into ecosystem health and environmental
changes.

Challenges and Future Directions: Despite the significant advancements in sonar and imaging technology,
several challenges remain to be addressed. These include limitations in range, resolution, and data
processing capabilities, as well as environmental factors such as water turbidity and electromagnetic
interference. Additionally, the integration of sonar and imaging systems with other sensors and navigation
algorithms presents technical and logistical challenges that require interdisciplinary collaboration and
innovative solutions. Future directions in sonar and imaging technology include the development of
compact and lightweight sensors, advanced signal processing algorithms, and autonomous data analysis
techniques. These advancements will enable unmanned vehicles to operate more effectively in
challenging environments and expand the scope of applications across industries.

Sonar and imaging systems are indispensable tools for unmanned vehicles, providing essential
capabilities for navigation, exploration, and situational awareness in diverse environments. From
underwater drones mapping the ocean depths to aerial drones surveying remote landscapes, these
technologies enable us to explore and understand our world in ways that were previously unimaginable.
As we continue to push the boundaries of sonar and imaging technology, it is essential to address
challenges collaboratively and innovatively, ensuring that these technologies serve the greater good and
enhance the human experience. With continued investment in research, development, and collaboration,
the future of sonar and imaging systems in unmanned vehicles holds limitless potential for positive
transformation and discovery.

[mh]Marine Exploration and Surveillance


Marine exploration and surveillance represent critical aspects of oceanic research, security, and resource
management. From unlocking the mysteries of the deep sea to safeguarding maritime borders and
ecosystems, these endeavors play a vital role in understanding and protecting our planet's largest and least
explored frontier. In this exploration, we delve into the intricacies of marine exploration and surveillance,
examining their methodologies, technologies, and the profound impact they have on various sectors.

The Importance of Marine Exploration: The oceans cover over 70% of the Earth's surface and harbor a
vast array of ecosystems, resources, and biodiversity. Marine exploration seeks to unlock the secrets of
these underwater worlds, from the sunlit shallows to the abyssal depths. Scientists, explorers, and
researchers utilize a variety of tools and techniques to study marine environments, including remote
sensing, underwater robotics, and deep-sea submersibles. Marine exploration yields valuable insights into
oceanography, geology, biology, and climate science, informing conservation efforts, resource
management, and policy decisions.

Technological Advances in Marine Exploration: Advancements in technology have revolutionized the


field of marine exploration, enabling scientists to access and study previously inaccessible regions of the
ocean. Remotely operated vehicles (ROVs) and autonomous underwater vehicles (AUVs) equipped with
cameras, sensors, and sampling devices can descend to great depths and collect data with unprecedented
precision and detail. These vehicles are capable of mapping underwater terrain, surveying marine life, and
collecting samples of water, sediment, and organisms for analysis. Additionally, satellite-based remote
sensing techniques provide a broader perspective on ocean dynamics, including sea surface temperature,
salinity, and ocean currents.

Applications of Marine Exploration: Marine exploration has diverse applications across scientific
research, resource management, and industry. In the realm of oceanography, exploration efforts
contribute to our understanding of ocean circulation, climate change, and the carbon cycle. Marine
biologists study deep-sea ecosystems to uncover new species, understand ecological interactions, and
discover potential biomedical compounds. Resource managers utilize exploration data to identify and
protect marine habitats, assess fish stocks, and plan for sustainable development. In the offshore energy
sector, exploration informs the location and extraction of oil, gas, and mineral resources, while
minimizing environmental impact.

The Role of Surveillance in Maritime Security: Maritime surveillance encompasses a range of activities
aimed at monitoring, detecting, and deterring threats to maritime security and sovereignty. These threats
include piracy, smuggling, illegal fishing, and maritime terrorism, which pose risks to global commerce,
safety, and stability. Surveillance efforts involve a combination of sensors, platforms, and intelligence
gathering techniques, including radar, sonar, satellite imagery, and unmanned aerial vehicles (UAVs).
These assets provide maritime authorities with real-time situational awareness and enable rapid response
to emerging threats.

Technological Innovations in Maritime Surveillance: Technological innovations are driving


advancements in maritime surveillance capabilities, enhancing detection, tracking, and identification of
maritime threats. Integrated sensor networks combine data from multiple sources, including radar, AIS
(Automatic Identification System), and electro-optical sensors, to create a comprehensive picture of
maritime activity in a given area. Machine learning algorithms analyze vast amounts of data to identify
anomalous behavior and potential threats, enabling more efficient allocation of resources and proactive
response strategies. Unmanned aerial vehicles (UAVs) equipped with high-resolution cameras and
infrared sensors provide aerial surveillance over large maritime areas, complementing traditional surface
and subsurface assets.
Challenges and Future Directions: Despite the progress made in marine exploration and surveillance,
numerous challenges persist, including technological limitations, environmental concerns, and
governance issues. Technological challenges include the development of robust, reliable, and cost-
effective sensor systems capable of operating in harsh marine environments. Environmental concerns
center around the impact of human activities on marine ecosystems, including pollution, habitat
destruction, and overfishing. Governance issues involve international cooperation, legal frameworks, and
resource allocation for marine research, conservation, and security.

Marine exploration and surveillance are essential endeavors that contribute to our understanding of the
oceans and help safeguard maritime interests and ecosystems. Through the use of advanced technologies
and interdisciplinary collaboration, researchers and authorities can explore the depths of the ocean,
uncovering new discoveries and addressing emerging challenges. As we continue to push the boundaries
of marine exploration and surveillance, it is crucial to prioritize sustainability, cooperation, and
responsible stewardship of our planet's oceans. By investing in research, innovation, and international
partnerships, we can unlock the full potential of marine exploration and surveillance for the benefit of
present and future generations.

[mh]Environmental Impacts and Regulations

The intersection of human activity and the environment has profound consequences for the health and
sustainability of ecosystems worldwide. As societies pursue economic development and technological
progress, they must grapple with the environmental impacts of their actions and enact regulations to
mitigate harm and preserve natural resources. In this exploration, we delve into the complex relationship
between human activities, environmental impacts, and regulatory frameworks, examining the challenges,
solutions, and ongoing efforts to achieve a harmonious balance between progress and preservation.

Understanding Environmental Impacts: Human activities, ranging from industrial production and
transportation to agriculture and urbanization, exert a range of pressures on the environment, resulting in
various environmental impacts. Pollution, habitat destruction, deforestation, species extinction, and
climate change are among the most pressing challenges facing ecosystems worldwide. Pollution, in the
form of air, water, and soil contamination, poses significant risks to human health, biodiversity, and
ecosystem function. Habitat destruction, driven by land conversion, urban sprawl, and infrastructure
development, disrupts ecosystems, displaces wildlife, and threatens vulnerable species. Deforestation,
primarily driven by agriculture, logging, and urban expansion, contributes to biodiversity loss, soil
erosion, and carbon emissions. Species extinction, resulting from habitat loss, pollution, climate change,
and overexploitation, undermines ecosystem resilience and disrupts ecological balance. Climate change,
fueled by greenhouse gas emissions from burning fossil fuels, deforestation, and industrial processes,
poses existential threats to global ecosystems, economies, and societies.

Regulatory Frameworks for Environmental Protection: To address these environmental challenges,


governments, international organizations, and civil society groups have developed a range of regulatory
frameworks aimed at protecting natural resources, mitigating pollution, and promoting sustainable
development. These frameworks encompass a variety of laws, regulations, policies, and agreements at the
local, national, and international levels. Environmental laws establish standards for air and water quality,
waste management, land use planning, and biodiversity conservation. Regulatory agencies oversee
compliance with environmental regulations, enforce penalties for non-compliance, and promote pollution
prevention and control measures. International agreements, such as the Paris Agreement on climate
change, the Convention on Biological Diversity, and the United Nations Framework Convention on
Climate Change, provide frameworks for global cooperation on environmental issues, setting targets, and
guidelines for emissions reduction, biodiversity conservation, and sustainable development.

Challenges in Environmental Regulation: Despite the progress made in environmental regulation, several
challenges persist, hindering effective implementation and enforcement of environmental laws and
policies. These challenges include inadequate funding and resources for regulatory agencies, insufficient
capacity and expertise in monitoring and enforcement, regulatory loopholes and inconsistencies, political
and economic pressures from industry stakeholders, and lack of public awareness and engagement.
Additionally, the transboundary nature of many environmental issues, such as air and water pollution,
deforestation, and climate change, requires coordinated action and cooperation among multiple
jurisdictions and stakeholders, posing governance challenges and barriers to effective regulation.

Innovations in Environmental Regulation: In response to these challenges, policymakers, regulators, and


advocates are exploring innovative approaches to environmental regulation that leverage technology,
data, and stakeholder engagement to achieve better outcomes for people and the planet. These approaches
include the use of remote sensing and satellite imagery for monitoring environmental changes, the
development of digital platforms and citizen science initiatives to collect and share environmental data,
the adoption of market-based mechanisms such as carbon pricing and emissions trading to incentivize
pollution reduction and climate mitigation, and the promotion of green finance and sustainable investment
practices to support environmentally sound projects and initiatives. Additionally, there is growing
recognition of the importance of indigenous knowledge, traditional ecological knowledge, and
community-based approaches to environmental management, which prioritize local participation, cultural
sensitivity, and social equity.

Case Studies in Environmental Regulation: Several case studies illustrate the complexities and dynamics
of environmental regulation in practice, highlighting both successes and challenges in addressing
environmental impacts and achieving sustainability goals. For example, the implementation of emissions
trading schemes in Europe and elsewhere has led to significant reductions in greenhouse gas emissions
from regulated industries while providing economic incentives for innovation and investment in clean
technologies. Similarly, the establishment of marine protected areas and fisheries management plans has
helped to conserve biodiversity, restore degraded ecosystems, and promote sustainable fishing practices,
benefiting both ecosystems and coastal communities. However, challenges remain in areas such as air
pollution control, water quality management, and waste management, where regulatory enforcement,
public awareness, and political will are critical to achieving desired outcomes.

Future Directions in Environmental Regulation: Looking ahead, the future of environmental regulation
will be shaped by a combination of technological innovation, policy reform, public engagement, and
international cooperation. Key priorities include strengthening regulatory frameworks for emerging
environmental issues such as plastic pollution, electronic waste, and chemical contamination, integrating
environmental considerations into economic decision-making processes, enhancing transparency and
accountability in environmental governance, promoting green and circular economy models that
minimize resource use and waste generation, and fostering partnerships between governments,
businesses, academia, and civil society to address complex environmental challenges collaboratively.
Additionally, there is a growing recognition of the interconnectedness of environmental issues with social
justice, equity, and human rights, highlighting the need for inclusive and participatory approaches to
environmental regulation that prioritize the needs and perspectives of marginalized communities and
future generations.

Environmental regulation is essential for safeguarding the health and integrity of ecosystems, protecting
human health and well-being, and promoting sustainable development for present and future generations.
By addressing the root causes of environmental degradation, mitigating pollution, conserving natural
resources, and promoting responsible stewardship of the planet, regulatory frameworks can help to
achieve a balance between human activities and the environment. However, addressing complex
environmental challenges requires sustained political will, cross-sectoral collaboration, and concerted
efforts at the local, national, and global levels. With innovative approaches, inclusive governance
structures, and collective action, we can build a more resilient, equitable, and sustainable future for people
and the planet.

chapter 5: Unmanned Space Systems (USS)

[mh]Introduction to USS

Unmanned space systems (USS) represent a groundbreaking frontier in space exploration, enabling
humanity to probe the cosmos, gather scientific data, and expand our understanding of the universe
without the need for human presence. From robotic rovers exploring the surface of Mars to sophisticated
telescopes observing distant galaxies, USS have revolutionized our approach to space exploration and
scientific discovery. In this exploration, we delve into the multifaceted realm of unmanned space systems,
examining their diverse applications, technological innovations, and the profound implications they have
for our exploration of the final frontier.

Exploring Planetary Surfaces: One of the most prominent applications of USS is the exploration of
planetary surfaces, where robotic spacecraft and rovers traverse alien landscapes, conduct scientific
experiments, and search for signs of life. Mars, in particular, has been a focal point of unmanned
exploration efforts, with multiple missions deploying rovers such as Spirit, Opportunity, and Curiosity to
study its geology, climate, and potential for habitability. These rovers are equipped with a suite of
scientific instruments, including cameras, spectrometers, and drills, allowing them to analyze rocks, soil
samples, and atmospheric conditions in situ. Similar missions have been conducted on other celestial
bodies, including the Moon, Venus, and asteroids, shedding light on the diverse geology and composition
of our solar system.

Observing the Cosmos: Unmanned space systems play a crucial role in observing and studying celestial
objects beyond our solar system, providing astronomers with unprecedented views of distant stars,
galaxies, and cosmic phenomena. Space telescopes such as the Hubble Space Telescope, the James Webb
Space Telescope, and the Chandra X-ray Observatory have revolutionized our understanding of the
universe, capturing breathtaking images and data that have reshaped our understanding of cosmic
evolution, dark matter, and the nature of black holes. These telescopes operate in various regions of the
electromagnetic spectrum, from optical and infrared to X-ray and gamma-ray, allowing scientists to
explore the universe across a wide range of wavelengths and uncover hidden mysteries.

Enabling Interplanetary Missions: Unmanned space systems serve as vital enablers for interplanetary
missions, providing crucial support functions such as communications, navigation, and propulsion.
Communication satellites in Earth orbit relay data between spacecraft and ground stations, allowing for
real-time command and control of missions across vast distances. Deep space networks, such as NASA's
Deep Space Network and ESA's Estrack network, provide global coverage for tracking and
communicating with spacecraft throughout the solar system. Advanced propulsion systems, such as ion
thrusters and solar sails, enable spacecraft to travel efficiently and reach distant destinations with
precision and speed. These capabilities are essential for conducting complex missions to planets, moons,
comets, and asteroids, paving the way for future human exploration of the solar system and beyond.
Monitoring Earth's Environment: Unmanned space systems also play a critical role in monitoring and
studying Earth's environment, providing valuable data on climate change, natural disasters, and
environmental trends. Earth observation satellites capture high-resolution imagery and data on land use,
vegetation, oceans, and atmosphere, facilitating research in areas such as climate modeling, disaster
response, agriculture, and urban planning. These satellites track changes in sea level, ice extent,
deforestation, and air quality, enabling policymakers, scientists, and communities to make informed
decisions and take timely action to address environmental challenges.

Technological Innovations and Future Directions: The field of unmanned space systems is characterized
by continuous technological innovation and exploration of new frontiers. Future directions in USS
include the development of autonomous spacecraft capable of self-navigation and decision-making, the
deployment of swarms of small satellites for distributed sensing and communication networks, and the
exploration of extraterrestrial resources for potential utilization in space exploration and settlement.
Advances in artificial intelligence, robotics, and additive manufacturing are driving new capabilities and
mission concepts, opening up exciting possibilities for the future of space exploration and scientific
discovery.

Unmanned space systems represent a pinnacle of human ingenuity and exploration, pushing the
boundaries of what is possible and expanding our horizons beyond the confines of Earth. From planetary
exploration and cosmology to Earth observation and interplanetary travel, USS have transformed our
understanding of the cosmos and our place within it. As we continue to push the frontiers of space
exploration, it is essential to prioritize collaboration, innovation, and responsible stewardship of the space
environment to ensure that future generations can continue to explore, discover, and thrive in the vastness
of space.

[mh]Propulsion and Guidance Systems

Propulsion and guidance systems are the backbone of unmanned space systems (USS), providing the
means to navigate the vast expanse of space and reach distant destinations with precision and efficiency.
From the intricate maneuvers of spacecraft exploring the outer reaches of our solar system to the
controlled descent of robotic landers onto alien surfaces, these systems enable humanity to push the
boundaries of exploration and scientific discovery. In this exploration, we delve into the complexities of
propulsion and guidance systems in USS, examining their technological intricacies, operational
challenges, and the transformative impact they have on our understanding of the cosmos.

Propulsion Systems: Propulsion systems are responsible for generating the thrust necessary to propel
spacecraft through the vacuum of space and overcome the gravitational forces of celestial bodies. The
choice of propulsion system depends on factors such as mission requirements, payload mass, distance to
the target, and available resources. Chemical propulsion systems, such as liquid-fueled rockets and solid
rocket motors, have been the workhorses of space exploration since the dawn of the Space Age, providing
high thrust levels and reliable performance for launching spacecraft into orbit and beyond. Electric
propulsion systems, such as ion thrusters and Hall effect thrusters, offer higher efficiency and lower fuel
consumption than traditional chemical rockets, making them ideal for long-duration missions and
interplanetary travel. These systems generate thrust by accelerating ions or plasma through an electric
field, producing gentle, continuous propulsion over extended periods.

Guidance Systems: Guidance systems play a crucial role in steering spacecraft along their intended
trajectories, maintaining orientation, and achieving precise maneuvers in the vacuum of space. These
systems rely on a combination of sensors, actuators, and computational algorithms to measure spacecraft
position and velocity, calculate optimal trajectories, and execute course corrections as needed. Inertial
guidance systems use gyroscopes and accelerometers to measure changes in spacecraft orientation and
acceleration relative to a reference frame, allowing for autonomous navigation and attitude control.
Celestial navigation systems use star trackers and Sun sensors to determine spacecraft orientation relative
to celestial reference points, providing accurate pointing and alignment for scientific instruments and
imaging payloads. Radar and lidar systems enable spacecraft to rendezvous with other spacecraft or
celestial bodies, providing accurate range measurements and velocity data for docking, landing, or orbital
maneuvers.

Operational Challenges: Propulsion and guidance systems face a myriad of operational challenges in the
harsh environment of space, including exposure to extreme temperatures, vacuum conditions, radiation,
micrometeoroids, and orbital debris. These challenges require robust, reliable, and redundant design
features to ensure the safety and success of space missions. For example, propulsion systems must
withstand the thermal stresses of launch and re-entry, operate efficiently in the vacuum of space, and
remain operational for extended periods without maintenance or refueling. Guidance systems must
provide accurate navigation and control capabilities despite the effects of radiation-induced electronics
failures, sensor degradation, and communication delays. Additionally, mission planners must carefully
balance the trade-offs between propulsion performance, payload mass, mission duration, and power
requirements to optimize mission success within the constraints of available resources.

Technological Innovations: Technological innovations are driving advancements in propulsion and


guidance systems, enabling new capabilities and mission concepts that were once thought impossible.
Electric propulsion systems, such as ion thrusters and Hall effect thrusters, are becoming increasingly
prevalent in deep space missions due to their high efficiency and long-duration capabilities. Additive
manufacturing techniques, such as 3D printing, are revolutionizing the design and manufacturing of
propulsion components, reducing costs, lead times, and material waste. Autonomous navigation
algorithms, powered by artificial intelligence and machine learning, are enhancing spacecraft autonomy
and decision-making capabilities, enabling autonomous rendezvous and docking operations, hazard
avoidance maneuvers, and autonomous landing on planetary surfaces.

Applications in USS: Propulsion and guidance systems have diverse applications across a wide range of
USS missions, including planetary exploration, Earth observation, satellite deployment, and
interplanetary travel. Planetary exploration missions, such as the Mars rovers and spacecraft exploring the
outer planets, rely on propulsion systems to reach their destinations, perform orbit insertion maneuvers,
and execute controlled landings on planetary surfaces. Earth observation satellites utilize propulsion
systems for orbit maintenance, inclination changes, and deorbit maneuvers at the end of their operational
lifetimes. Satellite deployment missions, such as the deployment of constellations of small satellites or
CubeSats, require precise guidance and propulsion systems to achieve the desired orbit and separation
parameters. Interplanetary travel missions, such as the Voyager probes and New Horizons spacecraft, use
propulsion systems to escape Earth's gravity, perform gravity assist maneuvers, and maintain course
corrections during long-duration missions to distant destinations.

Future Directions: Looking ahead, the future of propulsion and guidance systems in USS is ripe with
possibilities, driven by advances in technology, exploration objectives, and international collaboration.
Key priorities include the development of advanced propulsion technologies, such as nuclear thermal
propulsion and solar sails, for faster, more efficient interplanetary travel. The integration of propulsion
and guidance systems with artificial intelligence and autonomous control algorithms will enable
increasingly autonomous and responsive spacecraft operations, reducing reliance on ground-based control
and communication. Additionally, international cooperation and partnerships will be essential for pooling
resources, sharing expertise, and achieving common goals in space exploration and scientific discovery.
Propulsion and guidance systems are the driving force behind unmanned space systems, enabling
humanity to explore the cosmos, unlock the mysteries of the universe, and expand our understanding of
the cosmos. From the launch pads of Earth to the distant reaches of interstellar space, these systems
provide the means to navigate the vast expanse of space and reach distant destinations with precision and
efficiency. As we continue to push the boundaries of exploration and technological innovation, it is
essential to prioritize collaboration, innovation, and responsible stewardship of the space environment to
ensure the success and sustainability of future space missions. With continued investment in research,
development, and international cooperation, the future of propulsion and guidance systems in USS holds
limitless potential for unlocking the secrets of the cosmos and advancing humanity's journey into the final
frontier.

[mh]Scientific Instruments and Payloads

Scientific instruments and payloads are the heart and soul of unmanned space systems (USS), providing
the means to observe, measure, and analyze the cosmos from the vantage point of space. From telescopes
peering into the depths of the universe to sensors probing the atmospheres of distant planets, these
instruments enable scientists to unravel the mysteries of the cosmos and expand our understanding of the
universe. In this exploration, we delve into the intricate world of scientific instruments and payloads in
USS, examining their diverse capabilities, technological innovations, and the transformative impact they
have on our exploration of the cosmos.

Observational Instruments: Observational instruments are designed to capture and analyze


electromagnetic radiation from celestial objects across the electromagnetic spectrum, from radio waves
and microwaves to visible light, ultraviolet, X-rays, and gamma rays. Telescopes are the primary
observational instruments used in space exploration, ranging from optical telescopes such as the Hubble
Space Telescope and the James Webb Space Telescope to radio telescopes such as the Atacama Large
Millimeter/submillimeter Array (ALMA) and the Very Large Array (VLA). These telescopes capture
images and spectra of distant galaxies, stars, planets, and other celestial objects, providing valuable data
on their composition, structure, and evolution.

Remote Sensing Instruments: Remote sensing instruments are designed to capture and analyze radiation
reflected or emitted by Earth's surface and atmosphere, providing valuable data for Earth observation,
environmental monitoring, and natural resource management. These instruments include imaging sensors,
such as cameras and spectrometers, which capture high-resolution images and spectra of Earth's surface,
oceans, and atmosphere, and passive and active remote sensing instruments, such as radar and lidar,
which measure surface elevation, vegetation biomass, and atmospheric composition. Remote sensing
satellites, such as the Landsat series and the European Space Agency's Sentinel missions, provide a
wealth of data on land use, land cover, vegetation health, and atmospheric conditions, enabling scientists,
policymakers, and resource managers to monitor and respond to environmental changes.

Planetary Exploration Instruments: Planetary exploration instruments are designed to study the surfaces,
atmospheres, and interiors of celestial bodies within our solar system, including planets, moons, asteroids,
and comets. These instruments include cameras, spectrometers, and imagers, which capture high-
resolution images and spectra of planetary surfaces and atmospheres, as well as sensors and detectors for
measuring temperature, pressure, magnetic fields, and radiation levels. Robotic landers and rovers, such
as the Mars rovers and the Philae lander, deploy a suite of scientific instruments to study the geology,
chemistry, and climate of alien worlds, providing valuable insights into their origins and potential for
habitability.

P Detectors and P Accelerators: P detectors and p accelerators are designed to study subatomic ps and
cosmic rays, providing insights into the fundamental properties of matter and the origins of the universe.
These instruments include p detectors, such as the Alpha Magnetic Spectrometer (AMS) and the Large
Hadron Collider (LHC), which measure the energy, charge, and trajectory of ps produced in high-energy
collisions, as well as p accelerators, such as the Fermi Gamma-ray Space Telescope and the Cherenkov
Telescope Array, which accelerate ps to near-light speeds and observe the gamma-ray and cosmic-ray
radiation emitted by celestial objects and astrophysical phenomena.

Technological Innovations: Technological innovations are driving advancements in scientific instruments


and payloads, enabling new capabilities and mission concepts that were once thought impossible.
Miniaturization and microfabrication techniques allow scientists to develop compact and lightweight
instruments with high sensitivity and resolution, suitable for deployment on small satellites and CubeSats.
Additive manufacturing techniques, such as 3D printing, enable rapid prototyping and customization of
instrument components, reducing costs, lead times, and material waste. Advances in data processing and
analysis algorithms enable real-time processing of large volumes of data, allowing scientists to extract
meaningful insights and discoveries from complex datasets.

Applications in USS: Scientific instruments and payloads have diverse applications across a wide range
of USS missions, including astronomy, planetary exploration, Earth observation, and fundamental physics
research. In astronomy, these instruments enable astronomers to study the properties and behavior of
celestial objects, such as stars, galaxies, and black holes, and to probe the early universe, cosmic
evolution, and the nature of dark matter and dark energy. In planetary exploration, these instruments
provide valuable data on the composition, geology, and climate of planets, moons, and asteroids,
informing our understanding of planetary formation and evolution. In Earth observation, these
instruments monitor environmental changes, natural disasters, and climate trends, providing critical data
for disaster response, resource management, and policy decisions. In fundamental physics research, these
instruments explore the fundamental forces and ps of the universe, shedding light on the origins and
structure of matter and the nature of space and time.

Future Directions: Looking ahead, the future of scientific instruments and payloads in USS is ripe with
possibilities, driven by advances in technology, exploration objectives, and international collaboration.
Key priorities include the development of advanced imaging and spectroscopy techniques for studying
exoplanets and detecting signs of life beyond our solar system, the deployment of next-generation
telescopes and observatories for probing the early universe and the nature of dark matter and dark energy,
and the integration of artificial intelligence and machine learning algorithms for autonomous data analysis
and discovery. Additionally, international cooperation and partnerships will be essential for pooling
resources, sharing expertise, and achieving common goals in space exploration and scientific discovery.

Scientific instruments and payloads are the eyes and ears of unmanned space systems, enabling humanity
to observe, measure, and analyze the cosmos with unprecedented precision and detail. From the depths of
interstellar space to the surface of distant planets, these instruments provide valuable data and insights
that reshape our understanding of the universe and our place within it.

[mh]Space Exploration Missions


The human exploration of the Universe is a real challenge for both the scientific and engineering
communities. The space technology developed till now allowed scientists to achieve really outstanding
results (e.g. missions around and landing on the Moon, the International Space Station as an outpost of
the human presence, satellites and spaceships investigating and exploring Solar System planets as well as
asteroids and comets), but at the same time further steps are required to both overcome existing problems
and attain new and exceptional goals. One of the harshest trouble is the operative environment in which
astronauts and rovers have to work in. Indeed, the outer space and extra–terrestrial planets have such
different physical properties with respect to Earth that both space machinery has to be conceived
selectively and manned crew has to be suitably trained to adapt to it. Nevertheless the entire product
assembly integration and test campaign are made on Earth in 1G. Given so different ambient conditions,
each phase in the whole life cycle of a space product is thorny and tricky and should be therefore
carefully engineered. In particular, testing and operative phases could involve the most of risks because of
the different product environmental conditions. Micro–or zero gravity environments are both impossible
to be found and tough for a practical and realistic reproduction on Earth. In the past, for astronaut’s tests,
only parabolic flights and underwater conditions lead to some limited success, but their drawbacks –
especially related to costs and dangerous situations – exceeded all the possible benefits and therefore,
nowadays, they have a limited use.

The outstanding development of computer science techniques, such as virtual and augmented reality, has
provided an interesting way to deal with such problems. Indeed, computer simulation of (portions of) real
worlds is currently the best practice to feel one’s concrete presence into a totally different environment,
without being there physically. Virtual Reality (VR in the following) high realism, immersion and
easiness in interaction are the key ideas for such applications. Realism is to faithfully reproduce those
environments, but not only from a graphical point of view: physical/functional simulation of objects
presence and behavior/(mutual) interaction is fundamental too, especially for those disciplines (e.g.
electrical, thermal, mechanical) heavily based on ambient reactions. Immersion is useful to enhance
perception of the new environment and allows users (e.g. Astronauts, Engineering disciplines,
Manufacturing) to behave as if it was their real world. At the end, interaction is the user’s capability of
communication with the simulation: the easier, the more effective and expressive the experience; the
more intuitive, the less the amount of time required by (specialist and unskilled) users for practicing it;
the more suitable, the better VR capabilities at one’s disposal are exploited.

The space industry can largely benefit of the virtual simulation approach. In this context, the main help
for aerospace disciplines is related to improve mission planning phase; its advantages are to allow
realistic digital mock–up representations, provide collaborative multidisciplinary engineering tasks, and
simulate both critical ground and flight operations. But benefits can arise in a number of other ways too.
For instance, due to the one–of–a kind nature of space products, the only product after the spacecraft
launch available on ground is its digital representation. Second, complex, scientific data can be
successfully represented by VR applications. Indeed, data belonging to astrophysics and space mission
domains are usually very hard to be understood in all their relationships, essence and meaning, especially
those ones describing invisible properties such as radiations. In that sense, a suitable, graphical
representation can help scientists (and not specialized audience too) to improve their knowledge of those
data. Finally, VR laboratories can be organized to host virtual training of human crews, by exploiting
their capability to direct interaction and physical behavior simulation ().

Thales Alenia Space – Italy (TAS – I from now on) experience in virtual reality technologies is mainly
focused on considerably enhancing the use of such tools. Two main research branches can be found there:
user interaction with the virtual product/environment, and data cycle (that is from their production till
their exchange among engineering teams) management. In the former case, the research is devoted to
virtual reality technologies themselves – with an emphasis on the way to visualize different scenarios and
large amounts of data – while in the latter case the focus is on the system data modeling. When put
together, they shall converge towards a complex system architecture for a collaborative, human and
robotic space exploration (see for a more detailed insight). Our vision entails an unique framework to
enforce the development and maintenance of a common vision on such a complex system. Therefore,
recent advances in the entertainment and games domains coexist alongside the most up–to–date
methodologies to define the most complete and reliable Model–Based System Engineering (MBSE)
approach. This multidisciplinary view shall have an impact on the way each actor could conceive its own
activity. For instance, engineering activity will benefit from such representation because the big picture
could be at its disposal at any level of detail; it should be easier to prevent possible problems by detecting
weakness and critical points; it should improve the organization of the entire system, and check whether
all the mandatory requirements/constraints are met. Astronauts themselves find this is a worthwhile
experience to gain skills and capabilities, especially from a training viewpoint. At the end, scientific
missions could be planned more carefully because the simulation of several scenarios requires a fraction
of time, can be easily customized by suitable set of parameters, and provides valuable feed–backs under
several forms (e.g. especially simulation data and sensory perceptions). Collecting and analyzing data and
information from there could help to diminish the crash and failure risks and consequently increase the
chance the mission targets will be really achieved. For all the aforementioned reasons, the pillars of our
research policy comprise, but they are not limited to: concurrent set–up and accessibility; several
elements of 4D (space + time), 3D and 2D features for data manipulation and representation; exploiting
immersive capabilities; easiness in interfacing with highly specialized tools and paradigms; user friendly
capabilities; adaptability and scalability to work into several environments (e.g. from desktop
workstations to CAVEs).

This chapter is organized as follows: paragraphs 2 and 3 will introduce the space domain context and the
current state of art of VR systems in this field. The focus will be put especially on its requirements, the
key points of view of our researches and the objectives we are going to meet. Progresses in VR field
focused on the collaborative features and the interdisciplinary approach put in practice are described in 4.
Section 5 is instead targeted on modeling complex space scenarios, while paragraph 6 some practical
example of space applications are given. The chapter is ending up with section 7 illustrating both some
final remarks and a possible road–map for further improvements and future works.

[h]Motivation and goals

The aerospace domain embraces an enormous variety of scientific and engineering fields and themes.
Given the intrinsic complexity of that matter, space challenges could be successfully tackled only when
all the inputs from those disciplines are conveniently melt together in a collaborative way. The lack of
both a common vision and suitable management tools to coordinate such several subjects can indeed limit
the sphere of activity and the incisiveness of researches in space sciences. Computer science in general
and VR in particular could really play an indispensable role for space scientists towards a significant
qualitative leap.

To show up the previous assessment, we chose to discuss a specific theme from a practical point of view:
the simulation of hostile environments. In this context, the term "hostile" is referring without loss of
generality to those places where either stable or temporarily human presence is extremely difficult
because of harsh physical conditions. Indeed, that is really the case of outer space and extra–terrestrial
planets, but that definition could be extended to some environments on Earth too. In the latter case, it
could even denote built–up areas after an outstanding event, such as natural disasters, and temporally
unreachable, with limited communication links, or altered topography. Virtual reproduction of such
environments is a particularly interesting activity according to several points of view. For that end, three
main steps could be outlined. For each of them, examples will be discussed in detail aiming at presenting
our modus operandi as well as some results we achieved. Given that practical stamp, this chapter has been
outlined to highlight the following ideas. First of all, simulation of environments aids different disciplines
to be complementary to each other. Indeed, we consider our virtual reality tools as connectors between
apparently separated research areas. As connectors, we collect data coming from different disciplines, but
describing similar phenomena, and visualize them as a whole. This multidisciplinary approach is intended
to overcome partial views single disciplines could have and, at the same time, create new ways to make
them interact and combine their knowledge. Practically speaking, the simulation of a specific
environment in our case mainly involves research areas like astronomy, geology and meteorology which
hardly overlap: the sight they have individually is usually staggered because they focus on different fields
of investigation; their integrated sight, by spanning all possible features, allows a complete description of
it. A similar case applies also to engineering disciplines, for instance, when space machinery models has
to be conceived. They could be seen as a complex set of different sub-systems (e.g. mechanical,
plumbing, energy), working together but designed separately from several group of people. People who
typically have the know-how in their own field but could have some limitations in figuring out
requirements from other engineering sources. Nevertheless, the real machinery has to work properly
despite splits in engineering knowledge domains. In this context, VR supplies methodologies being able
to simulate a portion of real world(s) according to its most relevant features (i.e. inputs from different
fields of knowledge) and in an unified way. Second, for the sake of coherence with real operative
environments, space simulations must reach the highest degree of realism as possible. In this field,
realism gains a double meaning, referring to both visual appearance and correct physical behavior
models. For that end, it is mandatory to deal with the most advanced techniques in computer graphics and
engines for physical simulation. This double faced is the basic element for all the functional aspects of the
simulation itself. Issues like aesthetics and photo–realism are especially very well polished when a
thorough visual feedback is required, such as for training sessions, virtual assembly procedures and
terrain exploratory missions. 3D models like terrain chunks, robotic machinery and/or natural elements
are a natural mean to show huge collections of complex data and convey the right amount of information.
Therefore, scientific and information visualization techniques should be carefully taken into account for
informative simulation purposes. In this stage, advice and expertise from disciplines are fundamental to
achieve the most plausible results, especially in modeling properly operational behaviors in such hostile
environments and show related information. Last but not least, our discussion shall implicitly focus on
interactions among several actors when they are dealing with virtual simulations. Indeed, we strongly
trust in usefulness of such simulations for space disciplines, especially in terms of information flows and
the gain each involved side could have by them. According to that point of view, virtual reality facilities
could be potentially the natural way out for many disciplines. Indeed, its intrinsic way of conceiving
multidisciplinary activities could lead to a novel way to think about (and maybe re–arrange) the
engineering process itself. In this case, the aim of this part is to discuss our own experience in such a
field. In that sense, As suggested, introduce a typical example of complex scenario where lots of
interactions between several actors are required. Possible case studies could be collaborative meetings,
engineering briefings, training sessions and ergonomic and psychological surveys. The focus is to
emphasize the virtuous cycle among participants and aiming at the improvement of both the simulation
and all the engineering aspects of the represented product(s) (see Figure).
Figure :A pictorial representation of relationships happening into the VR simulation process. VR is the
core of such a process and, by interacting with it, communications among all the actors shall generate
added value, in term of both easiness in connections and knowledge acquisition / generation

The use of virtual reality (VR) and immersive technologies for the design, visualization, simulation and
training in support to aerospace research have become an increasingly important medium in a broad
spectrum of applications, such as hardware design, industrial control and training for aerospace systems
or complex control rooms. VR applications provide a panorama of unlimited possibilities for remote
space exploration, and their flexibility and power can impact many aspects of future space programs and
missions. Modeling and interactive navigation of virtual worlds can provide an innovative environment
which can be thought as an excellent medium for brainstorming and creation of new knowledge, as well
as to synthesize and share information from variety of sources. Nevertheless, they can serve as a platform
to carry out experiments with greater flexibility than those ones conducted in real world. In this section, a
review of the projects and works related to the use of VR in the fields of (1) planet rendering (see section
3.1), (2) remote space exploration (see section 3.2 ) and (3) virtual prototyping (see section 3.3).

[h]Planets rendering

Recently Google Earth, one of the most popular work related to the visualization of the terrestrial
environment in 3D, has enabled users to fly virtually through Mars and Moon surfaces, providing a three-
dimensional view that aid public understanding of space science. Moreover, it has given to researchers a
platform for sharing data similar to what Google Earth provides for Earth scientists. The Mars mode
includes global 3D terrain, detailed maps of the Mars rover traverses and a complete list of all satellite
images taken by the major orbital camera. Likewise the Moon mode includes global terrain and maps,
featured satellite images, detailed maps of the Apollo surface missions and geologic charts. Similar
purposes and results are achieved by a number of 3D astronomy programs and planetarium software. The
limited 3D modeling capabilities is their major drawback, nonetheless their usefulness in terms of public
outreach has been definitively demonstrated by the increasing interest among the public audience in the
space exploration.

In any case, they are somehow limited in providing suitable space industry services. The importance of
supporting scientists and engineers work by highly- specialized, immersive facilities is a milestone at Jet
Propulsion Labs and clearly described, among the others, in . In this paper, the authors remark the
contributions of 3D Martian soil modeling in the success in accurately planning the Sojourner rover's
sorties during Mars Pathfinder Mission. The need for a well-structured and comprehensive reproduction
of large amount of data, collected during Mars probes (especially Mars Pathfinder and Mars Global
Surveyor missions) brought researchers to lay stress on VR coupled to astronomy and cartography
applications (). Indeed, frontiers of knowledge can achieve unprecedented and amazing results as coupled
with tailored VR tools: new research directions spent their efforts both to increase the overall visual
quality of the virtual scenes (e.g. see ) and improve the user's interaction with those VR facilities (e.g.
and ). In particular, the first real, immersive environment is the one described in Head's work . His
ADVISER system (Advanced Visualization in Solar System Exploration and Research) was conceived as
a new form of problem- solving environment, in which scientists can directly manipulate massive
amounts of cartographic data sets, represented as 3D models. Its novelty was in integrating both hardware
and software technologies into a very powerful corpus, being able to extend and improve scientists'
capabilities in analyzing such data as they were physically on its surface. On the other hand, a first
attempt to place side by side virtual and augmented reality tools was described in . In order to enrich the
users' experience, the authors created MarsView, where they added a force feedback device to a
topographic map viewer. Thus, the haptic interface favors a more intuitive 3D interaction in which the
physical feeling allows the users to actually touch the Martian surface as they pan around and zoom in on
details. The golden age for Mars exploration from late 1990s onward has really generated an impressive
data mole, whose main challenge is represented by its analysis tools. In this sense, examples above can
illustrate how it could be efficiently faced by exploiting simulation and interaction capabilities.
Nowadays, they are considered as unavoidable winning points for time saving, effectiveness, and
catching complex interactions and relationships skills.

[h]Virtual remote space exploration

Interactive 3D computer graphics, virtual worlds and VR technology, along with computer or video
games technology support the creation of realistic environments for such tasks as dock landing, planetary
rover control and for an effective simulation of the space–time evolution of both environment and
exploration vehicles. In the major characteristics of the available virtual worlds are described, along with
the potential of virtual worlds to remote space exploration and other space–related activities. Here, a
number of NASA sponsored activities in virtual worlds are described, like ’NASA CoLab Island’ and
’Explorer Island in second life’ (the latter providing spacecraft models and Mars terrain surface model
based on real NASA data), ’SimConstellation’ that explores a broad range of lunar mission scenarios and
’SimStation’, simulating the operation of the ISS, and training the astronauts to work on the space shuttle
and space station. This work also describes some tools for virtual space activities, including Google Mars
3D and Google Moon. Landing on planets and their later exploration in space missions requires precise
information of the landing zone and its surroundings. The use of optical sensors mounted to the landing
unit helps to acquire data of the surface during descent. The retrieved data enables the creation of
navigation maps that are suitable for planetary exploration missions executed by a robot on the surface. In
a Virtual Testbed approach is used to generate close–to–reality environments for testing various landing
scenarios, providing artificial descent images test–data with a maximum of flexibility for landing
trajectories, sensor characteristics, lighting and surface conditions. In particular, a camera simulation is
developed including a generic camera model described by a set of intrinsic parameters distortions;
moreover, further camera effects like noise, lens flare and motion blur can be simulated, along with the
correct simulation of lighting conditions and reflection properties of materials in space. Besides these
images are generated algorithmically, the known data in the Virtual Testbed can be used for ground truth
verification of the map–generation algorithms. The work in describes a Human Mars mission planning
based on the Orbiter space flight simulator, where the authors have used Orbiter to create and investigate
a virtual prototype of the design reference mission known as ’Mars for Less’. The Mission Simulation
Toolkit (MST) is a software system developed by NASA as a part of the Mission Simulation Facility
(MSF) project, which was started in 2001 to facilitate the development of autonomous planetary robotic
missions. MST contains a library that supports surface rover simulation by including characteristics like
simulation setup, controls steering and locomotion of rover, simulation of the rover/terrain interaction,
power management, rock detection, graphical 3–D display. In another work carried out by NASA Ames
Research center , a visualization and surface reconstruction software for Mars Exploration Rover Science
Operations is analyzed and described. It is based on a ’Stereo–pipeline’, a tool that generates accurate and
dense 3D terrain models with high–resolution texture–mapping from stereo image pairs acquired during
Mars Exploration Rovers (MER) mission. With regard to lunar environment modeling, a realistic virtual
simulation environment for lunar rover is presented in , where Fractional Brown motion technique and the
real statistical information have been used to modeling the lunar terrain and stones, forming a realistic
virtual lunar surface, where main features may be easily expressed as simulation parameters. In this work
a dynamics simulation model is developed considering the mechanics of wheel–terrain interaction, and
the articulated body dynamics of lunar rover’s suspension mechanism. A lunar rover prototype has been
tested in this environment, including its mechanical subsystem, motion control algorithm and a simple
path planning system.

[h]Virtual prototyping

Prototypes or mock–ups are essential in the design process . Generally a mock–up involves a scale model,
more frequently full size, of a product. It is used for studying, training, testing, and manufacturability
analysis. Prototyping, which is the use of mock–ups for designing and evaluating candidate designs, can
occur at any stage of the design process. In a later stage, mock–ups are already completed in every detail
and can be used for testing ergonomic aspects. However, physical prototypes can be expensive and slowly
to be produced and thus can lead to delays in detecting eventual problems or mismatches in the solution
under development.

Computer science offers the opportunity to reduce or replace physical prototypes with virtual prototypes
(VP). A VP is a computer–based simulation of a physical prototype and having a comparable degree of
functional realism than a physical prototype but with the potential to add some extra functionality. By
using VP, different design alternatives can be immediately visualized, allowing users to give real-time
feedback about the design alternatives and their use. Furthermore, changes to the solutions can be made
interactively and more easily than with a physical prototype, which means that more prototypes can be
tested at a fraction of time and costs required otherwise. The last feature is particularly crucial for the
development of ’one–of–a–kind’ or ’few–of–a–kind’ products.

The use of VR can contribute to take full advantage of Virtual Prototyping. In order to test the design
optimization of a VP product in the same way as the physical mock–up, a human–product interaction
model is required. In an ideal way, the VP should be viewed, listened, and touched by all the persons
involved in its design, as well as the potential users. In this scenario VR plays a meaningful role since it
can allow different alternative solutions to be evaluated and compared in quite a realistic and dynamic
way, such as using stereoscopic visualization, 3D sound rendering and haptic feedback. Therefore VR
provides a matchless and more realistic interaction with prototypes than possible with CAD models .
By using VR tools, not only aesthetic but also ergonomic features could be evaluated and optimized.
There are several approaches for the ergonomic analysis in a VR scenario. The first involves having a
human operator interacting with the VE through haptic and/or tactile interfaces and the second is based on
human virtual models that will interact with the VP, in a pure simulation technique. These human virtual
models can be agents, that are created and controlled by the computer, or avatars, controlled by a real
human.

[h]VR for collaborative engineering

Model Based System Engineering (MBSE) is the term currently used to represent the transition between
system data management through documents (e.g. specifications, technical reports, interface control
documents) to standard–based semantically meaningful models, to be processed and interfaced by
engineering software tools. MBSE methodologies enable a smoother use of VR in support to engineering
teams, representing one of the most interesting applications.

The core of a MBSE approach is the so–called system model, that is the collection of different models,
representing one of the possible baselines of the product, and formally describing the different,
characterizing features throughout the product life cycle. In particular MBSE provides a consistent
representation of data from the system requirements to the design and analysis phases, finally including
the verification and validation activities. With respect to a more document–centric approach, the different
characteristics of a product are defined more clearly, from its preliminary definition up to a more detailed
representation. This shall ensure less sensitivity to errors than the traditional document–centric view, still
widely used for system design. MBSE methodologies have highlighted the capability to manage the
system information more efficiently compared to the existing approaches. This process allows introducing
advantages that draws attention particularly for commercial implications. Indeed, since the last decade
many industrial domains have been adopting a full–scale MBSE approach through their research,
developments and applications, as demonstrated by INCOSE (International Council of System
Engineering, ) initiatives in that sense. There is not a unique way to approach MBSE. The main
discriminating factor is the definition of concepts, as a semantic foundation derived from the analysis of
the system engineering process. The resulting conceptual data model shall be able to support the product
and process modeling, with a particular emphasis on the data to be exchanged during the engineering
activities, considering both people and computer tools. The selection or definition of the modeling and
notation meta–models is specific to the needs of a particular domain, and even engineering culture, but it
shall be compatible with current efforts, so to assure compatibility between tools and companies. A joint
team from TAS – I and Politecnico di Torino is currently involved in researches focusing on the latest
developments in this domain, with a particular emphasis on active participation on the related European
initiatives. For instance, worthwhile experiences are: the Concurrent Design Facilities for the preliminary
phases (lead by ESA experience in its CDF , but also in the ASI CEF&DBTE and in industrial practices
inside TAS – I); the ESA Virtual Spacecraft Design on–going study for more advanced phases . The
current developments have the objective to summarize the above mentioned initiatives, giving the
possibility to be in line with the ongoing standardization and language definition efforts (e.g. ECSS–E–
TM–10–25, ECSS–E–TM–10–23 (), OMG SysML , Modelica ). The definition of a system model
generally involves several engineering disciplines in a deeper way with respect to the traditional
approach. The project team is composed by experts belonging to engineering and/or scientific areas that
are very different among them. In this context the VR definitely becomes a useful tool in the management
of data available, providing the technology necessary for effective collaboration between different
disciplines. The VR allows viewing directly data and information that are often difficult to read for those
who may not have technical background but who are otherwise involved in the design process of a given
system.
The MBSE methodology is commonly characterized by the definition of all the processes, methods and
tools that allow supporting and improving the engineering activities. In particular it is possible to consider
some of experiences that are evolving within various organizations’ system engineering structure and
procedures and that are spreading through technical publications and studies. For instance Telelogic
Harmony–SE® represents a subset of a well-defined development process identifiable with Harmony . In
this case activities as requirements analysis, system functional analysis and architectural design are
properly related each other within the context of life cycle development process. Another example may be
expressed with INCOSE Object–Oriented Systems Engineering Method (OOSEM). The model–based
approach introduced is characterized by the use of OMG SysML™ as an instrument to outline the system
model specification. This language enables a well-defined representation of the systems, supporting the
analysis, design and verification activities . IBM Rational Unified Process for Systems Engineering (RUP
SE) for Model–Driven Systems Development (MDSD) may be considered an interesting methodology
similarly to the examples considered above. In particular this process is derived from the Rational Unified
Process (RUP) and it is used for software development in the case of government organizations and
Industrial . Vitech Model–Based System Engineering (MBSE) Methodology is another example where a
common System Design Repository is linked to four main concurrent activities defined as: Source
Requirements Analysis, Functional / Behavior Analysis, Architecture / Synthesis and finally Design
Validation and Verification . The elements that characterized the methodologies presented above as other
similar initiatives are particularly suitable for the management of complex situations, which are difficult
to handle when the product development progresses over time. For instance the study of hostile
environments, such as the analysis of certain space mission scenarios, generally leads to the definition of
high complexity systems. In this case the management of a considerable amount of data through a
coherent and flexible way has expedited the spread of model–based methods. The growing complexity of
systems that are analyzed often becomes increasingly too difficult to realize a proper collaboration,
avoiding at the same time potential design errors. The MBSE provides the necessary tools to formally
relate the possible aspects of a given system. A representation through the techniques of VR about hostile
environments, as well as a similar view of the data generated, points out many advantages. The VR
allows, in relation to the structure data available through MBSE approach, to define in an extended
manner the system architecture, while ensuring greater availability of information. Another benefit is also
linked to the clarity with which the VR allows to report for instance the development phases of a given
system. Virtual model directly connected to the network of information of a unique data structure also
ensures access to the most current representation of the system.

Based on the progress made in recent years VR allowed to generate an ever more faithful representation
of the reality about the possible physical phenomena that are analyzed. In this manner it is therefore
possible to consider the generation of virtual environments where to conduct realistic simulations of
possible scenarios in which the system can potentially operate so making use of the time variable (the
4D). The advantages related to this capability are highlighted in the ability to reproduce situations for
which the construction of a real mock–up requires substantial economic investment. This becomes
evident especially in the aerospace industry where both the complexity of the systems involved, the high
amount of changes to manage and the possible operational scenarios require a limitation of the physical
prototypes that are built. Today space domain is becoming a free worldwide market so there is a clear
trend towards a reduction of economic costs that are incurred during the project and that most affect the
tests that are made on real physical systems. The generation of virtual models has also the advantage to be
able for example to analyze directly different possible design alternatives. Through the use of VR in fact
more people may be involved at the same time in project activities for which there are discussions about
equivalent system configurations. Generally the development of virtual environments becomes necessary
when there is the need to face critical situations. VR in fact allows considering environments that
commonly are not possible to reproduce on Earth, as for instance in the case of space mission scenario:
gravity, dust. In a virtual model it is possible instead to recreate some of the characteristic features that we
can potentially find during these situations. Moreover it is possible to manage the system variables to
proper modify the scenario, considering in this manner other different conditions for the system under
analysis. This capability could be difficult to reproduce with real physical elements mainly because of the
economic investment that would require. The simulations that can be realized in VR environment allows
also to avoid all the possible unsafe situations for the possible user. This characteristic becomes of
particular interest for human space activities where often certain actions may lead to harmful situations.

MBSE techniques applied to space projects are often associated to 2D diagram–based models (e.g. an
activity diagram in SysML, a control loop visualized in Simulink), or to 3D virtual models (e.g. a virtual
mock–up built with a CAD application, multi–physics analysis visualized with CAE tools). These
visualization techniques reached a high degree of maturity in the last decade, deriving from different
experiences performed at discipline level. Just as an example, a SysML–like representation is closer to a
software engineer than to a mechanical engineer. In a multidisciplinary team, the integration of
discipline–level defined data in a system–level Virtual Environment represent an effective way to assure
the full understanding by the whole team of the key system issues, representing a WYSIWYG at product
level, such as a modern word processor is for a document. Figure shows a simplified example of
integration of tools in VR. The CAD model is used to define the physical configuration, and retrieve the
related drawing. Current applications allow the user to calculate and/or store in the same CAD model also
relevant properties, such as mass, moments of inertia (MOI), center of gravity position. Such values are of
interest of the whole team and through dedicated interfaces those properties may be extracted and related
to the system architecture (product structure, interfaces between elements). If in the same integrated
environment the CAD model is linked with the system model providing input for simulations (e.g. mass
properties for spacecraft dynamics) then the Virtual Environment allows a project team to visualize them
in the same place.

The above mentioned approach may be used to visualize products and their properties (with precise
values, such as mass properties or nominal values). As far as the product elements are linked with the
virtual reality elements, also their behavior may be associated through the related parameters (e.g.
instantaneous position). Behaviors are represented by functions (e.g. Provide Locomotion, with related
ports with the Distribute Electrical Energy function, and the Environment functions for the terrain). Each
function (or composition of functions) can be represented by a model able to provide simulation
capabilities. Figure shows an example at data level of linking between virtual reality and Modelica code
through the system model. The integration of simulation models allow the Virtual Environment to be the
collector of engineering discipline analysis, but a complete system level simulator is still far to be
implemented in such way and it is subject of our current research. The integration of several simulations
requires a simulation process manager and a revision of the simulation models to be able to include the
multi–effects. As explained in previous sections, the virtual environment may contain own simulation
capabilities, thanks to an embedded physical engine, able to simulate e.g. collisions, dynamics, soft
bodies. These features may be used for a rapid prototyping of the simulation, providing rapid feedback
during concept and feasibility studies, as well as during the evaluation of alternatives.

Product and operational simulations does not saturate the VR support capabilities for a project team. The
use of the VR with embedded simulation capabilities may also be used to validate part of the AIT
(Assembly Integration and Test) planning, supporting the definition and simulation of procedures, or for
training purposes. Procedures can be created in VR, they can be validated and then made available using
Augmented Reality (AR) format so that to guide hands free assembly task execution (see Figure).
Figure :A scheme showing the connection between MBSE approach and VR environment seen as a
natural finishing line to improve all the designing phases and to support teams during their work.

Figure :VR association to the system model at data level (modeled by Modelica language), where each
object in a virtual world has a formal description at a higher level.
Figure :Current experiments performed in TAS – I focusing on the evaluation of the potential benefits of
a VR / AR approach: user feed–backs are encouraged to improve their implementation.

[h]Modeling environments

Since space environments are extreme with respect to Earth's ones, a careful model of them is mandatory
before undertaking any scientific mission. The study of real operative conditions spans from
understanding physical laws to defining geological composition of the surface, from measuring magnetic
fields to analyze natural phenomena. Of course, the better the knowledge, the greater the likelihood to
succeed in a mission. That is, failure factors such as malfunction, mechanical crashes, accidents and
technical unsuitability are less likely to happen, while crew safety, support decision optimization, costs
reduction and scientific throughput and outcome will increase consequently. The added value of VR in
this context is its ability in supporting this need for realism in a smart and effective way.

[h]Physic laws

Technically speaking, a physic engine is a software providing a numerical simulation of systems under
given physical laws. The most common dynamics investigated by such engines comprise fluid and both
rigid and soft bodies dynamics. They are usually based on a Newtonian model and their contribution to
virtual worlds is to handle interactions among several objects / shapes. This way it is possible to model
object reactions to ambient forces and therefore create realistic and complex software simulations of
situations that might be hardly reproduced in reality: for instance, by changing the gravity constant to the
Moon one (that is more or less one sixth of the terrestrial value), it is possible to handle objects as they
were really on Earth’s satellite; similarly, precise space module conditions could be achieved in order to
train astronauts in a (close to) zero gravity environment. The great advantages of these solutions are
cheapness, flexible customization and safety. Indeed, with respect to other common solutions usually
adopted, such as parabolic flies, they do not require expensive settings to work - a modern PC with
standard hardware, graphical card and processing power is more than enough to perform simulations of
medium complexity. At the same time, setting–up virtual world behaviors relies mainly on customizable
parameters as inputs for the simulation algorithms. Lastly, digital mock–ups can be stressed out till very
extreme conditions without their breaking physically occurs. And also final users are not subject to any
risks while they are facing a simulation.

The two main components a modern physics engine typically provide, concern rigid body dynamics, that
is a collision detection/collision response system, and the dynamics simulation component responsible for
solving the forces affecting the simulated objects. More complex cores allow engines to successfully deal
with p/fluid, soft bodies, joints and clothes simulations. Given all those features, it appears clear why a
physic engine allows studying natural and artificial phenomena with ambient conditions that are different
from the Earth ones: for example, testing dust behavior at gravity conditions on Mars (natural
phenomena), or driving a Martian rover acting on velocity, friction and external forces (artificial
phenomena). Virtual reality simulations are so flexible that specific and reiterated tests could be
performed several times in a row. This could be accomplished for a variety of scenarios: for instance,
training crew in performing particular difficult actions could lead to find the best practice for a given task;
simulating different terrain conformations could help in finding possible troubles on the way of an
autonomous, robotic vehicle; pushing the use of some mechanical component to the limit could suggest
how resilient it is to external stresses, its risk threshold and so on.

When physic engine results are connected to suitable input/output devices being able to return perceptions
to the user, then the realism of the simulation is definitely increasing. Therefore, feedbacks making the
user feel lifelike forces and sensations (e.g. bumps of an irregular terrain while driving a rover or the
weights in moving objects) push further specific studies in complex fields. For example, by means of
haptic feedback device and motion capture suite it is possible to perform ergonomic and feasibility studies
(i.e.: reachability test to check if an astronaut is able to get to an object and then to perform a particular
action like screwing a bolt). On the other side, a primary limit of physics engine realism is the precision
of the numbers representing the positions of and forces acting upon objects. Direct consequences of this
assertion are: rounding errors could affect (even heavily when precision is too low) final computations
and simulated results could drastically differ from predicted ones, if numerical (small) fluctuations are not
properly taken into account in the simulation. To avoid such problems, several tests on well-known
phenomena should be performed before any other simulation in order to detect the margin of error and the
index of trustfulness to count on.

[h]Terrain modeling

To model planetary surfaces like the Moon and Mars ones, a Digital Elevation Model (DEM) is required.
Technically speaking, it looks like a grid or a raster-graphic image where elevation values are provided at
regularly spaced points called posts. Reference DEMs come from NASA High Resolution Imaging
Science Experiment and Lunar Reconnaissance Orbiter missions (HiRISE and LRO respectively) and
represent the most up-to-date and precise advances in space geology measurements and cartographic
imagery. In general, ground data can be derived at a post spacing about 4X the pixel scale of the input
imagery. Since HiRISE images are usually between 0.25 and 0.5 m/pixel, each pixel describes about 1-2
m. Vertical precision is then also very accurate, being in the order of tens of centimeters. The altitude
computation is a very time intensive procedure and requires several stages as well as careful pre–and
post–processing data elaboration, sophisticated software, and specialized training. During this process,
image elaborations techniques could inherently introduced some artifacts but despite this fact, a near-
optimal reconstruction satisfy modeling constraints is largely possible. For more detailed information
about the complete (Mars) DEM computation process, see and the on-line resources at . Instead, for a
visual reference, look at Figure :Inserting a terrain model into a virtual scene is only the first step we
perform to achieve environmental reconstruction. Indeed, the description of a planet could be more
complicated than it appears at a first glance. In the next sub–sections, As suggested, describe how to
enrich the simulation of a planetary terrain by inserting more typical landscape elements and modeling
natural phenomena occurring on their surfaces.

Figure :Examples of DEM processing on Victoria Crater pictures. From left to right: High resolution
photo (from http://photojournal.jpl.nasa.gov/catalog/PIA08813); The original DEM computed as
described in ; A front view of our final 3D model (raw model + texture)

[h]Rocks
Almost every image taken from astronauts and/or robotic instrumentation shows Mars (and somehow the
Moon too) to be a very rocky planet. But those details do not appear into reference DEMs, despite their
astonishing resolution. Even if those small details cannot (still) be caught by advanced laser
instrumentation, the presence of rocks and stones poses a severe challenge for robotic equipment because
they increase the chance of a mechanical crash in case of collisions. Then, for the sake of a better
plausibility, we have to add rock models on the so far reconstructed surface. In that sense, studies made
for Mars, like and , are really useful because they describe a statistical distribution of them, with a
particular emphasis of those terrains visited during rover missions, like the Pathfinder site. Moreover they
can estimate both the density and rock size-frequency distributions according to simple mathematical
functions, so that a complete description of the area is furnished. Those data turn to be really useful
especially during landing operations or when a site has to be explored to assess the risks in performing
exploration tasks. For instance, those model distributions estimate that the chance for a lander impacting a
>1 m diameter rock in the first 2 bounces is <3% and <5% for the Meridiani and Gusev landing sites,
respectively.

Our 3D rock models are inserted onto the terrain by following that statistical approach and according to
specific site parameters such as the total number of models, size and type. During simulation sessions,
that distribution could change. The aim is clearly at forcing operational situations in order to analyze
reactions of the simulated equipment in hardly extreme conditions. In particular, thanks to the collision
detection engine, it is possible to evaluate impact resistance factors to guarantee the highest level of safety
ever. From a modeling point of view, the rock generation procedure could be summarized as follows: i)
generate a random set of points (rock vertices) in a given 3D space; ii) compute the convex hull in order
to create the external rock surface; iii) compute the mesh of the given volume; iv) adjust and refine
themodel (e.g., simulate erosion or modify the outer appearance with respect to shape and roundness) in
order to give it a more realistic look; v) statistically compute the site on the planet surface where the rock
will be laid; vi) put the rock onto that site according to the normal direction in that point. Examples of
rock skeletons (that is after the first three steps of the previous algorithm) are shown in figure, while
complete rocks can be seen in many figures spread across this paper.

Figure :Examples in generating rocks having both different shapes and number of vertices.

[h]Dust

Another issue is represented by the presence of an huge quantity of dust laying down on the soil. When
any perturbation of the stillness state occurs (such as the rover transit or an astronaut’s walk), a
displacement of an huge amount of small, dusty ps is caused: they could form big clouds raising up
quickly and being in suspension for a long time period after (because of the lower gravity). Scientific
literature describes this phenomenon mainly for the Moon because of the several lunar missions
undertaken in 70s and 80s. For instance, studies like , and show in details the typical behavior of dust
when its ps are emitted by a rover wheel: schemes and formulas are then given (for instance, to determine
the angle of ejection or the distance a p covers during its flight) with the aim of characterizing this
unavoidable effect, which should definitely modeled in our simulations since it affects any operational
progress. Indeed both the visual appearance and the physical behavior of dust have to be carefully
represented. In the former case, to test driving sessions under limited conditions in the vision field or to
find a set of man-oeuvre being able to lift the smallest quantity of dust as possible. In the latter case,
because avoiding malfunctions, especially for those modules directly exposed to dust interaction (e.g.
solar panels, radiators and wheels joints), is still a high-complex engineering challenge.

[h]Atmosphere events

A thin atmosphere is surrounding Mars. Even if it could not be compared to the Earth’s one, some weak
weather activities happen all the same in it, so that winds blow and seasons rotate. The presence of winds
in particular could be considered as an issue, especially during some thorny task performance, like a
capsule landing. Therefore even this new factor should be simulated efficiently.

The Mars Climate Database (MCD, ) offers an interesting set of data particularly suitable for that
purpose. Indeed, it collects several observations (e.g. temperature, wind, chemical composition of the air
and so on), caught at different sites and over periods, and focusing towards the definition of a complete
3D Global Climate Model (GCM) for Mars. In and further details on such models can be found. A
complete predictive model for Martian atmosphere behavior is still far to come to a complete end, but
some good approximations could be achieved through a simplified version of the Earth’s weather models.
In particular and without loss of generality, a simpler version of equations described in have been
considered throughout our experiments

Where the simplification comes after considering Martian atmosphere distinctive features, such as
extreme rarefaction, (almost) absence of water vapor and heat exchange, lower gravity and so on

. Technically speaking, they are Navier-Stokes equations describing the 3D wind directions and changes
in pressure and temperature. Since our interest is on being able to describe the weather situation at a given
interval of time and with respect to a limited area of the planet (that is a landing site typically), they are
used to define a Local Area Model for which the input data come from the MCD itself. In other words,
the goal is to adapt global models to a smaller scale (meso-scale) for which both precision and accuracy
might be guaranteed at most for short-term forecasts. However, caution in initializing data has to be
undertaken because even small errors in them could potentially have a huger impact for those reduced
area.

First results made on the Pathfinder site showed a good approximation in describing the wind activity,
compared to different MCD entries. Visualizing them in a 3D environment (see Figure) represent
therefore a first step towards a fully definition and integration of a Martian weather ’forecast’ predictor.
When this result will be achieved robustly, missions definition will gain another powerful tool to ensure
reliability and safeness.
Figure :Representation of winds in the Martian atmosphere at several time intervals. Arrows points to the
blowing directions while their size encodes their strength.

The goal of this paragraph is to show how virtual reality paradigm can be adopted for real applications
into the space industry domain. Case studies described in the following represent only a small part of the
most innovative activities undertaken at TAS-I. Nevertheless they are really representative of how
flexible and effective VR simulations are for several challenging and practical problems.

[h]Rover driving

This is maybe the best example to explain the tight collaboration among several scientific disciplines
when there is the need to represent several data into a visualization application only. Indeed, it comprises
contributions from: astronomy and geology (high–resolution planet surfaces and rocks modeling); physics
(to handle the behavior of objects according to specific environmental conditions); technical engineering
disciplines (to set–up the 3D rover model as a logic set of layers and sub–systems, considering for each of
them its working functionality as both a stand–alone and in collaboration with all the other ones);
ergonomic (to understand astronauts’ requirements about a comfortable and safe life on board and
therefore design suitable tools); human–computer interaction (to design interfaces to help crew in
understanding the surrounding environment and take actions accordingly).

Figure from 8 to 13 shows many of the features aforementioned. We present two different scenarios: on
Mars (Figure–10) and on the Moon (Figure–13). In the former case, we reconstructed an area of
approximately 1 km2 where the Victoria Crater, an impact one located at 2.05°S, 5.50°Wand about 730
meters wide, stands. Instead in the latter case, our attention is paid to Linnè Crater in Mare Serenitatis at
27.7°N 11.8°E. The goal is to drive a (prototype of a) pressurized rover –that is an exploratory machine
with a cabin for human crew –onto those surfaces, avoiding both to fall down into the pits and crashing
against natural hindrances (mainly massive rocks, such as those ones depicted in Figures 8 and 9). The
task is made more difficult by the presence of huge clouds of dust which, according to the specific planets
conditions, are usually thicker, broader and take more time with respect to the Earth to dissolve
completely. Since in those situations the visibility could be extremely reduced, the importance of being
able to rely on secure instrumentation, prior knowledge of the terrain to be explored and accurate training
sessions is essential, because indeed, any error could have wasting consequences on crew and equipment.
Therefore, astronauts should be able to fully understand all the risks, the policies to avoid them and how
to approach every step in such missions. In this context, a VR simulation offers a reliable tool to safely
undertake such training. To help the crew to perform their duty, a suitable, basic interface has been built.
It stands on the rightmost side of the screen where a double panel is shown. In the first one, at the top
right corner, parameters such as roll, pitch and yaw angles, level of battery, speed, acceleration and
outside temperature, are mapped onto a deformable hexagon, to keep them always under control. Their
values are continuously updated during the simulation to suddenly reflect the current situation. If all of
them are kept under a pre–defined safety threshold, the whole hexagon is green. When an alert occurs, the
respective parameter turns to red: in this case, the crew should take appropriate countermeasures to face
that danger (for instance, by reducing the rover speed). In the second control panel, a small bird’s–eye–
view map of the surroundings is depicted. On this map, small red circles represent potential hazards, such
as huge rocks. As the rover is reducing too much its minimum safety distance (that is, it could run into
collision with a rock), a red alert appears, so that a correct man-oeuvre could be undertaken in time. To
help the drivers a blue cylinder is projected facing the engine too. In this case, it points out where the
rover will be after a configurable, small amount of time (e.g., 20 seconds) if any change in the march
occurs. The driving commands are given through a suitable interface aiming at reproducing the
corresponding mean to be mounted on the rover (e.g. control sticks, levers, steering wheel and so on).
They could be either some haptic interfaces (with or without the force feedback impulse) or, as in our
case, wii–motes. The direction as well as the intensity of the strength applied to the rows is shown by a
couple of green arrows.

Figure :The interface for rover driving: Mars scenario. When possible accidents are likely to occur, a red
hazard alert is shown to the driver (picture below).

Figure :Danger: the rover is falling down into the crater and crashing against a rock. Another hazardous
man-oeuvre: the rover is rolling over after going around a bend
Figure :The Martian dust. Since Mars is a rocky and sandy planet, ejecting dust is really likely to happen.
Thickness and density of dust clouds depend on several factors, including the speed the rover is traveling.
The presence of dust could be a problem for safe driving, building- up on solar panels, and unpredictable
effects by intruding in exposed mechanical parts.

Figure :The Lunar scenario. A close look to the surface of the Lunar Linné Crater. The surface
reproduced here is only a small portion of the whole Lunar soil. Indeed Lunar DEM do not cover yet the
whole surface of the terrestrial satellite.
Figure :A possible look for a scouting expedition rover, with both the external and inner views. The
ergonomic in designing machines and tools to explore planets is a crucial aspect in the whole scientific
mission setting–up, beside its operational functionalities. Among the main requirements needed for that,
we can cite comfortableness, habitability and safety.

Figure :Dust behaviour modeling: in a physical environment (leftmost image) and on Moon, where it is
lifted up by rover transit. Dust emitters are positioned on rover’s wheels. In principle, only a small
number of dust ps are modeled. The dust cloud is then rendered for realistic simulations by adding some
visual effects and simulating the presence of more ps around the original ones. All the equations used to
simulate ps trajectories have been taken by . Similar works for Mars are still missing at the best of our
knowledge. Therefore, we adapted the Lunar ones to match the Martian environment.

[h]Planet landing

Another essential task (and another typical example where cooperation among disciplines is strictly
essential) is to bring onto the extra- terrestrial surface all the machinery required for the scientific
mission. This operation is usually performed by a lander. It could be thought as a composition of at least
three distinct parts: the capsule, the propulsion system, and the anchoring units. The first module carries
all the machinery to settle on the ground; the second part is used during both the take–off and the landing
and it aims at balancing loads and thrusts and avoiding sharp and compromising movements; the last one
is the first one to touch the soil and has to soften the landing and provide stability. This kind of operation
is really ticklish because in case of failure, the equipment is very likely to be lost or damaged or having
malfunctions. To avoid such a possibility, carefulness in choosing the landing site is mandatory:
interesting sites from the scientific point of view could be landing targets if in the surroundings a flat
terrain, almost rock–free and without any other obstacle is present. Therefore, an accurate research should
be performed prior the implementation of the mission itself. During the VR tests, different landing sites
could be tested, till the most appropriate one is detected (see the first two pictures in Figure). Those trials
are suitable for another couple of things. First of all, to test endurance, impact absorption, breaking and
tensile strength and some other mechanical properties of lander legs. In this case, series of physical
simulations should be set up to test changes in parameters and find the right combination of them to
guarantee the maximum of safety in real operative environments. (see the last picture in Figure) Then,
since dust clouds are a major challenge, blind landing should be taken into account. In this case, both
automatic and manual landing operations have to deal with complementary sensors (e.g. sonar and radar)
integrating previous knowledge of the targeted site. In this case, VR simulations can help scientists to find
the best descent plan according to the supposed hypothesis and the real operative situations, which can be
surprisingly different from the first ones. Therefore, plan corrections should be undertaken to face
problems such as malfunctions, higher speed, error in measuring heights, winds (on Mars) and other
unpredictable events.

Figure :Preparing a landing mission. From left to right: a scale model of the targeted terrain; its 3D
elaboration (from a scanned cloud of points) for VR applications; physical tests on landing legs. Legs are
composite models with many joints connecting all the parts whose mechanical properties are the subject
of several researches undertaken at TAS-I. The red and green arrows display strength and direction of
applied forces.

[h]Visualizing radiations

Scientific visualization is an interdisciplinary field whose objective is to graphically represent scientific


data so that scientists could understand and take a more detailed insight of them. It usually deals with 3D
structures and phenomena coming from several science branches such as astronomy, architecture,
biology, chemistry, medicine, meteorology and so forth. Computer

graphics plays a central role because of its techniques in both rendering complex objects and their
features (among the others volumes, surfaces, materials and illumination sources) and dealing with their
evolution in time (see ). The importance of visualization is essential to manage complex systems and
when events to be displayed are invisible (i.e., they could not be perceived because of either micro–or
even lower scales or they happened outside the optic frequency band). In those cases, visual metaphors
should be used to show such phenomena and therefore keep the audience aware of their existence, effects
and consequences. This approach has been successfully applied to projects aiming at investigating how
radiations will affect human health and electronic components during space missions. In particular, we
focused on representing the Van Allen radiation belt surrounding the Earth. This area is located in the
inner region of the magnetosphere and mainly composed by energetic charged ps coming from cosmic
rays and solar wind. The purpose of this study is to show how radiations will spread and amass on and all
around the whole spaceship volume, given the significant time spaceships spend in orbit. This way, it will
be possible to design suitable countermeasures to shield against all the potential risks. As shown in
Figure, the belt has been represented as a ball of threads enveloping Earth, getting thicker and thicker as
time flows and spaceships orbit our planet. At the same time, a color scale gives to the observer the
feeling of danger, by ranging from cold (that is, low risks) to warm colors (highest damages) (Figure).
Figure :Radiations hitting a spaceship as it orbited around the Earth (first two images: cumulative amount
of electrons; last two images: protons)

Figure :Other representations of Van Allen's belt, showing the integral and differential radiation doses.
As they exceed tolerance limits, an alert message is shown in red.

[h]Cargo accommodation

The International Space Station (ISS) is the farthest outpost of the human presence in space and can be
thought as an habitable satellite. Since 1999, its pressurized modules allowed the presence of astronauts
whose main goal is to conduct experiments in several fields by exploiting its micro–gravity and space
environment research facilities. Shuttle services provided in years a continuous turnover of astronauts as
well as supplies, vital items and scientific equipment. Anyway, carrying provisions and other stuff back
and forth is far from being a simple task, at least in its designing phase. Indeed, the most difficult
challenge is how to put the greatest amount of items into a cargo so that time, money and fuel could be
saved and providing at the same time the best service as possible. In other words, it means facing the
well–known knapsack problem on a larger scale. The CAST (Cargo Accommodation Support Tool)
program has been established to work out that problem by optimizing the loading within transportation
vectors such as Culumbus and ATV (Automated Transfer Vehicle). Practically speaking, it has to find the
optimal disposal for items (usually bags) into racks, that is the main focus is on properly balancing the
loading. This means finding the best center of mass position for each rack into the vector, such that
resource wasting is minimal, any safety issues will occur and it will take the smallest number of journeys
as possible. The balancing problem can be solved algorithmically through an interactive, multi–stage
process, where problems such as items–racks correlation, rack configuration and item, racks and cargo
accommodation have to be addressed. The result is a series of 3D points, whose final configuration
corresponds to how bags have to be stored into racks according the given constraints. A visual
representation of them is particularly useful if it could be conceived as a practical guide to help people
during load/unload phases. In order to allow users to test several configurations at run–time and analyze
how they will affect the final cargo accommodation, direct interaction has been guaranteed through wii–
motes, data gloves and force–feedback haptic devices. Moreover, in order to guarantee the best simulation
as possible, physical constraints have been added too. So, easiness in picking and moving objects will be
affected by object masses and weights; collision detection among bags and racks will limit movements in
changing object positions and guarantee at the same time the consistency of results (that is, impossible
positions cannot occur).

Figure :Visualizing bags disposal into ATV cargo racks. From left to right: the bags inside their container
as they arrived at ISS; ATV module (in transparency) after docking the ISS station; a close look to bags to
see the photo- realistic textures describing them. Such an application could be used as a guide helping
during load/unload operations and to recognize single bags.

Figure :A schematic view of bags and racks to solve the balancing problem in a graphical way. Moving a
bag (in green) will change the center of mass position and therefore the balance optimization. Physical
constraints can limit the bag movements. The center of mass position is graphically updated after every
new change.
[h]Understanding natural risks

Although TAS – I experience in modeling 3D terrains is principally devoted to reconstruct extra–


terrestrial soils, we can present here an example of an application involving Earth territories. The work
comes after Alcotra–Risknat (Natural Risks) project. Alcotra is an European Commission approved
program for cross–border cooperation between Italy and France. In the context of improving the quality
of life for people and the sustainable development of economic systems through the Alpine frontier
between the two countries, a special care towards enforcing public and technical services through a web–
based platform in the natural risk protection field is given. Among the objectives, we can remind the need
to to provide innovative technological strategies to manage territory policies efficiently; develop an
environmental awareness depending on sustainability and responsible management of resource use
paradigms; coordinate civil defense facilities and equipment in the cross- border areas. Given this context,
our main contribution consisted in a 4D physically- realistic simulation demo of a landslide occurred at
Bolard in the high Susa Valley. Thanks to stereoscopic vision and 3D sound effects, we developed
interactive and highly immersive scenarios for citizen risks awareness purposes. The demo consists of a
3D model simulating the physical propagation of debris and rocks slides in a mountain site (see Figures
19 and 20). The simulation has been built on real geological data coming after in situ measures and given
the local terrain morphology and orography at that time. Photos and videos of that period have been used
to both reproduce the slide path along the interested mountainside and reproduce the likely appearance
(e.g., color, density, speed and so on) of the slide itself.

Figure :Scenario for RiskNat simulation: a landslide in the Susa Valley (Piemonte, Italy). From left to
right: the mountain raising up the village of Bolard; three different views of the mudslide. The
cartographic data have a resolution of about 5 meters. The slide run has been modeled according to
density, viscosity and speed parameters very close to the original ones.
Figure :Scenario for RiskNat simulation: the village of Bolard. From left to right: the village before the
arrival of the landslide; the flow of mud, rocks and debris at the village gate; all the buildings submerged
just after the flood.

The COSE Center facility is an innovative and highly technological equipped laboratory, currently
involved into developing both VR and AR applications to support inner research at TAS-I. After being
successfully used in several fields such as the entertainment industry, they have been satisfactorily
introduced also in the management of complex production projects with the aim of improving the quality
of the whole engineering steps chain, from the collection and validation of requirements till to the final
realization of the product itself. TAS-I proficiently application to its products is double-folded. First, as a
new, integrating tool in all the decision making phases of a project, by supporting manual engineering
tasks and other well-known instruments (e.g., CAD) and overcoming their limitations. Second, as a set of
interactive simulation tools, being able to realistically reproduce hostile, extra-terrestrial environments
and therefore supporting disciplines to properly understand operational behavior under extreme
conditions. The VR facilities could be considered as a center of attraction to improve knowledge,
technical skills and know-how capability. This enables the COSE Center research activities to have
reached several positive results in the policies of simplifying the team approach to complex products and
projects. Among them, we could cite a better interaction with customers and suppliers and among
multidisciplinary experts too; improving the effectiveness of evaluation/assessment by the program teams
according to a tightly collaborative approach. The good results achieved thank to the VR-lab have been
reached because the system structure and behavior are shown in a more realistic way to the team.
Running several simulation sessions by stressing virtual models under different conditions is a fast and
economic way to collect data about product requirements, limitations and strong points. Practically
speaking, the set of virtual tools adopted at TAS-I and the current research results has lead in some cases
engineering disciplines to rethink about both their relationship to the implementing system and the
necessity to focus on new critical aspects, emerged during interactive sessions. In some other cases,
engineers decided to optimize their internal process given the results obtained through virtual tool
analysis.

In the future, we are aiming at improving the capabilities of our VR facility in several research directions.
First of all, by implementing new features / applications according to the engineering fields needs and
allowing a more natural interaction with them through specific devices (e.g., new tracking devices, touch-
screen devices, improved AR interfaces and so on). Second, by involving a higher number of disciplines
in order to achieve the most complete vision as possible of the environment to be simulated. A complete
simulator of hostile environments is still far from being implemented, but our efforts tend towards that
end. This shall mean that physical engine features would be extended to encompass a wider range of
possible dynamics to be reproduced. This shall also mean that a tighter cooperation with scientist is
mandatory to enforce the realism of a simulation.

Preface

In the past few decades, the rapid advancement of technology has ushered in a new era of exploration and
innovation with the proliferation of unmanned systems across various domains – air, ground, sea, and
space. These unmanned systems, also known as unmanned vehicles or drones, have revolutionized the
way we perceive, interact with, and explore the world around us. From unmanned aerial vehicles (UAVs)
soaring through the skies to autonomous ground vehicles traversing rugged terrain, from unmanned
underwater vehicles (UUVs) diving into the depths of the oceans to robotic spacecraft venturing into the
vastness of space, these unmanned systems are reshaping industries, pushing the boundaries of human
knowledge, and transforming our understanding of the universe.

The introduction of unmanned systems has fundamentally altered the landscape of countless sectors,
including aerospace, defense, transportation, environmental monitoring, agriculture, and scientific
research. These systems offer a plethora of advantages, such as enhanced safety, increased efficiency,
reduced costs, and access to remote or hazardous environments. They enable tasks to be performed with
precision, autonomy, and adaptability, opening up new possibilities for exploration, discovery, and
innovation.

This book serves as a comprehensive introduction to unmanned systems across air, ground, sea, and space
domains, providing readers with an in-depth understanding of the principles, technologies, applications,
and challenges associated with these transformative technologies. Through a multidisciplinary approach,
this book explores the design, operation, and impact of unmanned systems, examining the underlying
engineering principles, sensing and perception capabilities, autonomy and control systems, and real-world
applications.

Each chapter delves into a specific domain of unmanned systems, offering insights into the unique
characteristics, challenges, and opportunities within air, ground, sea, and space environments. From the
aerodynamics of UAVs to the navigation algorithms of autonomous vehicles, from the hydrodynamics of
UUVs to the propulsion systems of spacecraft, readers will gain a comprehensive understanding of the
intricacies and complexities of unmanned systems across diverse domains.

By delving into the fascinating world of unmanned systems, this book aims to inspire curiosity, spark
innovation, and cultivate a deeper appreciation for the transformative potential of these remarkable
technologies. Whether you are a student, researcher, engineer, or enthusiast, this book provides a
comprehensive guide to the exciting field of unmanned systems, offering insights into the past, present,
and future of exploration and innovation in air, ground, sea, and space.

About the book

In recent years, rapid technological advancements have ushered in a new era of exploration and
innovation marked by the proliferation of unmanned systems across air, ground, sea, and space
domains. Unmanned aerial vehicles (UAVs), autonomous ground vehicles, unmanned underwater
vehicles (UUVs), and robotic spacecraft are revolutionizing industries and expanding human
capabilities in aerospace, defense, transportation, and scientific research. This book serves as a
comprehensive introduction to unmanned systems, offering insights into their principles, technologies,
applications, and challenges. Each chapter delves into specific domains, exploring the engineering
principles, sensing capabilities, autonomy, and real-world applications of unmanned systems. By
examining the aerodynamics of UAVs, navigation algorithms of autonomous vehicles, hydrodynamics
of UUVs, and propulsion systems of spacecraft, readers gain a deeper understanding of these
transformative technologies. Whether readers are students, researchers, engineers, or enthusiasts, this
book aims to inspire curiosity and foster appreciation for the remarkable potential of unmanned
systems in shaping the future of exploration and innovation across air, ground, sea, and space
environments.

You might also like