Download as pdf or txt
Download as pdf or txt
You are on page 1of 28

SPECIAL REPORT:

UNMANNED SYSTEMS
MARCH 2020

Sponsored by
Your Stamping &
Washer Specialists
REQUEST
YOUR FREE
STAMPINGS
& WASHERS
CATALOG
& CALENDAR!

3D-Printed
Prototypes
Available

bokers.com
(888)-WASHERS 612-729-9365 sales@bokers.com
CONTENTS

FEATURES TECH BRIEFS


2 Artificial Intelligence and 24 Aircraft Vertical Takeoff and Landing
Autonomous Vehicles 24 N
 eural Lander Uses AI to Land Drones
Smoothly
6 Underwater Drone Technology
25 Lightweight Sensing and Control System
10 The CAN Bus: Driving the for Unmanned Aerial Vehicle Monitoring
Future of Autonomous 25 Flying Robot Mimics Rapid Insect Flight
Military Vehicles
26 Software Enables Robot to Self-Right
14 Autonomous "Wingman" After Overturning
Vehicles
18 Microturbine Propulsion
for UAVs
20 Eyes in the Sky
ON THE COVER
Seventy years ago, military aviation
moved from reciprocating engines to
APPLICATION BRIEFS vastly more reliable turbo jets and
turboprops. The same cannot be
said for UAVs. The industry has been
22 
Parcel Delivery Drone slow to innovate and develop turbine
propulsion systems for small aircraft
because it is far more difficult to
23 Robotic Helicopter produce high-performance small
turbines than large ones. But that
may be about to change. See the
article on page 18.
(Photo by sibsky2016/Shutterstock.com)

UNMANNED SYSTEMS SPECIAL REPORT MARCH 2020 1


ARTIFICIAL
INTELLIGENCE AND
AUTONOMOUS VEHICLES
T
he use of artificial intelligence (AI) based machine The ability of unmanned aircraft to monitor areas
learning technologies in autonomous vehicles is on of interest, staying up in the air for long periods of
the rise. Helping to drive this trend is the availability time, makes them a compelling alternative to the use
of a new class of embedded AI processors. A good of human crews, especially in battlefield situations.
example is NVIDIA’s Jetson family, which includes small In one application, a homeland security agency is
form factor system on modules (SoMs) that provide GPU- protecting critical national energy infrastructure by
accelerated parallel processing. These high-performance, using drones to monitor oil pipelines. In addition to
low-power devices are designed to support the deep surveillance applications, AI engines are now being
learning and computer vision capabilities needed to build used on unmanned air vehicles for in-vehicle health
software-defined autonomous machines. They derive massive monitoring. Sensors onboard the platform are monitored
computing capabilities from the use of a parallel processing by a predictive maintenance data hub that uses AI to
GPU device with many cores, enabling next-gen computing monitor the health of various onboard subsystems, which
devices to take on many of the tasks that were historically helps to schedule maintenance and other updates.
handled by humans or multiple, traditional computers. The available market for military use of AI engines is
How AI provides navigation and obstacle avoidance relatively small compared to the industrial applications
for autonomous vehicles on land, air, and sea gets a lot for which most of these devices were originally intended.
of attention, but machine learning is also being used in The larger opportunities for vendors of GPU-based AI
other ways on unmanned vehicles. These machine learning engines are likely found in places like factory floors and
applications are usually related to the types of sensors that production lines. That said, these devices are increasingly
are onboard a particular platform. For example, a number of used in the commercial vehicle market onboard autonomous
our customers are using AI engines in unmanned situational test cars. For military applications, the use of these
awareness airborne applications, taking and processing devices on unmanned platforms deployed in harsh
data from different types of sensors, such as cameras, environmental conditions requires expertise in packaging
radars, and LiDARs, which are used for surveillance and and ruggedization to ensure that devices built for industrial
detection, and then report that data back to operators. use can perform optimally when deployed in the battlefield.

2 MARCH 2020 UNMANNED SYSTEMS SPECIAL REPORT


NVIDIA’s Jetson TX2i small
form factor system on
module (SoM).

The NVIDIA Jetson TX2i module provides a great example developer tools and comprehensive libraries for building AI
of how an industrial AI engine can be adapted for use applications. TensorRT, a deep learning inference runtime,
onboard military platforms, most of which are especially and the CUDA development environment are just a couple
sensitive to any additional size, weight, and power (SWaP) of free tools bundled in JetPack. If the system integrator
burden. Another key issue for defense applications is long is already familiar with the NVIDIA Jetson development
lifecycle support, since commercial graphics processors environment, they will also be comfortable with this
are notorious for short product lifecycles. The good news technology, and since it’s portable across the entire Jetson
for military system designers is that SOMs like the Jetson family ecosystem, the software development environment
offer some of the best ratios of FLOPS of computing is portable across all Jetson modules, reducing costs
performance per watts of power consumed. The TX2i, through a build-once and deploy multiple times approach.
specifically, combines a high-performance 6-core Arm NVIDIA also offers lab-grade developer kits at an
processor with a powerful 256-core NVIDIA GPU that is extremely low cost. Designed to jumpstart application
compatible with their CUDA development libraries and development, these boards use standard commercial
software applications. Optimized to reduce SWaP, this connectors and enable customers to get their program
device is packaged in a small form factor pre-integrated started with very little investment in the lab before
with processor, graphics, RAM memory, Flash storage, and porting it over to a rugged platform when they are
I/O. With its support for extended operating temperatures ready to take the next step towards deployment.
(all of its components are rated to the full industrial An example of a rugged mission computer ideal for use in
temperature range of -40°C to +85°C or beyond), the deployed machine learning applications is Curtiss-Wright’s
TX2i can be effectively cooled using straightforward DuraCOR 312, a miniature system (5.2" x 5.4" x 2.0") based on
conduction cooling, eliminating the need for complex cooling the industrial Jetson TX2i SoM. The DuraCOR 312 can be used
alternatives or fans that have undesirable moving parts. in a wide range of AI-based applications and is especially
For enhanced reliability, the module uses RAM memory useful for computer vision applications. For example, the
with Error Corrected Code (ECC) support, which is mission computer can be deployed on a small drone and
particularly useful to mitigate a potential single event flown around a platform of interest to inspect for damage
upset (SEU) at high altitude due to solar radiation. Even or any visible anomalies. At sea, the system can be used
better, NVIDIA supports the industrial version of this to locate and identify surface vessels. In the battlefield, a
Jetson device with a ten-year lifecycle, which matches DuraCOR 312 can be launched on a small unmanned aircraft
the long program requirements typical of military to provide object detection and visual detection, providing
systems. Compare that to the five-year lifecycle of other real-time actionable intelligence of enemy troop strength
embedded Jetson devices and an even shorter two- and deployment without needing to request and wait for air
year lifecycle for consumer-grade graphics devices. support, such as a large helicopter, to provide surveillance.
To ease and speed application software development, the In the case of visual detection, the video captured by
TX2i is supported by NVIDIA’s JetPack Software Development the unmanned aircraft’s camera is sent to the GPU, which
Kit (SDK) that includes a large number of free CUDA runs a machine learning inference, basically scanning the

UNMANNED SYSTEMS SPECIAL REPORT MARCH 2020 3


ARTIFICIAL
INTELLIGENCE AND
AUTONOMOUS VEHICLES

Curtiss-Wright’s
DuraCOR 312 miniature
system based on NVIDIA’s
Industrial Jetson TX2i SoM.

images to identify, as it’s been trained to do, any objects dust and water. In fact, the chassis can be fully immersed
of interest, such as a soldier or tank, and it will tag the and will experience no water ingress at all. What’s more, all
detected objects with time and location data. The inference the circuit boards are covered with a conformal coating to
code, which is stored on the DuraCOR 312, was developed protect against humidity and corrosion. Designed for SWaP
earlier on a larger system using an algorithm of choice. optimization, this compact AI-engine-based mission computer
Images associated with the objects that you want to detect weighs less than 2.0 lbs. and requires less than 25W of power.
are formatted appropriately and then used to train the system The DuraCOR 312 is fully compliant to extremely
to recognize them. After the system is trained satisfactorily demanding MIL-STD-810G, MIL-STD-461F, MIL-STD-
to the desired percentage of accuracy, the machine-learning 1275D, MIL-STD-704F, and RTCA/DO-160G environmental,
model, written using a deep learning framework such as power, and EMI conditions, including high altitude,
Caffe or TensorFlow, is then run in TensorRT, which helps wide temperature, humidity, extreme shock, and
to compress the software code for use on small systems vibration and noisy electrical environments. The unit
deployed at the edge of the battlefield network. The inference also provides an aerospace-grade power supply in a
code, once downloaded onto the DuraCOR 312, can now fanless IP67-rated mechanical package that handles
be deployed on a drone and begin to process incoming harsh shock and vibration and operates over extended
images to identify those objects it was trained to recognize. temperatures without requiring a cold plate or airflow.
The results can then be downlinked to the warfighter in Through our experience collaborating with companies like
real time or stored on the drone until it returns for offline Cisco Systems to deliver rugged network switch solutions
post-mission analysis. How much storage is required on the for challenging embedded applications, we’ve learned the
aircraft will be determined by the distance and duration benefits, both for the customer and for the COTS vendor,
of the mission and the amount of imagery captured. that come from partnering with a technology pacesetter.
When deployed in battlefield environments, cooling the Now, by leveraging NVIDIA’s best-in-class industrial AI
TX2i’s power-efficient ARMv8 processor cores and CUDA engines, and packaging them for the military COTS user, we
GPU is aided through the use of an aluminum heat spreader can bring proven, deep learning technologies, supported
on top of the module that makes contact with all of the with a rich and familiar development environment, that
hot components. The heat spreader attaches directly to are viable for the unique requirements of unmanned
the DuraCOR 312’s system chassis, using phase change military platforms. This approach reduces program risk
material, and the metal of the chassis as an extension of the and provides system designers with the shortest and
heatsink. To further adapt the TX2i for use in rugged military easiest learning curve to shorten time to deployment.
conditions, Curtiss-Wright submits all system components This article was written by Mike Southworth, senior
to additional ruggedization. The DuraCOR 312 also uses product manager, Curtiss-Wright Defense Solutions
miniature versions of traditional rugged circular military (Salt Lake City, UT). For more information, visit
connectors that, when mated, provide full sealing against http://info.hotims.com/72998-502.

4 MARCH 2020 UNMANNED SYSTEMS SPECIAL REPORT


APPLICATION-SPECIFIC INTERCONNECTS FOR DEFENSE SOLUTIONS
WWW.OMNETICS.COM | SALES@OMNETICS.COM | +1 763-572-0656

OMNETICS IS A WORLD-CLASS
MINIATURE CONNECTOR DESIGN AND
MANUFACTURING COMPANY WITH
OVER 30 YEARS OF EXPERIENCE. OUR
MINIATURE CONNECTORS ARE
DESIGNED AND ASSEMBLED IN A
SINGLE LOCATION AT OUR PLANT IN
MINNEAPOLIS, MINNESOTA.

WE TAKE PRIDE IN WHAT WE


BUILD FOR YOU.
Underwater Drone
Technology

F
inding ways to overcome save submarine crews lost at sea, or All ROVs include built-in sensors, a
physical limitations so that locate shipwrecks, including the Titanic. tether, a tether management system,
humans can dive deeper At the beginning of the 21st century, a flotation pack, and a thruster. Most
and stay underwater longer UUVs increasingly became a faster, ROVs have sensors for video cameras
has been an ongoing quest. Back cheaper, and safer alternative than and lights. They may also have sonars,
in the 15th century, Leonardo da sending humans and trained animals magnetometers, water samplers,
Vinci drew sketches of a submarine to defuse sea mines. For example, it and instruments that measure water
and a robot. Had he thought to takes a team of human divers 21 days clarity, water temperature, water
combine the two concepts, he would to clear sea mines in one square mile, density, and sound velocity. An ROV
have created a prototype of an but a UUV such as the Hydroid REMUS might also include a 3D camera
unmanned underwater vehicle, or can complete the task in only 16 hours. to improve the pilot’s perception
underwater drone. Instead, the world of the underwater scenario.
had to wait another 500 years. How Do Unmanned Underwater Linked to a host ship by an umbilical
In the 1950s, in response to sea Vehicles (UUVs) Work? tether that houses energy cables
mines that would go on to destroy or Unmanned underwater vehicles and communication cables, ROVs
damage more naval ships than all other consist of remotely operated vehicles use a tether management system
types of attacks, the U.S. Navy began (ROVs) and autonomous underwater (TMS). The TMS makes it possible
developing an unmanned underwater vehicles (AUVs). to adjust the length of the tether to
vehicle (UUV) that could dive 10,000 minimize the underwater effect of
feet (rated depth) and function for Remotely Operated cable drag. By employing a flotation
four straight hours. In the succeeding Vehicles (ROVs) pack and a thruster, heavy-duty
decades, the Navy developed UUVs that ROVs were developed first and are ROVs can achieve precise motion
could retrieve lost equipment such as controlled by humans from a distance. control even in high-current waters.
torpedoes or a missing atomic bomb, Their sizes vary.
HYDROID

REMUS 6000

6 MARCH 2020 UNMANNED SYSTEMS SPECIAL REPORT


A-size autonomous
underwater drone.

General
Dynamics Boeing Lockheed Martin Teledyne Saab Kongsberg Hydroid
(Bluefin)

Product Bluefin-21 Echo Voyager Marlin Gavia Sabertooth HUGIN 1000 REMUS 600
example

Type AUV AUV AUV AUV AUV/ROV hybrid

Weight 1650 lbs. 100,000 lbs. 3,500 lbs. 105 lbs. 1400 lbs. 1400 lbs. 500 to 850 lbs.

Rated Depth 15,000 ft. 11,000 ft. 12,000 ft. 3000 ft. 3600 ft. 9000 ft. 1800 ft.

Power source Lithium Battery, rechargeable, Battery, Battery, Battery, Battery, Lithium battery,
battery, and marine diesel rechargeable rechargeable rechargeable rechargeable rechargeable
rechargeable generator

Estimated 25 hours at 3 months; or 60 hours over 14 7 hours 8 hours 100 hours at 4 Up to 24 hours
Endurance 3 knots 280 km days; 100 km knots

Navigation Inertial Inertial Navigation, Doppler aided Inertial IMU/Doppler and Inertial Inertial
navigation Doppler Velocity inertial navigation, navigation, terrain naviga- navigation, navigation,
Log, depth sensors USBL, terrain Doppler velocity tion micro-navigation acoustic doppler,
navigation meter, USBL sonar

Application Research, Oil and gas, Oil and gas, Research, Oil and gas, Research, oil and Research,
military military military military military gas, military military

Table 1: Comparison of underwater drones (not all models are included).

Smaller ROVs are used mainly for control and do not have to be tethered low bandwidth (1-2 kbps) that restricts
observation and inspection in the to a larger vessel. Having exchanged the rate of data transfer. Acoustics
research, military, or recreational the tether for more mobility and may therefore be the most suitable
sectors because they can only freedom, AUVs must face the problems communication solution for AUVs, but
withstand shallow deployment (around of communicating with the ships, challenges include signal diffusion,
1,000 feet or 300 meters) and have navigating on their own, and powering relatively low bandwidth, and power
limited horsepower. On the other themselves over a long period. consumption. At 20 kbps, the bandwidth
hand, the largest and most heavy- Communication: Water distorts offered by acoustic modems is still
duty ROVs are suitable for the oil and signals, so the usual means of insufficient for real-time video, meaning
gas industry because they can dive communication including GPS, mobile that the human operator on the ship
as deep as 20,000 feet and perform (3G, 4G), Wi-Fi, radar signals, and will only see delayed images. The use of
multiple tasks like lifting, drilling, optical navigation such as LiDAR will all high-definition cameras may supplement
construction, and pipeline inspection. be ineffective for AUVs. While useful, the collection of information.
very low frequency (VLF) and extremely Navigation: Signal distortion by
Autonomous Underwater low frequency (ELF) signals can only water also hampers how well AUVs
LOCKHEED MARTIN

Vehicles (AUVs) enable one-way communication from orient themselves. Therefore, most
In recent years, military applications the ship to the drone, since VLF or ELF AUVs use inertial navigation. Inertial
have taken the next step and begun transmitters are too bulky to mount navigation supplements dead reckoning
using AUVs, which, unlike ROVs, on the drone. Also, VLF and ELF have calculation of the drone’s current
operate independently of human long wavelengths and thus extremely position based on the previously

UNMANNED SYSTEMS SPECIAL REPORT MARCH 2020 7


Underwater Drone
Technology
determined position and known or The REMUS drones by Hydroid crews. Like Boeing, Lockheed Martin
estimated speed over time and course probably have the oldest and most is also acquiring AUV start-ups as
with motion sensor (accelerometer) recognizable brand, mainly due to high- part of its expansion in the sector.
and rotation sensor (gyroscope) data. profile operations involving REMUS
Also, AUVs use a Doppler velocity drones such as sea mine detection Challenges and Future
meter that acoustically estimates during Operation Iraqi Freedom. Also, Perspective
the drone’s velocity relative to the Canada, Japan, and NATO countries Manufacturers aim to continually
sea bottom and ultra-short baseline are using REMUS AUVs in their military. improve the functionality, performance,
(USBL) positioning, which works by Of the many types of REMUS drones, eco-friendliness and power efficiency of
emitting a regular acoustic signal from REMUS 6000 is the biggest unit. their UUVs. Saab has already launched
a surface reference point, such as a General Dynamics’ Bluefin-21 is an AUV/ROV hybrid that can switch
support ship’s underwater beacon. most famous for its involvement in between roles during a mission, and
Power: AUVs must run reliably and the search for the wreckage of the there will be more hybrid launches in
safely for days, even weeks, without missing Malaysia Airlines Flight 370 the future. Also, the sector is turning to
human intervention. Therefore, they in 2014. The U.S. Navy had previously biomimicry to improve propulsion and
need more efficient ways to move ordered several Bluefin-21 UUVs maneuverability, as in the case of Festo’s
around or consume energy. For for its Black Pearl AUV program. AquaJelly and EvoLogics’ BOSS Manta
example, by converting buoyant, Currently, the U.S. Navy is looking to Ray. Moreover, there is concern that
vertical motion to horizontal motion improve the Bluefin-21, especially its the USBL technology disrupts marine
with its wings, an underwater glider two-way communication capability. life and could interfere, for example,
propels itself forward with little energy. Meanwhile, General Dynamics is with the acoustic communications
While most AUVs use rechargeable developing the Knifefish, which is whales use when feeding, breeding,
lithium batteries, some of the larger based on Bluefin-21 and has improved and breaching. Lastly, researchers
AUVs use aluminum-based semi-fuel capabilities in detecting mines and are experimenting with combining
cells that are higher-maintenance. working with Littoral combat ships. different battery and power systems
Independence: Researchers are Boeing’s Echo Voyager is, by far, the with supercapacitors, extracting energy
also trying to make AUVs more biggest AUV around (hence dubbed from seawater, installing underwater
independent by developing machine “XLUUV”). At close to 100,000 lbs., it power stations to refuel the UUV,
intelligence solutions that enable has the longest endurance and one of and transferring power wirelessly.
AUVs to navigate complex situations the deepest rated depths. Echo Voyager Manufacturers are likely responding
and solve problems by themselves. is also completely autonomous, as it to the continual commitment from
does not have to be launched from, the government. Within the U.S.
Applications controlled by, or recovered to a Department of Defense, the Navy
With stronger processing power support vessel. What’s more, Boeing receives the biggest budget to develop
and a bigger power supply, AUVs is building the Orca, which is based on unmanned systems. The Navy plans
can take on more challenging Echo Voyager and promises to have to have its squadron of UUVs by
projects, such as locating the sunken more warfare capabilities than just 2020 and ultimately replace trained
Argentine navy submarine ARA those of mine removal so that it can animals with Kongsberg/Hydroid’s
San Juan in 2018 and recovering potentially substitute for the much REMUS MK 18 Mod 2 Kingfish in
the black boxes from the crashed more expensive Littoral combat ships. sea mine removal operations.
Air France Flight AF447. AUVs are Lastly, Boeing has been acquiring AUV Meanwhile, navies in the UK, France,
also poised to stretch the range of start-ups to expand its portfolio. Russia, Japan, and China continue
applications they take on. Used for Lockheed Martin is another to use UUVs or develop related
ship hull inspection, wreck inspection, contractor making extra-large drones, technologies to stay competitive. For
nuclear reactor decontamination, which in this case are the Marlins. example, China is working to incorporate
exploration, mining, drilling, and Marlins also have extended endurance, artificial intelligence into drones, and
wreck recovery, they have caught the range, power, and rated depth. In Russia is developing nuclear-powered
attention of researchers investigating addition, the Marlin can generate real- drones. Whether for the purposes of
their capabilities for weather time, 3D, geo-referenced models of exploration, research, rescue, military, or
forecasting, hurricane hunting, surrounding environments from the defense, one thing is clear — advanced
and environmental conservation. data it collects. Lockheed Martin is technologies used in underwater
Among the major players in the also maintaining a presence in the drones will continue to develop.
AUV space (see Table 1), several small AUV space with its line of A-size This article was written by John
of them have especially close drones that act as training targets W. Koon, contributing editor,
ties with the U.S. military. for submarines, aircraft, and ship Aerospace & Defense Technology.

8 MARCH 2020 UNMANNED SYSTEMS SPECIAL REPORT


Cyber Defense
D
AT THE

Tactica
a al
Ed
Edge

RUGGED. SCALABLE. SECURE.


Achieve accreditation fo
for fo
forward-deployyed
programs — with ease.

Combat-proven solutions from Crystal Group protect


critical data and eliminate guesswork. As a trusted
partner on more than 600 U.S. deffeense programs, our
high-perfo
formance rugged hardware is engineered with
certified encr yption and anti-tamper fe
features to address
complex cyber requirements.

When lives are on the line, failure is not an option.


Crystal Group delivers. Learn how our solutions integrate
demanding perfo
formance and security standards fo
for
accreditation of your program.

SERVERS | DISPLAY
AYS | STORAGE | NETWORKING
G | EMBEDDED | CYBER
sales@crystalrugged.com | 800.378.1636 | crystalrugged.com
The CAN Bus:
DRIVING THE FUTURE OF AUTONOMOUS
MILITARY VEHICLES

I
t’s a crisp November day in Michigan, U.S. Army’s Tank Automotive Research, era of improvised explosive devices
and a convoy of British and American Development, and Engineering (IEDs) and protracted conflicts in
resupply vehicles is rumbling along Center (TARDEC); and the U.S. Army remote and desolate terrains.
at a comfortable 25 miles per hour. In Armament Research, Development, and The technologies powering these
the lead is a British Army Rheinmetall Engineering Center (ARDEC). While UGVs are cutting-edge. Many could
MAN Military Vehicles (RMMV) HX- this project had been in the works hardly have been envisioned a decade
60 truck, trailed closely by two U.S. for three years, the U.S. military has ago, but the secret weapon of nearly
Army Oshkosh Light Medium Tactical been exploring autonomous vehicle all of these vehicles is a technology
Vehicles (LMTVs). In total, there are technology in earnest since at least that has been standard in civilian
zero humans operating this convoy. the 1980s. In the time period since vehicles for nearly 30 years: the
This was the scene in November then, there have been huge leaps Controller Area Network, or CAN bus. BACKGROUND IMAGE: FABER14/SHUTTERSTOCK.COM
2017, when the US-UK Coalition in autonomous vehicle technology,
Assured Autonomous Resupply and the military has seen already What is a Controller Area
(CAAR) demonstration took place dangerous and logistically difficult Network (CAN)?
at Camp Grayling, Michigan — the resupply issues multiplied in the A controller area network (CAN) is
result of three years of research and a message-based protocol that allows
engineering to examine the use of internal systems to communicate with
unmanned ground vehicles (UGVs) one another without a central computer.
in resupply situations to tackle so- CAN technology is used in applications
called “last mile” issues regarding as wide-ranging as agriculture, robotics,
supplying troops on the front lines of industrial automation, and medical
active conflicts around the world. systems, but it is most known for its
CAAR is a collaboration among the use in the automotive industry.
United Kingdom’s Defence Science In today’s connected vehicles, the
and Technology Laboratory (Dstl); the CAN Bus Board CAN bus facilitates communication

10 MARCH 2020 UNMANNED SYSTEMS SPECIAL REPORT


between microcontrollers (MCUs) in vehicles for decades, and even In addition to its functionality,
along a larger vehicle bus, without huge leaps in vehicle technology like CAN’s inherent ruggedness is a
the use of a central computer. For electric and autonomous vehicles clear draw, performing just as
example, the cruise control system have continued to utilize the CAN bus consistently in extreme heat and cold
can quickly communicate with the due to its flexibility and reliability. as it does in arid and dusty climates
anti-lock braking system to disengage These same features make CAN an and extremely wet conditions.
when a quick stop is needed. ideal component for autonomous
The more complex vehicles become, military and defense vehicles, What Sensors Are Attached
with ever-more interconnected MCUs including UGVs and unmanned to the CAN Bus?
needing to transfer information, the aerial vehicles (UAVs), or drones. The technologies that enable
more important the reliability of the autonomous driving can vary slightly,
vehicle bus becomes. And with each Military Vehicles but they all require advanced vision
model year bringing new cameras, Commercial autonomous vehicles and sensing equipment to “see” the
sensors, and display screens, the must have highly attuned sensors road ahead, as well as high-powered
efficiencies that CAN provides in the when navigating city streets — able software to make decisions based
physical layer of a vehicle become to sense changing road conditions, on that visual information. Most
more attractive. In the past, cars were other vehicles, and pedestrians. autonomous military vehicles would
limited in their features due to the finite Tactical military vehicles, on the other support some combination of the
amount of space for the physical cables hand, must be prepared for off-road following sensors, among others:
and complex wiring that was required conditions in every kind of hostile • Light Detection and Range
for each system to communicate. CAN environment. The obstacles are greater (LiDAR) technology, for creating
allows for a leaner networked system and the consequences are higher a 3D map of the road ahead;
that not only underlies the connected stakes. That means that a higher • Color cameras, for determining the
vehicles of today, but also the drive- priority must be placed on sensors changing position of the road and
by-wire functionality necessary for the and algorithms that can calculate other obstacles in front of the vehicle;
autonomous vehicles of tomorrow. and make split-second decisions; • Infrared cameras, adding
and the need for near-instantaneous, another layer of complexity
How is CAN Being Used in error-free communication is critical. to obstacle-sensing; and
Military and Defense Vehicles? CAN enables all of these complex • GPS, for navigation and creating
As previously mentioned, the CAN systems to communicate with the a larger contextual map that
bus has been the communication clarity and speed that are necessary the vehicle can reference.
standard for embedded systems when lives are on the line. With autonomous vehicles, and
especially autonomous tactical
vehicles, the in-vehicle networks
supporting the advanced vision and
sensing technologies require a higher
bandwidth connection like those
provided by Ethernet or FlexRay. But
these connections can combine with
CAN or CAN FD (CAN with flexible data
rate) to create a robust network that
is flexible when performing tasks that
require high data throughput, and that
is quick and reliable when performing
more simplified communication tasks.
Many military vehicles make use
of the CAN bus to log and transfer
periodic operational data that are
reviewed by maintenance personnel
(or, more likely, computer algorithms)
for predictive maintenance — in other
words, analyzing operational data to
look for potential vehicle maintenance
The CAN bus allows for communication between embedded systems within a UAV, as well as the issues so that they can be addressed
transfer of information between a UAV and the remote operator. before they become critical.

UNMANNED SYSTEMS SPECIAL REPORT MARCH 2020 11


The CAN Bus
CAMERA LIDAR GPS

ULTRASONIC
SENSORS

RADAR

DATA COLLECTION
INFRARED SENSORS
AND PROCESSING

This illustration
THIS ILLUSTRATION represents an
REPRESENTS ANexample
EXAMPLElayout of sensors
LAYOUT on a military
OF SENSORS vehicle. VEHICLE.
ON A MILITARY

Drones making for much safer and more the bus. MilCAN B supports data
The use of the CAN bus isn’t reliable UAV flight operations. rates from 10 kbps to 1 Mbps.
just limited to UGVs. Unmanned Both protocols were developed
aircraft systems (UASs) have also MilCAN to specialize the use of CAN around
adopted CAN technology for its To account for the many issues deterministic data transfer, so the
low-latency, reliable communication specific to military vehicles, a working specifications can also be used
capabilities. In fact, there’s even group of the International High Speed for non-military applications.
a UAVCAN protocol designed for Data Bus-Users Group (IHSDB-UG)
aerospace and robotic applications. developed the MilCAN higher layer The Future of CAN in
The CAN bus allows for protocol in 1999, with the goal of Autonomous Military Vehicles
communication between embedded creating a standard interface for Don’t expect robotic armies to
systems within a UAV, as well as the utilizing CAN bus in military vehicle be taking over the front lines any
transfer of information between a development. There are two versions time soon, but with successful pilot
UAV and the remote operator. For of MilCAN: MilCAN A and MilCAN B. programs like CAAR demonstrating the
instance, the CAN bus can allow Widely used in armored vehicles, viability of heavy-duty military grade
the flight controller to manage the MilCAN A uses 29-bit identifiers and uses autonomous vehicles, there is hope for
throttle on the electronic speed a similar frame format to SAE-J1939. making extremely risky situations — like
controller (ESC), but it also allows Mission-critical in mind, MilCAN A resupply missions — a little safer. And
the ESC to return hard real-time data prioritizes message transmission, and for the foreseeable future, the CAN
to the flight controller, including defines 1-Mbit, 500-Kbps, and 250-Kbit bus will be along for that journey.
temperature, amperage, voltage, communication rates. This article was written by
warning signals, etc. via live telemetry. MilCAN B is actually an extension of Jesse Paliotto, director of marketing,
The real-time data, transferred the CANopen application layer, using Kvaser (Mission Viejo, CA).
within microseconds, allows 11-bit identifiers and only periodically For more information, visit
remote pilots to react immediately, allowing data to be transmitted via http://info.hotims.com/72993-501.

12 MARCH 2020 UNMANNED SYSTEMS SPECIAL REPORT


Launch to new heights, achieve global recognition and take off with $20,000. You can
make your dream a reality by entering our 2020 Create the Future Design Contest.
By submitting your most innovative product ideas, you could shape the future!

Visit: CreateTheFutureContest.com
Entries: Accepted beginning March 1, 2020
See the website for contest rules and regulations

PRINCIPAL SPONSORS CATEGORY SPONSOR SUPPORTING


SPONSOR

PRIZE SPONSOR
AUTONOMOUS
“WINGMAN” By 2025, the Army sees ground troops conducting
foot patrols in urban terrain with robots, called Squad

VEHICLES
Multipurpose Equipment Transport vehicles, that carry
rucksacks and other equipment alongside soldiers.
Overhead, unmanned aircraft will also serve as spotters
to warn troops so they can engage the enemy on their
own terms, according to the Army’s new strategy on

The Future of Military Unmanned


robotic and autonomous systems.

Vehicle Technology

T
he US Army’s Futures The questions are many: What is The article examines the type of
Command is the most important to be the deployment doctrine and sensors, control, and telemetry needed
administrative reorganization of the sensors capabilities? Are these between an autonomous vehicle and

BACKGROUND PHOTO ILLUSTRATION BY PEGGY FRIERSON;


the modern Army. Responding new vehicles armed or merely for its “chase Mobile Ground Station,” and
to the world’s changing priorities — C4ISR? And indeed, are they actually describes the capabilities of the MVD
especially the “near peer” threat of “autonomous” or controlled, like on MRAP vehicles and how it directly
ascendant Russia and China — the USAF UAVs, by remote operators applies to the battlefield of the future.
Army is no longer modernizing, but from mobile ground stations? The Brief references to Next Generation
re-inventing its ground vehicle fleet good news is, regardless of how the Combat Vehicles will be made, although
against new realities. Just like the U.S. doctrine or vehicle takes shape, the at time of writing, this program is
Air Force stopped inventing better jets Army’s own MVD (Multi-function Video still evolving and far from settled.
and pilot aids and moved to unmanned Display) system can be “appliqued”
aerial vehicles (UAV) for “dull, dirty and from the Type II MRAP Mine Clearing National Defense Strategy Stays
dangerous” missions, the Army envisions Vehicle and used to remotely Ahead of Russia and China
multiple autonomous vehicle concepts. control all aspects of tomorrow’s In less than 20 years, the US has
INSET: U.S. ARMY

Instead of a heavier Abrams main battle “wingman” autonomous vehicles. changed its defense doctrine multiple
tank, or a replacement to the aging M113 In this article, we briefly examine times in response to world events like
APC, autonomous “wingman” vehicles what the Army’s Futures Command 9/11, urban warfare in Iraq, fighting
may replace some of the human- has published on future vehicle fleet, ISIS — and now, a realization that
heavy tasks on the future battlefield. autonomy, and the evolving use cases. Russia and China are legitimate

14 MARCH 2020 UNMANNED SYSTEMS SPECIAL REPORT


battlefield foes. While the first part of vehicle perspective, platforms will NGCV CFT envisions new platforms
the Department of Defense’s mission have increased lethality and mobility, supplementing ground vehicles by
statement says America still intends effectively inter-communicate in a providing reconnaissance and offensive
to effectively fight in two battle denied network and GPS environment, strike. Shown here, autonomous drone
theaters simultaneously (a carryover and bring back the “overwhelming swarms complement the indicated
from World War II), the second half of force” part of the DoD’s mission autonomous vehicles. Both are inter-
the mission statement guaranteeing statement. In the past, bigger was linked to human-operated ground
to “prevail using overwhelming better. In the near future — by 2025, vehicles and dismounted soldiers.
force” has DoD planners worried. in fact — smaller, more nimble,
In early 2018, the Pentagon’s National networked, and autonomous vehicles Enter NGCV: Not One, but Five
Defense Strategy (NDS) recognized will enter the ground vehicle fleet. While NGCV is one of the Army’s
that on land, sea, and in the air, the Sensors will be an essential enabler to top eight CFTs, NGCV itself is not a
reality is that both Russia and China realize an autonomous vehicle fleet. single vehicle but instead a set of CFTs
pose serious threats. Military planners At the Future Ground Combat that is creating over five different new
believe there is no longer 100 percent Vehicles Summit near TACOM in ground vehicle concepts that all rely on
confidence in America’s technical and Detroit, Michigan, in December sensors like cameras, LiDAR, multiple
weapons superiority. In Syria, DoD 2018, Col. Warren Sponsler, Deputy battlefield networks, and massive on-
planners watched Russia “test bed” Director, NGCV CFT, Army Futures board processing and digital storage.
new offensive and defensive strategies Command, contrasted the difference Designed to supplement or replace
and technology and concluded between today’s one-on-one/ some current weapons platforms like
that America needed to re-evaluate vehicle-to-vehicle approach versus the M113 and Bradley Fighting Vehicle
many of our deployed platforms. the future multi-domain approach (BFV or M2), the concept of Manned
The Army brass knew something had (Figure 2, upper left frame). Unmanned Teaming (MUM-T) currently
to change. While weapons like the M1A2 In the lower frame, each ground used on the Army’s Apache helicopter
Abrams main battle tank were designed vehicle is networked to all other joint will become a ground reality as long as
for head-to-head European combat battlefield assets. This is essentially the technology enables the platform.
against Russian tanks, the reality shown no different from today’s digital Each of the five NGCV CFT concepts
in Crimea is that Russian EW and cyber battlefield concept, except it’s still includes some form of autonomy.
capabilities will affect Army infrastructure not a reality. Joint assets are rarely The Robotic Combat Vehicles (RCV)
sooner than Russian armored vehicles. directly inter-connected due to shown in Figure 3 are light, medium,
Apparently with little effort, Russia was interoperability constraints, relying and heavy RCVs designed to act as a
able to “see” and “kill” the opposing instead on intermediaries like FOBs, “wingman” to other ground vehicles.
forces’ command and control structure. AWACS, E2C, and other human Used ahead of a strike force, the RCVs
Currently using a fixed and slow-to- latency-prone “switch” points. In might enable reconnaissance, offensive
erect Forward Operating Base (FOB) the upper right frame, however, the weaponry, mobile FOB/command
concept dating to World War I/II, the post capability, and even “feint”
Army Cross Functional Team
Army has realized that speed — coupled operations to draw an enemy away
with a “shoot and scoot” model — is Long-Range Precision Fires (LRPF) from the main Army fighting force.
now essential for modern warfare.
Next Generation Combat Vehicle (NGCV)
Technology Required
New Army Cross-Functional Future Vertical Lift (FVL) for Autonomy
Teams (CFTs) and NGCV Despite the positive press in the
Network (including Precision Navigation
In late 2017, Army Chief of Staff commercial world about self-driving cars
and Timing PNT)
Mark Milley outlined a set of eight from Google, Uber, Tesla, and others,
cross-functional teams (CFTs) Air and Missile Defense (AMD) the DoD believes it will be the first to
designed to address new battlefield actually deploy self-driving vehicles.
Soldier Lethality (including Synthetic
realities, and more importantly, to Training Environment STE) Michael Griffin, defense undersecretary
bring essential capabilities right to for research and engineering, told
the warfighter as quickly as possible. Precision Navigation & Timing Congress in April 2018: “We’re going
The CFTs are listed in Figure 1. Synthetic Training Environment to have self-driving vehicles in theater
The CFTs bring together multiple for the Army before we’ll have self-
ZWICKROELL

Army organizations, labs, vendors, and Figure 1. The U.S. Army’s Cross Functional driving cars on the streets.” To realize
industry to openly discuss out-of-the- Teams (CFTs). Note that the Next Generation his vision — plus the NDS top-down
box ways to address the (primarily) Combat Vehicle isn’t a single vehicle, but a set goals, the Army’s top eight, and the
Russian threat. From a ground of at least five different platform types. NGCV’s CFT's five vehicle-type portfolio

UNMANNED SYSTEMS SPECIAL REPORT MARCH 2020 15


AUTONOMOUS
“WINGMAN” VEHICLES
— dramatic improvements in embedded, sensor infrastructure built into roadways embedded sensors, and always-on inter-
deployed technology are needed. and intersections, along with pre-defined vehicle communication would be either a
That’s because the Army’s use case inter-vehicle protocols that allow vehicles homing beacon for an enemy missile or
is different from the commercial one. to talk to each other. On the battlefield, another network to be hacked or denied.
Civilian self-driving cars require both a there are no pre-defined roadways with Instead, onboard sensors in Robotic
Combat Vehicles (RCVs) need to see,
understand, and navigate the terrain
around them. High-resolution closed-
circuit TV (ccTV) cameras mounted on
the vehicle will provide a 360-degree
view and embedded computers will
fuse the scene into a digital “cylinder”
as if an operator was looking all around
him inside the cylindrical scene.
In some cases where the RCV is in a
leader-follower role, a remote operator
might actually be viewing the scene
in real time, but onboard the RCV,
machine learning is compressing the
video while reducing it to predictable
quadrants to save RF bandwidth. In
this role, passive sensors like ccTV,
LiDAR, GPS and position/navigation
(pos/nav) using EMP-resistant
fiber optic gyros will be used.
Artificial intelligence (AI) — different
from machine learning by predicting
outcomes never previously encountered
Figure 2. The Army’s NGCV CFT’s vision for future warfare relies both on joint assets and Army- or reasonably linearly extrapolated —
owned autonomous platforms like UAS drones and autonomous ground vehicles. will go many steps further by merging
myriad battlefield sensors, metadata,
onboard databases, and occasional
use of active sonar and radar sensors.
By consuming as much data from
any available battlefield sensor feed,
whether on the vehicle or not, AI

NGCV CFT AND FUTURE GROUND COMBAT VEHICLES SUMMIT 2018


can autonomously drive the vehicle
without any operator input beyond a
designated destination or objective.
The more sensors and data added to
the vehicle, the embedded computer
challenge grows exponentially. For
example, while a quad core, 8th
Generation Intel Core i7 (Coffee Lake)
small form factor computer can decode/
encode streaming HD H.254 video in real
time at 60 Hz, it will struggle with two
simultaneous HD streams. If each vehicle
requires two HD cameras per corner
(90-degree field-of-view per camera,
with overlap), these eight HD video feeds
would require server-class processing
just for the CODECs. Image processing
Figure 3. NGCV portfolio consists of over five new ground vehicle platforms. The RCV concept relies requires even more processing power,
on manned/unmanned teaming (MUM-T) like the Army’s Apache strike helicopter. lower latency, and faster networks.

16 MARCH 2020 UNMANNED SYSTEMS SPECIAL REPORT


Image processing — sensor fusion, MVD makes available today, not in
enhancement, anti-aliasing, target the future, many of the key technologies
tracking, compression, and identifying identified in Table 1 that are essential for
areas of interest — now becomes a RCVs as part of the Army’s NGCV CFT. UNMANNED SYSTEMS INTEGRATED ROADMAP
2017-2042
digital signal processing challenge The Type II MRAP on which the system
requiring a dozen or more Intel is installed is not robotic, but MVD helps
Xeon-class server cores and multiple control the platform’s robotic arm.
high-performance computing (HPC) However, due to the under-one-frame
chassis, all small enough to be mounted latency of MVD, the vehicle’s operators
inside the vehicle. Onboard networks can drive the vehicle completely “head
from cameras and sensors to the down” by only looking at the display.
servers and operator displays need This is possible without inducing nausea
at least gigabit throughput and likely because the under-one-frame latency
10 Gbits/s for sensor growth. Finally, presents outside views practically in
since these are forward-deployed real time. This capability is absolutely
RCVs, they will likely sustain battle essential for NGCV leader-follower
damage, so fault-tolerant processors vehicles where operators are looking out
Statement A. Approved for public relase: distribution unlimited.

and redundant in-vehicle networks of sensors on the RCV but are located
are needed — requiring multi-gigabit separately and distant from the RCV. Figure 4. The DoD’s “Unmanned Systems
Ethernet switches and mass storage. GMS is working on enhancements to Integrated Roadmap” in 2017 charts the way
MVD that will tick off the list even more of for all service branches.
MVD: Right, and Right Now the technologies in Table 1. For example,
Figure 4 shows the DoD’s “Unmanned the GMS X422 “Lightning,” announced NGCV and Autonomous
Systems Integrated Roadmap: 2017- last October, brings AI to ground Vehicle Enablers
2042.” The latest version, it identifies vehicles by adding two Nvidia Tesla V100 As this article has outlined,
numerous key technologies needed for GPGPU “data center accelerators” in a autonomous “wingman” vehicles are
future autonomous vehicles. (Table 1) conduction-cooled vetronics chassis. An poised to replace many of the human-
In 2017, General Micro Systems upgraded embedded server and open heavy tasks on the future battlefield.
announced a contract win with the Army standard network switch raise the in- As the Army works to integrate the
for the Multi-Function Video Display vehicle speed to 10 Gbits/s, providing most advanced technologies into
system (MVD) designed to bring video more bandwidth for high speed and MVD on MRAPs and other Next
and sensor data to Type II Mine Clearing myriad more sensors. However, except Generation Combat Vehicles, the US
Vehicles on the MRAP chassis. Accepting for the addition of the AI chassis (about will be more prepared to respond to
input from multiple high-resolution video the size of a shoebox), the system the “near peer” threats we are seeing.
cameras, FLIR and the Vehicle Optic remains exceptionally small. It’s designed This article was written by
Sensor System (VPOSS) — all called to fit into ground vehicles wherever Chris A. Ciufo, CTO, General Micro
“vehicle enablers” — the system converts there’s room: under a seat, mounted Systems (Rancho Cucamonga, CA).
real-time data to ultra-low-latency vertically on a bulkhead, or tucked For more infor­mation, visit
video-over-IP and transmits it over an behind a piece of existing equipment. http://info.hotims.com/72993-500.
industry-standard 1 Gbits/s Ethernet
Robotics advancements Sensor advancements
network, first to a rugged, embedded
Intel Xeon-class server and mass storage Prioritized common/open architectures Collision avoidance
system, then outward to multiple Common data repositories Leader-follower
operator smart screen consoles, each
Autonomous modeling and simulation GPS-denied solutions
with embedded Intel Core i7 processors. advancements
Except for the sensors, GMS builds
Machine learning advancements Cyber resilience and robustness
100 percent of this intelligent, machine-
learning, open-standards network/ Artificial Intelligence advancements Information assurance solutions
server/display system. The software SWaP/miniaturization advancements Increased network and spectrum capacity
that brings the system to life is from the
Swarming capabilities Human-machine interface advancements
Army’s Night Vision Labs and RDECOM
and is used to provide situational Augmented Reality Autonomous data strategy adaptation
awareness, operator hand off, and a Virtual Reality
common user interface (UI) to disparate
operator stations and workloads. Table 1. Key technologies identified by the DoD for unmanned systems.

UNMANNED SYSTEMS SPECIAL REPORT MARCH 2020 17


MICROTURBINE PROPULSION
FOR UAVS
S
eventy years ago, military Today’s engine market is riddled Designing a Solution
aviation moved from with inefficiencies and hazards that After spending five years working
reciprocating engines to jeopardize the safety and reliability to design and manufacture a world-
vastly more reliable turbo jets of UAVs. Current engines powering class gas turbine engine small enough
and turboprops. Shortly after, the drone delivery are unreliable and to fit into Group 3 (max gross takeoff
commercial air transport industry run on highly volatile
followed suit, enabling modern air and dangerous fuel,
transport. Today, virtually all large minimizing the impact of
aircraft rely on turbine propulsion, a drone fleet and raising
yet small aircraft, both manned and costs. A major percentage
unmanned, have not exploited the of vehicle losses with their
advantages of turbines for propulsion. payloads are attributed
While UAVs have become integral to engine failure. Plus,
for both commercial and military with the need for frequent
aerial missions, the continued use of overhauls, customers
reciprocating engines has limited the have to purchase multiple
UAV market’s ability to reach its true engines for a single
growth potential. The industry has vehicle so that, when one
been slow to innovate and develop engine is being worked
turbine propulsion systems for small on, they can continue
aircraft because it is far more difficult to operating with an
design and produce high-performance alternate engine. This severely limits Artist’s illustration of a Monarch RP
small turbines than large ones. the potential for the entire industry. microturbine engine.

18 MARCH 2020 UNMANNED SYSTEMS SPECIAL REPORT


weight less than 1320 lbs) and Group
4 (MGTOW of more than 1320 lbs), the
team at UAV Turbines has achieved
this engineering feat with the launch
of its Monarch propulsion system.
The Monarch turboprop was carefully
designed to outperform conventional
reciprocating engines in several ways:
• Monarch propulsion systems will
provide more time in the air and less
time being serviced on the ground with
upwards of a 2,000 hour increase in
operation time between overhauls when
compared to available Class 3 engines.
• The Monarch system’s variable
pitch propeller will enable UAVs
to climb faster and reach greater
dash speeds, enabling greater
performance and efficiency in both
commercial and military aircraft. Impeller and turbine for a 197,000 RPM turbogenerator, one of several 10 HP electrical generator
• The reliability of Monarch systems prototypes built and operated in the UAVT development program.
eliminates the need for extra
engines for a single aircraft. The core rpm in the first system variable pitch propeller so that the
• The flexibility to run efficiently on UAV Turbines is releasing for flight power can be efficiently converted to
all types of heavy fuels, such as jet test is right around 100,000 rpm thrust through each mission stage, from
fuel, makes Monarch propulsion safer at cruise speed. UAV Turbines built takeoff to landing. The engine itself
and more convenient than engines even smaller experimental engines runs at essentially constant speed, so
running on volatile aviation gasoline. with rotor speeds of 200,000 pilot throttle changes result in pitch
• Monarch RP generates useful on- rpm. Managing tolerances and changes to the propeller blades via a
board electrical power that is 2-3x designing to process capabilities variable pitch mechanism as calculated
greater than what is produced becomes critical. New engineering by the engine FADEC (Full Authority
by conventional engines. approaches were essential to deal Digital Engine Control). The propeller
In designing this system, the team with the extreme internal thermal itself was designed with small pusher
at UAV Turbines had to overcome stresses and tight tolerances. UAV installations in mind. The propeller
two key issues facing the UAV engine One solution to the thermal issue was designed with three blades
industry: designing a small, powerful was to separate the hot section to reduce noise from the propeller
engine that was both reliable and from the engine’s bearing cavity by interacting with the wing wakes,
ran on safer fuel. There are several placing the turbine rotor at the end increase ground clearance, and reduce
fundamental problems that had to be of the shaft, cantilevered, so that the chance of ground strikes during
overcome, starting with the fact that the bearings are located in a cold takeoff or landing. The entire system
the size and weight of these systems section of the engine core. Overhung must then be integrated into a flyable
are major constraints. A little over systems have been used before but package, a classic systems challenge.
1,300 pounds really isn’t much. there are critical fits between the Today’s military and commercial
components that have to provide organizations are urgently in
How It Works intimate contact from zero cold to need of a more reliable and safer
To achieve significant thrust, the heavy full speed hot — and all transients propulsion system for their UAVs.
fuel (typically JP-8 or Jet A) burns in a in between. The designers ended up Reliable, lightweight, fuel-efficient
very small space at similar temperatures balancing on the limits of what can microturbine engines may be the
as the larger engines. The internal be made with what will operate. answer to provide propulsion and
distances, however, are much smaller, Another factor is that this is not just power generation in small to medium-
so managing thermal gradients and the an engine, but a propulsion system. To sized UAV propulsion systems
resulting stress becomes more difficult. put that power to work, it’s necessary This article was written by Kirk
The low air flows in these engines call to step down from 100k core rpm Warshaw, CEO, UAV Turbines (Miami,
for the use of very small air passages to 6k rpm via an extremely efficient, FL). For more information, visit
and very high-speed turbomachinery. lightweight gearbox. This drives a http://info.hotims.com/72996-502.

UNMANNED SYSTEMS SPECIAL REPORT MARCH 2020 19


EYES IN THE SKY
For Drones, Combining Vision Sensor and IMU
Data Leads to More Robust Pose Estimation

D
rones (i.e. quadrotors) are acceleration measurements could also
a popular and increasingly be used to derive translation. However,
widespread product used to calculate such results, developers
by consumers as well as in need to integrate twice, a process that
a diversity of industrial, military, and results in increased errors. Therefore,
other applications. Historically under the IMU sensor alone is not an accurate
the control of human operators on the source of precise location information.
ground, they’re becoming increasingly In contrast, the vision sensor is quite
autonomous as the cameras built into good at measuring location; it’s sub-
them find use not only for capturing optimal at determining orientation,
footage of the world around them however. Particularly with wide-view
but also in understanding and angles and long-distance observation,
responding to their surroundings. attempting to use a single IMU it’s quite complicated for the vision
Combining imaging with other or vision sensor to measure both system alone to measure orientation
sensing modalities can further bolster orientation and translation in space. with adequate precision. A hybrid
the robustness of this autonomy. A hybrid approach combining IMU system of paired IMU and vision
When a drone flies, it needs to and vision data, conversely, improves data can provide a more precise
ONE/SHUTTERSTOCK.COM

know where it is in three-dimensional the precision of pose estimation for measurement for the full six degrees
space at all times, across all six drones based on the paired strengths of pose in space, providing better
degrees of freedom, for translation of both measuring methods. results than using either the IMU
and rotation. Such pose estimation The IMU sensor measures or the vision sensor individually.
is crucial for flying without crashes acceleration, with information about The most challenging issues in
or other errors. Drone developers the orientation derived from its such a sensor fusion configuration
are heavily challenged when raw output data. In theory, such are to determine a common

20 MARCH 2020 UNMANNED SYSTEMS SPECIAL REPORT


coordinate frame of reference for Combining IMU
both orientation and translation and Vision Data
data, as well as to minimize the noise One challenge
produced by the sensors. A common with hybrid systems
approach to create the reference is that the captured
frame leverages linear Kalman vision data is
filters, which have the capability to often very slow
merge both IMU and vision data for from a frame rate
hybrid pose estimation purposes. standpoint, usually
For a vision system mounted on well below 100 Hz,
or embedded in the drone, SLAM while the IMU data
(simultaneous localization and comes across at
mapping) provides spatial awareness high frequency,
for the drone by mapping its sometimes well
environment to ensure that it does over 1 KHz. The root Considering the limited integration space in most drones,
not collide with trees, buildings, of the resultant implementing a robust setup for synchronization of the IMU and
other drones, or other objects. implementation vision data can get complicated.
problem lies in
Factors to Consider When finding a way to obtain information propeller movement, which influences
Building a Hybrid Sensor-Based from both systems at the exact the vision components of the design.
Drone System same time. SLAM techniques such Both the vision sensor and IMU have
Several key factors influence as Continuous Trajectory Estimation individual local reference systems to
the measurement quality. First can approximate the drone’s measure their own pose, which need
off, the quality results of an IMU’s movement by assuming that the to be calibrated to ensure a hybrid
measurements highly depend on drone’s speed is continuous. SLAM has a common reference frame.
the quality of the IMU selected Developers can integrate both Each system must be calibrated
for the design. Inexpensive IMUs the IMU and vision data into an individually first, and then co-calibrated
tend to generate high noise levels, environment with a common with respect to each other, in order to
which can lead to various errors and reference frame, allowing them to receive the position in the common
other deviations. More generally, assign measurements to a specific reference frame. Multiple data sets are
proper calibration is necessary to part of the continuous trajectory. In- available to test aerial vehicle pose
comprehend the filter’s characteristics, between any two image acquisitions, estimation, also in hybrid fashion,
e.g. the sensor’s noise model. multiple IMU measurements provide with the developed pipelines. The
Individual sensors, even from the additional reference points regarding most commonly used dataset is
same model and manufacturer, this trajectory. When in the air, EuRoC, which provides raw vision
will have slightly different noise the drone will then constantly be and IMU data to test algorithms and
patterns that require consideration. time-synchronized and updated compare against other methods.
On the vision side, the with IMU data. And every time a While visible light image sensors
implementation specifics fundamentally vision image is received, it then are often included in autonomous
depend on whether a global or rolling corrects the IMU information. system designs, they’re not necessarily
shutter image sensor is being used. the sole required sensor technology
With a global shutter image sensor, Hardware Requirements, in all implementation situations. By
every pixel is illuminated at the same Implementation, and Testing combining them with other sensor
time, with no consequent readout Considering the drones’ limited technologies, however, the resultant
distortion caused by object motion. integration space and more general “sensor fusion” configuration can
With a more economical rolling resource-limited embedded nature, deliver a robust implementation in all
shutter image sensor, conversely, implementing a robust setup for syn- possible usage scenarios, such as for
GEOGIF/SHUTTERSTOCK.COM

distortion can occur due to read- chronization of the IMU and vision data semi- and fully autonomous vehicles,
out time differences between pixels. is not straightforward. Light and lean industrial robots, drones, and other
IMU information can correct for components with powerful processing autonomous device applications.
rolling shutter artifacts, historically units are necessary in a limited memory This article was written by Ute
by the use of various filter methods. footprint while consuming little power. Häußler, corporate editor, FRAMOS
Nowadays, the noise reduction of the Stabilized sensors and software filtering GmbH (Taufkirchen, Germany).
IMU sensor can also be corrected by are also essential for SoC developers For more information, visit
deep learning-based processing. as a result of the extreme jitter due to http://info.hotims.com/72993-504.

UNMANNED SYSTEMS SPECIAL REPORT MARCH 2020 21


APPLICATION BRIEFS
Parcel Delivery Drone
Airbus Helicopters
Marignane Cedex, France
+33 (0)4 42 85 60 51
http://helicopters.airbus.com

A irbus has begun shore-to-ship trials in Singapore


with its Skyways parcel delivery drone. This marks
the first time drone technology has been deployed in real
port conditions, to deliver a variety of small, time-critical
maritime essentials to working vessels at anchorage.
The maiden shore-to-ship delivery flight was made
to the Swire Pacific Offshore’s Anchor Handling Tug
Supply vessel ‘M/V Pacific Centurion’, 1.5km from the
shoreline of Singapore’s Marina South Pier, carrying
1.5kg of 3D-printed consumables. Landing safely on the
ship deck and depositing its cargo to the shipmaster,
the Skyways unmanned air vehicle swiftly returned to its reducing carbon footprint, and significantly mitigating
base, with the entire flight taking less than ten minutes. risks of accidents associated with launch-boat deliveries.
The trials are being undertaken in conjunction with Airbus and Wilhelmsen Ships Services signed an agreement
partner Wilhelmsen Ships Services, one of the world’s in June 2018 to drive the development of an end-to-end
leading maritime logistics and port services companies. unmanned aircraft system for safe shore-to-ship deliveries.
During the trials, Airbus’ Skyways drone will lift off from The collaboration marries Airbus’ expertise in aeronautical
the pier with a payload capability of up to 4kg and vertical lift solutions and Wilhelmsen’s wealth of experience
navigate autonomously along pre-determined ‘aerial in ship agency services. A landing platform and control
corridors’ to vessels as far as 3km from the coast. center were set up at the Marina South Pier in November
The use of unmanned aircraft systems in the maritime 2018, through the facilitation of the Maritime and Port
industry paves the way for possible enlargement of existing Authority of Singapore. The maritime agency also designated
ship agency services’ portfolio, speeding up deliveries anchorages for vessels to anchor off the pier for the trials,
by up to six times, lowering delivery costs by up to 90%, while the Civil Aviation Authority of Singapore worked with
Airbus and Wilhelmsen to
ensure safety of the trials.
Skyways is an experimental
project aimed at establishing
seamless multi-modal
transportation networks in smart
cities. Through Skyways, Airbus
aims to develop an unmanned
airborne infrastructure solution
and address the sustainability
and efficiency of unmanned
aircraft in large urban and
maritime environments. Having
demonstrated the ability to
deliver parcels safely and reliably
to vessels anchored off the
coast of Singapore, Skyways will
soon be commencing another
trial phase delivering air parcels
autonomously in an urban
environment, at the National
University of Singapore.
For Free Info Visit http://
info.hotims.com/72993-462.

22 MARCH 2020 UNMANNED SYSTEMS SPECIAL REPORT


Robotic Helicopter
Steadicopter Ltd.
The Black Eagle 50 features a special inertial navigation
Migdal HaEmek, Israel
system capability, based on input from the system’s inertial
+972-4-9592959
and other sensors. Through a unique smart navigation
www.steadicopter.com
algorithm, this input enables the continuation of the flight

S teadicopter, one of the companies involved in the


Rotary Unmanned Aerial Systems (RUAS) industry,
recently unveiled its next generation Black Eagle 50
and the mission without relying on GPS, giving the Black
Eagle 50 a significant advantage in GPS-denied areas.
In terms of mission capability, the Black Eagle 50 is a VTOL
advanced lightweight unmanned robotic helicopter. New (Vertical Take-Off and Landing) robotic observation system,
capabilities include an inertial navigation system with no suitable for tactical maritime and land Intelligence, as well as
dependence on GPS, as well as support for naval missions. Surveillance, Target Acquisition and Reconnaissance (ISTAR)
Steadicopter’s Black Eagle 50 unmanned helicopter missions. A data link connects the aircraft with the ground
has been upgraded with several additional new features controller, enabling the transmission of live video and data
and is tailored for naval missions with its robust between the two. The system has a steady hovering endurance
mechanical and electronic capabilities that support flight of up to 3 hours and flight endurance of up to 4 hours.
in maritime environmental conditions. The company At only 27 kg, the Black Eagle 50 is extremely lightweight
also recently signed a cooperative agreement with Israel and compact, with a maximum takeoff weight of 35 Kg, and
Shipyards for the marketing of the Black Eagle as part payload capacity of 5 Kg. It has a communication range
of the defense, intelligence, and surveillance systems of up to 150 km, depending on the client’s requirements,
installed on its OPV (Offshore Patrol Vessel) family. The and a service ceiling of up to 10,000 ft. Its total length is
OPV vessels, based on the Nirit Class SAAR 4.5, are just 2540 mm, while its maximum air speed is 70 knots
heavily armed with advanced weapons systems and (126 Km/h) with a cruising speed of 45 knots (81 Km/h).
can be equipped with helicopter carrying capability. For Free Info Visit http://info.hotims.com/73000-464.

UNMANNED SYSTEMS SPECIAL REPORT MARCH 2020 23


TECH BRIEFS
Aircraft Vertical Takeoff and Landing
A vertical-takeoff UAV enables long-endurance missions and is easy to transport.
Langley Research Center, Hampton, Virginia

N ASA’s Langley Research


Center de­veloped
an inexpensive, long-en­
orientation enables VTOL
capabilities. It combines the
speed and fuel efficiency
durance, vertical takeoff and of fixed-wing aircraft
landing (VTOL) unmanned with the hoverability and
aerial vehicle (UAV). flexibility of rotary aircraft.
It is capable of flying for 24 The design is expected to
hours, landing in a 50 × 50 avoid the compromises
zone, and can be loaded into in performance that
the back of a cargo van for are typically made
easy transport. In addition, it in development of
can land in either a horizontal VTOL aircraft.
or vertical flight configuration. NASA is actively
The vehicle can be used seeking licensees to
in any application where The unique design of the aircraft enables VTOL capabilities. commercialize this
UAVs are needed to take off technology. Please contact
from, or land in, an unprepared landing This VTOL aircraft is a modification The Technology Gateway at LARC-
area. Commercial applications where of a conventional single-prop aircraft DL-technologygateway@mail.nasa.
extended-mission UAV capabilities are design. The addition of vertically gov to initiate licensing discussions.
beneficial include law enforcement, oriented, stowable tail rotors and an Follow this link for more information:
firefighting, crop surveys, pipeline articulating forward rotor capable of https://technology.nasa.gov/
surveys, or oil field management. pivoting from a horizontal to vertical patent/LAR-TOPS-230.

Neural Lander Uses AI to Land Drones Smoothly


The system employs a deep neural network to overcome the challenge of ground-effect turbulence.
California Institute of Technology, Pasadena

L anding multi-rotor drones smoothly


is difficult. Complex turbulence
is cre­ated by the airflow from each
modifies its landing trajectory and
rotor speed to achieve the smoothest
possible landing. It has the potential
normalization was used that smooths
out the neural net’s outputs so it doesn’t
make wildly varying predictions as in­
rotor bouncing off the ground during to help drones fly more smoothly puts or conditions shift. Improvements
a descent. This turbulence is not well and safely, espe­cially in the presence in landing were measured by examining
understood nor is it easy to compensate of unpredictable wind gusts. deviation from an ideal trajectory in
for, particularly for autonomous drones. Deep neural networks (DNNs) are 3D space. Three types of tests were
That is why takeoff and landing are AI systems inspired by biological conducted: a straight vertical landing,
often the two most difficult parts of a systems like the brain. The “deep” a descending arc landing, and flight in
drone flight. Drones typically wobble part of the name refers to the fact which the drone skims across a broken
and inch slowly toward a landing until that data inputs are churned through surface, such as over the edge of a
power is finally cut and they drop the multiple layers, each of which table, where the effect of turbulence
remaining distance to the ground. processes incoming information from the ground would vary sharply.
NASA/WILLIAM J. FREDERICKS

A system was developed that uses in a different way to tease out The new system decreases vertical
a deep neural network to help increasingly complex details. error by 100 percent, allowing for
autono­mous drones “learn” how to DNNs are capable of automatic controlled landings, and reduces
land more safely and quickly while learning, which makes them ideally lateral drift by up to 90 percent. In
using less power. The system, called suited for repetitive tasks. experiments, the system achieved
the Neural Lander, is a learning- To make sure that the drone flies actual landing rather than getting
based controller that tracks the smoothly under the guidance of the stuck about 10 to 15 centimeters
position and speed of the drone and DNN, a technique known as spectral above the ground, as unmodified

24 MARCH 2020 UNMANNED SYSTEMS SPECIAL REPORT


conventional flight controllers often across the table to flying in the that could land in difficult-to-reach
do. Further, during the skimming free space beyond the edge. locations (such as gridlocked traffic).
test, the Neural Lander produced Besides its obvious commercial For more information, contact
a much smoother transition as the applications, the new system could prove Robert Perkins at rperkins@caltech.edu;
drone transitioned from skimming crucial to autonomous medical transports 626-395-1862.

Lightweight Sensing and Control System for Unmanned


Aerial Vehicle Monitoring
The control system allows semi-autonomous flight.
Langley Research Center, Hampton, Virginia

A new sensing and control


system for unmanned aerial
vehicles (UAVs) allows for semi-
the order of 6" and weighing less than
one pound) generated a need for much
smaller flight and sensing equipment.
themselves to easy integration onto
a circuit board. The system uses less
energy than current systems, allowing
autonomous flight. Pilots need not NASA Langley’s new sensing and flight solar panels planted on the vehicle to
leave the ground to conduct routine control system for small UAVs includes generate the system’s power. While the
monitoring and surveillance quickly both an active flight control board and lightweight technology was designed
and cost-effectively. Such systems are an avionics sensor board. Together, for smaller UAVs, the sensors could be
particularly useful during long flight these compare the status of the UAV’s distributed throughout larger UAVs,
segments or over remote locations, position, heading, and orientation with de­pending on the application.
or for scientific applications such the pre-programmed data to determine NASA is actively seeking licensees to
as atmospheric monitoring or crop and apply the flight control inputs commercialize this technology. Please
monitoring, which might require long needed to maintain the desired course. contact The Technology Gateway
and repeated sampling in a specific To satisfy the small form-factor system at LARC-DL-technologygateway@
pattern. The small, lightweight requirements, microelectromechanical mail.nasa.gov to initiate licensing
technology can be quickly adapted systems (MEMS) are used to realize the discussions. Follow this link for more
to a specific configuration. various flight control sensing devices. information: https://technology.
Increasing demand for smaller UAVs MEMS-based devices are commercially nasa.gov/patent/LAR-TOPS-33.
(e.g., sometimes with wingspans on available single-chip devices that lend

Flying Robot Mimics Rapid Insect Flight


The autonomous, free-flying robot enables new drone applications.
Delft University of Technology, Delft, The Netherlands

F lying animals both power and


control flight by flapping their
wings. This enables small natural
robots that are agile, power-efficient,
and even scalable to insect sizes.
Researchers have developed the Del­
such as 360-degree flips, resembling
loops and barrel rolls. The 33-cm
wingspan and 29-gram robot has, for its
flyers such as insects to hover close Fly Nimble, a novel insect-inspired flying size, excellent power efficiency, allowing
to a flower but also to rapidly escape robot with flapping wings, beating 17 times five minutes of hovering flight, or a
danger. Animal flight has always per second, that not only generate the flight range of more than 1 km, on a fully
drawn the attention of biologists, who lift force needed to stay airborne but also charged battery. The robot has a thrust-
not only study their complex wing control the flight via minor adjustments to-weight ratio of more than 1.3 and is
motion patterns and aerodynamics in the wing motion. Inspired by fruit flies, capable of carrying an additional payload
but also their sensory and neuro- the robot’s control mechanisms have of up to 4 grams (e.g. a camera system
motor systems during such agile proved to be highly effective, allowing with a live video feed, additional sensors,
maneuvers. Recently, flying animals it not only to hover on the spot and fly etc.) The exceptional agility can be
have also become a source of in any direction but also be very agile. demonstrated by 360-degree flips around
inspiration for robotics re­searchers, The robot has a top speed of 25 km/h the pitch or roll axes or rapid transitions
who try to develop lightweight flying and can perform aggressive maneuvers from hover to forward or sideways flight,

UNMANNED SYSTEMS SPECIAL REPORT MARCH 2020 25


TECH BRIEFS

Left wing-pair
Right wing-pair flapping mechanism
flapping mechanism

Dihedral angle
control mechanism

The DelFly Nimble in stationary (hovering) flight. Autopilot &


radio receiver
and vice versa. At full throttle, the robot Battery
reaches a top speed of 7 m/s (~25 km/h). Wing root
adjustment mechanism
Like in quadrotors or helicopters, but
also like in insects, forward/backward The robot is equipped with two independent flapping mechanisms — one for each wing pair on the
and sideways flight is achieved by sides of the robot. These are complemented with two rotary servo actuators — one adjusting the
pitching and rolling the robot’s body dihedral angle by changing the relative orientation of the two flapping mechanisms and the other
into the respective direction. To control actuating the tips of the left and right wing-pair roots.
the body orientation (attitude), the
robot needs to be able to produce angle by changing the relative orientation The robot has potential for novel
torques around the three orthogonal of the two flapping mechanisms and applications as it is lightweight, safe
body axes. To this end, the robot is the other actuating the tips of the left around humans, and able to fly more
equipped with two independent flapping and right wing-pair roots. Rolling is efficiently than more traditional drone
mechanisms — one for each wing pair achieved by driving the two wing pairs designs, especially at smaller scales.
on the sides of the robot. These are at different flapping frequencies, which For more information, contact Matěj
complemented with two rotary servo results in a thrust difference creating Karásek at M.Karasek@tudelft.nl;
actuators — one adjusting the dihedral the torque around the roll axis. +31 6 8384 1669.

Software Enables Robots to Self-Right After Overturning


These robots feature an open systems architecture that enables them to get up after falling.
Army Research Laboratory, Adelphi, Maryland; Johns Hopkins University Applied Physics Laboratory, Baltimore, Maryland;
and Northrop Grumman Corp., Falls Church, Virginia

T he Advanced Explosive Ordnance


Dis­posal Robotic System (AEODRS)
features a modular open systems
An adaptive sampling algorithm
looks for transitions — states in which
the robot could transition from a
architecture that enables the robot stable configuration to an unstable BELOW: NORTHROP GRUMMAN CORPORATION

to be self-righting after a fall. The one, thus causing the robot to tip
self-righting ability is ex­tended over. The software effectively predicts
to robots with a greater number where those transitions might be so
of degrees of freedom (joints). the space can be searched efficiently.
ABOVE: HENRI WERIJ, TU DELFT;

The software looks at all possible The AEODRS’s eight degrees of


geometries and orientations that freedom were analyzed and it was
the robot could find itself in. Each determined that it can right itself on
additional joint adds a dimension level ground no matter what initial
to the search space. This analysis state it finds itself in. The analysis
was made possible by the Range also generates motion plans showing
Ad­versarial Planning Tool (RAPT), The Advanced Explosive Ordnance Disposal how the robot can reorient itself.
a software framework for testing Robotic System Increment 1 Platform was used For more information, contact the ARL
autonomous and robotic systems. to demonstrate the self-righting ability. Public Affairs Office at 301-394-3590.

26 MARCH 2020 UNMANNED SYSTEMS SPECIAL REPORT

You might also like