Professional Documents
Culture Documents
Aeroespacial 2020
Aeroespacial 2020
UNMANNED SYSTEMS
MARCH 2020
Sponsored by
Your Stamping &
Washer Specialists
REQUEST
YOUR FREE
STAMPINGS
& WASHERS
CATALOG
& CALENDAR!
3D-Printed
Prototypes
Available
bokers.com
(888)-WASHERS 612-729-9365 sales@bokers.com
CONTENTS
The NVIDIA Jetson TX2i module provides a great example developer tools and comprehensive libraries for building AI
of how an industrial AI engine can be adapted for use applications. TensorRT, a deep learning inference runtime,
onboard military platforms, most of which are especially and the CUDA development environment are just a couple
sensitive to any additional size, weight, and power (SWaP) of free tools bundled in JetPack. If the system integrator
burden. Another key issue for defense applications is long is already familiar with the NVIDIA Jetson development
lifecycle support, since commercial graphics processors environment, they will also be comfortable with this
are notorious for short product lifecycles. The good news technology, and since it’s portable across the entire Jetson
for military system designers is that SOMs like the Jetson family ecosystem, the software development environment
offer some of the best ratios of FLOPS of computing is portable across all Jetson modules, reducing costs
performance per watts of power consumed. The TX2i, through a build-once and deploy multiple times approach.
specifically, combines a high-performance 6-core Arm NVIDIA also offers lab-grade developer kits at an
processor with a powerful 256-core NVIDIA GPU that is extremely low cost. Designed to jumpstart application
compatible with their CUDA development libraries and development, these boards use standard commercial
software applications. Optimized to reduce SWaP, this connectors and enable customers to get their program
device is packaged in a small form factor pre-integrated started with very little investment in the lab before
with processor, graphics, RAM memory, Flash storage, and porting it over to a rugged platform when they are
I/O. With its support for extended operating temperatures ready to take the next step towards deployment.
(all of its components are rated to the full industrial An example of a rugged mission computer ideal for use in
temperature range of -40°C to +85°C or beyond), the deployed machine learning applications is Curtiss-Wright’s
TX2i can be effectively cooled using straightforward DuraCOR 312, a miniature system (5.2" x 5.4" x 2.0") based on
conduction cooling, eliminating the need for complex cooling the industrial Jetson TX2i SoM. The DuraCOR 312 can be used
alternatives or fans that have undesirable moving parts. in a wide range of AI-based applications and is especially
For enhanced reliability, the module uses RAM memory useful for computer vision applications. For example, the
with Error Corrected Code (ECC) support, which is mission computer can be deployed on a small drone and
particularly useful to mitigate a potential single event flown around a platform of interest to inspect for damage
upset (SEU) at high altitude due to solar radiation. Even or any visible anomalies. At sea, the system can be used
better, NVIDIA supports the industrial version of this to locate and identify surface vessels. In the battlefield, a
Jetson device with a ten-year lifecycle, which matches DuraCOR 312 can be launched on a small unmanned aircraft
the long program requirements typical of military to provide object detection and visual detection, providing
systems. Compare that to the five-year lifecycle of other real-time actionable intelligence of enemy troop strength
embedded Jetson devices and an even shorter two- and deployment without needing to request and wait for air
year lifecycle for consumer-grade graphics devices. support, such as a large helicopter, to provide surveillance.
To ease and speed application software development, the In the case of visual detection, the video captured by
TX2i is supported by NVIDIA’s JetPack Software Development the unmanned aircraft’s camera is sent to the GPU, which
Kit (SDK) that includes a large number of free CUDA runs a machine learning inference, basically scanning the
Curtiss-Wright’s
DuraCOR 312 miniature
system based on NVIDIA’s
Industrial Jetson TX2i SoM.
images to identify, as it’s been trained to do, any objects dust and water. In fact, the chassis can be fully immersed
of interest, such as a soldier or tank, and it will tag the and will experience no water ingress at all. What’s more, all
detected objects with time and location data. The inference the circuit boards are covered with a conformal coating to
code, which is stored on the DuraCOR 312, was developed protect against humidity and corrosion. Designed for SWaP
earlier on a larger system using an algorithm of choice. optimization, this compact AI-engine-based mission computer
Images associated with the objects that you want to detect weighs less than 2.0 lbs. and requires less than 25W of power.
are formatted appropriately and then used to train the system The DuraCOR 312 is fully compliant to extremely
to recognize them. After the system is trained satisfactorily demanding MIL-STD-810G, MIL-STD-461F, MIL-STD-
to the desired percentage of accuracy, the machine-learning 1275D, MIL-STD-704F, and RTCA/DO-160G environmental,
model, written using a deep learning framework such as power, and EMI conditions, including high altitude,
Caffe or TensorFlow, is then run in TensorRT, which helps wide temperature, humidity, extreme shock, and
to compress the software code for use on small systems vibration and noisy electrical environments. The unit
deployed at the edge of the battlefield network. The inference also provides an aerospace-grade power supply in a
code, once downloaded onto the DuraCOR 312, can now fanless IP67-rated mechanical package that handles
be deployed on a drone and begin to process incoming harsh shock and vibration and operates over extended
images to identify those objects it was trained to recognize. temperatures without requiring a cold plate or airflow.
The results can then be downlinked to the warfighter in Through our experience collaborating with companies like
real time or stored on the drone until it returns for offline Cisco Systems to deliver rugged network switch solutions
post-mission analysis. How much storage is required on the for challenging embedded applications, we’ve learned the
aircraft will be determined by the distance and duration benefits, both for the customer and for the COTS vendor,
of the mission and the amount of imagery captured. that come from partnering with a technology pacesetter.
When deployed in battlefield environments, cooling the Now, by leveraging NVIDIA’s best-in-class industrial AI
TX2i’s power-efficient ARMv8 processor cores and CUDA engines, and packaging them for the military COTS user, we
GPU is aided through the use of an aluminum heat spreader can bring proven, deep learning technologies, supported
on top of the module that makes contact with all of the with a rich and familiar development environment, that
hot components. The heat spreader attaches directly to are viable for the unique requirements of unmanned
the DuraCOR 312’s system chassis, using phase change military platforms. This approach reduces program risk
material, and the metal of the chassis as an extension of the and provides system designers with the shortest and
heatsink. To further adapt the TX2i for use in rugged military easiest learning curve to shorten time to deployment.
conditions, Curtiss-Wright submits all system components This article was written by Mike Southworth, senior
to additional ruggedization. The DuraCOR 312 also uses product manager, Curtiss-Wright Defense Solutions
miniature versions of traditional rugged circular military (Salt Lake City, UT). For more information, visit
connectors that, when mated, provide full sealing against http://info.hotims.com/72998-502.
OMNETICS IS A WORLD-CLASS
MINIATURE CONNECTOR DESIGN AND
MANUFACTURING COMPANY WITH
OVER 30 YEARS OF EXPERIENCE. OUR
MINIATURE CONNECTORS ARE
DESIGNED AND ASSEMBLED IN A
SINGLE LOCATION AT OUR PLANT IN
MINNEAPOLIS, MINNESOTA.
F
inding ways to overcome save submarine crews lost at sea, or All ROVs include built-in sensors, a
physical limitations so that locate shipwrecks, including the Titanic. tether, a tether management system,
humans can dive deeper At the beginning of the 21st century, a flotation pack, and a thruster. Most
and stay underwater longer UUVs increasingly became a faster, ROVs have sensors for video cameras
has been an ongoing quest. Back cheaper, and safer alternative than and lights. They may also have sonars,
in the 15th century, Leonardo da sending humans and trained animals magnetometers, water samplers,
Vinci drew sketches of a submarine to defuse sea mines. For example, it and instruments that measure water
and a robot. Had he thought to takes a team of human divers 21 days clarity, water temperature, water
combine the two concepts, he would to clear sea mines in one square mile, density, and sound velocity. An ROV
have created a prototype of an but a UUV such as the Hydroid REMUS might also include a 3D camera
unmanned underwater vehicle, or can complete the task in only 16 hours. to improve the pilot’s perception
underwater drone. Instead, the world of the underwater scenario.
had to wait another 500 years. How Do Unmanned Underwater Linked to a host ship by an umbilical
In the 1950s, in response to sea Vehicles (UUVs) Work? tether that houses energy cables
mines that would go on to destroy or Unmanned underwater vehicles and communication cables, ROVs
damage more naval ships than all other consist of remotely operated vehicles use a tether management system
types of attacks, the U.S. Navy began (ROVs) and autonomous underwater (TMS). The TMS makes it possible
developing an unmanned underwater vehicles (AUVs). to adjust the length of the tether to
vehicle (UUV) that could dive 10,000 minimize the underwater effect of
feet (rated depth) and function for Remotely Operated cable drag. By employing a flotation
four straight hours. In the succeeding Vehicles (ROVs) pack and a thruster, heavy-duty
decades, the Navy developed UUVs that ROVs were developed first and are ROVs can achieve precise motion
could retrieve lost equipment such as controlled by humans from a distance. control even in high-current waters.
torpedoes or a missing atomic bomb, Their sizes vary.
HYDROID
REMUS 6000
General
Dynamics Boeing Lockheed Martin Teledyne Saab Kongsberg Hydroid
(Bluefin)
Product Bluefin-21 Echo Voyager Marlin Gavia Sabertooth HUGIN 1000 REMUS 600
example
Weight 1650 lbs. 100,000 lbs. 3,500 lbs. 105 lbs. 1400 lbs. 1400 lbs. 500 to 850 lbs.
Rated Depth 15,000 ft. 11,000 ft. 12,000 ft. 3000 ft. 3600 ft. 9000 ft. 1800 ft.
Power source Lithium Battery, rechargeable, Battery, Battery, Battery, Battery, Lithium battery,
battery, and marine diesel rechargeable rechargeable rechargeable rechargeable rechargeable
rechargeable generator
Estimated 25 hours at 3 months; or 60 hours over 14 7 hours 8 hours 100 hours at 4 Up to 24 hours
Endurance 3 knots 280 km days; 100 km knots
Navigation Inertial Inertial Navigation, Doppler aided Inertial IMU/Doppler and Inertial Inertial
navigation Doppler Velocity inertial navigation, navigation, terrain naviga- navigation, navigation,
Log, depth sensors USBL, terrain Doppler velocity tion micro-navigation acoustic doppler,
navigation meter, USBL sonar
Application Research, Oil and gas, Oil and gas, Research, Oil and gas, Research, oil and Research,
military military military military military gas, military military
Smaller ROVs are used mainly for control and do not have to be tethered low bandwidth (1-2 kbps) that restricts
observation and inspection in the to a larger vessel. Having exchanged the rate of data transfer. Acoustics
research, military, or recreational the tether for more mobility and may therefore be the most suitable
sectors because they can only freedom, AUVs must face the problems communication solution for AUVs, but
withstand shallow deployment (around of communicating with the ships, challenges include signal diffusion,
1,000 feet or 300 meters) and have navigating on their own, and powering relatively low bandwidth, and power
limited horsepower. On the other themselves over a long period. consumption. At 20 kbps, the bandwidth
hand, the largest and most heavy- Communication: Water distorts offered by acoustic modems is still
duty ROVs are suitable for the oil and signals, so the usual means of insufficient for real-time video, meaning
gas industry because they can dive communication including GPS, mobile that the human operator on the ship
as deep as 20,000 feet and perform (3G, 4G), Wi-Fi, radar signals, and will only see delayed images. The use of
multiple tasks like lifting, drilling, optical navigation such as LiDAR will all high-definition cameras may supplement
construction, and pipeline inspection. be ineffective for AUVs. While useful, the collection of information.
very low frequency (VLF) and extremely Navigation: Signal distortion by
Autonomous Underwater low frequency (ELF) signals can only water also hampers how well AUVs
LOCKHEED MARTIN
Vehicles (AUVs) enable one-way communication from orient themselves. Therefore, most
In recent years, military applications the ship to the drone, since VLF or ELF AUVs use inertial navigation. Inertial
have taken the next step and begun transmitters are too bulky to mount navigation supplements dead reckoning
using AUVs, which, unlike ROVs, on the drone. Also, VLF and ELF have calculation of the drone’s current
operate independently of human long wavelengths and thus extremely position based on the previously
Tactica
a al
Ed
Edge
SERVERS | DISPLAY
AYS | STORAGE | NETWORKING
G | EMBEDDED | CYBER
sales@crystalrugged.com | 800.378.1636 | crystalrugged.com
The CAN Bus:
DRIVING THE FUTURE OF AUTONOMOUS
MILITARY VEHICLES
I
t’s a crisp November day in Michigan, U.S. Army’s Tank Automotive Research, era of improvised explosive devices
and a convoy of British and American Development, and Engineering (IEDs) and protracted conflicts in
resupply vehicles is rumbling along Center (TARDEC); and the U.S. Army remote and desolate terrains.
at a comfortable 25 miles per hour. In Armament Research, Development, and The technologies powering these
the lead is a British Army Rheinmetall Engineering Center (ARDEC). While UGVs are cutting-edge. Many could
MAN Military Vehicles (RMMV) HX- this project had been in the works hardly have been envisioned a decade
60 truck, trailed closely by two U.S. for three years, the U.S. military has ago, but the secret weapon of nearly
Army Oshkosh Light Medium Tactical been exploring autonomous vehicle all of these vehicles is a technology
Vehicles (LMTVs). In total, there are technology in earnest since at least that has been standard in civilian
zero humans operating this convoy. the 1980s. In the time period since vehicles for nearly 30 years: the
This was the scene in November then, there have been huge leaps Controller Area Network, or CAN bus. BACKGROUND IMAGE: FABER14/SHUTTERSTOCK.COM
2017, when the US-UK Coalition in autonomous vehicle technology,
Assured Autonomous Resupply and the military has seen already What is a Controller Area
(CAAR) demonstration took place dangerous and logistically difficult Network (CAN)?
at Camp Grayling, Michigan — the resupply issues multiplied in the A controller area network (CAN) is
result of three years of research and a message-based protocol that allows
engineering to examine the use of internal systems to communicate with
unmanned ground vehicles (UGVs) one another without a central computer.
in resupply situations to tackle so- CAN technology is used in applications
called “last mile” issues regarding as wide-ranging as agriculture, robotics,
supplying troops on the front lines of industrial automation, and medical
active conflicts around the world. systems, but it is most known for its
CAAR is a collaboration among the use in the automotive industry.
United Kingdom’s Defence Science In today’s connected vehicles, the
and Technology Laboratory (Dstl); the CAN Bus Board CAN bus facilitates communication
ULTRASONIC
SENSORS
RADAR
DATA COLLECTION
INFRARED SENSORS
AND PROCESSING
This illustration
THIS ILLUSTRATION represents an
REPRESENTS ANexample
EXAMPLElayout of sensors
LAYOUT on a military
OF SENSORS vehicle. VEHICLE.
ON A MILITARY
Drones making for much safer and more the bus. MilCAN B supports data
The use of the CAN bus isn’t reliable UAV flight operations. rates from 10 kbps to 1 Mbps.
just limited to UGVs. Unmanned Both protocols were developed
aircraft systems (UASs) have also MilCAN to specialize the use of CAN around
adopted CAN technology for its To account for the many issues deterministic data transfer, so the
low-latency, reliable communication specific to military vehicles, a working specifications can also be used
capabilities. In fact, there’s even group of the International High Speed for non-military applications.
a UAVCAN protocol designed for Data Bus-Users Group (IHSDB-UG)
aerospace and robotic applications. developed the MilCAN higher layer The Future of CAN in
The CAN bus allows for protocol in 1999, with the goal of Autonomous Military Vehicles
communication between embedded creating a standard interface for Don’t expect robotic armies to
systems within a UAV, as well as the utilizing CAN bus in military vehicle be taking over the front lines any
transfer of information between a development. There are two versions time soon, but with successful pilot
UAV and the remote operator. For of MilCAN: MilCAN A and MilCAN B. programs like CAAR demonstrating the
instance, the CAN bus can allow Widely used in armored vehicles, viability of heavy-duty military grade
the flight controller to manage the MilCAN A uses 29-bit identifiers and uses autonomous vehicles, there is hope for
throttle on the electronic speed a similar frame format to SAE-J1939. making extremely risky situations — like
controller (ESC), but it also allows Mission-critical in mind, MilCAN A resupply missions — a little safer. And
the ESC to return hard real-time data prioritizes message transmission, and for the foreseeable future, the CAN
to the flight controller, including defines 1-Mbit, 500-Kbps, and 250-Kbit bus will be along for that journey.
temperature, amperage, voltage, communication rates. This article was written by
warning signals, etc. via live telemetry. MilCAN B is actually an extension of Jesse Paliotto, director of marketing,
The real-time data, transferred the CANopen application layer, using Kvaser (Mission Viejo, CA).
within microseconds, allows 11-bit identifiers and only periodically For more information, visit
remote pilots to react immediately, allowing data to be transmitted via http://info.hotims.com/72993-501.
Visit: CreateTheFutureContest.com
Entries: Accepted beginning March 1, 2020
See the website for contest rules and regulations
PRIZE SPONSOR
AUTONOMOUS
“WINGMAN” By 2025, the Army sees ground troops conducting
foot patrols in urban terrain with robots, called Squad
VEHICLES
Multipurpose Equipment Transport vehicles, that carry
rucksacks and other equipment alongside soldiers.
Overhead, unmanned aircraft will also serve as spotters
to warn troops so they can engage the enemy on their
own terms, according to the Army’s new strategy on
Vehicle Technology
T
he US Army’s Futures The questions are many: What is The article examines the type of
Command is the most important to be the deployment doctrine and sensors, control, and telemetry needed
administrative reorganization of the sensors capabilities? Are these between an autonomous vehicle and
Instead of a heavier Abrams main battle “wingman” autonomous vehicles. changed its defense doctrine multiple
tank, or a replacement to the aging M113 In this article, we briefly examine times in response to world events like
APC, autonomous “wingman” vehicles what the Army’s Futures Command 9/11, urban warfare in Iraq, fighting
may replace some of the human- has published on future vehicle fleet, ISIS — and now, a realization that
heavy tasks on the future battlefield. autonomy, and the evolving use cases. Russia and China are legitimate
Army organizations, labs, vendors, and Figure 1. The U.S. Army’s Cross Functional driving cars on the streets.” To realize
industry to openly discuss out-of-the- Teams (CFTs). Note that the Next Generation his vision — plus the NDS top-down
box ways to address the (primarily) Combat Vehicle isn’t a single vehicle, but a set goals, the Army’s top eight, and the
Russian threat. From a ground of at least five different platform types. NGCV’s CFT's five vehicle-type portfolio
and redundant in-vehicle networks of sensors on the RCV but are located
are needed — requiring multi-gigabit separately and distant from the RCV. Figure 4. The DoD’s “Unmanned Systems
Ethernet switches and mass storage. GMS is working on enhancements to Integrated Roadmap” in 2017 charts the way
MVD that will tick off the list even more of for all service branches.
MVD: Right, and Right Now the technologies in Table 1. For example,
Figure 4 shows the DoD’s “Unmanned the GMS X422 “Lightning,” announced NGCV and Autonomous
Systems Integrated Roadmap: 2017- last October, brings AI to ground Vehicle Enablers
2042.” The latest version, it identifies vehicles by adding two Nvidia Tesla V100 As this article has outlined,
numerous key technologies needed for GPGPU “data center accelerators” in a autonomous “wingman” vehicles are
future autonomous vehicles. (Table 1) conduction-cooled vetronics chassis. An poised to replace many of the human-
In 2017, General Micro Systems upgraded embedded server and open heavy tasks on the future battlefield.
announced a contract win with the Army standard network switch raise the in- As the Army works to integrate the
for the Multi-Function Video Display vehicle speed to 10 Gbits/s, providing most advanced technologies into
system (MVD) designed to bring video more bandwidth for high speed and MVD on MRAPs and other Next
and sensor data to Type II Mine Clearing myriad more sensors. However, except Generation Combat Vehicles, the US
Vehicles on the MRAP chassis. Accepting for the addition of the AI chassis (about will be more prepared to respond to
input from multiple high-resolution video the size of a shoebox), the system the “near peer” threats we are seeing.
cameras, FLIR and the Vehicle Optic remains exceptionally small. It’s designed This article was written by
Sensor System (VPOSS) — all called to fit into ground vehicles wherever Chris A. Ciufo, CTO, General Micro
“vehicle enablers” — the system converts there’s room: under a seat, mounted Systems (Rancho Cucamonga, CA).
real-time data to ultra-low-latency vertically on a bulkhead, or tucked For more information, visit
video-over-IP and transmits it over an behind a piece of existing equipment. http://info.hotims.com/72993-500.
industry-standard 1 Gbits/s Ethernet
Robotics advancements Sensor advancements
network, first to a rugged, embedded
Intel Xeon-class server and mass storage Prioritized common/open architectures Collision avoidance
system, then outward to multiple Common data repositories Leader-follower
operator smart screen consoles, each
Autonomous modeling and simulation GPS-denied solutions
with embedded Intel Core i7 processors. advancements
Except for the sensors, GMS builds
Machine learning advancements Cyber resilience and robustness
100 percent of this intelligent, machine-
learning, open-standards network/ Artificial Intelligence advancements Information assurance solutions
server/display system. The software SWaP/miniaturization advancements Increased network and spectrum capacity
that brings the system to life is from the
Swarming capabilities Human-machine interface advancements
Army’s Night Vision Labs and RDECOM
and is used to provide situational Augmented Reality Autonomous data strategy adaptation
awareness, operator hand off, and a Virtual Reality
common user interface (UI) to disparate
operator stations and workloads. Table 1. Key technologies identified by the DoD for unmanned systems.
D
rones (i.e. quadrotors) are acceleration measurements could also
a popular and increasingly be used to derive translation. However,
widespread product used to calculate such results, developers
by consumers as well as in need to integrate twice, a process that
a diversity of industrial, military, and results in increased errors. Therefore,
other applications. Historically under the IMU sensor alone is not an accurate
the control of human operators on the source of precise location information.
ground, they’re becoming increasingly In contrast, the vision sensor is quite
autonomous as the cameras built into good at measuring location; it’s sub-
them find use not only for capturing optimal at determining orientation,
footage of the world around them however. Particularly with wide-view
but also in understanding and angles and long-distance observation,
responding to their surroundings. attempting to use a single IMU it’s quite complicated for the vision
Combining imaging with other or vision sensor to measure both system alone to measure orientation
sensing modalities can further bolster orientation and translation in space. with adequate precision. A hybrid
the robustness of this autonomy. A hybrid approach combining IMU system of paired IMU and vision
When a drone flies, it needs to and vision data, conversely, improves data can provide a more precise
ONE/SHUTTERSTOCK.COM
know where it is in three-dimensional the precision of pose estimation for measurement for the full six degrees
space at all times, across all six drones based on the paired strengths of pose in space, providing better
degrees of freedom, for translation of both measuring methods. results than using either the IMU
and rotation. Such pose estimation The IMU sensor measures or the vision sensor individually.
is crucial for flying without crashes acceleration, with information about The most challenging issues in
or other errors. Drone developers the orientation derived from its such a sensor fusion configuration
are heavily challenged when raw output data. In theory, such are to determine a common
distortion can occur due to read- chronization of the IMU and vision data semi- and fully autonomous vehicles,
out time differences between pixels. is not straightforward. Light and lean industrial robots, drones, and other
IMU information can correct for components with powerful processing autonomous device applications.
rolling shutter artifacts, historically units are necessary in a limited memory This article was written by Ute
by the use of various filter methods. footprint while consuming little power. Häußler, corporate editor, FRAMOS
Nowadays, the noise reduction of the Stabilized sensors and software filtering GmbH (Taufkirchen, Germany).
IMU sensor can also be corrected by are also essential for SoC developers For more information, visit
deep learning-based processing. as a result of the extreme jitter due to http://info.hotims.com/72993-504.
A system was developed that uses in a different way to tease out The new system decreases vertical
a deep neural network to help increasingly complex details. error by 100 percent, allowing for
autonomous drones “learn” how to DNNs are capable of automatic controlled landings, and reduces
land more safely and quickly while learning, which makes them ideally lateral drift by up to 90 percent. In
using less power. The system, called suited for repetitive tasks. experiments, the system achieved
the Neural Lander, is a learning- To make sure that the drone flies actual landing rather than getting
based controller that tracks the smoothly under the guidance of the stuck about 10 to 15 centimeters
position and speed of the drone and DNN, a technique known as spectral above the ground, as unmodified
Left wing-pair
Right wing-pair flapping mechanism
flapping mechanism
Dihedral angle
control mechanism
to be self-righting after a fall. The one, thus causing the robot to tip
self-righting ability is extended over. The software effectively predicts
to robots with a greater number where those transitions might be so
of degrees of freedom (joints). the space can be searched efficiently.
ABOVE: HENRI WERIJ, TU DELFT;