Multi-Sensor Data Fusion Techniques For RPAS Detect, Track and Avoid

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/283298389

Multi-Sensor Data Fusion Techniques for RPAS Detect, Track and Avoid

Conference Paper · September 2015


DOI: 10.4271/2015-01-2475

CITATIONS READS
5 1,336

3 authors, including:

Subramanian Ramasamy Roberto Sabatini


RMIT University RMIT University
125 PUBLICATIONS   1,879 CITATIONS    432 PUBLICATIONS   3,827 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Cyber-Physical Systems for Automated Decision Making and Trusted Autonomy View project

Next Generation Air Traffic Management and Avionics Systems View project

All content following this page was uploaded by Roberto Sabatini on 29 October 2015.

The user has requested enhancement of the downloaded file.


This is the author pre-publication version. This paper does not include the changes arising from the revision, formatting and publishing process.
The final paper that should be used for referencing is:
F. Cappello, S. Ramasamy, R. Sabatini, “Multi-Sensor Data Fusion Techniques for RPAS Detect, Track and Avoid”, SAE Technical Paper 2015-
01-2475, 2015. DOI: 10.4271/2015-01-2475
2015-01-2475

Multi-Sensor Data Fusion Techniques for RPAS Detect, Track and Avoid

Francesco Cappello, Subramanian Ramasamy, Roberto Sabatini


RMIT University – SAMME, Melbourne, Australia

Copyright © 2015 SAE International

Abstract specifically, development of innovative communication systems will


lead to a more direct and efficient air-ground linkages,
interoperability across all applications and, better data formats and
Accurate and robust tracking of intruders and objects is of growing
bandwidth leading to a reduction in communication errors [3]. In the
interest amongst the Remotely Piloted Aircraft System (RPAS)
foreseeable future, navigation systems will provide high-integrity,
scientific community. The ability of Multi-Sensor Data Fusion
high-reliability and worldwide all-weather navigation services [1].
(MSDF) algorithms to support effective Detect, Track and Avoid
Four-dimensional navigation accuracy will be developed that will
(DTA) mission- and safety-critical tasks, and to aid in accurately
allow for better airport and runway utilization and greater cost
predicting future states of intruders is explored in detail. RPAS are
savings from reduction of ground-based navigation aids. The
currently not equipped to routinely access all classes of airspace
progression towards area navigation (RNAV) and Global Navigation
primarily due to the incipient development of a certifiable on-board
Satellite System (GNSS) based operations will result in worldwide
Detect-and-Avoid (DAA) system. By employing a unified approach
navigation coverage supporting Non-Precision Approach/Precision
to cooperative and non-cooperative DAA, as well as providing
Approach (NPA/PA) capabilities that are currently being supported
enhanced Communication, Navigation and Surveillance (CNS)
by majority of the airports infrastructure around the globe [4]. These
services, a cohesive logical framework for the development of an
improvements will offer a reduction in workload to the on-board and
airworthy DAA capability can be achieved. In order to perform
remotely operating pilot and thus promoting a less stressful
effective detection of intruders, a number of high performance,
environment. The key expected benefits of the development
reliable and accurate avionics sensors and/or systems are adopted
CNS/ATM technologies are listed below:
including non-cooperative sensors (visual and thermal cameras,
LIDAR, acoustic sensors, etc.) and cooperative systems (Automatic  Reduction in human errors by on-board and remotely operating
Dependent Surveillance-Broadcast (ADS-B), Traffic Collision pilot
Avoidance System (TCAS), etc.). In this paper MSDF techniques
 Increased airspace capacity and efficiency
required for sensor and system information fusion are critically
analysed and their algorithms are presented. An Unscented Kalman  Reduced separation distances between aircraft
Filter (UKF) and a more advanced Particle Filter (PF) are adopted to
estimate the state vector of the objects based on maneuvering and Future surveillance systems will continue to use traditional
non-maneuvering tasks. Furthermore, an artificial neural network is Secondary Surveillance Radar (SSR) and Mode-S transponders and
conceptualised to exploit the use of statistical learning methods, will be gradually introduced in high-density continental airspace [1].
which act on combined information obtained from UKF and PF. As the implementation of ADS expands across continents, it will
provide a better surveillance picture. Aircraft position will be
automatically transmitted to ATM by the Next Generation Flight
Introduction Management System (FMS) along with other information such as
altitude, speed and distance through communication satellites and
Communication, Navigation and Surveillance (CNS) technologies are data links. ADS and ADS-B provide seamless position reporting
by definition, systems employed to support a seamless global ATM between communication and navigation technologies and are also
system [1]. The current and future improvements in CNS capable of providing information during transoceanic operations.
technologies will benefit the overall performance of Air Traffic ADS-B is a cooperative aircraft position determination surveillance
Management (ATM) systems for both civil and military sectors of the technology using satellite navigation and periodically broadcasting to
aerospace industry. The development of these systems will lead to enable aircraft to be tracked. Aircraft to aircraft ADS-B transmission
improvements in handling and transfer of information, efficient will also operate to provide situational awareness and facilitate in
surveillance services using Automatic Dependent Surveillance (ADS) creating adequate, yet safe, separation and conflict avoidance
and ADS-Broadcasting (ADS-B), and improve the accuracy, distances. Therefore ADS-B technology is a key enabler to carry out
integrity, availability and continuity of the navigation system [2]. efficient Detect, Track and Avoid (DTA) tasks for all types of aircraft
Consequentially, separation distances between aircraft and/or including RPAS.
Remotely Piloted Aircraft System (RPAS) will be reduced leading to
a better demand and capacity balance of the airspace. More
Page 1 of 12
This is the author pre-publication version. This paper does not include the changes arising from the revision, formatting and publishing process.
The final paper that should be used for referencing is:
F. Cappello, S. Ramasamy, R. Sabatini, “Multi-Sensor Data Fusion Techniques for RPAS Detect, Track and Avoid”, SAE Technical Paper 2015-
01-2475, 2015. DOI: 10.4271/2015-01-2475
This research builds upon the development of a highly robust and GPS-based ADS-B systems shall gradually change the roll of radar
effective Detect-and-Avoid (DAA) system for RPAS based on both use in civil surveillance applications [13]. New systems which are to
non-cooperative sensors and cooperative systems [5]. Currently the be developed should be interoperable and compatible with current
RPAS sector of the aerospace industry is experiencing unparalleled systems in use [13]. The airspace surveillance communication
levels of growth across the civil and military domains. RPAS dynamics where ADS-B and other technologies such as Satellite
technology is being proposed as an alternative to manned aircraft in Communications (SAT COM) are used, will all eventually be fused
an increasing number of applications especially in dull, dirty, with onboard non-cooperative sensors on the aircraft. The emphasis
dangerous (D3) missions and safety critical tasks. Some application of this paper is on fusing cooperative systems and non-cooperative
domains include surveillance, aerial mapping, border patrol and fire sensor measurements for enhanced airborne surveillance (DTA)
detection. In particular, small to medium-sized RPAS have the ability solutions illustrated in Figure 1.
of performing tasks with higher maneuverability [6]. The evolution of
RPAS in the near future presents a series of challenges to develop
systems which provide key functionalities (i.e., key operational and
- Cooperative Systems
technical capabilities) to the RPAS platform which will enable - Non-cooperative Sensors
constant access into all classes of airspace [7]. Identifying the - Naturally Inspired Sensors
commonalities and differences between manned and Remotely
Piloted Aircraft (RPA) is the first step toward developing a regulatory
framework that will provide, at a minimum, an equivalent level of
safety for RPAS integration into all classes of airspace and Detect
aerodromes [8]. Safety is represented for manned aircraft by a
consolidation system of rules regarding design, manufacture,
maintenance and operation of aircraft including Air Traffic
Management (ATM) procedures, systems and services, personnel Track Avoid
licensing, and aerodynamic operations [9]. These rules have to be
applied to RPAS on the basis of equivalent level principle, referring
to the need to maintain a safety standard at least equivalent to the one
required for manned aircraft [9]. For manned aviation the pilot, Detect, Track and Avoid Loop
together with supporting systems, contributes to safe flight by
avoiding traffic, ground obstacles and dangerous weather [10],
electronic sensors and system will be adopted to provide key Figure 1. Airborne surveillance system and DAA functions.
functionalities to the RPAS to satisfactorily mimic the function of the
pilot in order to DTA objects and obstacles on a collision path. A In Intelligent Surveillance Systems (ISS) technologies such as
shortcoming of RPA is that the situational awareness of the pilot is computer vision, pattern recognition, and artificial intelligence are
restricted to the field of view (FOV) of the Human machine interface developed to identify abnormal behaviors in videos. As a result,
(HMI) display screen. DAA functions and techniques will provide fewer human observers can monitor more scenarios with high
increased autonomy to the platform and consequentially reduce the accuracy [14]. This means that surveillance operations are constantly
pilot workload, stress and in addition offer increased safety and evolving into an increasingly more autonomous nature where human
situational awareness [11]. To facilitate civilian, military and interaction is needed less frequently. Multiple sensing is the ability to
academics in their research to develop key systems and software sense the environment with the concurrent use of several, possibly
which RPAS require to be integrated into all classes of airspace, a dissimilar, sensors [15]. The reason for multiple sensing is to be able
roadmap has been developed by the International Civil Aviation to perform more tasks, more efficiently, in a reliably way [15]. These
Organization (ICAO) as part of the Aviation System Block Upgrades techniques are increasingly being applied to aerial robotic systems
(ASBU) [12]. Specifically, RPAS integration entails the insertion into such as Vision Based Navigation (VBN) techniques for approach and
the air traffic management system, DAA (air and ground) and landing tasks [16, 17]. However due to the inherent robotic nature of
situational awareness), weather awareness [10]. The DAA capability the RPAS, its engineering is focusing on an increasingly intelligent
will enable RPAS to perform equally or exceed the performance of approach to design and integration. Therefore surveillance systems
the see-and-avoid ability of the pilot in manned aircraft systems [7]. onboard RPAS are also being developed to reduce the workload of
Both cooperative sensors and non-cooperative systems for DAA tasks pilots and to increase autonomy to the vehicle. Surveillance systems
are being developed to enable RPAS to routinely access all classes of use a number of sensors to obtain information about the surrounding
airspace [5, 7]. Both cooperative and non-cooperative environment. Visual and thermal sensors are two of the many sensors
systems/sensors are used for the DAA function measurements, the used. To fully exploit the fact that more than one sensor is being
hardware devices include; passive and active Forward Looking used, the information is fused using an approach called data fusion.
Sensors (FLS), Traffic Collision Avoidance System (TCAS) and Data fusion is implemented in the research areas such as signal
Automatic Dependent Surveillance – Broadcast (ADS-B) [5, 7]. processing, pattern and image recognition, artificial intelligence, and
Enhanced surveillance solutions based on Automatic Dependent information theory. A common feature of these architectures is that
Surveillance-Broadcast (ADS-B) system and Traffic Collision they incorporate multiple levels of information processing within the
Avoidance System (TCAS) and required for addressing cooperative data fusion process [18, 19]. More specifically sensor fusion is the
DAA functions [5, 7]. A number of ground/space based combining of sensor data from unlike sources so the resulting
sensors/systems are used for surveillance of the surrounding information has less uncertainty than if these sources were used as an
environment of an Aircraft [7]. The progressive development of the individual sensor. Current state-of-the-art airborne sensors and
Page 2 of 12
This is the author pre-publication version. This paper does not include the changes arising from the revision, formatting and publishing process.
The final paper that should be used for referencing is:
F. Cappello, S. Ramasamy, R. Sabatini, “Multi-Sensor Data Fusion Techniques for RPAS Detect, Track and Avoid”, SAE Technical Paper 2015-
01-2475, 2015. DOI: 10.4271/2015-01-2475
MSDF methods for surveillance detection and tracking tasks have (NextGen). The key enabling elements required for the evolution of
employed non-cooperative and cooperative sensors/systems for DAA CNS/ATM and Avionics (CNS+A) framework have been identified
solutions [7]. Data fusion of surveillance sensors utilizes these as part of these programmes [25-30]. In this paper a novel technique
sensors/systems in order to detect and track obstacles and intruders. for data fusion is introduces and illustrated with a number of
A non-cooperative collision avoidance system for RPAS using pulsed architectures based on visual/thermal sensor fusion with ADS-
Ka-band radar and optical sensors has been developed [20]. Acoustic- B/TCAS systems. This architecture will be employed for DAA
based DAA system for small RPAS by adopting a multi sensor applications for both manned and RPAS. The paper is arranged as
platform with acoustic sensors that allows detection of obstacles and follows, an overview of the methodology adopted for data fusion is
intruders in a 360º Field of View (FOV) and by performing quick- presented followed by a list of algorithms adopted for the fusion of
reaction manoeuvres for avoidance has been adopted [21]. This cooperative systems and non-cooperative sensors information for
system allows all weather operations and offers advantages in terms tracking. Furthermore the architectures adopted for ADS-B/TCAS
of power consumption and cost. An approach to the definition of and visual/thermal fusion are also described.
encounter models and their applications on the DAA strategies,
presented in [7], takes into account both cooperative and non-
cooperative scenarios. Ground-Based DAA (GBDAA) systems using
DTA Multi-Sensor Architecture and Data Fusion
electronic sensors are also currently being developed. A summary of
airborne surveillance systems currently available is provided in MSDF is an effective way of optimizing large volumes of data and is
Table 1. implemented by combining information from multiple sensors to
achieve inferences that are not feasible from a single sensor or source
Table 1. Airborne surveillance systems [adopted from 1]. [31]. MSDF is an emerging technology applied to many areas in
civilian and military domains, such as automated target recognition,
battlefield surveillance, and guidance and control of autonomous
Future vehicles, monitoring of complex machinery, medical diagnosis, and
Airspace Current system smart buildings [32]. Techniques for MSDF are drawn from a wide
System
range of areas including pattern recognition, artificial intelligence and
statistical estimation [31]. There are many algorithms for data fusion
Oceanic which optimally and sub-optimally combined the information to
derive a processed best estimate. Mathematically the type of system
continental En- Primary radar/ SSRVoice ADS determines if the system can be optimally estimated, in general linear
route airspace position reports RNAV/RNPG systems can be optimally estimated with the original Kalman Filter
with high / low- OMEGA/LORAN-CNDB NSS and non-linear systems are sub-optimally estimated with
approximation techniques which linearize the non-linear system. Real
density traffic
life navigation and surveillance applications are almost always
described by non-linear equations therefore approximation techniques
Continental SSR are commonly applied, some techniques include; the Extended
Kalman Filter (EKF), Unscented Kalman Filter (UKF), Square Root-
airspace with Primary radar [Mode A/C or UKF (SR-UKF), Cubature Kalman Filter (CKF), Point Mass Filter
high-density SSR Mode A/C Mode S] (PMF) and the Particle Filter (PF) [16, 17, 33]. In the past three
traffic ADS decades, the EKF has become the most widely used algorithm in
numerous nonlinear estimation applications [34]. In comparison to
the UKF, the EKF is difficult to manipulate (i.e. computationally
Terminal areas SSR intractable) due to derivation of the Jacobean matrices [16].
Primary radar Furthermore, the accuracy of the propagated mean and covariance is
with high- [Mode A/C or
SSR Mode a/c limited to first order Taylor series expansion, which is caused by its
density Mode S] ADS linearization process [35]. The UKF overcomes the limitations of the
EKF by providing derivative-free higher-order approximations by
approximating a Gaussian distribution rather than approximating an
These ground based systems provide information on manoeuvre arbitrary nonlinear function [36, 37]. The UKF is more accurate and
decisions for terminal-area operations [22]. In regards to image robust in navigation applications by also providing much better
processing techniques, the target that is well camouflaged for visual convergence characteristics. The UKF uses sigma points and a
detection will be hard (or even impossible) to detect in the visual process known as Unscented Transform (UT) to evaluate the
bands, whereas it can still be detectable in the thermal bands [23]. A statistics of a nonlinear transformed random variable [35]. Due to the
combination of visible and thermal imagery may then allow both the advancements in modern computing technologies, algorithms such as
detection and the unambiguous localization of the target (represented the UKF and the PF can now be implemented effectively for real time
in the thermal image) with respect to its background (represented in systems. The aim of this research is to adopt MSDF techniques to
the visual image) [24]. Significant outcomes are expected from novel provide the best estimate of track data obtained from cooperative
Communication, Navigation, Surveillance/Air Traffic Management systems and non-cooperative sensors. The UKF is used to processes
(CNS/ATM) systems, in line with the large-scale and regional ATM the cooperative sensors and a PF is adopted to combine the non-
modernisation programmes including Single European Sky ATM cooperative sensor measurements. The PF or Sequential Monte Carlo
Research (SESAR) and Next Generation Air Transportation System (SMC) are a broad and well known category of Monte Carlo
Page 3 of 12
This is the author pre-publication version. This paper does not include the changes arising from the revision, formatting and publishing process.
The final paper that should be used for referencing is:
F. Cappello, S. Ramasamy, R. Sabatini, “Multi-Sensor Data Fusion Techniques for RPAS Detect, Track and Avoid”, SAE Technical Paper 2015-
01-2475, 2015. DOI: 10.4271/2015-01-2475
algorithms developed in the last two decades to provide approximate time horizon. Coarse-resolution radar based DAA solution is
solutions to intractable inference problems [38]. The methodology is implemented for mini RPAS and its information is fused with data
used for nonlinear filtering problems seen in Bayesian statistical from ADS-B system. In most realistic problems the real world may
inference and signal processing. Using the PF for Filtering or be described by nonlinear differential equations, and the
estimation problems, the observations are used to measure internal measurements may not be a function of those states. In this case the
states in systems. The system dynamic process and sensors are also measurements are a linear function of the states, but the model of the
filled with random noise utilised in the filtering process. The real world is nonlinear. Previous research included the use of the
objective is to compute the posterior distributions of states or hidden IMM filter to fuse the measurements of the cooperative non-
variable, from the noisy and partial observations. The main objective cooperative unified approach [7]. In order to predict the track of the
of a PF is estimating the posterior density of the hidden state target object/aircraft MSDF techniques are used to obtain the best
variables using the observation variables if applicable. The PF is estimate and then a probabilistic model is used. What is being
designed for a hidden Markov Model, where the system consists of proposed is that the UKF is used to obtained a better estimate,
hidden and observable variables. The dynamical system describing therefore a better prediction will be obtained because the errors are
the evolution of the state variables is also known probabilistically essentially reduced. The detection information provided by several
obtained from the prediction and updating process of the PF. A set of awareness sensors should be evaluated and prioritized taking into
particles is initialised in the PF to generate a sample distribution, account the rest of the UAS information such as telemetry, flight plan
where each particle is assigned a weight, which represents the and mission information [41]. A multi-sensor surveillance system
probability of that particle to be sampled from the probability density includes function such as detection and tracking which utilize
function. Weight disparity leading to weight collapse is a common cooperative and non-cooperative information sources in a multi-
issue encountered in these filtering algorithms; however it can be sensor architecture. The output of an effective multi-sensor tracker
mitigated by including a resampling step before the weights become provides a unified surveillance picture with a relatively small number
too uneven [39]. ADS-B and TCAS are exploited in the MSDF of confirmed tracks and improved target localization [19]. Global
architecture which provides position and velocity estimates for the multi-sensor fusion and tracking that is capable of processing
detection and tracking solution. The MSDF technique used to fuse (potentially anomalous) AIS tracks and contact-level or track-level
the cooperative sensors is the UKF which is MSDF technique N, the data from other sensors to produce a single, consolidated surveillance
MSDF technique used to fuse the non-cooperative sensors is the PF picture [19]. In this paper, we focus on the second of these four
which is MSDF technique NC and the MSDF technique used to fuse technology areas including; a high-performance multi-sensor fusion
the MSDF N and MSDF NC sensors is based on a memory based that provides the basis for anomaly-detection work [12]. The
learning. Future research will aim to use the non-maneuvering following filters have been used for tracking [37] of intruders.
tracking algorithms to teach the NN. A typical architecture for MSDF
is illustrated in Figure 2. Obstacle detection and tracking algorithms
Tracking for single object non maneuvering target
Cooperative Systems Non-Cooperative Sensors
1. The optimal Bayesian filter
Visual Thermal Lidar 2. The Kalman Filter
ADS-B TCAS
MMW Radar Acoustic
3. The Extended Kalman Filter
4. The Unscented Kalman Filter
5. The Point Mass Filter
C MSDF NC MSDF 6. The Particle Filter
Maneuvreing object tracking
1. The optimal Bayesian filter

DTA 2. Generalized pseudo-Bayes Filters (of order 1 and order 2)


MSDF
3. Interactive Multiple Model
4. Bootstrap Filter (Auxuilary)
Tracks of Intruders/ objects
5. Extended Kalman Auxiillary Particle Filter
Single Object Tracking in Clutter
Figure 2. Multi-Senor Data Fusion for DAA.
1. The optimal Bayesian filter
In order to estimate the nonlinear functions, approximation methods 2. Nearest Neighbor filter
are used. On-board trajectory replanning with dynamically updated
constraints based on the intruder and the host dynamics is at present 3. Particle filter
used to generate obstacle avoidance trajectories [7, 40]. The states of 4. Extended Kalman auxiliary particle filter
the tracked obstacles are obtained by adopting UKF/PF or other
MSDF techniques is used in order to predict the trajectory in a given
Page 4 of 12
This is the author pre-publication version. This paper does not include the changes arising from the revision, formatting and publishing process.
The final paper that should be used for referencing is:
F. Cappello, S. Ramasamy, R. Sabatini, “Multi-Sensor Data Fusion Techniques for RPAS Detect, Track and Avoid”, SAE Technical Paper 2015-
01-2475, 2015. DOI: 10.4271/2015-01-2475

Single- and Multiple-Object Tracking in Clutter: Object-


Existence-Based Approach False Track Discrimination Scheme
1. Optimal Bayes’ recursion
2. Joint Probabilistic Data association (JPD)
3. Interacting Multiple Models – Joint Integrated Probabilistic Track Tentative Status
Data Association (IMM-JIPDA)
Multiple-object tracking in clutter: random-set-based
approach
Track Threshold
Random Finite Set Formalism (RFS)
1. The optimal Bayesian multi-object tracking filter
2. The probabilistic hypothesis density approximation Confirmation or Termination of Track based on New
Measurements
Approximate filters
1. Gaussian mixture PHD filter
Figure 3. Multi-Senor Data Fusion for DAA.
2. Particle PHD filter
3. Gaussian mixture CPHD algorithm  Each new track has the tentative status, which may be changed
Object-existence-based tracking filter by confirmation or termination using subsequent measurements
 if the probability of object existence rises above a predetermined
1. The integrated probabilistic data association filter
track confirmation threshold, the track becomes confirmed, and
2. Joint IPDA derivation from Gaussian mixture PHD filter stays confirmed until termination; and
Several well-known Bayesian algorithms for recursive state  when the probability of object existence falls below a
estimation. These algorithms, known as filters, result from the predetermined track termination threshold, the track is
general Bayesian recursive equation [37]. These filtering algorithms terminated and removed from memory
are viewed as algorithms for the single, multiple, maneuvreing and
non-maneuvreing object tracking problem. Maneuvering objects are The Unscented Kalman Filter
those objects whose dynamical behavior changes overtime. An object
that suddenly turns or accelerates displays a maneuvering behavior
with regard to its tracked position [37]. While the definition of a The UKF is an alternative to the EKF which shares its computational
maneuvering object extends beyond the tracking of position and simplicity while avoiding the need to derive and compute Jacobians
speed, historically it is in this context that maneuvering object and achieving greater accuracy [37]. The UKF is based upona the UT
tracking theory developed [37]. In the future, this service has to process, this method approximates the moments of a non-linearly
incorporate 4D navigation in order to know at what moment the transformed random variable. The UT can be used as the basis of a
RPAS is going to arrive at the different flight plan waypoints. All non-linear filtering approximation which has the same form as the
flight phases of the RPAS go through to perform a mission are also EKF but is derived in a quite different manner. The posterior PDF of
provided (such as on-ground, departure, en-route, mission or arrival). xk using Bayes’ rule is given by:
This data is going to be important in selecting a suitable conflict
reaction. As a result of this prioritization, the ADF service may ( ) ( ) ( ) (1)
choose between two possible outputs: collision avoidance maneuver
required or self-separation maneuver required [41]. A simple false The UKF approximates the joint density of the state and
track discrimination scheme may be described by the track status measurement conditional on the measurement history is
propagation as illustrated in Figure 3. approximated by a Gaussian density expressed as:

̂
( ) ([ ] [ ] [ ]) (2)
̂

Page 5 of 12
This is the author pre-publication version. This paper does not include the changes arising from the revision, formatting and publishing process.
The final paper that should be used for referencing is:
F. Cappello, S. Ramasamy, R. Sabatini, “Multi-Sensor Data Fusion Techniques for RPAS Detect, Track and Avoid”, SAE Technical Paper 2015-
01-2475, 2015. DOI: 10.4271/2015-01-2475
∑ ( ̂ )( ̂ ) (14)
This implies that:

( ) ( ̂ ) (3) ∑ ( ̂ )( ̂ ) (15)

In general, the moments appearing in these equations cannot be The resulting expression for the posterior PDF is given by:
computed exactly and so must be approximated. The UT is used for
this purpose. The state prediction is performed by predicting mean ( ) ( ̂ ) (16)
and covariance matrix of which represent the moments of
transformation given by: where:

( ) (4) ̂ ̂ ( ̂ ) (17)

where the statistics of given . The predicted mean and (18)


covariance matrix are given by:
As with the EKF, the UKFis strongly reminiscent of the KF with the
( ) ( ( ) ) ( ( ) ) (5) main difference being the dependence of the posterior covariance
matrix on the observed measurements. In UKF this occurs because
( ) ( ( ) ) the calculated sigma points generated in the UT process are moment
approximations and are determined based on state estimates which
( ( ) ) ( ) (6) are measurement dependent [37]. Generally the UKF performs to the
same order of computational expense as the EKF, however it
Approximations to the predicted mean and covariance matrix are to generally performs much better. This can be attributed to increased
be obtained using the UT. As described in Section 2.4.1, sigma points accuracy of the moment approximations in the UKF attributed to the
and weights are selected to match the mean UT process. In fact, the UKF achieves the same level of accuracy as
and the covariance matrix of ( ). With ( ) the second-order EKF, which requires the derivation of Jacobian and
Hessian matrices [37]. Before concluding, we note that the
where , the UT approximations to the predicted mean and
assumption of additive noise in the dynamic and measurement
covariance matrix are [37]:
equations has been made only for the sake of convenience [37]. The
UT is equally applicable for non-additive noise. For instance, if the
̂ ∑ (7) dynamic equation is ( ), then the quality undergoing a
non-linear transformation is the augmented variable
∑ ( ̂ )( ̂ ) (8) [ ] , which has the statistics ( ) [̂ ]
and ( ) ( ) [37]. The UT can be
The measurement prediction: The predicted statistics for the applied to the random variable transformed through the function
measurement yk are moments of the transformation are given by: toapproximate the predicted state statistics [37]. Since is of
dimension a larger number of sigma points will be required
( ) (9) than for the case where the dynamic noise is additive [37]. Similar
comments hold for a measurement equation which is non-linear in the
where the statistics of given are available from the measurement noise [37]. A summary of the UKF is given by below.
prediction step. Hence we have:
Algorithm Review
( ) ( ( ) ) (10)
1. Determine sigma points and weights to
( ) ( ( ) ) ( ) (11) match a mean ̂ and covariances matrix .
2. Compute the transformed sigma points ( ),
( ) ( ( ) ) (12)
.
3. Compute the predicted state statistics:
Let and denote the sigma points and weights,
respectively, selected to match the predicted mean and covariance
matrix. The transformed sigma points are ( ), . ̂ ∑ (19)
The UT approximations to the moments are [37]:
∑ ( ̂ )( ̂ ) (20)
̂ ∑ (13)

Page 6 of 12
This is the author pre-publication version. This paper does not include the changes arising from the revision, formatting and publishing process.
The final paper that should be used for referencing is:
F. Cappello, S. Ramasamy, R. Sabatini, “Multi-Sensor Data Fusion Techniques for RPAS Detect, Track and Avoid”, SAE Technical Paper 2015-
01-2475, 2015. DOI: 10.4271/2015-01-2475
4. Determine the sigma points and weights to ̂ ∑ (29)
match mean ̂ and covariance matrix .
5. Compute the transformed sigma points ( ), . Sequential Importance Resampling Particle Filter
6. Compute the predicted measurement statistics:
One of the basic challenges in implementation of PF is appropriate
choice of proposal density function. One of the common suboptimal
̂ ∑ (21) choices of proposal density function is transitional prior distribution
[43].
∑ ( ̂ )( ̂ ) (22)
( ) ( ) ( ) (30)
∑ ( ̂ )( ̂ ) (23)
By substituting ( ) into each particle’s weight, a new weighting can
be calculated based on:
7. Compute the posterior mean and covariance matrix
( ) (31)
̂ ̂ ( ̂ ) (24)
In each step of the SIR algorithm, re-sampling is done after state
(25) estimation. After re-sampling, weights of all particles are equal too
where is the number of particles. Hence the above equation is
The Particle Filter modified as follows:

There are different methods for target tracking, but particle filtering ( ) (32)
(PF) is one of the most important and widely used methods. The PF is
implementation of Bayesian recursive filtering using sequential
Monte Carlo sampling method [42]. The PF is used in statistics and
SIR Algorithm
signal processing, which is known as bootstrap filter and Monte
Carlo filter respectively [43]. The target state is represented by Initialise: the particles using prior distribution of target.
where is time. is the set of all observations. The posterior density
function is ( ) is represented by a set of weighted samples. FOR i=1:N
Observations ( ) in the PF method can be non-gaussian. The
main idea of the PF is the estimation of the Probability Distribution a. Prediction: Generate particles using dynamic model
Function (PDF) of the target object by a set of weighted samples
{( ( ) ( ) ) } such that ∑ ( )
[43]. Based ̂ ( ) (33)
on the observations, likelihood of each particle is applied in weight
assignment by the equation below in each step:
b. Weighting: Calculate weight for particle
( ) ( )
( ) (26)
( ) (34)

The PF consists of two stages, prediction and update. In the END FOR
prediction state, next step particles are calculated by using current
step particles and target state transition equations. In the update stage,
Normalize the particle weights by
the weights are assigned to new particles by the following equation
[43]:
̂ ∑
(35)
( ) ( )
(27)
( )
Resampling: Resample the particles and obtain the new part
with same weight and repeat from stage 2.
where is the importance density function which is used in the
sampling function. Posteriori density function is estimated by using
the following equation:
{( )} {( )} (36)
( ) ∑ ( ) (28)

The next step state vector is estimated by using the weighted average
on current step particles with the following equation [43]:

Page 7 of 12
This is the author pre-publication version. This paper does not include the changes arising from the revision, formatting and publishing process.
The final paper that should be used for referencing is:
F. Cappello, S. Ramasamy, R. Sabatini, “Multi-Sensor Data Fusion Techniques for RPAS Detect, Track and Avoid”, SAE Technical Paper 2015-
01-2475, 2015. DOI: 10.4271/2015-01-2475

The detection of maneuver can rely on the properties of the


Statistical Learning Techniques innovation process of the filter that is matched to the non-
maneuvering system. When the maneuver takes place the residual are
Statistical learning theory is a framework for machine learning no longer zero mean white noise and changes their statistical
drawing from the fields of statistics and functional analysis [44]. properties. The proposed detection algorithm is based on statistical
Statistical learning concerns itself with finding a predictive function testing theory. The decision problem can be formulated as statistical
for a task/problem. Computer vision, speech recognition and test of two hypothesis which checks probability density function of
bioinformatics are areas which Statistical learning theory has been innovation process. Hypothesis H0 represents non maneuvering
successfully applied. The goal of learning is prediction. Learning object and alterative hypothesis H1 represents a maneuvering object.
falls into many categories, including supervised learning, The ability to make rapid and effective decisions is difficult to do, but
unsupervised learning, online learning, and reinforcement learning. Information is represented as real vectors in artificial neural
From the perspective of statistical learning theory, supervised networks. This uniformity allows for arbitrary hierarchical
learning is best understood [45]. Supervised learning involves combinations of neural classifiers, because the features or outputs of
learning from a training set of data. Every point in the training is an one classifier can simply be regarded as inputs of another classifier,
input-output pair, where the input maps to an output. The learning and outputs of several classifiers can be combined by concatenation
problem consists of inferring the function that maps between the of the real output vectors. The equations for an artificial neural
input and the output in a predictive fashion, such that the learned network learning algorithm are presented below:
function can be used to predict output from future input. In machine
learning problems, a major problem that arises is that of overfitting. Function: Perceptron-Learning (example, network) returns a
Because learning is a prediction problem, the goal is not to find a perception hypothesis.
function that most closely fits the (previously observed) data, but to
find one that will most accurately predict output from future input
[46]. Some examples of Statistical learning techniques include but are Inputs: examples, a set of examples, each with input
not limited to; Artificial Neural Networks, Instance Based Learning, and output . Network, a perception with weights , ,
Learning with hidden variables and Kernel machines which are and activation function .
succinctly depicted in figure 4 [47].
Repeat for each in examples do

Instance-Based
Instance-Based Artificial
Artificial Neural
Neural in ∑ [ ] (36)
Learning
Learning Networks
Networks
Err [ ] ( ) (37)

( ) [ ] (38)

Statistical
Statistical Learning
Learning Until some stopping criteria is satisfied
Methods
Methods
Return: Neural-Net-Hypothesis (network).

The method which is adopted to fuse the measurements from


cooperative and non-cooperative sensors is artificial neural networks.
Learning
Learning with
with Kernel
Kernel Neural networks seem to be particularly well suited for the
Hidden
Hidden Variables
Variables Machines
Machines combination of inputs from completely different sources. So we will
be using them and the corresponding learning and training strategies
for these problems. Artificial neural networks have become a
Figure 4. Artificial intelligent techniques (Adapted from [33]). standard paradigm for pattern recognition. They are also well-suited
for the formulation of sensor-, information-, or decision-fusion
this difficulty is increased by the fact that the environment is fraught scenarios [49]. The Heuristics based MSDF is shown in Figure 5.
with uncertainty [48]. In this case the RPA or agent can handle
uncertainty by using the methods of probability and decision theory,
but first they must learn their probabilistic theories of the world from
experience. The state-of-the-art methods used for learning probability
models are primarily based on Bayesian networks. Learning models
are used to store and recall specific instances. In addition, neural
networks learning and kernel machines are prevalent techniques used
for information learning. Heuristics based MSDF techniques are
illustrated in Figure 6.
Page 8 of 12
This is the author pre-publication version. This paper does not include the changes arising from the revision, formatting and publishing process.
The final paper that should be used for referencing is:
F. Cappello, S. Ramasamy, R. Sabatini, “Multi-Sensor Data Fusion Techniques for RPAS Detect, Track and Avoid”, SAE Technical Paper 2015-
01-2475, 2015. DOI: 10.4271/2015-01-2475

( ) ( )
Non Parametric Learning Techniques ( ) ⁄
( ) (41)
( ) ( )

Where is the covariance matrix. Application of activation function


of the form (3) allows reducing the number of neurons in the middle
Instance Based Learning Artificial Neural Networks layer. The learning task of RBF network is split into two separate
- Nearest-Neighbour Methods - Neural Structures subtasks [5]. The first is determination of activation function
- Kernel Methods - Single Layered NN parameters (centers and weights) and the second -the calculation of
- Multi-Layered NN the output weights w. The centers are associated with the clusters of
- Leaning NN structures input data, represented by the average vector x. The self-organization
of this clustering process is usually done by the competition among
neurons, according to the strategy of "Winner Takes All" (WTA) or
Figure 5. Heuristics based MSDF (Adapted from [33]).
"Winner Takes Most" (WTM) [50]. After adaptation of the centers
the adjustment of the width parameter o is done in a way to provide
Typically, three-layer networks are used for classification: The first the continuous approximation of the data. Simultaneously, the
layer or input layer contains the data to be analyzed. Here, it is weights of the output neurons are calculated. The sum of all weights
usually assumed that the data are given as real vectors of fixed is always equal to one [50].
dimensionality (perhaps after some appropriate pre-processing) [49].
The second layer or hidden layer contains non-linear combinations of
the data, which can be regarded as features that are formed by neural- Vision and Thermal Fusion
network learning or training [49]. The third layer or output layer
represents the decision of the neural network. For a classification In this section we aim to provide a basic conceptual understanding of
problem with k classes there are typically k output-neurons. Ideally, the fusion of the cooperative systems and non-cooperative sensors.
the network should respond with the jth unit vector to the jth class. In Due to the significant details required to explain all processes
practice, the output value of the jth unit (typically between 0 involved in the fusion of all the sensors and system one example is
and 1) is regarded as the strength of evidence for or belief in the taken for this paper [52]. In this paper we present the fusion concept
object belonging to class j [49]. After proper training of the network, of combining ADS-B and TCAS systems with a visual and thermal
the output of the jth unit should actually be an estimate of the a fused configuration. The medium/filter used to fuse these
posteriori probability p(cj..d) of class j. Figure 7 is an illustration of measurements is dome using artificial neural network which can be
artificial intelligence techniques used to fuse cooperative and non- trained to perform better. The proposed training which will be used to
cooperative sensors techniques. In the proposed method the RBF or teach the system is non-manoeuvring and manoeuvring tracking. The
the HRBF neural networks (NN) are applied as probability density image fusion between the thermal and visual sensors is carried out. In
function (pdf) estimators [50]. The neural networks are trained for recent years, much development has been undertaken by computer
certain observation scenario. In this step innovation process of filter vision community in the area of image sensing technologies using a
tracking non-maneuvering object is used as learning data. The RBF multi-sensor framework. Application of this technology has found it
neural network consists of three layers: an input, a middle and an place in many civil and military areas such as medical imaging and
output layer [50]. The input layer corresponds to the input vector remote sensing. Single sensor operations and technologies do not
space. Each input neuron is fully connected to the middle layer provide the information and advantages of multi-sensor systems. A
neurons. The middle layer consists of m neurons. Each middle layer benefit of multi-sensor systems is that information that is not
neuron computes an activation function which is usually the Gaussian normally considered useful can be exploited for example
function 1 [51]. complementary information about the region surveyed so that image
fusion provides an effective method to enable comparison and
‖ ‖ analysis of such data [53]. By definition image fusion is the process
( ) ⁄
( ) (39)
( ) of combining information in two or more images of a scene to
enhance viewing or understanding of the scene [54]. The fusion
Where: – center of activation function, – width of process must preserve all relevant information in the fused image
activation function, – dimension of the estimated pdf. The output [55]. The methodology of image fusion and the computational
layer consists of neurons which correspond to the classification processes involved were developed by the MIT Lincln Laboratory
problem. The output layer is fully connected to the middle layer. where inspiration for the models developed were derived from
Each output layer neuron computes a linear weighted sum of the biological models of colour vision for fusion or IR radiation and
outputs of the middle layers as shown in equation 2 [51]. visible light [56]. The process involved in the biological make up of
certain species snakes (i.e., rattlesnake and pythons) follows the
following process. In the colour image fusion methodology the
( ) ∑ ( ), (40)
individual image inputs are first enhanced by filtering them with a
feedforward center-surrounding shunting neural network. The
Where is the weight value between the middle layer operation serves to [57]:
neuron and the output layer neuron. For multidimensional pdf,
the activation function can be given by [51]:

Page 9 of 12
This is the author pre-publication version. This paper does not include the changes arising from the revision, formatting and publishing process.
The final paper that should be used for referencing is:
F. Cappello, S. Ramasamy, R. Sabatini, “Multi-Sensor Data Fusion Techniques for RPAS Detect, Track and Avoid”, SAE Technical Paper 2015-
01-2475, 2015. DOI: 10.4271/2015-01-2475

1. Enhance special contrast in the individual visible and IR bands


Begin Fusion Process
2. Create both positive and negative polarity IR contrast images
3. Create two types of single-opponent colour contrast images
ADS-B and TCAS Visual and Thermal
Data Fusion Image Fusion
The resulting single opponent colour contrast images represent
greyscale fused images that are analogous to the IR-depressed visual
and IR enhanced visual cells of the rattlesnake [24]. The observer Processing and Data Sorting
evaluation of the fusion process is critical to the effectiveness of the UKF PF
system and for the operator ergonomics. Therefore the evaluation of
the fusion process would be to seen an ideal fused image ‘ground
truth’ against on to which all others can be compared, this would be
the most natural approach, however this is rarely the case with real Artificial Neural Network
systems. In a multi-sensor system images from several input sources
can only be fused into one output if the inputs contain spatially and Stop Fusion Process
spectrally independent information, this is also rarely the care in real
fusion applications, so the generally accepted objective is summaries
as follows, only the most visually important input information is used Multi-Sensor Track Update
to generate the output [24]. Considering that the most common
application, and in the case of our application, is for display purposes,
the preservation of perceptually important visual information Track Management
becomes an important and critical issue. Therefore, the ideal fused
output image should therefore contain all the important visual
Fused
information, as perceived by an observer viewing the input images. Measurements
Also, a single-level fusion system must ensure that no distortions or
other ‘false’ information is introduced in the fused image [58]. The
visual and thermal image fusion process is illustrated in Figure 6. Figure 7. Vision/thermal image and ADS-B measurements fusion process.

The typical performance criteria include [58]:


Begin Fusion Process
1. Identification and localisation of visual information in the input
Common Field of View and fused images.
Visual Image Thermal Image 2. Evaluation of the perceptual importance.
Coefficient Map 3. Measuring (quantification) the accuracy with which input
Generation Coefficient Map
information is represented in the fused image.
4. Distinguishing true scene (input) information and fused artefacts
in the output fused image.
1. Fusion Decision Map Generation
2. Detection
3. Segmentation Conclusions
4. Target Tracking

This paper summarizes initial research activities conducted on


Stop Fusion Process surveillance MSDF methods for mini RPAS. In particular, Target
detection and tracking methods for single and multiple target
situations. Multi-sensor architectures in which information from the
Figure 6. Vision/thermal image fusion process. sensors is combined leads to significantly more robust on-board
surveillance systems. Multi-sensor systems enrich the surveillance
picture, and provide an effective basis for anomaly detection
The tracking data obtained from the visual and thermal image fusion processing. Civil-military aircraft and RPAS can significantly benefit
process is combined with ADB-B measurements as illustrated in from this architecture, in that aircraft can operate in high density
Figure 7. traffic with high performance. A conceptual analysis on the ground
based and airborne sensors methods has been carried out.
Cooperative and non-cooperative sensor/systems were utilised for
DAA function measurements, passive and active Forward Looking
Sensors (FLS), Traffic Collision Avoidance System (TCAS) and
Automatic Dependent Surveillance – Broadcast (ADS-B). Enhanced
surveillance solutions based on Automatic Dependent Surveillance-
Broadcast (ADS-B) system and Traffic Collision Avoidance System
Page 10 of 12
This is the author pre-publication version. This paper does not include the changes arising from the revision, formatting and publishing process.
The final paper that should be used for referencing is:
F. Cappello, S. Ramasamy, R. Sabatini, “Multi-Sensor Data Fusion Techniques for RPAS Detect, Track and Avoid”, SAE Technical Paper 2015-
01-2475, 2015. DOI: 10.4271/2015-01-2475
(TCAS) and required for addressing cooperative DAA functions. The AIAC 16: Multinatioinal Aerospace Programs-Benefits and
emphasis of this paper is on fusing cooperative systems and non- Challenges. Engineers Australia, 2015.
cooperative sensor measurements for enhanced airborne surveillance 17. Sabatini, R., Cappello, F., Ramasamy, S., Gardi, A. and
(TDA) solutions. This review discussed the application of MSDF Clothier, R., “An Innovative Navigation and Guidance System
techniques for non-cooperative and cooperative sensor/systems for Small Unmanned Aircraft using Low-Cost Sensors”, Journal
fusion and data processing. of Aircraft Engineering and Aerospace Technology, 2015.
18. Dailey, D.J., Harn, P., and Lin P-J., “ITS Data Fusion,” Final
Research Report, Research Project T9903, Task 9, ATIS/ATMS
References Regional IVHS Demonstration, 1996.
19. Gad, A., and Farooq, M., “Data Fusion Architecture for
1. ICAO, “Global Air Navigation Plan for CNS/ATM Systems,” Maritime Surveillance,” Proceedings of the Fifth International
Doc 9750, 1998. Conference on Information Fusion. Vol. 1. IEEE, 2002.
2. ICAO, “Manual of Air Traffic Services Data Link 20. Fasano, G., Accardo, D., Moccia, A., Carbone, C., Ciniglio, U.,
Applications,” Doc 9694-AN/955, 1999. Corraro, F., and Luongo, S., “Multi-Sensor Based Fully
3. ICAO, “Manual on Required Communication Performance Autonomous Non-Cooperative Collision Avoidance System for
(RCP)”Doc 9869-AN/462, 2006. Unmanned Air Vehicles,” Journal of Aerospace Computing,
4. EUROCONTROL,” European Organisation for the Safety of Air Information, and Communication, vol. 5, no. 10, pp. 338-360,
Navigation,” Edition 2.1,1999. 2008. DOI: 10.2514/1.35145
5. Ramasamy, S., and Sabatini, R., “Unifying Cooperative and 21. Rodriguez, L., Sabatini, R., Ramasamy, S., and Gardi, A., “A
Non-Cooperative UAS Sense-and-Avoid,’ Conference: Second Novel System for Non-Cooperative UAV Sense-And-Avoid”,
International Workshop on Metrology for Aerospace European Navigation Conference (ENC 2013), Vienna, Austria,
(MetroAeroSpace 2015), At Benevento, Italy, 2015. 2013.
6. Laliberte, A.S., Rango, A. and Herrick, J.E., “Unmanned Aerial 22. Spriesterbach, T.P., Bruns, K.A., Baron, L.I., and Sohlke, J.E.,
Vehicles for Rangeland Mapping and Monitoring: A “Unmanned Aircraft System Airspace Integration in the
Comparison of Two Systems”, Proceedings of the ASPRS National Airspace Using a Ground-Based Sense and Avoid
Annual Conference, Tampa, Florida, USA, 2007. System,” Johns Hopkins APL Technical Digest, Volume 32,
7. S. Ramasamy, R. Sabatini, A. Gardi, “Towards a Unified Number 3, 2013.
Approach to Cooperative and Non-Cooperative RPAS Detect- 23. Toet, A., and IJspeert, J.K., “Perceptual evaluation of different
and-Avoid.” Fourth Australasian Unmanned Systems image fusion schemes,” Displays 24.1, 25-37, 2003.
Conference 2014 (ACUS 2014), Melbourne (Australia), 24. Blum, R. S., and Liu, Z., “Multi-Sensor Image Fusion and Its
December 2014. DOI: 10.13140/2.1.4841.3764 Applications,” CRC press, 2005.
8. ICAO, “Unmanned Aircraft Systems (UAS),” Cir 328-AN/190, 25. SESAR 2011., “Modernising the European Sky,” Brussels,
2011. Belgium: SESAR Consortium.
9. Crespo, D.C., Mendes de Leon, P., “Achieving the Single 26. ICAO., “Doc 9750 – Global Air Navigation Capacity &
European Sky: Goals and Challenges,” vol. 8, Kluwer Law Efficiency Plan 2013-2028,” 4 ed. Montreal, Quebec, Canada:
International, 2011. International Civil Aviation Organization (ICAO). 2014a.
10. ICAO, “Roadmap for the integration of civil Remotely-Piloted 27. SESAR and the Environment. Brussels, Belgium: European
Aircraft Systems into the European Aviation System,” A Commission. EU 2014. Clen Sky 2 - Joint Technical
Strategic R&D Plan for the integration of civil RPAS into the Programme. Brussels, Belgium: European Commission. EU
European Aviation System, 2013. 2010.
11. Gu, W., Mittu, R., Marble, J., Taylor, G., Sibley, C., et el., 28. Sabatini, R., Gardi, A., Ramasamy, S., Kistan, T., and Marino,
“Towards Modeling the Behavior of Autonomous Systems and M., “Novel ATM and Avionic Systems for Environmentally
Humans for Trusted Operations,” The Intersection of Robust Sustainable Aviation Conference: Practical Responses to
Intelligence and Trust in Autonomous Systems: Papers from the Climate Change (PRCC) 2014,” Engineers Australia Convention
AAAI Spring Symposium, 2014. 2014, At Melbourne, Australia, Volume: Transport (Aviation)•
12. ICAO., “Introduction to the Aviation System Block Upgrade November 2014 DOI: 10.13140/2.1.1938.0808
(ASBU) Modules Strategic Planning for ASBU Modules 29. Sabatini, R., “A Roadmap for Future Aviation Research in
Implementation,” 2013. Australia: Improving Aviation Safety, Efficiency and
13. Baud, O., Gomord, P., Honoré, N., Lawrence, P., Ostorero, L., Environmental Sustainability,” Conference: First International
Paupiah, S., and Taupin, O., “Air Traffic Control Tracking Symposium on Sustainable Aviation (ISSA 2015), At Istanbul,
Systems Performance Impacts with New Surveillance Turkey, DOI: 10.13140/RG.2.1.4833.2962
Technology Sensors,” Aerospace Technologies Advancements, 30. Ramasamy, S., Sabatini, R., and Gardi, A., “Novel Flight
2010. DOI: 10.5772/7164 Management System for Improved Safety and Sustainability in
14. Huihuan, Q., Xinyu, W., Yangsheng, X., “Intelligent the CNS+A Context, Conference: Integrated Communication,”
Surveillance Systems,” vol. 51. Springer Science & Business Navigation and Surveillance Conference (ICNS 2015), At
Media, 2011. Herndon, VA, USA, DOI: 10.1109/ICNSURV.2015.7121225
15. Aggarwal, J. K., “Multisensor Fusion for Computer Vision,” 31. Tagarev, T., and Ivanov, P., “Computational Intelligence in
Vol. 99. Springer Science & Business Media, 2013. Multi-Source Data and Information Fusion,” Issue: Information
16. Cappello, F, Subramanian, R, and Sabatini, R., “Multi-sensor and Security, vol. 2, 1999.
Data Fusion Techniques for RPAS Navigation and Guidance”,

Page 11 of 12
This is the author pre-publication version. This paper does not include the changes arising from the revision, formatting and publishing process.
The final paper that should be used for referencing is:
F. Cappello, S. Ramasamy, R. Sabatini, “Multi-Sensor Data Fusion Techniques for RPAS Detect, Track and Avoid”, SAE Technical Paper 2015-
01-2475, 2015. DOI: 10.4271/2015-01-2475
32. Hall, D.L., “An Introduction to Multisensor Data Fusion,” 47. Welling, M. A, “First Encounter with Machine Learning,”
Proceedings of the IEEE, vol. 85, issue 1, 1997. Irvine, CA.: University of California, 1-93, 2011.
33. Cappello, F., Sabatini, R., Ramasamy, S., Marino, M., “Particle 48. Cosenzo, K. A., Fatkin, L. T., and Branscome, T. A. “Cognitive
Filter based Multi-sensor Data Fusion Techniques for RPAS Uncertainty and Work Shifts in a Real-World Multi-Task
Navigation and Guidance,” Conference: Second International Environment (No. ARL-TR-3515),” Army Research Lab
Workshop on Metrology for Aerospace (MetroAeroSpace Aberdeen proving Ground MD: Human Research and
2015), At Benevento, Italy, 2015. Engineering Directorate, 2005.
34. Wan, Eric A., and Van Der Merwe, R., “The Unscented Kalman 49. Shahbazian, E., Rogova, G., and DeWeert, M. J. “Harbour
Filter for Nonlinear Estimation”, Adaptive Systems for Signal Protection through Data Fusion Technologies,” Springer Science
Processing, Communications, and Control Symposium 2000. & Business Media, 2009.
AS-SPCC. The IEEE 2000. IEEE, 2000. 50. Konopko, K., and Janczak, D., “Maneuver Detection Algorithm
35. Van Der Merwe, R., “Sigma-point Kalman Filters for Based on Probability Density Function Estimation with Use of
Probabilistic Inference in Dynamic State-space Models”, RBF and HRBF Neural Network,” In Photonics Applications in
Dissertation, Oregon Health & Science University, 2004. Astronomy, Communications, Industry, and High-Energy
36. Oh, S-M., “Nonlinear Estimation for Vision-Based Air-to-Air Physics Experiments II (pp. 625-629). International Society for
Tracking”, Diss. Georgia Institute of Technology, 2007. Optics and Photonics, 2004.
37. Challa, S., “Fundamentals of Object Tracking”, Cambridge 51. Cuadrado-Gallego, J.J., Braungarten, R., Dumke, R.R., Abran,
University Press, 2011. A. (Eds.), “Software Process and Product Measurement
38. Doucet, A., & Johansen, A. M. (2009). “A Tutorial on Particle International Conference,” IWSM–MENSURA 2007, Palma de
Filtering and Smoothing: Fifteen Years Later”, Handbook of Mallorca, Spain, November 5-8, 2007.
Nonlinear Filtering, 12, 656-704. 52. Demmel, S., Gruyer, D., & Rakotonirainy, A., “V2V/V2I
39. Hussain, M.S., “Real-Coded Genetic Algorithm Particle Filters Augmented Maps: State-Of-The-Art and Contribution to Real-
for High-Dimensional State Spaces”, (Doctoral dissertation, Time Crash Risk Assessment,” In Proceedings of 20th Canadian
UCL (University College London)), 2014. Multidisciplinary Road Safety Conference, The Canadian
40. Lai, C.K., Lone, M., Thomas, P., Whidborne, J., Cooke, A., Association of Road Safety Professionals, Ontario, Canada,
“On-Board Trajectory Generation for Collision Avoidance in 2010.
Unmanned Aerial Vehicles”, Proceedings of the IEEE 53. Dontabhaktuni Jayakumar, D. P., & Arunakumari, D. “Satellite
Aerospace Conference, pp. 1-14, 2011. DOI: Image Fusion in Various Domains,” International Journal of
10.1109/AERO.2011.5747526 Advanced Engineering, Management and Science (IJAEMS),
41. Angelov, P., “Sense and Avoid in UAS: Research and vol. 1, issue 3, June 2015.
Applications,” John Wiley & Sons, 2012. 54. Garzelli, A., Aiazzi, B., Baronti, S., Selva, M., and Alparone, L.
42. Cheng, Q. and Bondon, P., “A New Sampling Method In “Hyperspectral Image Fusion,” In Proceedings of the
Particle Filter,” 17th European Signal Processing Conference Hyperspectral 2010 Workshop (pp. 17-19), 2010.
(EUSIPCO), 2009. 55. Pramanik, S., & Bhattacharjee, D. “Multi-sensor Image Fusion
43. Parseh, M-J., Pashazadeh. S., and Seyedarabi, H., “Improved Based on Moment Calculation,” 2nd IEEE International
Particle Filtering Algorithm for Maneuvering Object Tracking Conference on Parallel Distributed and Grid Computing
Using Deformation Detection,” 2nd International Conference on (PDGC), pp. 447-451, December 2012.
Computer and Knowledge Engineering (ICCKE), October 18- 56. Toet, A., “Cognitive Image Fusion and Assessment,” INTECH
19, 2012. Open Access Publisher, 2011.
44. Mohri, M., Rostamizadeh, A., and Talwalkar, A., “Foundations 57. Toet, A., & Hogervorst, M., “Real-time Full Color Multiband
of Machine Learning,” The MIT Press, ISBN: 9780262018258, Night Vision,” INTECH Open Access Publisher, 2010.
2012. 58. Stathaki, T. “Image Fusion: Algorithms and Applications,”
45. Bengio, Y., Courville, A., and Vincent, P., “Representation Academic Press, 2011.
Learning: A Review and New Perspectives,” IEEE Trans Pattern
Anal Mach Intell. 2013 Aug; vol. 35, issue 8, pp. 1798-828,
DOI: 10.1109/TPAMI.2013.50.
46. Anthony, M. H. G., and Biggs, N., “Computational Learning
Theory,” Cambridge University Press, 1997.

Page 12 of 12

View publication stats

You might also like