Download as pdf or txt
Download as pdf or txt
You are on page 1of 251

A Dissertation

entitled

Data Fusion of Infrared, Radar, and Acoustics Based Monitoring System

by

Golrokh Mirzaei

Submitted to the Graduate Faculty as partial fulfillment of the requirements for the

Doctor of Philosophy Degree in Engineering Degree in Engineering

_________________________________________
Prof. Mohsin M. Jamali, Committee Chair

_________________________________________
Prof. Jackson Carvalho, Committee Member

_________________________________________
Prof. Mohammed Y. Niamat, Committee Member

_________________________________________
Prof. Richard Molyet, Committee Member

_________________________________________
Prof. Mehdi Pourazady, Committee Member

_________________________________________
Dr. Patricia R. Komuniecki, Dean
College of Graduate Studies

The University of Toledo

May 2014
Copyright 2014, Golrokh Mirzaei

This document is copyrighted material. Under copyright law, no parts of this document
may be reproduced without the expressed permission of the author.
An Abstract of

Data Fusion of Infrared, Radar, and Acoustics Based Monitoring System

by

Golrokh Mirzaei

Submitted to the Graduate Faculty as partial fulfillment of the requirements for the
Doctor of Philosophy Degree in Engineering

The University of Toledo

May 2014

Many birds and bats fatalities have been reported in the vicinity of wind farms. An

acoustic, infrared camera, and marine radar based system is developed to monitor the

nocturnal migration of birds and bats. The system is deployed and tested in an area of

potential wind farm development. The area is also a stopover for migrating birds and

bats.

Multi-sensory data fusion is developed based on acoustics, infrared camera (IR),

and radar. The diversity of the sensors technologies complicated its development.

Different signal processing techniques were developed for processing of various types of

data. Data fusion is then implemented from three diverse sensors in order to make

inferences about the targets. This approach leads to reduction of uncertainties and

provides a desired level of confidence and detail information about the patterns. This

work is a unique, multifidelity, and multidisciplinary approach based on pattern

recognition, machine learning, signal processing, bio-inspired computing, probabilistic

iii
methods, and fuzzy reasoning. Sensors were located in the western basin of Lake Erie in

Ohio and were used to collect data over the migration period of 2011 and 2012.

Acoustic data were collected using acoustic detectors (SM2 and SM2BAT). Data

were preprocessed to convert the recorded files to standard wave format. Acoustic

processing was performed in two steps: feature extraction, and classification. Acoustic

features of bat echolocation calls were extracted based on three different techniques:

Short Time Fourier Transform (STFT), Mel Frequency Cepstrum Coefficient (MFCC),

and Discrete Wavelet Transform (DWT). These features were fed into an Evolutionary

Neural Network (ENN) for their classification at the species level using acoustic features.

Results from different feature extraction techniques were compared based on

classification accuracy. The technique can identify bats and will contribute towards

developing mitigation procedures for reducing bat fatalities.

Infrared videos were collected using thermal IR camera (FLIR SR 19). Pre-

processing was performed to convert infrared videos to frames. Three different

background subtraction techniques were applied to detect moving objects in IR data.

Thresholding was performed for image binarization using extended Otsu Threshold.

Morphology was performed for noise suppression and filtering. Results of three different

techniques were then compared. Selected technique (Running Average) followed by

thresholding and filtering is then used for tracking and information extraction. Ant based

Clustering Algorithm (ACA) based on Lumer and Faieta with its three different

variations including Standard ACA, Different Speed ACA, and Short Memory ACA were

implemented over extracted features and were compared in terms of different groups

iv
created for detected avian data. Fuzzy C Means (FCM) was implemented and used to

group the targets.

Radar data were collected using Furuno marine radar (XANK250) with T-bar

antenna and parabolic dish. The target detection was processed using radR which is open

source platform available for recording and processing radar data. This platform was used

to remove clutter and noise, detect possible targets in terms of blip, and save the blips

information. The tracking algorithm was developed based on estimation and data

association, independent from radR. Estimation is performed using Sequential Importance

Sampling-based Particle Filter (SIS-PF) and data association is performed using the

Nearest Neighbors (NN).

The data fusion was performed in a heterogeneous dissimilar sensory environment.

This is a challenging environment which needs many efforts in both setting up

experiments and algorithmic development. Setting up experiments includes preparation

of the equipment including purchase of the required equipment, installing the systems,

configuration, and control parameter setting. The algorithmic development includes

developing algorithms and use of the best available technique for this specific

application. Various trade-off of time, accuracy, and cost were considered.

Data fusion of the acoustics/IR/radar is a hierarchical model based on two levels:

Level 1 and Level 2. Level 1 is a homogenous dissimilar fusion based on feature level

fusion. Level 2 is a heterogeneous fusion and is based on decision level fusion. The

feature level is employed on the IR and radar data and combines the features of detected

/tracked targets into a composite feature vector. The constructed feature vector is an end-

v
to-end individual sensors’ feature vector which serves as an input to the next level. The

second level is a decision level, which uses the feature vector from L1 and fuses the data

with acoustic data. The fusion was developed based on number of fusion functions. Data

alignment including temporal and spatial alignment, and target association was

implemented. A fuzzy Bayesian fusion technique was developed for decision level

fusion, the fuzzy inference system provides the priori probability, and Bayesian inference

provides posteriori probability of the avian targets.

The result of the data fusion was used to process the spring and fall 2011 migration

time in the western basin of Lake Erie in Ohio. This area is a landscape is in the

prevailing wind and is putative for wind turbine construction. Also this area is a stopover

for migrant birds/bats and the presence of wind turbines may threatened their habitats and

life. The aim of this project is to provide an understanding of the activity and behavior of

the biological targets by combining three different sensors and provide a detail and

reliable information. This work can be extend to other application of military, industry,

medication, traffic control, etc.

vi
To My Beloved Parents

Banoozaman and Aliakbar


Acknowledgements

First, I would like to thank my dear advisor, Prof. Mohsin Jamali, who has

supported me throughout this long way with his knowledge, patience, and useful hints.

This work could not be completed without his guidance, encouragement, and advice. I

would like also to thank Dr. Mohammad Niamat, Dr. Jackson Carvalho, Dr. Richard

Molyet, and Dr. Mehdi Pourazady for their time and serving as my committee members. I

must also show my gratitude to Dr. Jeremy D. Ross, Dr. Peter Gorsevski from the

Department of Geospatial Sciences at BGSU, and Dr. Verner Bingman from Department

of Psychology at BGSU for their guidance throughout this project, which was partially

funded by the Department of Energy (Contract #DE-FG36-06G086096). I would like

express my appreciation to Ottawa National Wildlife Refuge (ONWR) and US Fish

Wildlife Service (USFWS) for allowing Dr. Jeremy D. Ross the use of their facilities for

collection of data during the spring and fall migration period of 2011 and 2012. I am also

grateful to my colleague Mohammad Majid for his contributions to this work.

I would like to thank my dear parents for their patience, love, and support, and also

my wonderful and beloved husband, Nima, for being beside me for the past four years;

such that, without their support and endless love none of this would have been possible.

viii
Table of Contents

Abstract………………………………………………………………………………………………………………………………iii

Acknowledgment………………………………………………………………………………………………………………..viii

Table of Contents…………………………………………………………………………………………………………………ix

List of Tables…………………………………………………………………………………………………………………..…xiii

List of Figures……………………………………………………………………………………………………………………..xv

1 Introduction .............................................................................................................................. 1

1.1 Problem Statement ........................................................................................................... 1

1.2 Background and Previous Works ..................................................................................... 4

1.3 Multi-sensor Environment ............................................................................................. 18

1.3.1 Acoustics ................................................................................................................ 19

1.3.2 Infrared Camera ..................................................................................................... 20

1.3.3 Marine Radar ......................................................................................................... 21

1.4 Data Fusion .................................................................................................................... 22

1.5 Thesis Outline ................................................................................................................ 24

2 Ultrasound Acoustic Monitoring System (UAMS)................................................................ 26

2.1 Acoustic Monitoring ...................................................................................................... 26

2.2 Acoustic Data Acquisition ............................................................................................. 28

ix
2.3 UAMS Feature Extraction ............................................................................................. 30

2.3.1 Fourier Transform (FT) Approach ......................................................................... 31

2.3.2 Mel Frequency Cepstrum Coefficient (MFCC) Approach .................................... 32

2.3.3 Discrete Wavelet Transform (DWT) Approach ..................................................... 35

2.4 UAMS Classification by Supervised Machine Learning Technique ............................. 38

2.4.1 Existing Classification Techniques ........................................................................ 45

2.5 Conclusion ..................................................................................................................... 49

3 Infrared Imaging Monitoring System (IIMS) ........................................................................ 51

3.1 IR camera Monitoring .................................................................................................... 51

3.2 IR Data Acquisition ....................................................................................................... 53

3.3 IIMS Detection of Moving Objects ............................................................................... 54

3.3.1 Background Modeling............................................................................................ 54

3.3.2 Thresholding .......................................................................................................... 55

3.3.3 Noise Suppression using Morphological Filters .................................................... 57

3.4 Target Tracking.............................................................................................................. 59

3.5 IIMS Feature Extraction ................................................................................................ 61

3.6 Imagery Group Clustering using Unsupervised Learning ............................................. 63

3.6.1 Ant Clustering Algorithm ...................................................................................... 63

3.6.2 Fuzzy Clustering .................................................................................................... 66

3.7 Conclusion ..................................................................................................................... 70

4 Radar Monitoring System (RMS) .......................................................................................... 71

x
4.1 Radar Monitoring ........................................................................................................... 71

4.2 Radar Data Acquisition .................................................................................................. 73

4.3 RMS Detection by Marine Radar................................................................................... 76

4.3.1 radR ........................................................................................................................ 77

4.3.2 Blips Detection via radR ........................................................................................ 78

4.4 RMS Target Tracking .................................................................................................... 81

4.4.1 Sequential Importance Sampling-based Particle Filter (SIS-PF) ........................... 84

4.4.2 Nearest Neighbor Data Association ....................................................................... 86

4.5 Conclusion ..................................................................................................................... 87

5 Data Fusion ............................................................................................................................ 89

5.1 Multi-Sensor Fusion....................................................................................................... 89

5.2 Multi-sensor Fusion Architecture .................................................................................. 91

5.3 Fusion Hierarchy............................................................................................................ 92

5.4 L1 Fusion ....................................................................................................................... 94

5.4.1 Data Alignment ...................................................................................................... 94

5.4.2 Data Association .................................................................................................... 95

5.5 Fusion Functions .......................................................................................................... 102

5.6 L2 Fusion ..................................................................................................................... 107

5.6.1 Data Association .................................................................................................. 107

5.6.2 Fuzzy Bayesian Fusion ........................................................................................ 111

5.6.3 Fuzzy System (FS) ............................................................................................... 112

xi
5.6.4 Bayesian Inference ............................................................................................... 119

5.7 Conclusion ................................................................................................................... 130

6 Experiments and Results ...................................................................................................... 131

6.1 Experimental Setup ...................................................................................................... 131

6.2 Sensor Processing ........................................................................................................ 134

6.2.1 UAMS Experiments ............................................................................................. 135

6.2.2 RMS Experiments ................................................................................................ 138

6.2.3 IIMS Experiments ................................................................................................ 140

6.3 L1 Fusion ..................................................................................................................... 144

6.3.1 Common Coverage Area (CCA Function)........................................................... 147

6.3.2 Time Alignment (TA Function) ........................................................................... 151

6.3.3 Spatial Alignment (SA Function) ........................................................................ 153

6.4 L2 Fusion ..................................................................................................................... 156

6.5 Multi-sensor Fusion for Spring 2011 ........................................................................... 164

6.6 Multi-sensor Fusion for Fall 2011 ............................................................................... 179

7 Conclusion and Future Work ............................................................................................... 206

References……………………………………………………………………………………………………………………...…211

xii
List of Tables

4.1: Radar Technical Details .......................................................................................................... 74

5.1: L1 Data Association Modules and Properties ......................................................................... 96

5.2: Notations and their Numerical Ranges ................................................................................. 117

5.3: Fuzzy Rules .......................................................................................................................... 118

5.4: Hypothesis or Event Space of Acoustics .............................................................................. 121

5.5: Avian Categories................................................................................................................... 121

5.6: A priori Probabilities ............................................................................................................ 123

5.7: A priori Probabilities ............................................................................................................ 124

6.1: Bird and Bat class and Species used in this Work ................................................................ 133

6.2: ENN Parameters ................................................................................................................... 136

6.3: Overall Classification Accuracy based on Feature ............................................................... 137

6.4: Classification Accuracy in Species Level based on .............................................................. 137

6.5: Input Data Frame of Targets ................................................................................................. 138

6.6: Particle Filter Parameters ...................................................................................................... 139

6.7: Total Detected Targets .......................................................................................................... 142

6.8: IR Extracted Features............................................................................................................ 142

6.9: Parameter Settings in ACA ................................................................................................... 142

6.10: Total Number of Clusters in Different Iterations in ACA .................................................. 143

6.11: FCM Clusters ...................................................................................................................... 144

6.12: Fusion Feature Vectors ....................................................................................................... 155

6.13: Definitions and Units of Feature Vectors ........................................................................... 157

xiii
6.14: Configuration Settings of the Fuzzy System ...................................................................... 160

6.15: Configuration Settings of the Inputs ................................................................................... 160

6.16: Parameters in Bayesian Technique ..................................................................................... 162

6.17: Sample L2 Vectors.............................................................................................................. 163

xiv
List of Figures

1-1: JDL Model Data Fusion [57] .................................................................................................. 10

1-2: DFIG Model [58] .................................................................................................................... 10

1-3: STDF Model [59] ................................................................................................................... 11

1-4: Thompoulos Model [65] ......................................................................................................... 11

1-5: Luo and Kay’s Model [66] ..................................................................................................... 12

1-6: Waterfall Model [67] .............................................................................................................. 13

1-7: Bahador’s Taxonomy of Data Fusion Methodologies [74] .................................................... 15

1-8: Block Diagram of Different Existing Applications for Data Fusion ...................................... 18

1-9: Acoustic Monitoring............................................................................................................... 19

1-10: Overall Process of IR Monitoring System ............................................................................ 20

1-11: Overall Process of Radar Monitoring System ...................................................................... 21

1-12: Data Fusion Disciplines ........................................................................................................ 22

1-13: Multi-sensory Fusion ............................................................................................................ 23

1-14: Multi-sensory Fusion ............................................................................................................ 24

2-1: Spectrogram of a Bat Echolocation Call ................................................................................ 27

2-2: (a) The AR125 from Binary Acoustics [24] (b) SM2BAT from Wildlife Acoustics [25] ..... 29

2-3: Acoustic Array in Scott Park, Ohio ........................................................................................ 29

2-4: Ohio State University’s Stone Lab, Put-in-bay, Ohio ............................................................ 30

2-5: Feature Extraction .................................................................................................................. 31

2-6: Twenty Filter Banks in MFCC ............................................................................................... 34

2-7: Block Diagram of MFCC ...................................................................................................... 35

xv
2-8: Signal Decomposition by DWT ............................................................................................. 37

2-9: Block Diagram of Training and Testing Process in ENN ...................................................... 39

2-10: Feedforward Neural Network (FNN) ................................................................................... 40

2-11: Chromosome Scheme ........................................................................................................... 41

2-12: Crossover Operation (a) Single Point (b) Multiple Points.................................................... 43

2-13: Mutation Operation .............................................................................................................. 43

2-14: Classification Process ........................................................................................................... 45

2-15: Sonobat Spectrogram for Single and Series of Echolocation Calls ...................................... 48

2-16: Spectrogram of Echolocation call of Lasionycteris Noctivagans in ..................................... 49

3-1: Block Diagram of IIMS .......................................................................................................... 52

3-2: (a) The SR-19 IR Camera and (b) Direction of IR camera..................................................... 53

3-3: Experimental Setup ................................................................................................................ 54

3-4: Thresholding ........................................................................................................................... 57

3-5: (a) Original Image (b) Image after Otsu Thresholding(c) Image after EOtsu Thresholding .. 59

3-6: Result of BS techniques after applied EOtsu and Morphology .............................................. 59

3-7: Tracking Result ...................................................................................................................... 60

3-8: Tracking Algorithm ................................................................................................................ 61

3-9: Sample Results of Detection and Tracking Algorithm ........................................................... 61

4-1: Block Diagram of RMS .......................................................................................................... 72

4-2: Marine Radar (a) Parabolic Dish (b) T-bar Antenna (c) Trailer ............................................. 74

4-3: Radar Data Acquisition .......................................................................................................... 75

4-4: Block Diagram of XIR3000C ................................................................................................. 76

4-5: Blips Detection in radR before and after noise removal......................................................... 80

4-6: Entities in RMS ...................................................................................................................... 80

4-7: Land-Water and Land-Land PPI............................................................................................. 81

4-8: Tracking of Avian Radar Processing ...................................................................................... 83

xvi
4-9: PF-SIS .................................................................................................................................... 86

4-10: Nearest Neighbor (NN) ........................................................................................................ 87

5-1: Multi-Sensor Fusion ............................................................................................................... 90

5-2: Acoustics/IR/Radar Fusion Architecture ................................................................................ 92

5-3: Fusion Category...................................................................................................................... 93

5-4: Fusion Levels of Proposed AMS Architecture ....................................................................... 94

5-5: Block Diagram of Data Association Process .......................................................................... 96

5-6: Association Matrices ............................................................................................................ 100

5-7: Association Process of L1 Fusion ........................................................................................ 101

5-8: Fusion Functions .................................................................................................................. 105

5-9: Association Matrices in L2 ................................................................................................... 110

5-10: Fuzzy Bayesian Inference................................................................................................... 112

5-11: Fusion Scheme.................................................................................................................... 113

5-12: Fuzzy System...................................................................................................................... 116

5-13: L2 Feature Vector ............................................................................................................... 129

6-1: Project Area (Ottawa National Wildlife Refuge, Ohio) ....................................................... 132

6-2: Experimental Setup .............................................................................................................. 133

6-3: Block Diagram of UAMS ..................................................................................................... 136

6-4: Classification Comparison.................................................................................................... 138

6-5: Sample Real Tracked Target using Particle Filter ................................................................ 140

6-6: Block Diagram of IIMS ........................................................................................................ 141

6-7: Coverage Area of Marine Radar in Vertical Position ........................................................... 144

6-8: Coverage Area of IR Camera ............................................................................................... 145

6-9: Common Coverage Area of IR and Radar............................................................................ 145

6-10: Common Coverage Area of IR and Radar (zoomed) ......................................................... 146

6-11: Common Coverage Area of IR and Radar (zoomed) ......................................................... 146

xvii
6-12: Common Coverage Area of IR and Radar (zoomed) ......................................................... 147

6-13: Coverage Area outside IR and Radar (zoomed side views) ............................................... 147

6-14: Field of View of IR............................................................................................................. 149

6-15: IR Direction ........................................................................................................................ 150

6-16: IR Geometry ....................................................................................................................... 150

6-17: Spatial Alignment of IR and Radar .................................................................................... 154

6-18: Spatial Alignment ............................................................................................................... 154

6-19: IR/Radar Feature Vector..................................................................................................... 155

6-20: Multi-modal Feature Vectors ............................................................................................. 156

6-21: Membership Functions of the Input Sets ............................................................................ 159

6-22: Fuzzy Rule Viewer ............................................................................................................. 161

6-23: Bird Class Composition for Available Acoustic Data in Nightly Basis in Spring 2011 .... 164

6-24: Overall Bird Class Composition for Available Acoustic Data in Spring 2011 .................. 165

6-25: Total Number of Bird Flight Calls for Available Acoustic Data in Spring 2011 ............... 166

6-26: Bat Species Composition for Available Acoustic Data in Nightly Basis in Spring 2011 .. 166

6-27: Overall Bat Species Composition for Available Acoustic Data in Spring 2011 ................ 167

6-28: Total Number of Bat Passes for Available Acoustic Data in Spring 2011 ......................... 168

6-29: Flight Direction of Targets for Available IR Data in Spring 2011 ..................................... 168

6-30: Total Number of IR Tracks for Available IR Data in Spring 2011 .................................... 169

6-31: Total Number of Radar Tracks for Available Data in Spring 2011 (Vertical Mode) ......... 170

6-32: Range of Radar Tracks for Available Data on 4/22/2011 .................................................. 170

6-33: Range of Radar Tracks for Available Data on 4/24/2011 .................................................. 171

6-34: Range of Radar Tracks for Available Data in 4/26/2011 ................................................... 172

6-35: Total Number of Bird Flight Calls over Common Nights in Spring 2011 ......................... 172

6-36: Bird Class Composition over Common Nights in Spring 2011 ......................................... 173

6-37: Flight Direction of Targets over Common Nights in Spring 2011 ..................................... 173

xviii
6-38: Total Number of IR Targets over Common Nights in Spring 2011 ................................... 174

6-39: IR Direction in Night of (a) 05/04/2011 (b) 05/05/2011 .................................................... 174

6-40: IR Direction in Night of (a) 05/06/2011 (b) 05/08/2011 .................................................... 175

6-41: IR Direction in Night of 05/09/2011 .................................................................................. 175

6-42: Total Number of Radar Tracks over Common Nights ....................................................... 176

6-43: Direction of Different Bird Classes for Fusion Data in Spring 2011 ................................. 176

6-44: Overall Direction of Bird Classes for Fusion Data in Spring 2011 .................................... 177

6-45: Range of Warblers for Fusion Data in Spring 2011 ........................................................... 178

6-46: Range of Thrushes for Fusion Data in Spring 2011 ........................................................... 178

6-47: Range of Sparrows for Fusion Data in Spring 2011 ........................................................... 179

6-48: Bird Class Composition for Available Acoustic Data in Nightly....................................... 180

6-49: Overall Bird Class Composition for Available Acoustic Data in Fall 2011 ....................... 180

6-50: Total Number of Bird Flight Calls for Available Acoustic Data in Fall 2011 ................... 181

6-51: Bat passes of Different Species for Available Acoustic Data in ........................................ 182

6-52: Overall Bat Species Composition for Available Acoustic Data in Fall 2011 ..................... 183

6-53: Total Number of Bat Passes for Available Acoustic Data in Fall 2011 ............................. 184

6-54: Total Number of Radar Tracks for Available Data in Fall 2011 (Vertical Mode) ............. 184

6-55: Range of Radar Tracks for Available Data in 8/25/2011 ................................................... 185

6-56: Range of Radar Tracks for Available Data in 8/26/2011 ................................................... 185

6-57: Range of Radar Tracks for Available Data in 8/27/2011 ................................................... 186

6-58: Range of Radar Tracks for Available Data in 8/28/2011 ................................................... 186

6-59: Range of Radar Tracks for Available Data in 8/29/2011 ................................................... 187

6-60: Range of Radar Tracks for Available Data in 8/30/2011 ................................................... 187

6-61: Range of Radar Tracks for Available Data in 9/1/2011 ..................................................... 188

6-62: Range of Radar Tracks for Available Data in 9/3/2011 ..................................................... 188

6-63: Range of Radar Tracks for Available Data in 9/4/2011 ..................................................... 189

xix
6-64: Range of Radar Tracks for Available Data in 9/5/2011 ..................................................... 190

6-65: Range of Radar Tracks for Available Data in 9/6/2011 ..................................................... 190

6-66: Range of Radar Tracks for Available Data in 9/9/2011 ..................................................... 191

6-67: Range of Radar Tracks for Available Data in 9/10/2011 ................................................... 191

6-68: Range of Radar Tracks for Available Data in 9/11/2011 ................................................... 192

6-69: Range of Radar Tracks for Available Data in 9/12/2011 ................................................... 192

6-70: Range of Radar Tracks for Available Data in 9/13/2011 ................................................... 193

6-71: Total Number of Bird Flight Calls over Common Nights in Fall 2011 .............................. 194

6-72: Bird Class Composition over Common Nights in Fall 2011 .............................................. 194

6-73: Total Number of Bat Passes over Common Nights in Fall 2011 ....................................... 195

6-74: Bat Species Composition over Common Nights in Fall 2011 ............................................ 195

6-75: Flight Direction of Targets over Common Nights in Fall 2011 ......................................... 196

6-76: Total Number of IR Tracks over Common Nights in Fall 2011......................................... 196

6-77: IR Direction in Night of (a) 08/29/2011 (b) 08/30/2011 .................................................... 197

6-78: IR Direction in Night of (a) 09/01/2011 (b) 09/03/2011 .................................................... 197

6-79: IR Direction in Night of (a) 09/04/2011 (b) 09/05/2011 .................................................... 198

6-80: IR Direction in Night of (a) 09/10/2011 (b) 09/11/2011 .................................................... 198

6-81: IR Direction in Night of (a) 09/12/2011 (b) 09/13/2011 .................................................... 199

6-82: IR Direction in Night of (a) 09/15/2011 (b) 09/16/2011 .................................................... 199

6-83: IR Direction in Night of (a) 09/26/2011 (b) 09/27/2011 .................................................... 200

6-84: Total Number of Radar Tracks over Common Nights in Fall 2011 ................................... 200

6-85: Direction of Different Bird Classes for Fusion Data in Fall 2011 ...................................... 201

6-86: Overall Direction of Bird Classes for Fusion Data in Fall 2011 ........................................ 202

6-87: Direction of Bat Passes for fusion Data in Fall 2011 ......................................................... 202

6-88: Direction of Bat Passes for Fusion Data in Fall 2011 ........................................................ 203

6-89: Range of Epfus for Fusion Data in Fall 2011 ..................................................................... 203

xx
6-90: Range of Labos for fusion Data in Fall 2011 ..................................................................... 204

6-91: Range of Nyhu and Mylu for Fusion Data in Fall 2011 ..................................................... 204

6-92: Range of Warblers for Fusion Data in Fall 2011................................................................ 205

6-93: Range of Thrushes for Fusion Data in Fall 2011................................................................ 205

xxi
Chapter 1

1 Introduction

1.1 Problem Statement


Wind energy is promoted to reduce reliance on traditional energy sources.

Although there are many benefits and attractions with this source of energy, there are

many environmental issues that deserve considerations. One of the concerns is the impact

of the wind energy deployment on the avian life. Wind turbines may be hazardous

structures for birds and bats, especially during nocturnal migration time. It has been

reported that there are large number of bird/bat mortality due to collision or other factors

near wind turbines [1][2][3]. They may be attracted to rotation of turbine blades and be

directly struck by turbine blades or perish by barotrauma, caused by a sudden drop in

atmosphere pressure near the blades [4]. There is also large number of bat mortality due

to White Nose Syndrome (WNS) [5]. These phenomena are reducing bird/bat populations

and have become an important issue. Therefore, quantification and identification of birds

and bats is critical in area potential for future construction of wind farms to help their

preservation. More consideration should be given if the birds/bats are on endangered list.

In this work the multi-sensory environment is used in the application of Avian

Monitoring System (AMS) in wind farms. The targets in our work are biological targets

which are birds and bats, and for ease of use, we refer them as “target”, in the rest of this

1
document. The multi-sensor monitoring approaches in this study are intended to facilitate

rapid but accurate biological assessments. The goal of this work is to monitor, quantify,

and recognize birds and bats in the vicinity of wind turbines. It assists wildlife biologists

in the observation of avian targets’ behaviors, also to help and guide the mitigation

process which may reduce unwanted effects on birds/bats. This research may result in

contributing towards their preservation if appropriate mitigation measures are employed.

Our approach provides information on local chiropteran species diversity and

nocturnal activity (potentially co-opted for avian monitoring) along with temporal

variation in these measures and the potential for wind turbines to negatively affect

breeding and migratory populations of bats (and birds). The specific threat posed by wind

turbines can be mitigated in two ways. First, it would be applied by properly siting

turbines in areas of relatively low biological importance. The multi-sensor approach

tested in this study provides the means to evaluate localized attributes of birds and bats

foraging activity especially within potential rotor-swept zones of wind turbines. Second,

strike-risks between birds/bats and turbine blades can be reduced by feathering the

turbine (i.e., stopping rotation) during periods of high avian activity and/or elevated

species diversity (potentially indicative of elevated migration events).

Three different types of sensors are used in this work: acoustics, infrared camera

(IR), and radar. Each sensor provides specific information regarding the targets.

Acoustics help in identification of targets using frequency features in the acoustic signals.

Infrared camera capture the images of targets and offer quantification and passage rate

information , as well as providing other valuable facts such as straightness index, heat,

size, velocity, and direction of the flight. Finally, radar gives altitude information, which

2
is not provided in IR sensory. Moreover, information such as range, area, perimeter,

intensity, and some other statistics is provided by radars.

The data fusion in multi-sensory environment is very important and has received

considerable attention in the past few years, mainly because of the wide range of its

potential applications. Data fusion of the sensors provides complementary information

for behavior analysis of the targets. In this environment, each sensor offers surveillance

and makes independent measurements and reports it to the sensor’s central processing

node. The central process node of each sensor measures the parameters (target signature

and target state parameter) and processes the decision and reports it to the fusion node. In

the fusion node, the reports of all sensors are correlated and an overall decision is made

based on one of the following hypothesis:

1. New target detection

2. Existing target set

3. False alarm

There are many advantages for multiple sensor systems [6]:

 Higher signal-to-noise ratio

 Increased robustness and reliability in the case of sensor failure

 Obtained Information regarding independent features in the system

 Extended parameter coverage, rendering a more complete picture of the

system

 Increased dimensionality of the measurement

 Enhanced resolution

3
 Reduced ambiguity

 Increased assurance

 Increased hypothesis discrimination with the aid of additional information

arriving from multiple sensors

 Reduction in measurement time, and possibly costs - there is a trade-off to

consider in this issue. Thus, an optimal number of sensors to extract the

required information from a system should be ideally pursued.

Following section explains the background and related works in the data fusion in

monitoring system using multi-sensor environment.

1.2 Background and Previous Works


Data fusion is important in different applications such as target recognition and

tracking [7], traffic control [8], remote sensing [9], battlefield surveillance [10],

maintenance engineering [11], mine detection [12], robotics [13], and medical

applications [14]. Many studies and works have made contributions to the development

of different parts of this technology. However, by emergence of new sensors, more

advanced sensor processing techniques are needed to provide reasonably accurate and

practical results.

Several bird/bat acoustic monitoring techniques have been performed previously [15]

[16][17]. A research study conducted by Griffin [18] explained that echolocation calls

differ from species to species. However, first publications devoted to acoustic bat

recognition appeared 23 years later and was published by Ahlen [19] and Fenton and Bell

[20]. Further analysis of ultrasound signals was enabled by improved technology [21].

4
Allen et al. [22] performed acoustic monitoring of flight activity of bats over 34 nights

in Missouri using three different detectors; zero-crossing Anabat [23], and two full

spectrum detectors: AR-125 [24] and SM2 [25]. Comparisons were made over different

parameters such as memory consumption, total files collected, total bat passes, species

and species group identification, quality of call sequences, and reported call parameters.

They have used two identification software packages: BCID [26], and Sonobat 3 NE

[27]. Results of this study show full spectrum detectors clearly collect more data and are

more useful in case of calls from rare, quiet, or difficult species.

Several bird/bat call automated classification methods were proposed based on

machine learning algorithms such as Support Vector Machine (SVM) [28], Artificial

Neural Network (ANN) [16][29][30], statistical technique such as Discriminate Function

Analysis (DFA) [16][31][32], Ensembles of Neural Networks [28], Hidden Markov

Models (HMM) [33], and k-Nearest Neighbor (k-NN) [34].

According to Armitage and Ober [35], the machine learning algorithms have better

performance when compared with DFA. However, among the machine learning

algorithms, the ANN provides the highest correct classification rate of the training sets.

On the other hand, the Back Propagation Neural Network (BPNN) suffers from some

limitations associated with network training. Its back propagation process is very

computationally-intensive. The reason is that the training in BPNN, unlike the other

classification algorithms, is based on an iterative algorithm for the computation of

weights. Also BPNN cannot guarantee an optimal solution and might converge to a set of

sub-optimal weights [36].

5
Redgwell et al. [28] classified the echolocation calls from 14 species by SVM and

Ensembles of Neural Networks. According to them, both SVMs and Ensembles of Neural

Networks outperformed DFA for all of the species, while Ensembles of Neural Networks

outperformed SVM by a 10% higher accuracy rate. However, the classification result of

Ensembles of Neural Networks varies by species.

Selin et al. [37] used ANNs such as unsupervised Self-Organizing Map (SOM) and

supervised Multilayer Perceptron (MLP). Results of these studies were encouraging and

pointed toward the MLP being a better classifier than the SOM for recognition of bird

sounds. An acoustic monitoring study was performed by Cornell Laboratory of

Ornithology [15] to assess and minimize the effect of wind turbines on night-migrating

birds in Nebraska and Northern New York State. They have used PZM microphones and

tapes for recording, and analyzed the data by ear and using their sound analysis software.

In a work by Bardeli et al. [38] bird sound detection system is developed based on

detecting temporal patterns in a given frequency band typical for the species. Their

algorithm has been designed specifically for certain target endangered species in

Germany: Eurasian Bittern (Botaurus stellaris) and the Savi’s Warbler (Locustella

lusciniodies).

According to a research study conducted by Sun et al. [39] calls were divided into

two categories: FM (Frequency Modulated)/CF (Constant Frequency)/FM and FM.

Sonograms were able to distinguish only one out of five species with FM/CF/FM

characteristics. Parameters such as starting frequency, ending frequency, peak frequency,

duration, longest inter-pulse interval, and short inter-pulse interval were extracted and

used as the features of the echolocation calls for the identification of other four species.

6
They used the mist netting and captured number of bats. Sample bats collected were then

identified at species level using pictures from three biological books. Calls from collected

bats were recorded using the Petterson D980 bat detector. They were analyzed using

Discriminant Function Analysis (DFA) and results show promising results. Authors

recommended that DFA should be combined with an artificial neural network for better

accuracy.

S. Parson et al. [16] identified echolocation calls from twelve bat species using DFA

and artificial neural network. They recorded calls using an S-25 [40] detector. Temporal

and spectral features were measured for echolocation calls identification. They concluded

that the ANNs outperformed their equivalent DFA. Obrist et al. [41] in their work used a

Petterson D980 [42] bat detector for recording echolocation calls. Parameters such as

duration, highest frequency, lowest frequency, and frequency of main energy were

extracted and used for statistical based identification using DFA.

Avian monitoring can also be conducted using imaging sensors such as

infrared/thermal camera and/or radar [43][44][45][46]. Gauthreaux et al. [43] at Clemson

University used a thermal imaging camera (resolution of 640×482 pixel, 60

frame/second) and fixed-beam marine radar (Pathfinder model 3400, Raytheon Inc.) for

monitoring of bird migration. They pointed the thermal camera and narrow radar beam up

aligning the top of the field of view towards the north. The thermal camera provides the

(x, y) coordinates of the target and the radar gives its altitude (z-coordinate). Gauthreaux

et al. [47] also had equipped the radar with a Geographical Positioning System (GPS).

Their system provides the location information of the target including the latitude and

longitude. They correlated the results with the WSR 88D data. Jamali et al. [48] designed

7
a remote monitoring system to monitor avian activity for on-shore/off-shore wind

turbines applications using IR camera and radar. The system is capable of synchronizing

the IR and radar from a remote location via 3G system. The data can be transferred over

the internet.

There is a commercially available mobile avian radar system [49] from Geo-Marine

that uses X-band marine radar with T-bar antenna operating in vertical and horizontal

modes. Different information is achieved by modes of radar. Altitude information of the

target is available when radar is operating in vertical mode. Range, speed, and direction

of the target can be computed when radar is in the horizontal mode.

Radar can also be used for identification of species. DeTect Inc [50] has designed

software that identifies bird species using their wing beat frequency. When the radar

beam detects the bird, fluctuations in the target echo is used to obtain the wing beat

frequency which is useful in distinguishing between birds and bats at the species level.

Zaugg et al. [51] extracted wing flapping pattern and other variables related to signal

intensity. Support vector classifier was then used to develop predictive models.

In 1998 Clemson University Radar Ornithology Laboratory (CUROL) had

developed BIRDRAD [52] which is a bird detecting system. The system consists of

Furuno FR-2155 X-band radar with T-bar antenna and a GPS attached to a trailer. One

year later, they had developed BIRDRAD based on the FR-2155BB. The new system

used video signals that were sent to a computer monitor for further analysis.

The number of sensors used in the monitoring is an important factor for obtaining

reliable result. Richardson et al. [53] in an analytical proof showed that in the most

8
situations the decisions made based on multiple sensors were more reliable than fewer

sensors. However, the definition and calculation of the optimum number of sensors which

are needed for a given system is challenging [54]. Even though there are literature and

books which theoretically explain the fundamentals of data fusion [55][56], there is less

exploration of practically implemented data fusion systems.

There are several data fusion models existing in the literature [57][58][59]. A

generalized and dominant model within fusion community is Joint Directories of

Laboratories (JDL) [57] data Fusion Subgroups Data Fusion Lexicon. The JDFL defines

data fusion as “a process dealing with the association, correlation, and combination of

data and information from single and multiple sources to achieve refined position and

identity estimates, and complete and timely assessment of situations and threats, and their

significance”. This model is developed in multiple levels dealing with automatic

detection, association, correlation, and combination of data followed by situation and

threat assessment. There are different revisions of JDL model [60][61][62], Figure 1-1

shows a revised JDL model [60].

Data Fusion Information Group (DFIG) [58] introduces a data fusion model by

incorporating human decision making and resource management levels to higher level of

fusion model. It assigns the machine labor based interfacing at the lower level of fusion

model. Figure 1-2 illustrates the DFIG model.

9
Level 0 Level 1 Level 2 Level 3
Process Threat
Object Situation
Assignment Refinement
Refinement Refinement
(Source pre- (Impact
processing) Assessment)
Human
Computer
Source Interaction

Database Management System


Process
Refineme
nt
Level 4

Figure 1-1: JDL Model Data Fusion [57]

Information Fusion

Explicit Tacit Human


Sensors Decision
Real Fusion Fusion
and Sources L0 L5 Making
World
L1 L2/3

Machine Human

L4

Platform Resource Management

L6
Ground
Station Mission Management

Figure 1-2: DFIG Model [58]

10
The State Transition Data Fusion model [59][63][64] rejects the separation or

division of levels of machine labor and human decision making. According to this model

the weakness of either of machines and humans can be complemented by the strength of

the other in a mix initiative. The STDF model is shown in Figure 1-3.

Level 0 Level 1 Level 2 Level 3


Human Situation Awareness Situation Awareness Situation Awareness Situation Awareness
(Sensation) (Perception) (Comprehension) (Projection)

Interface Interface Interface Interface

Level 0 Level 2 Level 3


Level 1
Machine Sub-Object Situation Impact
Object Assessment
Assessment Assessment Assessment

Figure 1-3: STDF Model [59]

Thompoulos Architecture [65] is a data fusion architecture containing three modules

of signal level, evidence level, and dynamic level fusion. Single or combination of levels

can be used in different applications. This model is shown in Figure 1-4.

Signal Level

Sensors Evidence Level


Database

Dynamic Level

Figure 1-4: Thompoulos Model [65]

11
Luo and Kay [66] proposed multi sensor integration and made a differentiation

between multi-sensor integration and fusion. According to them, multi sensor integration

refers to utilizing multiple sensors to provide different aspects of information for one

task, and data fusion can be at any step in the integration which combines data. The Luo

and Kay’s architecture for data integration and fusion is shown in Figure 1-5. The data

driven from sensors are fused in different levels and the formation of data is represented

from raw data to higher level representation.

High
Information System

Symbol

Fusion
Feature
Fusion

Fusion
Pixel

Signal
S1 S2 S3 S1
Low

Figure 1-5: Luo and Kay’s Model [66]

Harris [67] introduced a hierarchical fusion model known as waterfall model. This

model is based on three levels: first level deals with the raw data pre-processing. The

second level is feature level which deals with feature related processing consisting of

feature extraction and pattern recognition. Finally, the third level is integration of beliefs

12
and their associated probabilities to make decision and human interaction. The waterfall

model is shown in Figure 1-6.

Decision
Making
Situation
Assessment
Pattern Level 3
Processing
Feature Level 2
Extraction

Pre-processing
Level 1
Sensors

Figure 1-6: Waterfall Model [67]

There are many definitions of data fusion that exists in the literature. According to

Abidi and Gonzalez [68], “Data fusion deals with the synergistic combination of

information made available by various knowledge sources such as sensors, in order to

provide better understanding of a given science”. Hall [69] describes “Mutisensor data

fusion seeks to combine data from multiple sensors to perform interferences that may not

be possible from single sensor alone”. According to DSTO [70], “Data fusion is a

multilevel, multifaceted process dealing with the automatic detection, association,

correlation, estimation, and combination of data and information from single and multiple

sources”.

Wald [71] states “Data fusion is a formal framework in which is expressed means

and tools for the alliance of data of the same originating from different sources. It aims at

obtaining information of greater quality; the exact definition of greater quality will

13
depend upon the application”. Steinberg and Bowman [60] believe “Data fusion is the

process of combining data to refine state estimates and predictions”.

According to Llinas [72], “Information fusion is an Information Process dealing

with the association, correlation, and combination of data and information from single

and multiple sensors or sources to achieve refined estimates of parameters,

characteristics, events, and behaviors for observed entities in an observed field of view. It

is sometimes implemented as a fully automatic process or as a human-aiding process for

analysis and/or decision support.”

Challa and Gulres [73] believe “Multi-sensor data fusion is a core component of all

networked sensing systems, which is used either to: a) join/combine complementary

information produced by sensor to obtain a more complete picture or b) reduce/manage

uncertainty by using sensor information from multiple sources. Khaleghi and Khamis

[74] state “Multisensor data fusion is a technology to enable combining information from

several sources in order to form a unified picture”.

Bostrom and Andler [75] proposed a new definition as “Information fusion is the

study of efficient methods for automatically or semi-automatically transforming

information from different sources and different points in time into a representation that

provides effective support for human or automated decision making”. Jalobeanu and

Gutierrez [76] states “The data fusion problem can be stated as the computation of the

posterior pdf (probability distribution function) of the unknown single object given all

observations”. More existing definitions of data fusion are collected by Bostrom and

Andler [75].

14
Khaleghi et al. [74] have provided a comprehensive review of existing fusion

methodologies and techniques. According to them the majority of the issues and

problems in data fusion arise from imperfections of data provided by sensors, data

correlation, inconsistency, and disparateness, as shown in Figure 1-7. They also have

categorized the existing techniques in each stage of their taxonomy.

Data Related Fusion Aspects

Imperfection Correlation Inconsistency Disparateness

Uncertainty Imprecision Granularity Conflict Outlier Disorder

Vagueness Ambiguity Incompleteness

Figure 1-7: Bahador’s Taxonomy of Data Fusion Methodologies [74]

Data fusion is used in different applications to tackle real world problems. Several

works are available for the biometric applications [77][78][79][80]. In a study by Ross

and Govindarajan [78], multiple biometric sources are used for person identification

purposes. Data fusion is performed at feature level using face and hand geometry. Gao

and Maggs [80] proposed a multi-modal and multi-view personal identification system.

Multiple view features from different viewpoints and from different sensors are

integrated in feature level using their proposed similarity measure. Bokade and Sapkal

[79] proposed a feature level data fusion using face and palm-print features. They

extracted the features using Principal Component Analysis (PCA) and normalize match

scores were generated by features. Euclidean distance and feature distance were used to

15
make final decision. In a bi-modality fusion approach, Rattani et. al [77] used a feature

level fusion to fuse the face and fingerprints features. Dieckmann et al. [81] involved the

lip motion feature in the face and speech identification system to improve the system

reliability.

In the road obstacle detection, data fusion is used to warn or intervene in risky

situations. Amdities et al. [82] performed data fusion of infrared camera and radar in

automotive preventative safety system. Their approach is based on Kalman filter. The

azimuth and lateral velocity information is provided by IR and range is given by radar. In

a similar work, Guilloux et al. [83] developed a system to detect obstacles and avoid

collision using IR and radar. Radar was used due to accuracy in speed estimation and its

capability in higher distances and even in poor weather conditions. IR is used as it shows

heat of objects which is due to human activity and may be a source of danger.

In medical imaging applications, data fusion plays an important role. Due to the

sensitivity of the medical data, the reliability and detail information is in high demand.

Images can be taken from different sources and fusion of data makes the diagnosis more

reliable. Vince and Adali [84] presented a framework to integrate the different types of

brain imaging. They developed their framework in group level and using the extracted

features of each type of data.

Data fusion is used in mine detection applications. Ajith and Briana [85] developed

a feature level fusion for pre detection of mines and decision level fusion for post

detection of mines. Non- linear optimization techniques were used in the locality of their

region of interest. Cremer and Jong [86], developed a feature level fusion to combine data

16
from polarimetric infrared imaging sensor and video impulse GPR in landmine detection

application. They show their fusion method always outperform in accuracy than the

single sensor processing.

Data fusion is used in detection and tracking targets in surveillance applications

[87][88]. Yang et. al [87] proposed a technique to make synergy decision from data

driven by Infrared Search and Track (IRST) and intermittent-working radar. They aimed

to decrease the chance of radar being locked by adverse Electronic Support Measure

(ESM). A distributed data fusion technique is provided by Akselrod et al. [88] in multi-

sensor multi target tracking. They presented a decision mechanism which provides the

required data for fusion process while reducing the redundancy in the information. Jesus

et. al. [89] developed a surveillance system using acoustic radar and video capture

modules. The radar system detects targets by acoustic procedures in audio band of (5

KHz - 20 KHz), estimate their position, and generate the alarm of acoustic detection. A

Monitoring manager is associated with a database in which some parameters and

information is generated such as files of captured image and sound, associated images,

parameters of configuration, etc.

In the application of ornithology, fusion of fixed-beam radar with thermal camera

by Gauthreaux et al. [43] combined the result of the two sensors: IR and radar. In their

work, the camera and radar are aligned together to get the altitude information by radar,

and IR is used for locational information of the target. They manually compared two sets

of data. Figure 1-8 shows a block diagram of different existing applications where data

fusion is being used.

17
Road Obstacle Target
Detection Tracking

Biometric
Robotics

Fusion
Land mine Medical
Detection Imaging

Engine
Diagnosis Ornithology

Figure 1-8: Block Diagram of Different Existing Applications for Data Fusion

1.3 Multi-sensor Environment


The multisensory environment in AMS is based on three different types of sensors:

acoustics, infrared camera (IR), and marine radar. This environment is a heterogeneous

environment where the sensors are neither the same type nor technology. Radar and IR

are imagery sensors and the third sensor is an acoustic detector. Each sensor needs to be

investigated and appropriate processing techniques should be adopted.

Sensors are deployed in our project area in Ohio along the coastline of western

Lake Erie basin during the spring (May-July) and fall (Aug-Oct) migration seasons of

2011 and 2012. Recordings were made nightly from one hour after sunset to one hour

before sunrise. Sensors in AMS along with their overall process are described as follows;

18
1.3.1 Acoustics

Acoustic sensors are useful for identification and classification of targets. In our

work, the data were collected using recorders set up near the potential future wind farm in

the project area and were transferred to the lab for further processing. Different

techniques and algorithms based on signal processing and machine learning were

implemented and compared with existing methods and commercially available software.

The data were the echolocation calls of bats and nocturnal bird flight calls which

were recorded by two different types of recorders for birds and bats. Parameters also

were set based on the type of interested signals. Setting parameters include high/low pass

filter, sampling rate, channel, compression, division ratio, schedule, etc. Feature

extraction and classification algorithms were developed for recorded data [90][91].

Feature extraction techniques are Frequency Cepstral Coefficients (MFCC) [34][92],

Wavelet Transforms (WT)[93][94], and Short Term Fourier Transform (STFT) [94]

which provide the significant features of the calls and are representative of signals while

reducing the dimensionality of signals. Extracted features were then used in classification

and identification process. Evolutionary Neural Network (ENN) [95] was developed and

used to identify the targets. The overall process of acoustic system is shown in Figure 1-

9.

Data Feature
Extraction Classification
Collection

Figure 1-9: Acoustic Monitoring

19
1.3.2 Infrared Camera

IR camera is a useful tool for quantification purposes in AMS. IR cameras are

widely used because the images are independent of lighting conditions and monitoring is

performed during nights. It also provides the temperature information that can be used to

determine body mass. In our work, IR camera is used for detection and tracking of

targets which helps to provide information of the flight pattern, target’s behaviors and

activity, such as direction, velocity, heat, and straightness index. The infrared images

were collected using FLIR SR-19 model IR camera set up in the project area. The IR

camera was vertically pointing upwards. The data obtained from the camera were

processed using different techniques of image processing and machine learning area such

as background subtraction, thresholding, filtering, feature extraction, and clustering.

Targets are detected by applying background subtraction and connectivity of

components in tracking [96]. The background subtraction techniques implemented in this

work include Running Average (RA) [97], Running Gaussian Average (RGA) [97], and

Mixture of Gaussian (MOG) [97][98]. Thresholding method is applied for binarization of

images to detect possible blips. Filtering is applied using morphological techniques.

Tracking and trajectory model is achieved based on labeling and component connectivity.

The objects with the same area are given the same labels and dilation is used to design

the trajectory. The overall process of IR monitoring system is shown in Figure 1-10.

Data Target
Collection Detection Trajectory/Tracking

Figure 1-10: Overall Process of IR Monitoring System

20
1.3.3 Marine Radar

Radar is a useful tool for detection and tracking of targets especially in the nights

and also in situations with poor visibility due to fog or clouds. Also radars are beneficial

for detecting targets over a wider range. A marine radar from Furuno (1500 Mark 3) [99]

is used for radar tracking in this work. This is X-band radar which can detect a target at a

reasonably high range. Marine radar is widely used because of high resolution,

commercial availability, ease of maintenance, and low cost, as well as providing range

and altitude information. However, target detection with marine radars is challenging due

to a high amount of clutter and noise. Protection shield around the radar beam may

reduce the ground clutter to a certain degree. Elevating the antenna’s mount also helps to

remove part of the clutter.

Radar monitoring in AMS is processed for blip detection and target tracking. Blip

detection is performed in radR [100] and using available and developed tracking

algorithms. Tracking includes estimation and data association. Estimation is implemented

using Sequential Importance Sampling-based Particle Filter (SIS-PF) [101] and data

association is accomplished using the Nearest Neighbors (NN) [102]. Radar data provides

target information including area, range, perimeter, height, and location of the target. The

overall process of radar monitoring system is shown in Figure 1-11.

Data Target
Tracking
Collection Detection

Target
Association
Estimation

Figure 1-11: Overall Process of Radar Monitoring System


21
1.4 Data Fusion
Fusion of multisensory data provides significant advantages over single source data.

The main advantage of employing fusion is to produce a fused result that provides the

most detailed and reliable information possible. Fusing multiple information sources

together also produces a more efficient representation of the data. Techniques to combine

or fuse data are drawn from a diverse set of disciplines such as digital signal processing,

statistics, artificial intelligence, pattern recognition, numerical methods, and information

theory, as shown in Figure 1-12.

Pattern Information
Recognition Theory

Data Fusion Machine


Artificial
Intelligence Techniques Learning

Numerical
Statistics Methods

Figure 1-12: Data Fusion Disciplines

There are many applications of multisensory data fusion in military and nonmilitary

areas. Military applications include automatic target recognition, control for autonomous

vehicles, remote sensing, battlefield surveillance, etc. Non-military applications include

monitoring of engineering processes, condition-based maintenance of complex

equipment, robotics, and medical applications, etc.

The goal of AMS is to make a reliable and complete decision by combining and

fusing the data drawn from three sensors, as shown in Figure 1-13. The radar provides the

22
ability to accurately determine the target’s range, but has a limited ability to determine

the angular direction of a target. By contrast, the infrared imaging sensor can accurately

determine the target’s angular direction, but is unable to measure range. The acoustic data

provide the category classification such as species in case of avian targets; however, it

doesn’t provide the quantification information. So the fusion of all of these data is

complementary in nature and will provide more accurate and reliable results for AMS.

Sensor 2:
Radar

Sensor 1: Sensor 3:
Acoustics Fusion IR

Joint Identity
Declaration

Figure 1-13: Multi-sensory Fusion

Radar, infrared camera, and acoustic detector are used as the data collective

sensors. The processed data from single sensors is then fused together as shown in Figure

1-14. The data of IR and radar are fused together based on F1 fusion level [103] and the

result is fused with acoustics based on F2 fusion level. The F1 is a homogenous fusion

based on imagery fusion of IR/Radar, while F2 is a heterogeneous fusion process. The

final result is an overall inference of data from three sensors.

23
Acoustics

Radar
F2 T
F1

IR

Figure 1-14: Multi-sensory Fusion

Data fusion is a complex task which needs knowledge of the sensors. It will be more

complex if sensors use different technology and type (heterogeneous sensors), or if the

sensors are not synchronized in a heterogeneous sensory environment during the data

recording.

1.5 Thesis Outline


This thesis is organized as follows:

 Chapter 1: This chapter is the introduction to the thesis. It discusses the

motivation behind the work. An overview of the thesis is provided to clarify the

aim of this work. The background and related work is provided.

 Chapter 2: Acoustic signal processing including feature extraction and

classification is explained. A new technique of classification based on bio-

inspired computing and neurocomputing called evolutionary neural network is

proposed in this application and compared with other existing classifiers. The

developed classifier is combined with different feature extraction techniques and

24
results are compared.

 Chapter 3: Infrared image processing including data acquisition, target detection,

and building target trajectory is explained. Image processing algorithm including

background subtraction and object detection, thresholding, noise suppression are

implemented. Feature extraction of tracks and bio-inspired based clustering are

developed and described.

 Chapter 4: Marine radar data processing including data acquisition and

processing is explained. Target detection and tracking technique is developed to

detect the targets.

 Chapter 5: Data fusion of the acoustics, infrared camera, and radar is proposed.

Fusion architecture and fusion hierarchy in two levels are developed and

explained in details. Data alignment and association are described. Fuzzy

Bayesian based decision level fusion is proposed.

 Chapter 6: Data collection and processing of data is described. Simulation and

Results are shown.

 Chapter 7: Conclusions and future work on this topic are provided.

25
Chapter 2

2 Ultrasound Acoustic Monitoring System (UAMS)

2.1 Acoustic Monitoring


Avian Acoustic monitoring system is an automated approach for recognition and

identification of targets (birds and bats) using characteristics of the signals. All

chiropterans have echolocation calls within ultrasound frequency range (5 KHz-120

KHz) and birds have calls in audio frequency range of 200 Hz - 15 KHz. This work is

mainly focused on the processing of ultrasound frequency calls of bats; however, it can

be applicable to bird flight calls considering the frequency difference.

Traditionally, the identification of bats is performed by morphometric

characteristics [17]. In automated classifications, the conspicuousness of the echolocation

signal is used as a mean to recognize the calls between different species. The

echolocation calls have some specific features and various characteristics that can be used

to differentiate among different species. However, the structure of the echolocation calls

varies due to morphology, age, geographical variation, context and behavior. Figure 2-1

shows the spectrogram of an actual recorded bat call. Some of the features of the call are

shown in the spectrogram including maximum frequency (F_max), minimum frequency

(F_min), frequency of knee (Fk), frequency at maximum power (F_maxp), center

frequency (Fc), time from maximum to characteristic frequency (Tc), and duration (D).

26
Figure 2-1: Spectrogram of a Bat Echolocation Call

Ultrasound Acoustic Monitoring System (UAMS) of echolocation calls can

provide information such as species presence and relative abundance. As an index for

target activity, this system tallies number of passes, defined as any sequence of more than

two calls in a specified time. The pass measurement is not a direct measure of the number

of individuals as some bats’ calls may be recorded multiple times if they loiter within

range of the recorder or, alternatively, are not recorded at all as they pass. This is even

more complex especially when groups of identical species are present in the area. In such

a case, the calls from the groups are not differentiated with a single target hovering

several times. The numbers of detected echolocation calls from different species are not

necessarily reflective of each species’ relative abundance in the area since the calling rate

of different species may be different.

The UMAS is divided into three main steps: data collection, feature extraction,

and classification. Data were collected in our project area at Ottawa National Wildlife

Refuge, in Ohio during migration period of 2011 and 2012. Feature extraction was

performed using different signal processing techniques to find the significant features of

the signals. They have been tested using real data. Species classification was performed

27
using a proposed bio-inspired learning based technique. It is then compared with existing

classification techniques. Data acquisition, feature extraction, and classification are

described in details in following sections.

2.2 Acoustic Data Acquisition


Frequencies of all chiropteran vocalizations are in ultrasound range between 20

KHz to 120 KHz. The detectors with ultrasound frequency recording capability were

used to record the bat’s echolocation calls. Recording first started using Binary Acoustic

Technology recorder (AR-125) [24] which was tested at several sites, according to

Figures 2-2(a) and 2-3. Its use was discontinued due to challenges of weather, placement,

and need for continuous monitoring. Then Wildlife Acoustics SM2BAT [25] was

selected as detector of choice which detects ultrasound frequency of calls and translates

them into audible device and is shown in Figure 2-2(b). SM2Bat is full spectrum

detectors with 192-KHz sampling frequency (fs). It is weather proof, easy to deploy and

can record data automatically for the duration of battery life approximately of 9 or 10

nights. The SM2BAT recording parameters were set using Song Meter Configuration

Utility [25] with division ratio of 16, compression of WAC0, high pass filter (HPF) of

fs/16, and all other parameters at default settings. The ultrasonic signals were recorded in

WAC format which is a compress form of acoustic signals and then they were converted

to wave format using the WAC2WAV utility [25] from Wildlife Acoustics. Figures 2-3

and 2-4 show part of our experiments in Scott Park and Put-in-bay, Ohio.

SM2 package is employed for monitoring diverse set of bird species with calls in

frequency range of 5-10 KHz. This package consists of SM2 recorder platform and a

28
SMX-NFC microphone. The SMX-NFC microphone is waterproofed and specially

designed to record distant night flight calls. The flat horizontal surface, on which the

microphone capsule mounted, creates a pressure zone on the surface.

Figure 2-2: (a) The AR125 from Binary Acoustics [24] (b) SM2BAT from Wildlife
Acoustics [25]

AR125

Figure 2-3: Acoustic Array in Scott Park, Ohio

29
Figure 2-4: Ohio State University’s Stone Lab, Put-in-bay, Ohio

2.3 UAMS Feature Extraction


The goal of feature extraction is to find a transformation from observation space

to feature space with fewer dimensions while keeping the most significant features, as

shown in Figure 2-5. These features retain the important information of the signal, which

can be effectively used as a representative of the signal. In this way, the computational

complexity would be effectively reduced while providing a reliable solution for the

classification of the calls.

The acoustic signals have specific characteristics and features. These

characteristics of signals are used to differentiate between species. To extract the best set

of informative features from the signals, three different techniques were tested: Fast

Fourier Transform (FFT) [94], Mel Frequency Cepstrum Coefficient (MFCC) [34][92],

and Discrete Wavelet Transform (DWT) [93]. These techniques are described in details

as follows:

30
Observation Space Feature Space

Figure 2-5: Feature Extraction

2.3.1 Fourier Transform (FT) Approach

The Fast Fourier Transform (FFT) [94] is a widely used technique for extracting

features. FFT provides information in frequency domain but does not give the time

information of the signal. Bat calls are non-stationary and FFT is not appropriate for this

type of signal, though the Short Term Fourier Transform (STFT) [94] can overcome this

limitation. In STFT, the signal is first partitioned using the Hamming window into

several segments where each of them can be assumed stationary. Then STFT operations

are performed using the following equations:

( ) (2.1)

( ) ∑ ( ) ( ) (2.2)

( ) (2.3)

where ( ) is the window function, represents the length of the window or data

segment, k corresponds to the frequency bin, and ( ) is the input signal. STFT window

size of 20 ms with 50% window overlap is used.

31
Although the STFT provides a time-frequency representation of the signal, but the

main concern of this technique is the resolution problem which is related to the

Heisenberg Uncertainty principle [94]. According to this principle, the exact time-

frequency of the signal cannot be determined as it only provides information of the time

interval in which certain band of frequencies exists. Narrow window in STFT provides a

good time resolution and poor frequency resolution. On the other hand, larger window

gives better frequency resolution but poor time resolution. Therefore selection of window

size is very critical.

Significant set of features were selected as the coefficients of STFT for each

echolocation call. Afterwards, each feature set was subjected to dimensionality reduction

using Principal Component Analysis (PCA) [104]. It prevents the classifier to be

overwhelmed by the high number of features.

2.3.2 Mel Frequency Cepstrum Coefficient (MFCC) Approach

A representation of calls/songs features using Mel Frequency Cepstrum

Coefficient (MFCC) [34][92][105] would be provided by a set of cepstrum coefficients.

These coefficients are generated by cosine transform of the real part of the short-term

energy spectrum which is stated on a Mel-frequency scale. The computation of cepstrum

coefficients for a typical signal involves computation of the STFT for the windowed

segments of the whole signal followed by computation of the magnitudes of the

spectrogram. Hamming window with size of 20 ms was used. Coefficients are then

mapped to the Mel scale using Mel filter banks given as:

32
( ) ∑ | ( )| ( ) (2.4)

where ( ) is the filter bank and is the number of filter banks.

A Mel filter bank is a collection of triangular filters defined by the center

frequencies ( ) and is defined as:

( )
( ) ( )
( ) ( )
( ) ( ) ( )
( ) ( ) (2.5)
( ) ( )
( ) ( ) ( )
( ) ( )
{ ( ) ( )

where ( ) denotes frequency in Hertz and ( ) is the center frequency of the mth filter

in Hertz. Twenty filter banks are used and they are shown in Figure 2-6. Maximum and

minimum frequency of the filter banks in Mel scale ( ) are approximated as:

( ) (2.6)

33
Figure 2-6: Twenty Filter Banks in MFCC

Fixed frequency resolution in the Mel scale which is also called Mel frequency step is

then computed as:

( )
(2.7)

The center Mel frequencies for each filter bank are calculated and are converted to the

normal frequency domain using the following equation:

( ) (2.8)

The sub-band energy compression is performed as:

( ) | ( )| (2.9)

Finally the Discrete Cosine Transform (DCT) is applied to the filtered spectrogram. In

this step, the spectral information is transformed into the cepstral domain and the cepstral

coefficients are computed as:

34
( )
( ) ∑ ( ) ( ) (2.10)

where ( ) is the kth MFCC. The resultant coefficients are used as the features of the

signal. In order to reduce the dimensionality, PCA is applied. The 0th coefficient usually

is ignored as it is regarded as a collection of average energies of each frequency bands in

the corresponding signal. The block diagram of MFCC is shown in in Figure 2-7.

Time-domain Significant
signal Coefficients

STFT DCT

Mel –Scale Sub-Band


Filter Banks Energy
Compression

Figure 2-7: Block Diagram of MFCC

2.3.3 Discrete Wavelet Transform (DWT) Approach

The features of call/song of bats/birds are provided by a set of details coefficients

of the Discrete Wavelet Transform (DWT) [93]. DWT provides the time-frequency

representation of the signal which gives multi-resolution outlook of the signal. This

resolution makes the wavelet superior to the STFT in this application. The continuous

wavelet transform is based on the correlation of the time domain signal by a family of

basis wavelet functions, as [93]:

35
∫ ( ) ( ) (2.11)
| |

where the is the set of wavelet basis functions referred as a wavelet family. The

wavelet family ( ) is obtained by shift (translation) and scale (dilation) of “mother

wavelet”, as:

( ) (2.12)

where the and are translation and scale parameters, respectively and ( ) is the

“mother wavelet”. With shifting and scaling of the mother wavelet, different features of

the signal are extracted. In order to reconstruct the signal, the mother wavelet is assumed

to satisfy the admissibility condition, as:

| ̂ ( )| (2.13)
{ ∫ }
| |

where ̂ ( ) is the Fourier transform of ( ).

In DWT, the scale and translation of discretization is defined as and ,

respectively. So the DWT is expressed as:

( ) (2.14)

where and are discrete scale and translation functions, respectively.

DWT analyses the signal with filters of different cutoff frequencies at different

scales. It is able to provide different resolution by decomposition of signal into

36
approximation and detail data using low-pass and high-pass filters and then scaling by

sub-sampling as:

( )
( ) (2.15)

( )
( ) (2.16)

where to n and is the number of samples in the signal.

The procedure of decomposition of a signal using the high pass and low pass filters is

shown in Figure 2-8.

Time Domain Signal

Approximation
2
Details

HPF LPF

HPF LPF

Figure 2-8: Signal Decomposition by DWT

Features of the signal are selected as the wavelet detail coefficients which refer to

the similarity of the signal to the wavelet at any scale. Features are defined based on four

parameters: maximum, minimum, mean, and standard deviation of detail coefficients in

37
each sub-band. Daubechies-2 (db2) model wavelet is used with different levels of

decomposition based on the different wave files.

2.4 UAMS Classification by Supervised Machine Learning


Technique
Supervised learning is based on learning with an external supervisor who provides

knowledge of environment. This knowledge is presented by inputs and outputs. The most

common supervised learning method is Neural Network (NN) [36]. In the traditional NN,

the back propagation algorithm is used to train the network. Although back propagation

neural networks are used for solving a variety of problems, they still have some

limitations. First, back propagation cannot guarantee optimal solution. In real–world

applications, the back propagation algorithm might converge to a set of sub-optimal

weights. Second, it is difficult to estimate the topology of the appropriate architecture

[36].

Modern learning techniques based on the bio-inspired computing (Natural

Computing) can overcome some of the limitations of the traditional learning techniques.

Bio-inspired computing is the computational process of extracting ideas from nature to

solve problems. One of the nature-inspired models of computation is evolutionary

computing inspired by Darwinian evolution of species. Genetic Algorithm (GA) [106] is

a modern learning technique modeled with process of evolutionary computation and is

very effective at functional optimization. It efficiently searches large and complex spaces

to find the optimal solution for the problem. In this work GA is used effectively in

optimizing the weight selection of the Neural Network (NN). The GA used within neural

network produces an Evolutionary Neural Network (ENN) [95].

38
ENN consist of two main parts: training and testing. The block diagram of

training and testing of ENN is shown in Figure 2-9. In the training process, GA is utilized

for optimization of selecting the weights of network. The best chromosomes from GA

represent the optimized weights. The subsequent testing process incorporates the results

of the training process (i.e., best chromosomes). Neural network with the optimized

weights are then used for classification.

Testing
Training
Weight
Selection GA NN
Optimization
Best
Chromosomes

Figure 2-9: Block Diagram of Training and Testing Process in ENN

A multilayer Feedforward Neural Network (FNN) is used in this work, as shown

in Figure 2-10. It is an interconnection of perceptrons in which data and calculations flow

in a single direction from the input data to the outputs. The number of layers in a neural

network specifies the number of layers of perceptrons. The network is assumed to have a

fixed number of layers and neurons in each layer. The inputs are the features of the calls

provided by feature extraction techniques (FFT, MFCC, and DWT). The output of the

network is species to be classified. The number of neurons in the hidden layer is

determined by experiments.

The problem space is represented as chromosomes. Chromosomes are generated

randomly between -1 and 1. The best chromosomes from GA are used as the optimized

39
weights in the NN. A scheme of a chromosome used in this problem space is shown in

Figure 2-11. It is assumed that m, n, and p are the number of neurons in the input, hidden,

and output layers, respectively. One chromosome will have m×n×p neurons which is the

length of the chromosome. The computational steps of ENN algorithm are as follows:

𝑤 𝑤

𝑤
𝑤 𝑝
𝑤

.....
𝑛
...................

... 𝑤𝑛
..........

𝑤𝑛 𝑝
𝑤𝑚

𝑤𝑚 𝑛

Figure 2-10: Feedforward Neural Network (FNN)

Initialization: The number of chromosomes/generations, crossover/mutation probability,

and number of calls for training /testing are defined. An initial random population of

chromosomes is also generated. The initial weights are set to small random numbers in

the range [-1, 1].

Evaluation: Individual chromosomes are applied to the FFN. Chromosomes are

evaluated based on the fitness function. The fitness value represents the quality of the

chromosome, and is used to grade and order the population. This function is specific to

the individual problem and is essentially a driving force for an effective evolutionary

search. The fitness function of each individual chromosome is calculated as:

40
(2.17)

0.22 -0.1 … 0.31 0.9 … 0.77 -0.21

𝑤 𝑤 𝑤𝑚 𝑛 𝑤𝑚 𝑛 𝑤𝑛 𝑝 𝑤𝑛 𝑝

Figure 2-11: Chromosome Scheme

where is the network error function and it is defined as:

∑ ∑ ( ) (2.18)

where m is number of samples of practice cluster, n is number of neurons in input layer,

is the desired output of neuron j in output layer, and is the actual output of neuron j

in output layer and can be defined as:

(∑ ) (2.19)

where is the inputs signal, is the weight for the connection between neuron and

neuron , is the bias of neuron , and ( ) is the activation function. The Sigmoid

function is used as the activation function in this work. It is a strictly increasing function

that presents saturation and a graceful balance between linear and nonlinear behavior. It

is described as:

( ) (2.20)
( )

Selection: A pair of chromosomes must be selected for mating. The Roulette Wheel

Selection (RWS) scheme [106] is used for selection from the current population. The

parent chromosomes are selected with a probability relative to their fitness (i.e.

41
chromosomes with higher fitness have a higher probability of being selected for mating).

In the RWS, each chromosome of the population is assigned a portion of the wheel

according to its size (i.e., fitness value). The probability of chromosome selection is

given as:

(2.21)

where and denotes the fitness and the probability of individual chromosome,

respectively. Based on the number of parents needed, the wheel is then rolled to create

the next generation and each winning individual is selected and copied into the parent

population.

Operation: recombination and variation on chromosomes is provided using crossover

and mutation operators. Crossover is the genetic recombination process in which different

segments of two pairs of selected chromosomes are exchanged based on the crossover

probability. In this way, the new offspring is preserving common characteristics of the

parent solutions, as well as other characteristics. The areas of the search space are

explored while retaining optimal solution characteristics. Two different types of

crossover are applied for a pair of chromosomes: single point and multiple points, as

shown in Figure 2-12. The crossover point is randomly selected from where l

is the length of the chromosome.

Mutation is the genetic variation operation in which one gene is randomly

replaced by another to create a new chromosome, as shown in Figure 2-13. This is to

prevent solutions of each population from falling into a local optimum. The genes in the

chromosomes are replaced based on the mutation probability and mutation point.

42
𝑤 𝑤 𝑤𝑙 𝑤𝑙 𝑤 𝑤 𝑤𝑙 𝑤𝑙
… …

… …

(a)

𝑤 𝑤 𝑤𝑙 𝑤𝑙 𝑤 𝑤 𝑤𝑙 𝑤𝑙
… …

… …

(b)

Figure 2-12: Crossover Operation (a) Single Point (b) Multiple Points

𝑤 𝑤 𝑤𝑛 𝑤𝑛 𝑤𝑙

0.42

𝑤 𝑤 𝑤𝑙
… -0.21

Figure 2-13: Mutation Operation

43
Iteration: The Selection and Operation processes are repeated until the size of the new

chromosome population becomes equal to the size of the initial population. The initial

(parent) chromosome population are replaced with the new (offspring) population. The

training process is repeated until a specified number of generations have been considered

or the desired minimum error is achieved.

The training was based on available calls for each species in the Sonobat library

[27]. The library contains calls from variety of bat species documented in the Eastern

United States. The result of the training process was subsequently used in testing of

recorded bat echolocation calls. The process of ENN classification is shown in Figure 2-

14. The Feature extraction techniques and ENN algorithms are parallelized to improve

the computational performance [107][108].

44
Training

A-priori known
 Weights
Training Data Initialization
 Chromosomes

1 2
= ∑ ∑ ( )
Evaluation 2 =1 =1

...
Roulette Wheel
Selection Selection
Features
 Crossover
Operation
 Mutation

 No. of Generations
Iteration
 Minimum error

Best
Chromosomes Feed Forward

...
Testing Neural Network
Species
Test Data

Figure 2-14: Classification Process

2.4.1 Existing Classification Techniques

A) Discriminant Function Analysis (DFA): DFA [16][109] is a statistical technique

to determine which variables discriminate between groups. This method is performed

based on two steps: (1) testing significance of a set of discrimination functions, and (2)

classification. Testing is similar to Multivariate Analysis of Variance (MANOVA) in the

sense that two matrices consist of (a) total variances and covariances (b) pooled within-

group variances and covariances, are compared using multivariate F tests. Comparison is

made to determine if there have significant differences between groups. Then

45
discriminant functions are formulated from the best combinations of discriminant

features. Discriminant Function Analysis was implemented using SPSS software [110].

B) Back Propagation ANN (BP-ANN): BP-ANN [36] is a supervised learning

technique which uses the back propagation to train the network. This technique was

implemented in MATLAB. At the first step, the weights and biases are initiated with

random numbers and then it trains the network based on the back-forward training

technique. The sigmoid function is used as the activation function. The actual output is

calculated as:

(∑ ) (2.22)

where is input signal, is the weight for the connection between neuron and

neuron , and are the actual output and bias of neuron j, respectively. The weight

training and correction weight are defined as:

( ) ( ) ( ) (2.23)

( ) ( ) (2.24)

where ( ) is the error gradient for the neuron at iteration and is the learning rate,

a positive constant less than unity.

A three layer neural network is used with number of features as neurons in the

input layer and number of species as neurons in the output layer. Several tests have been

performed to find the best architecture. The momentum constant was varied between 0.1

and 0.9 in steps of 0.1.

46
C) Support Vector Machine (SVM): SVMs [111][112] are set of related supervised

learning methods used for classification and regression. SVM is implemented in

MATLAB. An SVM constructs a hyperplane or set of hyperplanes in a high or infinite

dimensional space. A good separation is achieved by the hyperplane that has the largest

distance to the nearest training data points of any class (so-called functional margin). The

larger the margin, the lower the generalization error of the classifier will be provided. As

the linear separation of bat call classes was not possible, the kernel function in support

vector machines is implemented based on Radial Basis Function (RBF) to transform the

feature space. This enables the fitting of a maximum-margin hyperplane. The RBF kernel

nonlinearly maps samples into a higher dimensional space.

The RBF kernel is expressed as:

( ) ( ‖ ‖ ) (2.25)

where is a kernel parameter. The RBF kernel adds a “bump” around each data point:

( ) ∑ ( ‖ ‖ ) (2.26)

D) Sonobat: Sonobat [27] is a commercially available software tool for viewing and

analyzing full spectrum sound data. It is especially designed to work with ultrasound and

pulsatile sounds as these sounds are emitted by bats. The classification option in this

software is able to recognize and classify the bat species using an unspecified statistical

classification algorithm. Figure 2-15 shows single and series of echolocation calls in the

Sonobat.

47
Figure 2-15: Sonobat Spectrogram for Single and Series of Echolocation Calls

E) Song Scope: Song Scope [25] from Wildlife Acoustics is also commercially

available software which automatically identifies animal species from their vocalizations.

The automatic processing of sound recordings of Song Scope is able to detect and

identify species from their vocalizations.

The Song Scope classification algorithm is based on Hidden Markov Models

(HMM) and features are extracted using spectral feature vectors similar to Mel Frequency

Cepstral Coefficient (MFCCs). Song Scope automatically segments training data into

48
individual syllables using cues from the signal detection algorithms. States are then

assigned sequentially to each syllable class in proportion to the mean duration of

syllables in the class. A spectrogram of series of echolocation bat call for Lasionycteris

Noctivagans is shown in bottom panel of Figure 2-16. Its top panel shows power of the

signal.

Figure 2-16: Spectrogram of Echolocation call of Lasionycteris Noctivagans in

Song Scope

2.5 Conclusion
Acoustic monitoring system was developed and performed in three main steps

consisting of data acquisition, feature extraction, and classification. Data acquisition was

performed in the project area in the Ottawa National Wildlife Refuge (ONWR). Full-

spectrum ultrasound detectors (AR125, SM2BAT) were used to record the echolocation

calls. Feature extraction was performed based on three different techniques: Short Time

Fourier Transform (STFT), Mel Frequency Cepstrum Coefficient (MFCC), and Discrete

49
Wavelet Transform (DWT). The extracted features were subjected to dimension

reduction using Principal Component Analysis (PCA) and were fed into classification

algorithm. A bio-inspired computing based classifier named Evolutionary Neural

Network (ENN) was proposed in this application for the classification at species level

using acoustic features. Different existing classifiers were tested such as DFA, Back

Propagation NN, SVM, and two commercially available software including Song Scope,

and Sonobat. Results from different feature extraction techniques were compared based

on classification accuracy. The acoustic identification approach can identify the bats and

will contribute towards developing mitigation procedures for reducing bat fatalities. The

acoustic identification approach can be extended to other applications.

50
Chapter 3

3 Infrared Imaging Monitoring System (IIMS)

3.1 IR Camera Monitoring


Thermal-IR cameras are useful tools for monitoring of the targets over nights as

they can provide high resolution of images at the poor visibility of nights. Also they

provide temperature information of the targets. Such cameras detect electromagnetic

frequencies from approximately 1 to 400 THz which includes most of the thermal

radiation emitted by biological objects.

Thermal-IR videos can be recorded over specified periods of time (e.g., during

nocturnal migration periods) and they can be processed using video processing

techniques to quantify and characterize target tracks. That said, one must recognize that

thermal-IR images have a very low Signal to Noise Ratio (SNR), which can limit

information for target detection and tracking. However, classifying thermal-IR targets as

bats, birds, or insects can be challenging due to unavailability of pre-defined exemplars

for training and classification. Therefore, unsupervised learning via clustering algorithms

may be the best method to group bats/birds/insects in IR imaging.

There are several methods existing for target detection and tracking of visual

images [113][114], but there exists a limited amount of work on target detection and

tracking on IR imagery in the computer vision community [115][116]. Among those

51
methods for IR target detection and tracking, most of the techniques are based on

temperature changes with an assumption that moving targets will appear as hot spots in

the infrared focal plane. This is not efficient in scenarios where the temperature of

background objects is higher than the moving targets, which usually happens in summer

time.

The Infrared Imaging Monitoring System (IIMS) involves five main steps: data

acquisition, detection, tracking, feature extraction, and clustering as shown in Figure 3-1.

These techniques are described in detail in following sections.

Background Feature
Subtraction Extraction

Image
Sequence Thresholding

Noise
Suppression
Clustering

Tracking ACA
FCM

SACA SM-SACA DS-ACA

Figure 3-1: Block Diagram of IIMS

52
3.2 IR Data Acquisition
Data were collected using SR-19 IR camera from FLIR Systems [117] which

provides very good night visibility and situation awareness. This model has a focal length

of 19 mm (36° HFOV) and standard resolution Focal Plane Array (FPA) with 320(H) ×

240(V) pixels. The recordings were made using card capture AXIS 241Q Video Server.

This software enabled the viewing of the video stream from the camera, after it first

passed through the Axis 4-channel server. The camera frame rate is 30 frames/sec (fps).

The camera was static, pointed directly skyward, and oriented +19.75° from the north,

with the left side of the screen being approximately east and the right side being

approximately west. The SR-19 IR camera used in this work and its direction is shown in

Figure 3-2.

(a) (b)

Figure 3-2: (a) The SR-19 IR Camera and (b) Direction of IR camera

Video feeds from IR camera are stored into a hard drive for further data

processing. Recordings were made nightly from one hour after sunset to one hour before

53
sunrise; as the nocturnal migration usually occurred over this period of time. The

experimental setup is shown in Figure 3-3.

FLIR SR-19 AXIS 241Q Video Server Laptop/PC

Figure 3-3: Experimental Setup

3.3 IIMS Detection of Moving Objects


The goals of the thermal-IR video analysis were to detect moving targets, characterize

objects and their trajectories, and determine the total numbers in each different groups of

moving targets. In this study, bats/birds/insects were detected as targets. The collected

videos were processed and converted to frames before applying the detection analysis.

The detection algorithm is based on several steps as follows.

3.3.1 Background Modeling

The Background Subtraction (BS) modeling is applied on the acquired frames to

find the background reference and foreground objects. The main goal of modeling is to

detect the moving target in the videos, which corresponds to the bats/birds, or insects.

There are several background subtraction/modeling techniques proposed in literature

[119][120][121][122][123][124]. IR images are illumination-independent and do not

involve shading and colors. In this work, the Running Average (RA) [125] technique is

54
utilized to dynamically update the background image and provide acceptable accuracy in

IR imaging. In RA method, background is updated as:

( ) ( ) (3.1)

where is the background image in time i, is the input image in time i , and is the

adaptive rate (a value of 0.05 is used). The RA method subtracts the moving biological

targets from the background model in collected thermal-IR images. Other techniques

such as Running Gaussian Average (RGA) [125] and Mixture of Gaussian [125][126]

have also been tested and compared to determine the most appropriate approach in terms

of accuracy and computational complexity.

3.3.2 Thresholding

An appropriate threshold value needs to be determined for separating moving objects

from the background model. However, determining of an optimum threshold value can be

challenging. In some applications a static thresholding method may be appropriate but it

may not provide accurate results for all conditions. In IIMS the apparent size of the target

varies with distance from the camera. Thus, a static threshold value would not be

appropriate as it may miss some targets at further distances. There are many adaptive

thresholding methods available in the literature though they may be computationally

intensive [127]. Thresholding in thermal-IR images poses a tradeoff between accuracy

and computational efficiency.

In this work, the Extended Otsu threshold (EOtsu) technique is considered as an

adaptive thresholding mechanism and has been used in this study. In the Otsu method

[118], the measure of separation between image intensities is classified into two

55
categories: foreground and background. Optimum threshold is defined in the sense that

maximizes the between-class variance.

The measure of separation is defined as the ratio between–class variance ( ) and the

total image intensity variance( ). It is expressed as:

( ) (3.2)

The Equation (3.2) can be normalized as:

( ) (3.3)

where is the Otsu threshold [118].

The Otsu thresholding method is extended by incorporating an adaptive variable

based on the mean of each frame and a certain constant value. The extended threshold

can be achieved as:

(3.4)

where is proposed optimum threshold and is an adaptive variable defined as:

( ) (3.5)

where C is a constant value of 0.5 and is the new frame with pixels

A block diagram of overall thresholding is shown in Figure 3-4.

56
Compute normalized histogram of ∑𝐿𝑖 𝑝𝑖 ⬚ =1
𝑛𝑖
the input image 𝑝𝑖
𝑀𝑁

Compute the cumulative sum 𝑃 (𝑘) ∑ 𝑝𝑖


𝑖

Compute the cumulative means 𝑚(𝑘) ∑ 𝑖 𝑝𝑖


𝑖

Compute the global intensity


mean 𝐿

𝑚𝐺 ∑ 𝑖 𝑝𝑖
𝑖
Compute the between class
variance

𝑘 𝑘(𝜎⬚ (𝑘))
Otsu threshold

Incorporate mean of each frame to 𝑝 𝑘 𝐿


Otsu threshold 𝐿 𝑚𝑒𝑎𝑛 (𝐼𝑖𝑗 ) 𝐶

Figure 3-4: Thresholding

3.3.3 Noise Suppression using Morphological Filters

The noise suppression and filtering is used to suppress the noise and its effects on

the image. In this work, the morphological filtering [118] based on opening-closing

approach was used.

57
Opening can smooth the contour of object by eliminating narrow protrusions.

Closing can smooth sections of contours by fusing narrow breaks and long thin gulfs,

eliminating minor holes and filling gaps in the contour.

The opening of image set by element structure is applied as:

( ) (3.6)

where is the opening and are erosion and dilation operations, respectively.

Similarly, the closing of set by structure element is applied as:

( ) (3.7)

where is the closing operation.

Although erosion allows the separation, it is not an ideal filter because the

retrieved objects are smaller than the original objects due to size of the structuring

element. On the other hand, dilation yields the full connection of elements, though it

increases the size of the objects. Opening is the only operation able to discriminate the

objects and conserve their original size. Finally, by applying the closing, the full

connection of the elements is yielded while eliminating gaps between objects.

Figure 3-5 shows thresholding and filtering were able to suppress noise and gave

a clearer image. Figure 3-6 shows the result of detection of a typical blip after

background subtraction, thresholding, and morphology filtering based on collected data.

It can be seen that Running Average (RA) provides superior results.

58
(a) (b) (c)

Figure 3-5: (a) Original Image (b) Image after Otsu Thresholding(c) Image after
EOtsu Thresholding

Figure 3-6: Result of BS techniques after applied EOtsu and Morphology

3.4 Target Tracking


Tracking is used to maintain temporal consistency of the foreground objects in

frames. In fact, tracking algorithm processes detected pixel blips in each video frame for

identification of track targets [128].

In this study, labeling-based component connectivity is applied to maintain the

track of the targets. In this method the detected blips obtained in the previous section, are

used for tracking. Each blip in a new frame is dilated by a fixed-size structure element.

59
Dilated blips are called D-Blips. If D-Blips have a common area with previous blips, we

assign these to the same target but in a different time. Even if blips are missing in

between frames, the tracking algorithm is able to recognize if the blip is of the same

track. Blips of the same track are given the same label; otherwise if the later blip is of a

separate track, a new target/track label is assigned. After detecting blips of the same track

in sequential frames, morphological operators are used to get the trajectory of the flight

path. To do this, the frames containing the target are isolated and stitched together to

generate the track of the target, as shown in Figure 3-7. The tracking algorithm is shown

in Figure 3-8. Figure 3-9 shows two samples of tracks based on real data.

Figure 3-7: Tracking Result

60
Tracking Algorithm:
(1) Convert collected data to frames
(2) Detect blips on each frame
(3) Create morphological blips
(4) Label the blips
(5) Check component connectivity
i) In case of connectivity, used same labeling
ii) Otherwise use new labeling
(6) Check if the frame is the last frame containing blip with
same label
i) In case of last frame, go to step (7)
ii) Otherwise check the new frame
(7) Plot the target trajectory

Figure 3-8: Tracking Algorithm

Figure 3-9: Sample Results of Detection and Tracking Algorithm

3.5 IIMS Feature Extraction


Feature extraction is used to obtain salient characteristics of detected tracks.

Features useful to wildlife biologists are size, velocity, heat, straightness index, and

direction. These features are very effective in separating various species and are

described as follows:

Size: the average apparent size of the target. It can be computed as:

61
( ) ∑ (3.8)

where ( ) is the size of kth target, is the target size in frame i , and is the total

number of the frames in which target is present.

Velocity: the apparent velocity of the target based on the distance traveled in camera

view. It can be determined as:


( ) (3.9)
( )

where ( ) is the velocity of kth target and is the the distance which is traveled by the

target in the frame i. The value of 0.033 represents time (seconds) in between consecutive

frames containing the target based on a camera frame rate of 30 frames/second.

Heat: the average thermal-infrared intensity of the target. It can be defined as:

( ) ∑ (3.10)

where ( ) is the heat of kth target and is the heat of target in frame i.

Straightness Index: a measure of the discrepancy between the actual track that is

traveled by the target and a perfectly straight segment. It is defined as:

( ) (3.11)

where ( ) is the straightness index of kth target, is the straight distance of target k,

and is distance from starting point to the present location of the target. Straightness

index values range from 0 to 1.

Direction: the direction of the flight based on the first and last points of the target in the

camera’s field of view.

62
3.6 Imagery Group Clustering using Unsupervised Learning
Unsupervised learning is an appropriate technique when dealing with unknown

images where training data is not available. Clustering algorithms can arrange the dataset

into different groups based on target features. The unsupervised learning method is used

for exploration of inter-relationships within the pool of collected data. This was an ideal

application for grouping our IIMS targets as birds/bats/insects since there were no prior

training sets available for these groups of targets.

3.6.1 Ant Clustering Algorithm

In this study, ant based clustering method is used for unsupervised learning

[129][130][131][132]. Ant Clustering Algorithm (ACA) [106] is a heuristic method for

clustering data inspired by behavior of ants, particularly for clustering of unlabeled data

sets. The ACA method was selected as no a-priori number of clusters is needed to be

assumed, in contrast to other clustering methods [133][134][135]. In this method, data is

projected from multi-dimensional space to a bidirectional grid, in which ants randomly

move and pick up and deposit items between specific locations.

The common feature of both real and artificial ants is collection and drop-off

items. Real ants collect corpses, move larvae, and arrange them by size. Artificial ants

move data items that are laid out in an artificial environment and organize them in a

sorted fashion. The ACA has no prior information on the number and size of clusters.

Three different variations of the ACA of Lumer and Faieta (LF) model [136] were

implemented for thermal-IR target clustering. In this application features extracted from

63
tracks, including size, velocity, heat, and straightness index, are applied as attributes for

clustering.

Standard ACA (S-ACA): The general idea of S-ACA is to pick up an item and match it

with similar items in different locations. The carried item is then deposited in a location

where it contains the most similar items. There are some ants that are just responsible to

carry items and drop them off.

The probability that an unloaded random moving ant picks up an item is given by:

( ) (3.12)

where is the perceived fraction of items in the neighborhood of the ant and is a

threshold constant. For , which means that there are not many

items in the neighborhood, there is high probability of picking up an item. If

, there is less probability of the ant removing an item from a mass area.

The probability that a randomly loaded moving ant drops that item in a certain

location is given by:

( ) (3.13)

where is another threshold constant. In this case, if then , therefore in

non-dense areas there is less probability of an ant depositing the item. Alternatively, if

then , meaning the probability of item deposition is high when there are

many items in a given location. It indicates items have a tendency to be dropped off in an

area where there are many items.

64
The perceived function proposed by LF is used to calculate the similarity in the

neighborhood of object situated at site r given as:

( )
( ) ( ∑ [ ] ) (3.14)
( )

where is the total number of sites in the local area of interest, is a factor for scaling

of the dissimilarity of ( ), where d the Euclidean distance between pair of items,

and is the total number of sites in the local area of interest.

Main steps for grouping data by S-ACA are: (1) projection from attribute space to

the item grid (2) calculation of perceived function (3) pick up and drop off items based on

the probability.

Different Speed ACA (DS-ACA): Ants move randomly in the item space with different

speeds. The speed is distributed uniformly in the interval where is the

maximum speed. For this study it was defined as . This speed affects the

probability of picking up and dropping off items by ants according to the following

equation:

( )
( ) ( ∑ [ ] ) (3.15)
( )
( )

where corresponds to the number of units traveled by an ant per time unit. Thus, fast

moving ants are not as selective as slow or relaxed ants in their approximation of the

average resemblance of an item to its neighbors.

65
Short-term Memory ACA (SM-ACA): Ants have a short memory of a fixed size of m.

They remember the last m dropped items and their locations. This memory helps ants in

dropping an item at a location which is the most similar to the carried item in the memory

list. In this case, the ant goes toward the location of the memorized element which is the

most similar to the one just collected. This behavior results in the creation of a smaller

number of statistically equivalent clusters, since identical items have a lower probability

of starting a new cluster.

3.6.2 Fuzzy Clustering

The clustering method can be performed based on whether subsets are fuzzy or

crisp (hard). Hard clustering algorithms require an object either does or does not belong

to subsets which are based on classical set theory. That is, in a hard clustering of a

database, data are grouped into a number of mutually exclusive groups. On the other

hand, in Fuzzy clustering technique, objects are allowed to belong to more than one

cluster with different degree of membership.

There are some advantages and disadvantages of clustering algorithms. In hard

clustering there is no degree membership for each cluster, such as Ant Clustering

Algorithm (ACA), however, in ACA no a-priori known number of clusters is needed. On

the other hand, in fuzzy clustering there are membership values for data which shows the

degree of belonging of the data to each cluster. An example of fuzzy clustering is Fuzzy

C Means (FCM) [137]. However, in FCM the number of clusters should be defined

initially.

66
The optimal number of clusters cannot always be defined a-priori and a good

cluster validity criterion has to be found. The FCM is used in this work as a fuzzy

clustering algorithm to group IR data. The membership function is a value between 0 and

1 indicating their partial membership to clusters. Thus, a data set can be grouped into c

fuzzy clusters.

A fuzzy partition of dataset X is described by which is a real c×N

matrix, where represents the degree of membership that the kth observation belongs

to the cth cluster( ). is matrix representation of fuzzy partition

with following conditions:

(3.16)

∑ (3.17)

∑ (3.18)

The fuzzy partitioning space for a finite set of is as follows:

{ | ∑ ∑

(3.19)

while

Fuzzy C Means (FCM): This technique was originally introduced by Jim Bezdek [137]

as an improvement on earlier clustering methods. It provides a method that shows how to

group data points that populate some multidimensional space into a specific number of

67
different clusters. This function creates fuzzy c-partitions of a given data set. A fuzzy c-

partition of X is one which characterizes the membership of each sample point in all the

clusters by a membership function which ranges between zero and one. Additionally, the

sum of the memberships for each sample point must be unity.

The objective function of a large family fuzzy clustering algorithm is minimization of

fuzzy c-means given as:

( ) ∑ ∑( ) ‖ ‖ (3.20)

where is matrix including centers of clusters and is defined as:

[ ] (3.21)

m is weighting exponent which controls relative weights placed in each squared distance.

A is positive definite (n×n) weight matrix and ‖ ‖ is squared inner–product

distance norm as:

‖ ‖ ( ) ( ) (3.22)

The value of objective function is a measure of the total weighted within-group squared

error which is acquired by the representation of the c clusters defined by their prototypes

In FCM objective function of Equation (3.20) is based on means of Lagrange multipliers

as:

̅( ) ∑ ∑( ) ∑ (∑ ) (3.23)

If and , then ( ) may minimize the objective

function of Equation (3.23) only if

68
⁄( )
(3.24)
∑ ( ⁄ )

and


(3.25)

The resultants are the weighted means of samples for a cluster. These weights are the

membership degrees of samples to corresponding clusters. This is the reason of calling

this algorithm as fuzzy c-means.

The algorithm of FCM is summarized as follows:

1- Initialize the partition matrix randomly, such that

() ( )
Loop for l=1, 2, … until ‖ ‖

2- Cluster centers (means) computation:

( )
∑ ( )
(3.26)
( )
∑ ( )

3- Distance computation

( ) ( ) (3.27)

4- Partition matrix update

69
⁄( )

∑ ( ⁄ ) (3.28)

3.7 Conclusion
Three different background subtraction techniques are applied to detect moving

objects. Otsu thresholding method is then extended by incorporating an adaptive variable

based on the mean of each frame and certain constant value. Filtering using

morphological operations is applied. Results of three different techniques are then

compared. Selected background subtraction technique (Running Average) followed by

thresholding and filtering is then used for tracking and feature extraction. Ant based

Clustering Algorithm (ACA) based on Lumer and Faieta model with its three different

variations including Standard ACA, Different Speed ACA, and Short Memory ACA are

implemented over extracted features and are compared in terms of different groups

created for detected avian data. Fuzzy C means is implemented and used to group the

targets.

70
Chapter 4

4 Radar Monitoring System (RMS)

4.1 Radar Monitoring


Radar has been a useful tool in Ornithological studies for monitoring of birds

activities for few decades. Radar as it is shown from its name is an abbreviation of Radio

Detection and Ranging which is mostly used for rang estimation of targets. Radar

operates by transmitting pulses of electromagnetic radiation (radio waves) and then

receiving reflected waves from an object. Radio waves travel close to the speed of light

and the distance to the object is, thus, related to the time lapse between transmission and

reception of the echo.

Several types of radars with different characteristics are used for different

purposes. Weather Surveillance Radar (WSR), also known as Next Generation Radar

(NEXRAD) is a network of high-resolution S-band Doppler weather radars operated by

the National Weather Service. NEXRAD detects precipitation and atmospheric

movement or wind. Tracking radars can be used for target detection and tracking. The

primary application of tracking radar is for weapon control and missile-range

instrumentation. It can be used in other applications such as tracking storms, birds, etc.

With tracking radar, a target of interest is identified and the radar is locked to that target.

Tracking radars are expensive and can only be obtained from military surplus or salvage

71
operations. However, tracking radar does not provide a broad view of migration over a

given site, so they are not widely popular for use in avian monitoring applications

especially at wind farm sites. Marine radar is commercially available surveillance radar

and is commonly used for avian monitoring and ornithological studies. There are many

factors which make this type of radar superior than Doppler or Tracking radars. Marine

radars are low in cost, high in resolution, portable, easy to operate, and require little

maintenance. This work uses Furuno 1500 Mark 3 marine radar.

The Radar Monitoring System (RMS) in this work involves three steps: data

acquisition, blips detection, and target tracking, as shown in Figure 4-1. These steps are

described in following sections:

Avian Radar System

Data Collection

Blip Detection

RadR

Target Tracking

Estimation

Data Association

Figure 4-1: Block Diagram of RMS

72
4.2 Radar Data Acquisition
The data are collected using Furuno 1500 Mark 3 marine radar [99]. The radar is

25 KW Furuno X-band radar (frequency = 9410 GHz, wavelength = 3 cm, model #

FAR2127BB, Furuno Electric Company, USA). Radar can be equipped with a T-bar

antenna (XN20AF), or a parabolic dish. T-bar antenna produces electromagnetic beam

1.23° wide x 20° high and 6.5’ length array antenna rotating in vertical plane to extract

the altitude information. The parabolic antenna with 3.5° beamwidth and 15° angle of

elevation provides a conical shape. The radar operates at 9410 MHz ± 30 MHz (X-band).

X-band radar is widely used in the ornithology applications due to its ability to utilize

smaller antennas and to provide better target resolution. Technical details of the marine

radar are shown in Table 4.1. The array and parabolic antennas are shown in Figure 4-

2(a) and (b). Monitoring equipment was housed in a trailer for easy deployment from site

to site and is shown in Figure 4-2(c)

The vertical mode was used at sunrise and sunset so that the ascent and descent

rate of birds can be determined. This will allow estimation of how quickly birds pass

through potential turbine strike zones. The parabolic dish was used during the middle of

the night as there is fewer targets ascending or descending and so the dish would be able

to determine how continuing migrants are shifting their orientation as they approach the

lakeshore.

73
Table 4.1: Radar Technical Details

Specifications of Marine Radar


Frequency 9410 MHz ± 30 MHz (X-band)
Output Power 25 KW
IF 60 MHz, Logarithmic, BW 28/3 MHz
Noise Figure 6 dB
Range Accuracy 1% Maximum Range
Minimum Range 35 m
Bearing Accuracy ± 10
EPA 10 targets
ATA 20 targets

(a) (b) (c)

Figure 4-2: Marine Radar (a) Parabolic Dish (b) T-bar Antenna (c) Trailer

The marine radar is connected to a digitizing card (XIR3000B) from Russell

Technologies [138]. The received signal is digitized and transmitted to a laptop/PC

through the USB. The data acquisition system is shown in Figure 4-3.

74
Furuno 1500 Mark 3 Digitizing Card from RTI Data Processing
Marine Radar

Figure 4-3: Radar Data Acquisition

This digitizing card is capable of receiving different kinds of input signals. It

allows digitizing of raw data at the front end of this commercially available radar before

converting into video signal for display at the operator console. It operates in a slave

mode and in this mode only azimuth, heading marker, trigger and, video signals from the

transceiver are captured. The digitizing card (XIR3000B) from Russell Technologies is

also accompanied by a software development package.

XIR3000 is integrated with IntegRadar Software Developers Kit (SDK) which

provides flexible image generation, scaling, and display parameters for heading and

centering. SDK also provides superior clutter suppression while keeping potential targets.

RadarSample is part of the SDK and can be used to view radar images and offers

different configuration functions and processing capabilities. Radar images are saved in

.REC formats and can be processed using radR software [100] for further analysis. The

block diagram of XIRC3000C is shown in Figure 4-4.

75
Laptop/PC
Transceiver + Antenna
Control Module

USB/422
Converter

XIR3000C Power
TCP/IP

Power

Figure 4-4: Block Diagram of XIR3000C

4.3 RMS Detection by Marine Radar


Targets are tracked based on their detection on multiple scans of the radar. Radar

continuously generates electromagnetic pulses. The microwave energy is transmitted by

continuously rotating antenna. Part of the energy is absorbed by the reflectors and other

part of that is scattered through a broad solid angle. The part which is returned back to the

antenna is described as an echo. The time between successive transmissions is sweep

time or Pulse Repetition Interval (PRI). Number of Pulses Per Second (PPS) is the Pulse

Repetition Frequency (PRF). Control system of the radar may vary these parameters

depending on application or need.

The directional antenna which radiates the pulse is called scanner. In each scan,

the position of all detected objects in range and bearing are determined. Echoes are

displayed on Plan Position Indicator (PPI) for each scan. Over several scans, the track of

76
the object will be formed. Targets can be detected and tracked when the echo’s power can

be distinguished with a specific certainty when it is buried in noise and clutter. Recorded

files from the digitizing card and its software will need to be further processed by radR.

4.3.1 radR

radR [100] is an open source software used in studying biological targets for target

recognition and tracking. The software was developed by Taylor et al [100]. It is based

on R language and its interface is coded by using tcl/tk, which works on Windows and

Linux operating systems. radR is used to process the data, remove noise, and detect

targets in terms of blips. It consists of various plugins and requires initialization of certain

parameters and setting of threshold values. It removes unwanted noise and produces

output data as blip movie. Target information is available in the blip movie or in the form

of an Excel sheet. There are several plugins that are available in radR. These plugins can

be classified into three categories of data sources, processing, and data storage. Their

description is as follows:

Data Sources

 antenna: load, edit, and save antenna characteristics

 seascan: acquire data from a Rutter’s Sigma S6 digitizer card via Seascan

server

 seascanarch: read raw radar data from .DAT files recorded by Rutter Seascan

software

 USRP: acquire data from an Ettus Research USRP-1 with LFRX

daughterboard and custom electrical front-end

77
 XIR3000: acquire data from a Russell Technologies’s XIR3000 USB video

processor board

 XIR3000arch: read raw radar data from folders of .REC files created by RTI

software

 video: process camera movies (e.g. radar display screenshots) in rectangular

coordinates using radR's algorithms and interface

 genblips: generate artificial data from a simple, customizable dynamic model

of target motion

Processing

 batch processing: process multiple blip movies to extract tracks

 custom blip filtering: write an R expression involving blip parameters

 remove persistent fluctuating clutter: the de-clutter plugin works on data

recorded as blip movies

 tracker: assemble blips into tracks

 zone: define regions where blips are forbidden or filtered differently

Data Storage

 blipmovie: read and write archives of blip data

 rawarch: read and write full scans of raw data

4.3.2 Blips Detection via radR

The radR statistical model for blip finding is based on following steps [100]:

 Learning phase

 Calculating sample scores

78
 Classifying samples

 Finding and filtering patches

 Estimating current cells states

The digitizing card creates scan of data in the form of a matrix. Each column in

the matrix is the amount of power which is received by the antenna. It is in the form of

time series that is received after transmission of each pulse. Each value in the columns

demonstrates the amount of microwave energy which is echoed back as well as noise

from different sources. Rows in the matrix represent the amount of microwave energy

received from a certain “range cell”. Individual numbers in each row corresponds to

energy received while the radar was at a specific azimuth. Return echoes are digitized

into “samples”, where each sample is the intensity of specific range cell from a pulse.

The number of samples and intensity resolution of echoed signal depend on the setting

and digitizing card. In fact, digitizing card has important role in distinguishing targets in

different ranges and provide the accurate distance measure.

As the number of samples is massive, the radR can be configured to keep only

those samples containing targets or “blips”. Collections of these blips are stored in

“blipmovies” archive. The background is formed by temporal mean and mean deviation

of the strength of the radar echo from user-defined windows of samples and pulses across

the entire scan. The intensity is computed as z-scores of the sample, which is the intensity

of the signal returned for that azimuth and range cell combination. Those samples which

surpass a specific threshold in z-score are considered as “hot” and those below the

threshold are called “cold” samples. A group of hot samples create “patches”. Forming of

blips is based on the patches which satisfy the filtering criteria such as number of

79
samples, PPI area, angular, and radial spans. Those patches meeting the specified criteria

are considered as blips. The background can be updated by a learning pattern with data

from scans based on the weighting scheme. Figure 4-5 shows raw and processed image

from the marine radar.

Figure 4-5: Blips Detection in radR before and after noise removal

Detected blips in several scans of the rotation beam are associated together to form

tracks which are used to provide target information such as velocity and direction. A

block diagram of different entities in RMS is shown in Figure 4-6.

Pulse Track

Sample Blip

Hot Patch
Sample
Figure 4-6: Entities in RMS

Radar data were collected when antenna was rotating in vertical mode giving a 0° to 360°

scan of the space. Data were collected into two different positions as recommended by

wildlife biologist for the purposes of this work. One data set is based on Land-Water

80
while the rotation axis from the north is 0°. Second set is based on Land-Land with

rotation axis as 90° from north. Figure 4-7 shows the radR scans in Land-Water and

Land-Land positions with two different antenna rotation positions.

Vertical Process: Right: Land; Left: Water

Vertical Process: Both Sides Land (Rotation Axis-90)

Figure 4-7: Land-Water and Land-Land PPI

4.4 RMS Target Tracking


Tracking can be defined as the problem of estimating the trajectory of a target in

the image plane as it moves around a scene. In a tracker system, consistent labels are

assigned to the targets in different frames. Additionally, depending on the tracking

81
domain, a tracker can also provide object-centric information, such as orientation, area, or

shape of a target. Target tracking via RMS is useful to provide altitude information of the

target as it is not available from IR camera or acoustic detectors.

Target tracking consists of two parts namely estimation and data association.

Estimation is applied to predict the next position of the target. That is the potential

position of “blip” in the next scan. Data Association matches the estimated values of the

potential blips with the “old blips”. Data association have important role in multi-target

tracking because of the following reasons:

 Random false alarm in the detection process

 Clutter due to other reflectors and scatterers

 Interference with other targets in multi target environments

There are two tracker models currently available in the radR [100]: Nearest

Neighbor (NN) and Multi-Frame Correspondence (MFC) algorithm. These techniques are

not appropriate for non-linear and non-Gaussian systems. The tracking in the NN

techniques is based on the distance of the old blips in existing targets and new blips. It

means that the closest blip in the neighborhood will be the best candidate to be a part of

the current track. For this reason, the distance for each pair (old blip and new blips in an

active target) will be calculated. Also, speed, turning angle, relative change in blip area,

and intensity are computed for each pair. Those pairs which satisfy the criteria are

considered as potential pairs, where among those, the pair with minimum distance is the

best candidate and is selected as part of the track.

82
Multi-Frame Correspondence is a non-iterative greedy algorithm for multi-frame

point correspondence. In MFC, two scans are compared and a velocity is assigned to the

potential pairs which are matched by NN. Then each subsequent scan is compared to the

previous two scans and the quality of the match is measured by a “gain” function. The

gain function is a weighted sum (log scale) of the proximity of the new blip to the next

predicted position by assuming constant target velocity; and the homogeneity of target

velocity when the new blip is added to the track. The final goal is to maximize the gain

with repeated execution of this algorithm for next scan.

According to nature of signals in RMS, a non-linear tracking model is required.

Particle filter has been selected as an appropriate tracking technique and implemented in

this work. Figure 4-8 shows the block diagram of tracking algorithm.

Tracking Process

Estimation SIS-Particle Filter

State Prediction

Weight Update

Data Association Nearest Neighbor (NN)

Figure 4-8: Tracking of Avian Radar Processing

83
4.4.1 Sequential Importance Sampling-based Particle Filter (SIS-PF)

Sequential Importance Sampling-based Particle Filter (SIS-PF) [101], is an estimation

model based technique to estimate position values for target with positive and negative

velocities. The idea behind the PF is to denote the required posterior density function by a

set of random samples with associated weights. Estimations are then computed based on

these samples and weights. Random measures are assumed as ( ) which

characterize the posterior pdf ( | ), where { } are set of all states

up to time k. is a user defined parameter which denotes the number of particles, and

is associated weight while all weights are normalized as ∑ .

The posterior density can be approximated as [101]:

( | ) ∑ ( ) (4.1)

The selection of the weights is based on principle of Importance Sampling (IS).

According to IS, properties of a particular distribution is estimated by only having

samples generated from a different distribution rather than the distribution of interest. Let

this probability density be proportional to a value ( ), and ( ) be the importance

density which is used to evaluate the samples for . The weighted approximation to the

density ( ) is calculated as:

( | ) ∑ ( ) (4.2)

th
where is the normalized weight of particle is proportional to:

( )
(4.3)
( )

84
So if the samples from importance density is derived by ( | ), the weights in

Equation 4.2 are defined as:

( | )
( ) (4.4)
( | )

The weight update can be derived by:

( | ∣ ) ( | )
( | ) (4.5)
( | )

( | ∣ ) ( | | )
( | )
( | )

( | ) ( ( ))
( | )
( | )

( | ) ( ( )) ( | )

Finally,

( ( | ) ( | )) ( | )
(4.6)
( | ) ( | )

( ( | ) ( | ))
( | )

The modified weight is defined as:

( ( | ) ( | ))
(4.7)
( | )

Posterior filtered density can be defined as:

( | ) ∑ ( ) (4.8)

85
The SIS algorithm is based on recursive propagation of the weights and support

points as each measurement is received sequentially. An overall computational process of

PF-SIS is shown in Figure 4-9.

Initialization

Particles and Weights


Generation

State Vector Prediction

Importance Density Importance Sampling

Weight Update

State vector Correction

Figure 4-9: PF-SIS

4.4.2 Nearest Neighbor Data Association

For the tracking system to perform properly, the most likely measured potential

target location should be used to update the target's state estimator. This is generally

known as the data association problem. The probability of the given measurement being

correct is a distance function between the predicted state of the target and the measured

state.

86
The data association is performed based on Nearest Neighbor (NN). In NN, a

single pairing is determined with a previously recognized track; the goal is to minimize

an overall distance function that considers all observation-to-track pairings that satisfy a

preliminary gating test. In fact, this algorithm always updates the tracking filter with the

measurement closest to the predicted state. Euclidian distance is used as the measure of

distance in this work. The NN is shown in Figure 4-10. It can be seen that the E1 has the

minimum distance between the target and the next position, comparing with E2, E3, and

E4. Thus, the E1’s correspondent blip will be selected as the best candidate next position

of the track.

×
E1
× E2
×
E3
E4 ×
×
E1<E2<E3<E4

Figure 4-10: Nearest Neighbor (NN)

4.5 Conclusion
Data were collected using Furuno 1500 Mark 3 marine radar, digitized using

XIR3000B from Russell Technologies, and transferred to laptop/PC for data processing.

radR software was used to remove noise, detect possible targets in terms of blip, and save

the blips information. Different plugins are used in radR including XIR3000arc for

reading data from digitizing card, antenna to set antenna control parameters, blipmovie to

read and writes archives of blip data, etc. Detected blips were tracked using SIS-PF

87
tracking algorithm. The SIS_PF tracking algorithm was developed in two phases:

estimation and data association. Particle filter was developed for estimation and Nearest

Neighbor was used for data association.

88
Chapter 5

5 Data Fusion

5.1 Multi-Sensor Fusion


The multi-sensory fusion can be used to produce a combined result with more

detailed and reliable information possible as well as efficient representation of the data

[139][140]. The fusion system acts as a human brain. The human brain integrates sensory

information such as sight, smell, taste and touch data. It makes inferences regarding

different problems such as the flavor of food which is an integration of smell and taste or

the feeling of safety in unknown areas which can be sensed by combination of sight,

sound, and touch.

In this work, the main goal of data fusion is to assess the position and

identification of biological targets to make inferences that may not be viable from a

single sensor alone. There are four types of uncertainty with the sensor’s measurement.

Random errors are caused by casual noise due to hardware or other sensor limitations.

Systematic errors are calibration errors which are caused by linearization of the

calibration process. Loading and environmental errors arise from environment which

alters the measurement due to sensor intrusion. Spurious Readings are non-systematic

errors such as erroneous detection of targets. Due to these types of errors, a single sensor

itself cannot provide reliable inferences, while a reliable result can be feasible using

89
multiple sensors. Multi-sensor fusion systems have many benefits such as robust

operational performance, extended spatial coverage, extended temporal coverage,

reduced ambiguity, improved detection, improved system reliability, and increased

dimensionality, etc.

In AMS, three different types of sensors are used for data fusion, as shown in

Figure 5-1: acoustics, Infrared camera (IR), and marine radar. The radar provides the

ability to accurately determine the target’s range, but has a limited ability to determine

the angular direction of the target. By contrast, the infrared imaging sensor can accurately

determine the target’s angular direction, but is unable to measure range. However, neither

of these two types of sensors provides identification and classification information. The

acoustic data in AMS provides the identity of the biological targets at species level;

however, it is not reliable for quantification purposes. Therefore, the fusion of these three

sensors is proposed to provide more accurate and reliable results.

Radar

IR Fusion Acoustics

Joint Identity
Declarati
Figure 5-1: Multi-Sensor Fusion

Data fusion sensors are categorized into different groups: homogenous and

heterogeneous sensors. Homogenous sensors are sensors of a similar type, such as

90
imagery sensors versus acoustic sensors. They are also divided into two sub-groups:

similar and dissimilar sensors. Homogenous similar sensors are sensors of the same type

and same technology such as data fusion using two IR cameras or two radars. On the

other hand, homogenous dissimilar sensors are those which are of the same type but with

different technologies, such as IR and radar. This type of fusion is more challenging than

the former one. Heterogeneous sensors are sensors of different types and technologies.

For example, data fusion of acoustic and imagery sensors is considered as heterogeneous

data fusion. It is the most inspiring fusion requiring more effort and its use is proposed in

this work.

5.2 Multi-sensor Fusion Architecture


The proposed fusion architecture of AMS consists of pre-processing, detection

and tracking, classification, association, integration, and finally overall quantification and

identification. Figure 5-2 illustrates the proposed architecture of AMS fusion. Pre-

processing is the processing of raw data from a single sensor. Detection and tracking are

used in IR and radar to detect targets in the noisy environment and track the targets of

interest. They involve detection of blips, removing noise and clutter, estimation and

association, and forming the trajectory or tracks. Different detection and tracking

techniques are used for IR and radar due to diversity of their sensor technology.

Classification is used in acoustic processing to obtain significant features or target

information. Targets of interest in IR and radar data are integrated to form a single target

in the fusion node. Acoustics are incorporated into the result to obtain the second level of

fusion. The final fusion node provides the information on movement rates associated with

specific features and quantification of class level targets.

91
Sensor Processing
Sensor Controls

Radar Pre-Processing Detection and Gating and Control parameters


Tracking

Pre-Processing Detection and Fusion Node 1 (IR+Radar)

(IR/Radar) +Acoustics)
IR
Tracking

Fusion Node 2
Acoustics Pre-Processing Classification

Classification,
Quantification

Figure 5-2: Acoustics/IR/Radar Fusion Architecture

5.3 Fusion Hierarchy


Generally a three level hierarchy fusion is widely accepted in the fusion

community. The hierarchy is a transformation from raw data such as sensor signals to an

abstract form of data which can be a symbol, decision, or any concept and information. A

three level hierarchy is shown in Figure 5-3 and it can be described as following;

 Raw data level fusion: the combination of the raw data from multiple sensors

when they are measuring same physical phenomena. This is feasible only if

the sensor measures are commensurate. The data fusion of images from

optical sensors in terms of pixels can be considered as a raw or pixel level

fusion.

 Feature level fusion: extraction of features from different sensors when sensor

measures are not commensurate. Features are combined into composite

92
feature vectors, which represent the multi modal feature of physical

phenomenon [79][80].

 Decision level fusion: utilization of inferences made from different sensors

and combination of them to yield a final fused decision. An example of

decision level techniques can be classical and Bayesian inference or voting

methods [150][151].

Data Fusion Hierarchy


Data Level Fusion

Feature Level Fusion

Decision Level
Fusion

Figure 5-3: Fusion Category

Data fusion of the acoustics/IR/radar is performed in a hierarchical form and is

based on two levels; Level 1 (L1) which is a homogenous dissimilar fusion based on

feature level fusion, and Level 2 (L2) which is a heterogeneous fusion based on decision

level fusion as shown in Figure 5-4.

The feature level data fusion combines features of detected /tracked targets in the

radar and IR data, and produces a composite feature vector, named L1 feature vector. The

constructed feature vector is an end-to-end individual sensors feature vector which serves

as an input to the next level. The second level is a decision level, which uses the L1

feature vectors and fuses it with the acoustic data using fuzzy Bayesian inference. The

93
acoustic data used in this level was previously analyzed, and results of fusion give

complementary information about targets of interest. The acoustic data provides

identification information, and imagery data gives quantification information. The final

inference will provide statistics on the quantification of class/species level targets.

Imagery Fusion

Sensor 1: Feature Level Decision Level


Radar
Level 1 Feature Level 2 Joint Identity
Fusion Vector Fusion Declaration
Sensor 2:
IR
IR

Sensor 3:
Acoustics

Homogenous Heterogeneous
Dissimilar Fusion Fusion

Figure 5-4: Fusion Levels of Proposed AMS Architecture

5.4 L1 Fusion
L1 fusion process deals with the integration of IR and radar data in feature level. It is

implemented based on different fusion modules and procedures which involve data

alignment and data association [141], and they are explained in following sections.

5.4.1 Data Alignment

The goal of data alignment is to provide a common representational format of data

from different sensors and it is a requirement for fusion systems. The conversion from

94
sensor measurement system to a common format makes the sensor observations

compatible for fusion. Data alignment is divided into two main alignment operations.

a) Spatial alignment is a transformation of the local spatial coordinate system with

position of ( ) to a common coordinate system, as ( ) ( ) where ( ) is

the transformed position. The process involves geo-referencing the location and field of

view of IR and radar. Experiments are performed to implement alignment so they refer

to the same target and feasible track-to-track association. The alignment is employed

based on the vertical mode of radar and the field of view of IR. The details of spatial

alignment are described in Chapter 6.

b) Temporal alignment is a transformation of the local time t to a common time

format as ( ). In AMS, the radar recording system is in Greenwich Mean Time

(GMT) format, while the IR data is recorded in Eastern Standard Time (EST). In order

to fuse IR and radar data, the EST format is used as the time reference and all of the

GMT-based data is converted to the EST time format. The resulting data is aligned to a

common EST format. Details of the temporal alignment experiment are described in

Chapter 6.

5.4.2 Data Association

Data association involves the use of certain metrics for comparison of tracks and

measurements that were resulted from IR and radar data. It is implemented based on

several modules: gating, association matrix, and assignment, as shown in Figure 5-5.

95
Data Association

Creating Assignment
Gating Association
Logic
Matrix

Figure 5-5: Block Diagram of Data Association Process

Gating is applied to remove unlikely target-to-target pairs while keeping most

putative pairs [69]. The goal of such screening is to shrink the number of possible

combinations that need to be considered for assignment. An association matrix is created

to measure similarities between IR and radar target pairs. Assignment is performed to

define track to track pairs. Timestamp and locational information is considered as

association properties. Data association modules are implemented for the association

properties as shown in Table 5.1.

Table 5.1: L1 Data Association Modules and Properties

Data Association

Properties Modules

Locational Information Gating

Timestamp Association Matrix

Assignment

Temporal and locational gates are considered based on a user-defined threshold as ,

where is the range of gate for the property of . The association matrix is

generated for putative targets satisfying the gating criteria. Let set of the properties of

96
sensor A (IR) and B (radar) be denoted as { } and

{ }, respectively.

where

- is the ith property of sensor A which is the pair of temporal and locational

properties as ( ( ))

- and ( ) represent the timestamp and locational information of sensor

- m is the total number of sample properties available for sensor A

- is the ith property of sensor B which is the pair of temporal and locational

properties as ( ( ))

- and ( ) represent the timestamp and locational information of sensor

- n is the total number of sample properties in sensor B

The gating function of ( ) is defined to determine if the

falls within the gate of the . For each sample, if the criterion is

satisfied, the gating function returns a value of “1” as the presence status and “0” as the

absence status. Thus, (n×m) Status Matrix (SM) is created with elements of presence and

absence statuses for n and m properties of sensors B and A, respectively. For properties

with presence status, score measures are given based on the local and global distances.

Local distance measures similarities between the timestamp and locational information of

the pairs while global distance measures overall properties between two pair of targets.

The Euclidean distance is used as a score measure to evaluate the similarities as:

97
( ) | | (5.1)

( ) ‖ ‖ √( ) ( ) (5.2)

Global distance between two properties is defined as:

( ) ‖ ‖ (5.3)

where

( ( )) (5.4)

d(.) is the local distance to measure the similarities between the temporal and locational

information.

D(.) is the global distance to measure the similarities between overall properties.

The distance value of zero indicates the maximum similarity. The dissimilarity increases

with the increase in the distance. Index Score Matrix (ISM) is generated whose elements

have two values of index and score for presence statuses. It is an n×k matrix where k (k ≤

m) is the number of presence status values. Total number of sample properties for sensor

A and B are m and n, respectively.

There might be a possibility for a to have presence status value for several

(many–to-many relation). The best candidate is the one with minimum score value in

ISM matrix and other candidates are replaced with null value. Index Score Matrix (ISM)

is transformed into n×k Association Matrix (AM) with dimension and elements of unique

presence status. This AM matrix is many to one relationship. The AM matrix is modified

by selecting the minimum of each row, which provides a one-to-one assignment from

and presence status values.

98
The SM, ISM, and AM matrices are shown in Figure 5-6. In this figure, the SM

matrix contains presence and absence statuses of from ith property of sensor B to jth

property of sensor A. ISM matrix contains pair of index values of and score values of

from the lth element of the present status from matrix SM and kth element of sensor B.

AM matrix contains elements of of unique status elements and is replaced with the

minimum of each elements for every row. The overall process of association is shown in

Figure 5-7.

99
m IR Feature (Sensor A)

n Radar Feature (Sensor B)


𝑠 𝑠 ... 𝑠 𝑚

𝑠 𝑠 ... 𝑠 𝑚


SM=

𝑠𝑛 𝑠𝑛 ... 𝑠𝑛𝑚

(𝑖 𝑐 ) (𝑖 𝑐 ) . . . (𝑖 𝑘 𝑐 𝑘 )
(𝑖 𝑐 ) 𝑖 𝑘
(𝑖 𝑐 ) (𝑖 𝑐 ) . . . (𝑖 𝑘 𝑐 𝑘 )
(𝑖 𝑘 𝑐 𝑘 ) (𝑖 𝑐 ) 𝑖 𝑘

ISM=

(𝑖𝑛 𝑐𝑛 ) (𝑖𝑛 𝑐𝑛 ) ... (𝑖𝑛𝑘 𝑐𝑛𝑘 )


(𝑖 𝑐 ) 𝑖 𝑘

𝑎 𝑎 ... 𝑎 𝑘
𝑚𝑖𝑛 (𝑎 𝑎 𝑎 𝑘 )
𝑎 𝑎 𝑎 𝑘
𝑚𝑖𝑛 (𝑎 𝑎 𝑎 𝑘 )

AM=

𝑎𝑛 𝑎𝑛 ... 𝑎𝑛𝑘
𝑚𝑖𝑛 (𝑎𝑛 𝑎𝑛 𝑎𝑛𝑘 )

Figure 5-6: Association Matrices

100
Sensor A Sensor B
Association Associatio
𝑝𝐵 𝑝𝐵 .... 𝑝𝐵𝑚
𝑝𝐴
...


𝑔𝐵 (𝑝𝐴 )

𝑠 𝑠 .... 𝑠 𝑚



𝑝𝐵 𝑝𝐵 .... 𝑝𝐵𝑚
𝑝𝐴𝑛
...

𝑔𝐵 (𝑝𝐴𝑛 )

𝑠 𝑠 .... 𝑠 𝑚

...

Status Matrix(SM)

Index Score Matrix (ISM)

Association Matrix (AM)

Modified Association Matrix


(Modified AM)

Figure 5-7: Association Process of L1 Fusion

101
5.5 Fusion Functions
A number of functions are created to perform L1 fusion. These functions are

developed in MATLAB and are performed step-by-step in L1 fusion. A block diagram of

various L1 fusion functions is shown in Figure 5-8. They are described as follows:

radR

radR is an open source software for processing and analysis of radar data. It is used

to detect possible blips of avian targets. Blips are used for creating tracks and further

processing. Details of radR are described previously in Chapter 4.

Sort based on Scan_number (S-Scan)

The S-scan function is developed for sorting detected blips based on their scan

numbers. The scan number of each blip is the order of its detection in each radar scan.

This parameter is used for tracking and further processing.

Particle Filter (PF)

The PF is implemented for target tracking. It is an estimation technique for

predicting blips of targets in the next frames. Blips are used to build the trajectory of the

target. The detail of this function is described previously in Chapter 4.

Convert Blip to Target (BTT)

Blips belonging to the same track number are associated with one target.

According to this, BTT function first sorts the data based on the track number. The

unique_track vector is created by obtaining unique values of track numbers. Blips are

then matched with unique_track values. All blips with the same track number are

identified with a single entry as target_no. A dataset with a number of targets and their

information, such as number of samples (ns), area, intensity (int), max, angular span,

102
radial span, and perimeter (perim) is generated. These values are calculated based on the

average values of the blips. Coordinate values (x, y, z) are created as x_first, x_sec,

y_first, y_sec, z_first, z_sec for each target. These coordinates values then represent x, y, z

values of the position of each arriving and exiting of individual target.

Temporal Alignment (TA)

All radar data generated by radR are in GMT whereas IR data are in EST format.

In order to have the common time system, radar data are converted into EST format.

Radar data generated by radR are converted to EST time and date using following

relations:

( ( ) ) (5.5)

( ( ) ( ) ( )) (5.6)

where is time in EST, is time in GMT, is date in EST, is date in

GMT format.

Common Coverage Area (CCA)

The common coverage area of two sensors (IR and radar) needs to be defined. As

sensors are not synchronized during data collection, each sensor may have a different

coverage area. In AMS data fusion, we are interested in those targets which are present in

both sensors. This provides information from both sensors and helps in making an overall

decision. Therefore, the common coverage area of IR and radar is geometrically

determined and is verified using simulation. Those targets that satisfy following criteria

are in common coverage area.

103
( (( ) (( ) ) (

(5.7)

))) )

where TC is the target in common coverage area, x and z are the target positions on the X

axis and Z axis, respectively. Proof of Equation 5.7 is provided in Chapter 6.

104
Radar Data IR Data
RadR

Blip Sorting by
S-Scan Target Tracks
Scan Number

Particle Filter
PF (PF)

Convert Blips to
BTT
Targets

Convert to EST
TA
Zone

Get Targets in
CCA Common Coverage
Area

Target Sorting by
S_Time
Time

Get Targets in
CT Common Time
Alignment

Get Spatially Target


SA
Alignment

Figure 5-8: Fusion Functions

105
Sorting based on time (S_Time)

S_Time function sorts targets based on EST format. Data are collected during the

entire night and it is divided into processing before and after midnight. Then according to

EST format, data is sorted in before and after midnight in nightly basis.

Common Time (CT)

This function obtains targets in an aligned time zone. The temporal range for all

the data is defined in both sensors. The variable ‘max’ indicates the target that is

observed at the beginning. The variable ‘min’ indicates the target that is observed at the

very end. The ‘max’ value found in one sensor is searched in the time range of another

sensor. The closest time to this value is considered as the common starting time. A

similar process is applied using the ‘min’ value to obtain the common ending time. Data

in the range of ‘max’ and ‘min’ are considered as targets of interest and the data out of

this range in both sensors are ignored.

Spatial Alignment (SA)

Spatial alignment provides information on common coverage area of both IR and

radar. It uses the coordinate system of the radar as the reference. It then converts

coordinates of IR according to reference coordinates using Equation 5.8. The proof of

this equation is provided in Chapter 6.

(5.8)

where x, y are the old position of target in the IR and is the new position of x.

106
5.6 L2 Fusion
L2 fusion is the integration of acoustics and L1 features where they are processed

separately and are described in Chapters 2 and Section 5.4, respectively. L1 features

contain the fused IR/radar information of the targets and the results of acoustic

processing named acoustic features contain the identity of the targets at species/class

level. Acoustic and L1 features are fed into the L2 fusion, which is a decision level

fusion. In this level, the inferences regarding the targets are made consist of quantity,

identity, and other information. As there is no locational information for targets available

through acoustics, there is no spatial alignment possible in the L2 fusion. Timestamp is

assigned to targets and data association is performed for acoustic/L1 targets. Fuzzy-

Bayesian technique is proposed to perform L2 fusion.

5.6.1 Data Association

Data association of acoustic and L1 features is similar to L1 fusion, in the sense

that association modules and properties are defined. The only association property

available in L2 is the target timestamp due to limitations of the sensor technology. Data

association modules are gating, association matrix, and assignment. Similar to L1 fusion,

gating is performed by considering a gate based on the user defined threshold. The

association matrix is created for putative vectors satisfying the gating criteria. The SM,

ISM, and AM matrices are generated and the best pairs are selected.

Temporal gates are considered based on a user-defined threshold as , where

is the range of gate for the property of . The association matrix is

generated for putative targets satisfying the gating criteria. Let set of the properties of

107
sensor A (L1 feature vectors) be denoted as { } and set of all the

sample properties for sensor B (acoustics), denoted as { }

where

- is the ith property of sensor A which is temporal properties as ( )

- represents the timestamp information of sensor A

- m is the total number of sample properties available for sensor A

- is the ith property of sensor B which is temporal properties as ( )

- represents the timestamp information of sensor B

- n is the total number of sample properties available for sensor B

The gating function of ( ) is defined to determine if the falls

within the gate of the . For each sample, if the criterion is satisfied

then the gating function returns a value of:

 “1” as the presence status

 “0” as the absence status.

Thus, (n×m) Status Matrix (SM) is created with elements of presence and absence

statuses for n and m properties of sensors B and A, respectively. For properties with

presence status, score measures are given based on local and global distances. Local

distance measures similarities between the timestamp information while global distance

measures overall properties between two sensors. The Euclidean distance is used as a

score measure to evaluate the similarities between two pairs as:

( ) | | (5.9)

( ) ‖ ‖ ( ) (5.10)

108
where d(.) is the local distance to measure the similarities between the temporal

properties and D(.) is the global distance to measure the similarities between overall

properties. As there is no locational information included in the properties, the local

distance and global distances are the same in L2 fusion. The distance value of zero

indicates the maximum similarity. The dissimilarity increases with the increase in the

distance. Index Score Matrix (ISM) is generated whose elements have two values of

index and score for presence statuses. It is an n×k matrix where k (k ≤ m) is the number

of presence status values. Total number of sample properties for sensor A and B are m

and n, respectively.

As there might be a possibility for a to have presence status value for several

(many–to-many relation), the best candidate is defined as the one with minimum

score value in ISM matrix and other candidates are replaced with null value. Index Score

Matrix (ISM) is transformed into n×k Association Matrix (AM) with dimension and

elements of unique presence status. This AM matrix is many to one relationship. The AM

matrix is modified by selecting the minimum of each row, which provides a one-to-one

assignment from and presence status values. The SM, ISM, and AM matrices are

shown in Figure 5-9. In this figure, the SM matrix contains presence and absence statuses

of from ith property of sensor B to jth property of sensor A. An ISM matrix contains

pair of index values of and score values of from the lth element of the present status

from matrix SM and kth element of sensor B. AM matrix contains element of of

unique status elements and is replaced with the minimum of each elements for every row.

109
m- L1 Feature Vector (Sensor A)

𝑠 𝑠 ... 𝑠

n Acoustic Feature (Sensor B)


𝑚

𝑠 𝑠 ... 𝑠 𝑚


SM=

𝑠𝑛 𝑠𝑛 ... 𝑠𝑛𝑚

(𝑖 𝑐 ) (𝑖 𝑐 ) ... (𝑖 𝑘 𝑐 𝑘)
(𝑖 𝑐 ) 𝑖 𝑘
(𝑖 𝑐 ) (𝑖 𝑐 ) ... (𝑖 𝑘 𝑐 𝑘)
(𝑖 𝑘 𝑐 𝑘 ) (𝑖 𝑐 ) 𝑖 𝑘

ISM=

(𝑖𝑛 𝑐𝑛 ) (𝑖𝑛 𝑐𝑛 ) ... (𝑖𝑛𝑘 𝑐𝑛𝑘 )


(𝑖 𝑐 ) 𝑖 𝑘

𝑎 𝑎 ... 𝑎 𝑘 𝑚𝑖𝑛 (𝑎 𝑎 𝑎 )
𝑘

𝑎 𝑎 𝑎 𝑘 𝑚𝑖𝑛 (𝑎 𝑎 𝑎 )
𝑘

AM=

𝑎𝑛 𝑎𝑛 ... 𝑎𝑛𝑘
𝑚𝑖𝑛 (𝑎𝑛 𝑎𝑛 𝑎𝑛𝑘 )

Figure 5-9: Association Matrices in L2

110
5.6.2 Fuzzy Bayesian Fusion

The theory of probability and statistics plays an important role in decision making

problems. Bayesian inference is an interesting conditional approach in various decision

problems. However, the traditional Bayesian technique is unable to deal with

uncertainties in problems because in this method, a priori distributions are assumed based

on classical probability distribution. In many applications including avian monitoring, a

priori information of the targets usually is not exact value and is more or less fuzzy. In

these situations, it is more efficient to employ a fuzzy model to provide a priori

information [147][153][154].

Fuzzy rule-based systems can approximate a prior and likelihood probabilities in

Bayesian inference and thereby approximate posterior probabilities. This allows for the

description of priors with fuzzy if–then rules rather than closed-form pdfs. The fuzzy

Bayesian fusion in avian monitoring is developed into two steps: a) fuzzy system

providing a priori probability, b) Bayesian inference providing posteriori probability. In

fact, the probability-based avian category (bird, bat, or insect) is determined using the

fuzzy system and the result is fed to the Bayesian inference system to provide the

posterior probability of the target signature, as shown in Figure 5-10. The overall fusion

based on Fuzzy rule-based and Bayesian Inference is shown in Figure 5-11. It can be seen

that there are two fusion nodes used for two different levels of fusion. Feature vectors of

is resulted from the fusion of IR and radar features and it is fed to the fusion node 2.

In this level, the previously processed acoustic data in the form of acoustic features are

integrated with through a fuzzy Bayesian technique. The results are the L2 feature

vectors ( ) which are the data combination of three sensors. , , and

111
are three different types of data which are drawn from single sensor, combination of two

sensors, and combination of three sensors, respectively. , , and are single

sensor features of radar, IR, and acoustics respectively.

Priori Probability
Posteriori Probability
Fuzzy System (FS) Bayesian Inference
Avian category Target Signature

Figure 5-10: Fuzzy Bayesian Inference

5.6.3 Fuzzy System (FS)

Lack of exact information in perception of events, and inaccuracy of measurements

causes uncertainty in reasoning. There is no sharp and well-defined boundary that can be

defined for concepts (facts). Different avian categories have some common features with

less and more measures [142][143]. For instance, the flapping rate of birds generally is

higher than that of bats which results in higher heat intensity for birds; however, in some

situations they have same range of heat intensity. Also, flight straightness index values of

birds are higher than those of bats as birds fly in a straighter pattern while bats fly in a

zigzag pattern. The degree of certainty in intensity values, ranges, straightness index, and

other features is not well defined. This fuzzy nature of facts needs a mechanism for

representing the linguistic constructs such as low intensity, high intensity, low

straightness index, high straightness index, etc. The fuzzy system proposed by Lotfi

Zadeh [143] is implemented and used in this case.

112
Fusion Node 1 DDual

DMono
Drad 𝐹𝑉𝑟𝑎𝑑
DMono 𝐹𝑉𝐿
DIR 𝐹𝑉𝐼𝑅

FVL1

𝐹𝑉𝑎𝑐
Fuzzy

Fusion Node 2
r=1, …,R
Dac

Bayesian
𝐻𝑘 𝑘 𝐾

FVL2
DMono Dtri DDual

Inference

Figure 5-11: Fusion Scheme

Fuzzy logic approximates and provides an inference structure which allows the human

reasoning capability to confront with presence of uncertainty. Fuzzy system is proposed

to define the avian categories based on the probability measures of fuzzy events.

Let denote the set of L1 feature vectors, as , where n is the

total number of L1 feature vectors. We define the set of straightness index values as

, heat intensity as , range as ,

and finally velocity as . We are interested in set of ̃ with the subsets

113
of avian category, as ̃ ̃ , where denotes the total number of avian

categories. Suppose we have three different categories of birds, bats, and insects. Thus,

̃ | (5.11)

̃ | (5.12)

̃ | (5.13)

In other word, ̃ is the set of any feature vector of belongs to the L1 feature vector set

of where is the feature describing a bird. ̃ is the set of any feature vector of

belongs to the feature vector set of where is the feature describing a bat. Finally ̃

is the set of any feature vector of belongs to the feature vector set of where is the

feature describing an insect.

To specify ̃ ̃, and ̃, we have to define precisely what is meant by a “bird”,

“bat”, or “insect”. This means we have to operationalize these terms to recognize in

which category the targets belong to. However, the boundaries of members in ̃ are not

crisp values and this causes the ̃ as a member of the set ̃ to become a fuzzy set. Let

denotes the set of inputs of the system with k input as [̃ ] where ̃ is

an input subset. The fuzzy inputs with are given as:

SI: ̃ | (5.14)

Heat: ̃ | (5.15)

Range: ̃ | (5.16)

Velocity: ̃ | (5.17)

114
where ̃ , ̃ ̃ and ̃ are the fuzzy sets of straightness index, heat, range, and velocity,

respectively. In other words, ̃ is the set of any feature of where belongs to the set

of of all straightness index features. ̃ is the set of any feature of where belongs

to the set of of all heat features. ̃ is the set of any feature of where belongs to the

set of of all range features. Finally ̃ is the set of any feature of where belongs to

the set of of all velocity features. The mapping of inputs (straightness index, heat,

range, and velocity) to the outputs (birds, bats, and insects) occurs through several

concepts, explained as follows:

Membership functions in fuzzy theory serve as a way of dealing with the

uncertainty issue for classifying the targets in the fuzzy sets [138][140]. The membership

function of ̃( ) represents the membership degree of in fuzzy set of ̃ . The

membership function is the general form of an indicator function in the classical set and

its value varies between 0 and 1.

 If the region of fusion features is characterized by full membership, then its value

is 1 ( ̃( ) ). In this case, the elements are known as the elements of

the core.

 If the region of fusion feature is characterized by nonzero membership in ̃ , then

it indicates the support of a membership function for fuzzy set ̃ . The support has

the elements of where ̃( ) .

 If the region has a nonzero membership but not full membership, it defines the

boundary of a membership function for fuzzy set. The boundary has the elements

whose membership is ̃( ) .
115
The membership functions serve as a tool to provide operations of fuzzy sets.

Consider fuzzy sets of ̃, ̃, and ̃ on the L1 feature set of with membership

functions of ̃( ), ̃( ), and ̃( ), where the functional operations of the sets

consist of union, intersection, and complement and are defined as:

̃ ( ) ̃( ) ̃( ) (5.18)

̃ ( ) ̃( ) ̃( ) (5.19)

̃(
̅̅̅̅ ) ̃( ) (5.20)

The nonlinear mapping from the inputs to the output is performed through a set of

fuzzy rules known as rule based system [139]. It is based on IF-THEN rules, given by IF

antecedents and THEN consequents. The antecedents represent the inputs which are

based on L1 feature vectors and consequents denote the outputs which is different

categories of target signature. The fuzzy system is shown in Figure 5-12.

Fuzzy Rule Based

Straightness Index 𝑉̃ ̃
𝐶 Bird
Heat ̃
𝑉 ̃
Fuzzy System 𝐶 Bat
Insect
Range ̃
𝑉 (FS) ̃
𝐶
Velocity 𝑉̃

Figure 5-12: Fuzzy System

Different notations of input sets and their numerical ranges used are shown in

Table 5-2. The fuzzy rules are shown in Table 5-3. The rules are given based on the

author’s consideration of available facts regarding the activity and behavior of birds, bats,

116
and insects. However, the rules are subject to change based on the biologist’s interests.

Several assumptions are considered in making these rules, including:

 Feeding is at medium or lower ranges

 Migrating is at high or medium ranges

 Insects have lower ranges than birds and bats

 Velocity is inversely proportional to range

 Heat is inversely proportional to range

 Straightness index of bat is lower than birds and insect

Table 5.2: Notations and their Numerical Ranges

Inputs Notation Numerical Range


Low(L) [ 0-0.8]
Straightness Index ̃ Medium(M) [0.6-0.90]
High(H) [0.8-1]
Low(L) [0-100]
Heat ̃ Medium(M) [80-200]
High(H) [180-250]
Low(L) [0-180]
Range ̃ Medium(M) [150-350]
High(H) [300-∞]
Low(L) [0-2000]
Velocity ̃ Medium(M) [1500-3500]
High(H) [3000-∞]

117
Table 5.3: Fuzzy Rules

Inputs Outputs

̃ =SI ̃ =Ht ̃ =Rg ̃ =Vc ̃ =Bird ̃ =Bat ̃ =Insect


H L H L x
H M M M x
H H M H x
H H L H x
L H L H x
L M M M x
M L H L x
M M M M x
H L L H x

These rules are based on multiple conjunction antecedents as:

̃ ̃ ̃ ̃ (5.21)

The membership function for this expression can be defined as:

̃( ) ̃( ) ̃( ) ̃( ) (5.22)

The list of rules which are based on the fuzzy inputs and outputs are given as:

If SI = H and Ht = L and Rg = H and Vc = L then target is Bird Migrating

If SI = H and Ht = M and Rg = M and Vc = M then target is Bird Migrating

If SI = H and Ht = H and Rg = M and Vc = H then target is Bird Feeding

If SI = H and Ht = H and Rg = L and Vc = H then target is Bird Feeding

If SI = L and Ht = H and Rg = L and Vc = H then target is Bat Feeding

If SI = L and Ht = M and Rg = M and Vc = M then target is Bat Feeding

118
If SI = M and Ht = L and Rg = H and Vc = L then target is Bat Migrating

If SI = M and Ht = M and Rg = M and Vc = M then target is Bat Migrating

If SI = H and Ht = L and Rg = L and Vc = H then target is Insect

Fuzziness helps in evaluating the fuzzy rules, but the final output of a fuzzy system

has to be a crisp number. By the defuzzification process, the crisp outputs will be

produced which are the best representation of the fuzzy sets. The input of defuzzification

is the aggregate output fuzzy set, and the output is a single number. Defuzzification is

performed based on center of mass [144][145]. This technique takes the output

distribution and finds its center of mass to come up with one crisp number. This crisp

value by center of mass is computed as [144]:

∑ ̃( )
(5.23)
∑ ̃ ( )

where z is the center of mass and ̃ is the membership in class ̃ at value .

5.6.4 Bayesian Inference

Bayesian inference is a statistical technique reflecting the conditional probability of

hypotheses for given prior probabilistic beliefs based on Bayes theorem [148][149].

Bayes’ theorem provides posterior conditional probability of ( ∣ ) which is

translated to the probability of hypothesis of given that evidence of occurs. The

notation of probability of event as ( ) is shortened to just ( ) in the rest of this

chapter to simplify the expressions. The Bayes’ theorem emanating from the conditional

probability is:

119
( ) ( | )( ) (5.24)

where Bayes’ theorem is defined as:

( | )( )
( | ) (5.25)
( )

The generalize form of Bayes’ rule is based on the partitions of the event space. If

hypothesis {Hj} is a sample of the event space ( { }), then for each

sample, the Bayes theorem is extended to:

( | )( ) ( | )( )
( | ) (5.26)
∑( | )( ) ( )

Bayes rules gives the best estimation of unknown parameters from a set of data

through two type of estimation: a) maximum likelihood estimate which is the value of x

that maximizes (data|x) (b) maximum a posterior estimate which is value of x that

maximizes (x|data).

We have two types of data (observation) from acoustic sensor (sensor 1), and

IR/radar sensor (sensor 2). Let E1 and E2 denote the evidences or observations driven

from sensor 1 and sensor 2, respectively. Also let H and C be the hypothesis or event

space of target signature from sensor 1 and sensor 2, respectively. Event space of sensor

1 is defined as the class/species of the birds and bats which are under study in this work.

These species are also documented as the available species in Ohio. The list of the

hypothesis or event space of H is shown in Table 5-4.

120
Table 5.4: Hypothesis or Event Space of Acoustics

Hypothesis ( ), j=1, …10


j
1 Warbler
2 Trushes
3 Sparrow
4 Labo
5 Lano
6 Epfu
7 Mylu
8 Pesu
9 Nyhu
10 Laci

Event space of target signature from sensor 2 is defined as C. It is the category of the

avian targets which can be “Bird”, “Bat”, or “Insect”, as has been previously described in

fuzzy system Section 5.6.3. The avian categories are shown in Table 5.5.

Although the membership function in fuzzy system is translated as the likelihood of

target in either of the avian category, for simplicity we are assuming it as probability in

this work.

Table 5.5: Avian Categories

Categories (c) , i=1,..I


i
1 Bird
2 Bat
3 Insect

The species in the list of event space are mutually exclusive and have as union of the

entire sample space as:

121
∑( | ) (5.27)

∑( | ) (5.28)

where m is the total number of bat species and n is the total number of bird classes in this

work which are seven and three, respectively.

The fusion node wishes to know the probabilities of the hypothesis H and C, given

the observation from sensors ( ) as ( | ). Since the H and C are

independent, the posterior probability is defined as:

( | ) ( | ) ( | ) (5.29)

Also sensor measurements are independent, thus we have

( | ) ( | ) ( | ) (5.30)

( | ) ( | ) ( | ) (5.31)

Equations (5.30) and (5.31) can be expressed in terms of theirs constituents using Bayes’

rule as:

( | )( )
( | ) ( | )( | ) ( | ) (5.32)
( )

( | )( )
( | ) ( | )( | ) ( | ) (5.33)
( )

where

- ( | ) is the probability of the target being one of the 10 species/classes in

Table 5.4 , given the observation driven from sensor 1 (acoustic detector).

122
- ( | ) is the probability of the target being one of the categories of bird, bat,

or insect , given the observation driven from sensor 1 (acoustic detector).

- (H) is the probability of a target being one of the 10 species in Table 5.4.

- (C) is the probability of the target being one of the categories of bird, bat , or

insect

We assume the same a priori probabilities for all of species/classes. For example the

probability of the target to be in the Warbler class is also the same as if the target to be in

the Sparrow class. Similarly, we assume the same probabilities for all of the avian

categories. For example the probability of the target to be a bird is the same as that to be a

bat. A priori probabilities are shown in Table 5.6.

Table 5.6: A priori Probabilities

A Priori Probabilities
Bird/Bat Species/Class Avian Category

( ) ( )

The probability of ( | ) is driven from acoustic sensors which is processed using

acoustic sensor processing. This probability is derived based on the output error of neural

network error ( ) in Equation (2.18). It is defined as:

( | ) (5.34)

where is the network error. The increase in the error causes a decrease in probability

and vice versa. The value of this term varies and depends to the target. The probability of

( | ) is driven from L1 fusion (IR/radar) and its value is obtained from the fuzzy

123
system. Table 5.7 shows a priori probabilities for species and categories driven from

acoustic sensor processing and L1 fusion, respectively.

Table 5.7: A priori Probabilities

Apriori Probabilities
(H=Warbler|. E1=Acoustics) (C=Bird |. E2=IR/radar)
(H=Thrush| E1=Acoustics ) (C=Bat | .E2=IR/radar)
(H=Sparrows| E1=Acoustics ) (C=Insect | .E2=IR/radar)
(H= Labo | E1=Acoustics )
(H= Lano | E1=Acoustics )
(H= Epfu | E1=Acoustics )
(H= Mylu | E1=Acoustics )
(H= Pesu | E1=Acoustics )
(H= Nyhu | E1=Acoustics )
(H= Laci | E1=Acoustics )

For example following a priori probabilities are driven from acoustic sensor

processing and fuzzy systems:

( | ) ,( | )

According to the Equations (5.29), (5.32), and (5.33), the posterior probabilities of

( | ) is defined as:

( | ) ( | ) ( | )

( | )( | )

( | )( | )

( | )( )
( | )
( )

( | )( )
( | )
( )

124
As ( ) and ( ) are independent, then ( | ) ( ).

Similarly, ( ) and ( ) are independent, then ( | ) ( ). Thus we

have:

( | )( )
( | )
( )

( | )( )
( | )
( )

( | ) ( ) ( ) ( | )

Considering the assumption as ( | ) , ( | ) ,

and a priori probabilities in Table 5.6 , we have:

( | ) ( ) ( ) ( | )

Assuming ( | ) and ( | ) then

( | ) ( | ) ( | )

( | )( | )

( | )( | )

( | )( )
( | )
( )

( | )( )
( | )
( )

As ( ) and ( ) are independent, then ( | ) ( ).

Similarly, ( ) and ( ) are independent, then ( | ) ( ). Thus we

have:

125
( | )( )
( | )
( )

( | )( )
( | )
( )

( | ) ( ) ( ) ( | )

Considering the assumption as ( | ) ,( | ) , and

apriori probabilities in Table 5.6, we have:

( | ) ( ) ( ) ( | )

It indicates that the maximum probability which occurs with ( | )

and ( | ) ) is 0.033. However, the standard maximum probability is

supposed to be 1. So normalizing is performed to convert the max probability to standard

maximum probability of 1 as:

It shows that the probability of the target to be Warbler and bird given the observation is

made from the acoustic and L1 (IR/radar) is 30.30%.

According to the Bayes’ rule, the probability of a target to be any species for example a

Warbler, given the target is a bird is calculated as:

( ∣ ) ( )
( ∣ ) (5.35)
( )

The Equation 5.35 can be extended to the generalized form of Bayes’ rules and can be

defined as:

126
( ∣ ) ( )
( ∣ )
∑ ( ∣ ) ( )

( ∣ ) ( ) (5.36)
( ∣ ) ( ) ( ∣ ) ( )
( ∣ ) ( )

Similarly, posterior probability of other bird classes in the event space is calculated as:

( )∣ ) ( )
( ∣ ) (5.37)
∑ ( ∣ ) ( )

( ∣ ) ( )
( ∣ ) ( ) ( ∣ ) ( )
( ∣ ) ( )

( )∣ ) ( )
( ∣ ) (5.38)
∑ ( ∣ ) ( )

( )∣ ) ( )
( ∣ ) ( ) ( ∣ ) ( )
( ∣ ) ( )

Similarly, posterior probabilities of the bat species are defined as:

( ∣ ) ( )
( ∣ ) (5.39)
∑ ( ∣ ) ( )

( )∣ ) ( )
( ∣ ) ( ) ( ∣ ) ( ) ( ∣ ) ( )
( ∣ ) ( ) ( ∣ ) ( ) ( ∣ ) ( )
( ∣ ) ( )

( ∣ ) ( )
( ∣ ) (5.40)
∑ ( ∣ ) ( )

127
( )∣ ) ( )
( ∣ ) ( ) ( ∣ ) ( ) ( ∣ ) ( )
( ∣ ) ( ) ( ∣ ) ( ) ( ∣ ) ( )
( ∣ ) ( )

( ∣ ) ( )
( ∣ ) (5.41)
∑ ( ∣ ) ( )

( )∣ ) ( )
( ∣ ) ( ) ( ∣ ) ( ) ( ∣ ) ( )
( ∣ ) ( ) ( ∣ ) ( ) ( ∣ ) ( )
( ∣ ) ( )

( ∣ ) ( )
( ∣ ) (5.42)
∑ ( ∣ ) ( )

( )∣ ) ( )
( ∣ ) ( ) ( ∣ ) ( ) ( ∣ ) ( )
( ∣ ) ( ) ( ∣ ) ( ) ( ∣ ) ( )
( ∣ ) ( )

( ∣ ) ( )
( ∣ ) (5.43)
∑ ( ∣ ) ( )

( )∣ ) ( )
( ∣ ) ( ) ( ∣ ) ( ) ( ∣ ) ( )
( ∣ ) ( ) ( ∣ ) ( ) ( ∣ ) ( )
( ∣ ) ( )

( ∣ ) ( )
( ∣ ) (5.44)
∑ ( ∣ ) ( )

( )∣ ) ( )
( ∣ ) ( ) ( ∣ ) ( ) ( ∣ ) ( )
( ∣ ) ( ) ( ∣ ) ( ) ( ∣ ) ( )
( ∣ ) ( )

128
( ∣ ) ( )
( ∣ ) (5.45)
∑ ( ∣ ) ( )

( )∣ ) ( )
( ∣ ) ( ) ( ∣ ) ( ) ( ∣ ) ( )
( ∣ ) ( ) ( ∣ ) ( ) ( ∣ ) ( )
( )∣ ) ( )

If the probability is less than a predefined threshold, the data fusion assignment is

ignored and it is assumed that the target in acoustic does not have a match in the IR/radar

targets. If the probability exceeds the predefined threshold, it is assumed that the target in

acoustic have a match in IR/radar and an identity tag describing the class/species of the

target is added to L1 feature vector. This combined vector is named as L2 feature vector.

The L2 feature vectors contain the information of target signature from IR, radar, and

acoustics. Figure 5-12 shows a scheme of L2 feature vector.

Straightness Heat Identity Category


Direction Range
Index Intensity

IR Radar Acoustics

Size (Pixel) Area (PPI)


Height Bird/Bat/Insect
Distance Radial Span Class/Species

Velocity (pix/s) Angular Span

Figure 5-13: L2 Feature Vector

129
5.7 Conclusion
Data fusion of the acoustics/IR/radar is modeled in hierarchical form of two levels.

Level one (L1) is a homogenous dissimilar fusion and is based on feature level fusion.

Level two (L2) is a heterogeneous fusion and is based on decision level fusion. The

feature level is employed based on the IR and radar data and combines features of

detected /tracked targets into a composite feature vector. The constructed feature vector is

an end-to-end individual sensors’ feature vector which serves as an input to the next

level. The second level is decision level, which uses the feature vector from L1 and fuses

with the acoustic data. The L1 fusion is developed based on various fusion functions such

as data alignment including time and spatial alignment. L2 fusion is developed based on

the fuzzy Bayesian technique and created L2 feature vectors. The fuzzy Bayesian

technique is performed to provide the posterior probability of the class/species of targets

for the L1 feature vectors. Fuzzy system provides a priori probability of the target

category (bird, bat, and insect). The Bayesian inference provides a posterior probability

of the species/class of the targets. If the probability exceeds a pre-defined threshold, the

identity tag including the species /class of the target is added to the L1 feature vector.

This combined vector is named L2 feature vector and is the result of the data fusion. The

L2 feature vectors or data fusion results contain multi-modal information of the targets

from three sensors, including identity, range, heat intensity, straightness index, etc.

130
Chapter 6

6 Experiments and Results

6.1 Experimental Setup


The multi-sensory monitoring system was deployed along the coastline of the western

Lake Erie basin in Ottawa National Wildlife Refuge (ONWR: N 41.633˚, W 83.200˚) and

at the University of Toledo [UToledo (acoustics only): N 41.656˚, W 83.606˚] during the

spring (May-July) and fall (Aug-Oct) migration seasons of 2011. The ONWR was

established in 1961 to serve as a protected habitat for migrating birds to offer nesting and

stopover opportunities during the migration period. The project area was defined by the

wildlife biologist collaborating on this project [160]. Three sensors including acoustic

detectors, thermal Infrared camera, and marine radar were used to collect multi-modal

information on targets. Recordings were made nightly from one hour after the sunset to

one hour before the sunrise. Two types of acoustic detectors were used to detect the bird

nocturnal flight calls and bat echolocation calls. They are SM2 with 22.05 KHz sampling

frequency and SM2BAT with 192 KHz sampling frequency, respectively. The IR camera

used was the SR-19 FLIR system with Field of View (FOV) of 36° and resolution Focal

Plane Array (FPA) of 320(H) × 240(V). The marine radar used was the Furuno 1500

Mark 3 with T-bar and parabolic antennas. The T-bar antenna in vertical mode was used

at sunrise and sunset to detect the ascending and descending rate of birds. This helps to

131
estimate how quickly birds pass through potential turbine strike zones. The parabolic dish

was used during the middle of the night which helps to determine how continuing

migrants are shifted their orientation as they approached the lakeshore. The IR camera

was pointing up and recorded only in the vertical direction. Therefore, only the vertical

radar data setup was used for fusion, as it has common area with IR. The IR camera was

located 25 meters from the radar. The sensors were run through the night and the data

were analyzed to study the movement and activity of migrants. Figure 6-1 shows the

project area in ONWR. Part of our experimental setup is shown in Figure 6-2.

Figure 6-1: Project Area (Ottawa National Wildlife Refuge, Ohio)

132
Figure 6-2: Experimental Setup

Two different libraries of calls from Sonobat [27] and Cornell University were used for birds

and bats study, respectively. The Sonobat library contains calls from a variety of bat species

documented in the Eastern United States. However, among all 11 species documented in Ohio,

only seven species were found in our project area. The Cornell University library contains

predefined calls for different bird classes. However, three commonly known families in this area

were selected for nocturnal flight call analysis. The list of birds and bats in our work are provided

in Table 6.1.

Table 6.1: Bird and Bat class and Species used in this Work

Bat Species Birds


Scientific Name Acronym Name Class
Eptesicus fuscus Epfu Warbler

Lasiurus borealis Labo Thrush

Lasionycteris noctivagans Lano Sparrow

Myotis lucifugus Mylu

Lasiurus cinereus Laci

Nycticeius humeralis Nyhu

Perimyotis subflavus Pesu

133
6.2 Sensor Processing
Data collected from three types of sensors have been processed separately using

different techniques due to varying sensor technology. Feature extraction and

classification algorithms were performed to process acoustic data as described in Chapter

2. Acoustic data were converted to wave form using WAC2WAV Utility from Wildlife

Acoustics [25] which is an appropriate format for acoustic signals. Features of signals

were extracted using MFCC, STFT, and DWT techniques and then were subjected to

dimension reduction using PCA [104]. Evolutionary Neural Network (ENN) was

developed for this application as an efficient classification technique based on Genetic

Algorithm (GA) and Feedforward Neural Network (FNN). Extracted features were the

inputs and species to be classified were the outputs of the network.

A detection and tracking algorithm was developed and used to process IR (video)

data, as described in Chapter 3. Different techniques including background subtraction

using RA, thresholding using an extended Otsu threshold, and noise suppression using

morphology were performed to detect the putative blips. A tracking algorithm was then

used to track the target and build the trajectory. Finally, features of targets including size,

velocity, heat, direction, and straightness index were calculated.

Blip detection and target tracking algorithms were used for processing of radar data,

as described in Chapter 4. Blip detection including filtering and detecting the putative

blips were performed using the radR platform and target tracking was implemented

independently in MATLAB. Tracking was performed in two phases of target estimation

and data association. Particle filter and Nearest Neighbor techniques were used for

estimation and data association, respectively. Parallel implementation of sensor

134
processing algorithms are also developed to improve temporal performance

[155][156][157].

6.2.1 UAMS Experiments

A total of 459 nocturnal bird flight calls were collected during spring 2011. Bird calls

were processed separately from bat echolocation calls due to the difference in signal

characteristics and frequencies. Bird flight calls were processed using SIFS feature

extraction and different classifiers such as HMM, BP-NN, and ENN [158]. However, this

work is mainly focused in bat data processing. A total of 675 and 1107 sequences of bat

echolocation calls were collected during the 2011 spring and fall migration periods,

respectively. Features of calls were extracted using STFT, MFCC, and DWT. Features of

the wavelet are defined based on four parameters: maximum, minimum, mean, and

standard deviation of detail coefficients in each sub-band.

Features were then subjected to dimension reduction by PCA using 160 principal

components. Resultant features were employed in ENN classifier. Features of calls were

used as inputs to the neural network. Outputs were species to be classified. Training calls

were obtained from a-priori-known calls from the reference library of Sonobat [27]. This

collection contains acoustic data for 15 different species of Eastern US bats. The neural

network was trained using genetic algorithm based on reference calls and was then tested

using the collected data. Figure 6-3 shows the overall process of the UAMS. Table 6.2

shows various parameters used in the ENN algorithm.

135
Collected
Database
Calls

Feature Extraction

FFT MFCC DWT

Training
Display
Testing
FFNN Species

Best Chromosomes
GA

Figure 6-3: Block Diagram of UAMS

Table 6.2: ENN Parameters

Parameter Value
Activation function Sigmoid
Number of chromosomes 800
Fitness function MSE
Neurons in hidden layer 100
Number of generations 5000
Neurons in output layer 5
Crossover probability 0.7
Mutation probability 0.1

Table 6.3 shows the overall classification accuracy using different feature

extraction techniques. Table 6.4 shows the classification accuracy in species level using

different feature extractors. Results show that the DWT when combined with ENN gives

promising results in species level classification.

136
Table 6.3: Overall Classification Accuracy based on Feature
Extraction Techniques

Generation FFT MFCC DWT

2000 56.6% 68% 80%

3000 61.7% 74% 83.5%

4000 67.5% 78.3% 86%

5000 70.8% 81.6% 93.3%

Table 6.4: Classification Accuracy in Species Level based on


Feature Extraction Techniques

Bat Species
Scientific Name STFT MFCC DWT
Acronym Name
Eptesicus
Epfu 75% 83.33% 95.83%
fuscus
Lasiurus
Labo 79.16% 83.33% 100%
borealis
Lasionycteris
Lano 66.66% 79.16% 91.66%
noctivagans
Myotis
Mylu 70.83% 87.5% 95.83%
lucifugus
Lasiurus
Laci 57.5% 63% 85.23%
cinereus
Nycticeius
Nyhu 64.3% 66.11% 69.32%
humeralis
Perimyotis
Pesu 69.35% 63.12% 73.15%
subflavus

The ENN algorithm is compared with other existing classification techniques such

as Support Vector Machine (SVM), Discriminant Function Analysis (DFA), and Back

Propagation Neural Network (BP-NN) as shown in Figure 6-4. It can be seen that

ENN+DWT outperforms other techniques in accuracy. It can also be observed that ENN

provides higher accuracy when compared to existing classification approaches.

137
100

90

80

70
MFCC

Percentage
60

50
DWT
40

30 STFT
20

10

0
ENN DFA SVM MLP

Figure 6-4: Classification Comparison

6.2.2 RMS Experiments

The first part of RMS uses radR [100] for detection of blips in the migratory data set.

radR removes noise, detects possible targets in terms of blips, and saves the blips’

information. Target information is classified as either old blips in previous frames or

blips in the current frame. Table 6.5 gives the format of a data frame.

Table 6.5: Input Data Frame of Targets

x y z t ns area int max aspan rspan perim range scan

where

 ns is the number of samples in the patch

 area is the apparent area of the patch (in meters2)

 perim is the apparent perimeter of the patch (in meters)

 aspan is the angular span of the patch (in samples)

 rspan is the radial span of the patch (in samples)

 x, y, z, t are coordinates of the blip centroid in meters and seconds

138
 max is the maximum intensity of any sample in the patch (scaled to the range [0,

1])

 int is the mean intensity of samples in the patch (scaled to the range [0, 1])

Second part of RMS consists of target tracking via detected blips. The tracking is

performed using Particle Filter which is developed in the MATLAB and is independent

from radR. Nearest Neighbor using Euclidian distance is used for data association.

Parameters used in the Particle Filter are shown in Table 6.6.

Table 6.6: Particle Filter Parameters

Parameter Value

Scan time 2.5 s

Number of particles 50

Step size 1

Data association NN

Measurement metric Euclidean Distance

The Particle Filter algorithm provides two kinds of output: a polar plot for visual

observations and an Excel sheet containing target information. An example of a real track

and target information is shown in Figure 6-5.

139
Track ID area range perim intensity #samples x y z
2 617 288 101 6.97 134.5 121.8 -250.14 75

Figure 6-5: Sample Real Tracked Target using Particle Filter

6.2.3 IIMS Experiments

A previously developed detection and tracking algorithm was used to process

videos from IR camera [96][159]. A block diagram of IIMS is shown in Figure 6-6.

Among 150 visually detected targets for testing, 141 targets were detected by the

proposed algorithm. Detection error may be due to the targets’ smaller size, farther

distance from the camera, or a highly cluttered image. Table 6.7 shows results from the

detection and tracking algorithm. The proposed algorithm also extracts a number of

features such as size, velocity, heat, straightness index, and direction. Table 6.8 provides

a summary of extracted features.

140
An ant-based clustering algorithm was used to group birds, bats, and insects based

on their extracted features. Various parameters used in ACA are shown in Table 6.9.

Three variations of ACA were developed which are S-ACA, DS-ACA, and SM-ACA.

Number of clusters created with different numbers of iterations based on the three

variations of ACA is shown in Table 6.10.

Detection
Background
Thresholding Filtering
Subtraction

Tracking
Labeling Morphology Component
Connectivity

Feature Extraction

Clustering

S-ACA DS-ACA SM-ACA

Display Number
of Tracks

Figure 6-6: Block Diagram of IIMS

141
Table 6.7: Total Detected Targets

Total Targets Detected Targets Missed Targets Correction Detection Miss Detection

150 141 9 94% 6%

Table 6.8: IR Extracted Features

Blob # Direction Size Velocity Distance Heat Straightness Index


1 E 284.578 816.706 517.247 160.421 0.994
2 W 84.625 1866.164 497.643 120.125 0.999
3 W 73.625 1167.158 311.242 120 0.999
4 NW 196.125 2214.638 590.570 124.875 1
5 W 63.375 1783.415 475.577 122.75 0.999
6 N 111.031 447.230 477.045 142.062 0.996
7 N 210.411 1195.826 677.635 136.235 0.998
8 N 177.777 2034.260 610.278 131 0.998
9 W 134.3 1882.251 627.417 125.7 0.999
10 W 63.666 1576.658 157.665 138 0.999

Table 6.9: Parameter Settings in ACA

Parameter Value
Number of ants 50
Threshold constant 0.1
Threshold constant 0.15
Neighborhood size 4,8
Dissimilarity scale 1.05
Maximum velocity 6
Memory size 10

142
Table 6.10: Total Number of Clusters in Different Iterations in ACA

Iteration
Technique
10,000 100,000 1,000,000 10,000,000
S-ACA ≥ 20 9 6 4
DS-ACA ≥ 12 7 5 4
SM-ACA ≥ 14 6 4 4

It can be seen from Table 6.10 that numbers of clusters decreased with an increase

in the number of iterations. All three algorithms converge in four groups. The SM-ACA

converges faster than DS-ACA and S-ACA. At least three clusters belonging to insects,

birds, and, bats are expected to be created. The fourth cluster is assumed to be unknown

and it may be due to noise of system or error of algorithm. The behavior of these three

categories strongly depends on the target features. For example, bats are expected to have

a zigzag pattern with a straightness index less than that of birds; however, the heat

intensity of a bird generally is higher than insects and bats due to a higher flapping rate.

The number of individuals in clusters gives a clue to population of birds, bats and,

insects. It is assumed that insects have higher populations than birds. Similarly birds have

higher population than bats.

A Fuzzy C-Means (FCM) algorithm was also used to cluster the targets of

interest. The number of clusters was set to three. Thus, three clusters were created and the

number of individuals in each cluster was obtained. Table 6.11 shows the information of

the created clusters using FCM.

143
Table 6.11: FCM Clusters

Number of Number of Number of


Total Number of
Technique Targets in Targets in Targets in
Clusters
Cluster 1 Cluster 2 Cluster 3
FCM 3 670 50 279

6.3 L1 Fusion
Individual sensors observe targets in their own coverage region. However, the data

fusion supports the targets observed by all three sensors to provide multimodal

information about the targets. Pre-processing, alignment, and correlation are performed to

obtain common targets. Coverage areas of IR and radar are shown in Figures 6-7 to 6-13

for visual purposes using 3DMax software.

E Y
ty S
N ty
ty W Z
ty

E
N S
X
W

Figure 6-7: Coverage Area of Marine Radar in Vertical Position

144
Figure 6-8: Coverage Area of IR Camera

Figure 6-9: Common Coverage Area of IR and Radar

145
Figure 6-10: Common Coverage Area of IR and Radar (zoomed)

Figure 6-11: Common Coverage Area of IR and Radar (zoomed)

146
Figure 6-12: Common Coverage Area of IR and Radar (zoomed)

Figure 6-13: Coverage Area outside IR and Radar (zoomed side views)

6.3.1 Common Coverage Area (CCA Function)

The data in the common coverage area is derived from two sensors: IR and radar. The

maximum detection of radar and IR is 1.5 nautical miles and 475 meters, respectively.

147
Most of the IR zone is covered by the radar area and there is only a small part outside the

radar coverage, which can be ignored. To extract the common area, filtering is applied to

remove targets from this boundary. The filter is based on two criteria given in Equations

(6.1) and (6.2):

) (6.1)

) (6.2)

Proof:

1)

| => (Criteria 1)

2)

According to FOV of camera and Figure 6-14:

(̂) ( ) (6.3)

So is calculated as:

(̂ ) (6.4)

148
E
C
B
𝐶 D

Et
y St
N y
ty W A
ty
Figure 6-14: Field of View of IR

In ∆(ABC) we have

(6.5)

As , then according to Equation (6.5) ,

(6.6)

According to Pythagorean Theorem, is measured as:

(6.7)

The IR needs to be rotated 19.75° from the north. This is accomplished in two steps. First

the square is rotated with clockwise resulting in a maximum diameter as shown

in Figure 6-15.

149
Rotation with
=45°
Rotation with
𝐻 D 𝜃 =19.75°
𝐻 𝐷 D
𝐹
𝐻 𝐷
𝛽 B
𝐹
𝐻

𝜃
O P

Figure 6-15: IR Direction

However, we are interested in clockwise rotation of

Having , is measured as:

(6.8)

According to Figure 6-16, is computed as:

(6.9)

B
𝐹 𝛽
F

Figure 6-16: IR Geometry

According to Equations (6.7), (6.8), and (6.9):

( ) (6.10)

150
However, we are interested in finding the size of . According to Figure 6-15:

(6.11)

According to Equations (6.10) and (6.11), is measured as:

(6.12)

Thus, the maximum value in the x-axis is 156.696 meters and maximum value in z is the

457.2 meters (maximum detection range of camera).

The slope of the line which is passing through O and D is:

(6.13)

where

( ) (6.14)

Point coordinates of ( ) and ( ) are shown as:

( ) ( )( ) ( ) (6.15)

According to Equations (6.14) and (6.15):

(6.16)

Finally, considering both the positive and negative sides of the x-axis, this relation is
expressed as:

(6.17)

This derivation will satisfy criteria 2.

6.3.2 Time Alignment (TA Function)

The radar and IR data are acquired based on GMT and EST format, respectively.

The EST format is considered as the reference and all of the radar data are converted into

EST format. As there is a difference of five hours between the EST and GMT format, the

151
following equations are used to convert the GMT to EST format. Equation (6.18) checks

the date as:

( ) ( ) ( ) (6.18)

According to this equation:

If then

If then

where

is the date in Eastern Standard Time(EST)

is the date in Greenwich Mean Time (GMT)

is the time in EST

is the time in GMT

Equation (6.19) checks the time of radar target and converts to EST format in the

following fashion:

( ) ( )

( ( )) (6.19)

According to this equation:

If then the = 19 +

If then the = -5

152
where

is the Eastern Standard Time

is the Greenwich Mean Time

For example:

02:00:00 (GMT) on 12/03/2013 = 21:00:00 EST of 12/02/2013.

06:00:00 (GMT) on 12/03/2013 = 01:00:00 EST of 12/03/2013.

6.3.3 Spatial Alignment (SA Function)

The radar coordinate system is considered as the reference coordinate system in

the L1 fusion. The spatial alignment is performed by converting IR data to the radar

coordinate system using Equation (5.8). The proof is provided as follows:

Original IR and rotated IR coordinates are represented as (x,y) and (x’, y’) as shown in

Figures 6-17 and 6-18, and we have

(6.20)

(6.21)

(6.22)

Since the radar data are in vertical mode, we have

(6.23)

So according to Equation (6.20), we have

(6.24)

153
Z

X
x’
x
y y’
θ

Y
Figure 6-17: Spatial Alignment of IR and Radar


y y

A x’

θ
x

Figure 6-18: Spatial Alignment

After the temporal and spatial alignment processes, the target-to-target association is

performed, as described in Chapter 5. As the result of the gating, association, and

assignment, a feature vector is generated for each track including the features from IR

and radar, as shown in Figure 6-19. There are six features (R1, R2,..,R6) driven from IR and

five features(I1 , I2,…,I5) driven from radar as shown in Figures 5-13 and 6-19. The fusion

node generates one single feature vector combining the IR and radar features.

154
Table 6.12 shows sample IR/radar feature vectors generated based on the real collected

data.

Radar
IR

I1 I2 I3 I4 I5 I6 R1 R2 R3 R4 R5

Data Level Fusion


Gating

Data Alignment

Association

I1 I2 I3 I4 I5 I6 R1 R2 R3 R4 R5

Figure 6-19: IR/Radar Feature Vector

Table 6.12: Fusion Feature Vectors

IR Features Radar Features

I1 I2 I3 I4 I5 I6 R1 R2 R3 R4 R5

0.9712 Southeast 136.42 342.6 623 453 305 23.96 15.07 345 211
0.99 East 138.73 640.58 213 643 224 15.78 9.08 654 143
0.9999 Southwest 150.64 327.78 425 2231 590 10.54 18.10 324 260
0.9448 Northwest 136.96 156.01 343 756 454 28.94 16.48 758 294
0.643 South 128 346.2 535 547 310 11.21 17.23 4332 179

I1= Straightness Index, I2=Direction , I3= Heat , I4= Distance , I5=velocity , I6=Size

R1= Range , R2=Angular Span, R3=Radial Span , R4=Area , R5=Height

155
6.4 L2 Fusion
Data fusion in level 2 is performed based on the resultant feature vectors driven

from L1 fusion and the previously processed acoustic data. In L2 fusion, one single L1

feature vector is assigned to each acoustic target based on the association metric

described in Chapter 5. As a result of the association, multimodal feature vectors are

formed which convey acoustic, IR, and radar features, as shown in Figure 6-20. The

definition of the features is shown in Table 6.13.

 Range
 Area
 Perimeter
 Angular
Span
 Radial
 Straightness Feature
Index Vector
 Direction
 Angle

Species

Figure 6-20: Multi-modal Feature Vectors

156
Table 6.13: Definitions and Units of Feature Vectors

Feature Definition Unit Sensor


Range Range or altitude of the target m Radar
Straightness Index (SI) Straightness Index of the flight pattern - IR
Direction Flight direction - IR
Angle of flight in circular coordinate degree IR
Angle system, adjusted by considering 19.75º
from the North
Area Average area of the target cm2 Radar
Number of columns along the angular -
Angular Span (Aspan) Radar
area in radar Plan Position Indicator (PPI)
Radial Span (Rspan ) Number of rows along the range in PPI - Radar
Perimeter (Perim) Average perimeter of the target cm Radar
Species Species of the target - Acoustics

Timestamp is used as an association property in the L2 fusion. No spatial alignment is

performed in this level. The timing format of the acoustics is EST and is consistent with

L1 feature vectors. Avian category for each vector is defined based on the Fuzzy System

(FS) using MATLAB Fuzzy Toolbox. The FS is developed based on Mamdani-style

fuzzy system. In this method, first fuzzy rules are determined as explained in Chapter 6.

Fuzzification is performed to map crisp values of the inputs as the features driven from

the sensors to fuzzy values using membership functions. The fuzzified inputs are

combined by fuzzy rules to establish rule strength. Consequences of rules are found by

combining the rule strength and the output membership function. Defuzzification is

performed on output distribution to convert it to crisp variables as putative categories.

In practice, membership functions can have multiple different types, such as the

triangular waveform, trapezoidal waveform, Gaussian waveform, bell-shaped waveform,

sigmoidal waveform and S-curve waveform. The exact type depends on the actual

157
applications. The trapezoidal function is used for the input sets’ membership function in

this work. However, it is adjustable by the user. The membership functions of the fuzzy

input sets are shown in Figure 6-21. It can be seen that three membership functions are

considered for each inputs, as explained in Chapter 5. There are four inputs sets which

are straightness index, heat, range, and velocity. The configuration setting of the inputs

and membership functions are shown in Table 6.14 and 6.15, respectively.

Figure 6-22 shows the rule viewer of fuzzy system. The rule viewer is based on

the fuzzy inference described earlier and displays a scheme of the overall fuzzy process.

In the viewer five plots across the top of the figure represent the antecedent and

consequent of the first rule. Each rule is a row of plots, and each column is a variable.

First three variables are inputs (straightness index, heat, range, velocity) and the last one

is the output (category). There are nine rules shown in Figure 6-22 as derived in Chapter

5. Each membership function in the set is associated with a particular rule and maps input

variable to output through set of nine rules. The amount of membership function of inputs

is shown graphically by yellow highlights. There are three membership functions defined

in the output: birds [0, 0.33], bats [0.34, 0.66] and insects [0.67, 1]. The last row in the

output (category) in Figure 6-22 shows the aggregated outputs.

 If the aggregated value is in the range of [0, 0.33], it is a bird.

 If the aggregated value is in the range of [0.34, 0.66] it is a bat

 If the aggregated value is in the range of [0.67, 1] is an insect.

For example with a SI=0.5 , heat=125 , range =250 and velocity= 2500 the membership

function in the output will be 0.387 which falls in the range of bats.

158
Figure 6-21: Membership Functions of the Input Sets

159
Table 6.14: Configuration Settings of the Fuzzy System

Feature Value
Name AvianFIS
Type ‘mamdani’
Number of Inputs 4
Number of Outputs 1
Number of rules 9
AndMethod ‘min’
OrMethod ‘max’
ImpMethod ‘min’
AggMethod ‘max’
DefuzzMethod ‘centroid’

Table 6.15: Configuration Settings of the Inputs

Feature Value
Input 1
Name Straightness Index
Range [0,1]
Number of MF 3
MF1 L:trapmf
MF2 M:trapmf
MF3 H:trapmf
Input 2
Name Heat
Range [0,250]
Number of MF 3
MF1 L:trapmf
MF2 M:trapmf
MF3 H:trapmf
Input 3
Name Range
Range [0,500]
Number of MF 3
MF1 L:trapmf
MF2 M:trapmf
MF3 H:trapmf
Input 4
Name Velocity
Range [0,5000]
Number of MF 3
MF1 L:trapmf
MF2 M:trapmf
MF3 H:trapmf

160
Figure 6-22: Fuzzy Rule Viewer

The membership functions are the likelihood of the target that belongs to any of the

avian category (bird, bat, insect). They are considered as the input to the Bayesian

inference. The output is the probability of the target to be either of the bird/bats’

class/species of interests. The Bayesian inference is developed and is explained in detail

in Chapter 5. Table 6.14 shows a priori probabilities and hypotheses in the avian

Bayesian technique. The result of Bayesian inference which is the result of L2 fusion is

the L2 feature vectors and their associated probabilities. Table 6.15 shows number of

samples of L2 vectors obtained from the real data.

161
Table 6.16: Parameters in Bayesian Technique

Categories of
H|E1 C|E2 C H
Targets
Labo| Bird|
1 Bird Labo
Sensor1(Acoustics) Sensor2(IR/radar)
Lano | Bat|
2 Bat Lano
Sensor1(Acoustics) Sensor2(IR/radar)
Epfu | Insect|
3 Insect Epfu
Sensor1(Acoustics) Sensor2(IR/radar)
Mylu |
4 Mylu
Sensor1(Acoustics)
Laci |
5 Laci
Sensor1(Acoustics)
Nyhu |
6 Nyhu
Sensor1(Acoustics)
Pesu |
7 Pesu
Sensor1(Acoustics)
Warbler |
8 Warbler
Sensor1(Acoustics)
Sparrow |
9 Sparrow
Sensor1(Acoustics)
Thrush |
10 Thrush
Sensor1(Acoustics)

162
Straightne Radial Angular
Target# Direction Heat Size Velocity Distance Range Height Area Category Species Probability
ss Index Span Span

1 E 0.723 120 479.2341 645 342.6453 400 152 6.66 23.5 550 Bat Epfu 0.96

2 W 0.847 111 143.8889 234 533.3103 580 123 4.12 24.22 654 Bat Labo 0.68

3 W 0.9 140 534.5 890 175.1987 450 238 8.43 22.13 325 Bird Warbler 0.55
Table 6.17: Sample L2 Vectors

4 N 1 178 1157.5 657 698.9297 671 385 4.02 23.12 678 Bird Warbler 0.71

163
5 W 1 123 68.6154 456 640.5802 501 226 6.69 21.32 445 Bird Thrush 0.51

6 N 0.8 102 81.4331 721 746.9654 320 319 10.12 18.3 558 Bat Mylu 0.43

7 N 0.6 96 122.6667 453 529.3387 394 396 7.51 21.11 435 Bat Epfu 0.54

8 W 0.9 123 159.2333 601 469.4246 412 256 6.99 15.2 356 Bat Thrush 0.86

9 W 0.8 131 193.1 455 683.2687 431 187 4.02 20.32 675 Bird Sparrow 0.49

10 W 1 95 59.4 674 163.4888 593 244 5.33 28.47 475 Bird Thrush 0.38
6.5 Multi-sensor Fusion for Spring 2011
This section provides the results of sensor processing and data fusion in the spring

2011 migration period. Figures 6-23 to 6-34 show the results of single sensor processing

for all available data collected during the migration period. Figure 6-23 shows the

number of bird flight calls for three different classes. The data are shown on a nightly

basis. The class information is acquired by the acoustic detector. It can be seen that in

41% of the nights, the number of Thrushes are higher than Warblers and Sparrows. In

35% of the nights, the number of Warblers exceeds both of the other two classes.

Number of Sparrows is higher than the Warblers and Thrushes in 17% of the nights. On

some nights the number of calls for different classes is observed to be the same.

Bird Flight Calls of Different Classes for


Available Data in Spring 2011
90
80
Number of Flight Calls

70
60
50
40 Warbler
30
20 Thrush
10 Sparrow
0

Night

Figure 6-23: Bird Class Composition for Available Acoustic Data in Nightly Basis in

Spring 2011

Figure 6-24 shows overall bird class composition over the migration period in spring

2011. The class information is acquired by acoustics. It can be seen Thrushes and

164
Warblers are 47% and 35% of all flight calls, respectively. Sparrows make only 18% of

all flight calls.

Overall Bird Class Composition for Available Data in Spring


2011
Sparrow
18%

Warbler
35%

Thrush
47%

Figure 6-24: Overall Bird Class Composition for Available Acoustic Data in Spring

2011

Figure 6-25 shows the total number of bird flight calls on a nightly basis for available

acoustic data over the migration period in spring 2011. A maximum of 40 calls per night

were recorded on 76% of the nights.

Figure 6-26 shows the number of bat passes of different species based on available

data during the migration period of spring 2011. The species information and number of

passes are acquired by acoustic detector. Numbers of Epfus are higher than the number of

other species in approximately 50% of the nights. The number of Labos exceeds the

number of other species during 25% of the nights.

165
Total Number of Bird Flight Calls for Available Data in
Spring 2011
140

Number of Flight Calls


120
100
80
60
40
20
0

5/22/2011
5/10/2011
5/11/2011
5/12/2011
5/13/2011
5/14/2011
5/15/2011
5/16/2011
5/17/2011
5/18/2011
5/19/2011
5/20/2011
5/21/2011

5/23/2011
5/24/2011
5/25/2011
5/26/2011
5/27/2011
5/4/2011
5/5/2011
5/6/2011
5/7/2011
5/8/2011
5/9/2011
Night

Figure 6-25: Total Number of Bird Flight Calls for Available Acoustic Data in

Spring 2011

Bat Passes of Different Species for Available Data in


Spring 2011
60
Total Number of Passes

50
Lano
40
Nyhu
30
Laci
20
Epfu
10
Pesu
0
Labo
Mylu
Night

Figure 6-26: Bat Species Composition for Available Acoustic Data in Nightly Basis

in Spring 2011

Figure 6-27 shows bat species composition as acquired by acoustic detector. Epfu is

abundant with 37% of all the passes. Mylu and Labo, each have 18% of calls and are

considered as common species. Laci, Lano, Nyhu, and Pesu comprise small number of all

passes and are categorized as rare species.

166
Overall Bat Species Composition for Available Data in Spring
2011

Nyhu
1%
Lano
Mylu 10%
18%
Laci
13%

Labo
18%

Epfu
37%
Pesu
3%

Figure 6-27: Overall Bat Species Composition for Available Acoustic Data in Spring

2011

Figure 6-28 shows the total number of bat passes on a nightly basis in spring 2011. In

almost all nights the number of passes was observed as lower than 60 passes per night.

IR data provides directional information. Figure 6-29 shows the flight direction of the

targets in spring 2011. It can be seen that 46% of the targets were flying towards north.

Flights towards west, east, and south were observed as 15%, 20%, and 19% respectively.

Total number of passes was observed as higher than 100 passes only during one night.

167
Total Number of Bat Passes for Available Data in Spring
2011
120

100

Total Passes
80

60

40

20

Night

Figure 6-28: Total Number of Bat Passes for Available Acoustic Data in Spring 2011

Target Flight Direction for Available IR Data in Spring


2011

W
15%
N
E 46%
20%

S
19%

Figure 6-29: Flight Direction of Targets for Available IR Data in Spring 2011

Figure 6-30 shows the total number of IR tracks on a nightly basis in spring 2011. It

can be seen that in 71% of the nights, the total number of IR tracks were less than 300

168
tracks per night. The total number of tracks exceeds more than 600 tracks per night in

28% of the nights.

Total Number of IR Tracks in Spring 2011


Total Number of Targets 1200

1000

800

600

400

200

Night

Figure 6-30: Total Number of IR Tracks for Available IR Data in Spring 2011

Figure 6-31 shows the total number of radar tracks on a nightly basis in spring 2011.

The data were acquired by radar in vertical mode. It can be seen that in 70% of all nights,

the number of tracks were less than 500 tracks per night. The number of radar tracks

exceeds more than 500 tracks per night in 30% of nights.

Figures 6-32 to 6-34 show the range of radar tracks on selected sample nights in

spring 2011. The range information is obtained from radar in vertical mode. Figure 6-32

shows the range of targets during the night of 4/22/2011. It can be seen that 97% of

targets were recorded at range above 1000 meters and 2% of the targets were at less than

1000 meters.

169
Total Number of Radar Tracks for Available Data in Spring 2011
(Vertical Radar Data)
4000
3500
Number of Tracks
3000
2500
2000
1500
1000
500
0

Night

Figure 6-31: Total Number of Radar Tracks for Available Data in Spring 2011

(Vertical Mode)

4/22/2011
6000

5000

4000
Range (m)

3000

2000

1000

0
1 5 9 13 17 21 25 29 33 37 41 45 49 53 57 61 65 69
Target Number

Figure 6-32: Range of Radar Tracks for Available Data on 4/22/2011

Figure 6-33 shows the range of targets during night of 4/24/2011. Percentages of

targets less than 500 meters, between 500 and 1000 meters, and more than 1000 meters

are 8%, 35% and 55% respectively.

170
4/24/2011
3000

2500

2000
Range (m) 1500

1000

500

205

375
103
120
137
154
171
188

222
239
256
273
290
307
324
341
358
1

35
18

52
69
86
Target Number

Figure 6-33: Range of Radar Tracks for Available Data on 4/24/2011

Figure 6-34 shows the range of targets during the night of 4/26/2011. Percentages of

targets less than 1000 meters and above 1000 meters are 66% and 33% respectively.

Common data is defined as data during common nights where data were available

from all three sensors. Acoustic data for bats was not available for common nights for

spring 2011. Figure 6-35 shows the number of bird flight calls over common nights in

spring 2011. These results were obtained using an acoustic detector. It can be seen that

only 9% of all bird flight calls detected in spring 2011 were observed on common nights.

171
4/26/2011
1600
1400
1200
1000
Range (m)
800
600
400
200
0
1 2 3 4 5 6 7 8 9
Target Number

Figure 6-34: Range of Radar Tracks for Available Data in 4/26/2011

Total Number of Bird Flight Calls over Common Nights in


Spring 2011
18
Total Number of Flight Calls

16
14
12
10
8
6
4
2
0
5/4/2011 5/5/2011 5/6/2011 5/7/2011 5/8/2011 5/9/2011 5/10/2011
Night

Figure 6-35: Total Number of Bird Flight Calls over Common Nights in Spring 2011

Figure 6-36 shows the number of bird flight calls of different classes over common

night. Percentage of flight calls for Warblers, Thrushes, and Sparrows were 10%, 3%,

and 20% respectively.

172
Number of Bird Flight Calls of Different Species over
Common Nights in Spring 2011
8

Total Number of Flight Calls


7
6
5
4
3
Warbler
2 Trush
1
Sparrow
0

Night

Figure 6-36: Bird Class Composition over Common Nights in Spring 2011

Figure 6-37 shows the flight direction of the targets over common nights as acquired

by IR camera. It can be seen that 52% of the targets were flying toward north. It can also

be seen that 18% were flying toward south.

Bird Flight Direction over Common Nights in Spring


2011

W
15%

E
15% N
52%

S
18%

Figure 6-37: Flight Direction of Targets over Common Nights in Spring 2011

173
Figure 6-38 shows the total number of IR tracks over common nights in spring 2011.

There were less than 300 targets in all the nights.

Total Number of IR Tracks Over Common Nights in


Spring 2011
300
Number of Tracks

250
200
150
100
50 Total
0

Night

Figure 6-38: Total Number of IR Targets over Common Nights in Spring 2011

Figures 6-39 to 6-41 show the direction of flight for common nights using IR camera. It

can be seen the direction of the targets in spring is mostly towards north.

Figure 6-39: IR Direction in Night of (a) 05/04/2011 (b) 05/05/2011

174
Figure 6-40: IR Direction in Night of (a) 05/06/2011 (b) 05/08/2011

Figure 6-41: IR Direction in Night of 05/09/2011

Figure 6-42 shows the total number of radar tracks over common nights in spring 2011.

These data are obtained using radar in vertical mode. It can be seen that about 11% of all

the targets acquired by radar in spring 2011 were observed in common nights.

175
Total Number of Radar Tracks over Common Nights in
Spring 2011
900
800
700
Number of Tracks 600
500
400
300
200
100
0
Date 5/4/2011 5/5/2011 5/8/2011 5/9/2011 5/10/2011
Night

Figure 6-42: Total Number of Radar Tracks over Common Nights

Direction and species information is obtained by combining IR and acoustic data. Figure

6-43 shows the flight direction of bird classes. Percentage of Warblers, Sparrows, and

Thrushes flying towards north were 61%, 23%, and 14% respectively. It can also be

observed that number of all Warblers, Sparrows, and Thrushes flying towards north was

higher than other three directions.

Direction of Bird Species of Fusion Data in Spring 2011


12

10
Total Number of Targets

6 Warbler
Sparrow
4
Thrush
2

0
N E S W
Direction

Figure 6-43: Direction of Different Bird Classes for Fusion Data in Spring 2011

176
Figure 6-44 shows the direction of all the bird targets in fusion data for spring 2011.

Percentage of all birds flying towards north, east, south and west were 54%, 20%, 20%,

and 6% respectively.

Direction of Birds in Fusion Data in Spring 2011

W
6%
S
20%

N
54%
E
20%

Figure 6-44: Overall Direction of Bird Classes for Fusion Data in Spring 2011

The range and species information are obtained by radar and acoustics, respectively.

Figure 6-45 to 6-47 shows the range of bird classes for fusion data in spring 2011. Figure

6-47 shows the range of Warblers. It can be seen that 75% of the Warblers were detected

in a range of 500 to 900 meters.

177
Range of Warblers
1600

1400

1200

1000
Range (m)
800

600 Warbler
400

200

0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
Target Number

Figure 6-45: Range of Warblers for Fusion Data in Spring 2011

Thrushes flying in the range between 500 to 900 meters were about 57% of all

Thrushes, as shown in Figure 6-46. Sparrows flying in the range between 500 to 900

meters were 50% of all Sparrows, as shown in Figure 6-47.

Range of Trushes
1600

1400

1200

1000
Range (m)

800

600 Thrush
400

200

0
1 2 3 4 5 6 7
Target Number

Figure 6-46: Range of Thrushes for Fusion Data in Spring 2011

178
Range of Sparrows
1600

1400

1200

Range (m) 1000

800

600 Sparrow

400

200

0
1 2 3 4 5 6 7 8 9 10 11 12
Target Number

Figure 6-47: Range of Sparrows for Fusion Data in Spring 2011

6.6 Multi-sensor Fusion for Fall 2011


This section provides the result of sensor processing and data fusion in the fall 2011

migration period. The IR information of available data for fall 2011 is not obtained.

Figures 6-48 to 6-54 show the results of single sensor processing for all available data

collected during the migration period. Figure 6-48 shows the number of bird flight calls

for three different classes. The data are shown on a nightly basis. The species information

is acquired by acoustic detector.

It can be seen that in 58% of the nights, the number of Warblers was higher than the

number of Sparrows and Thrushes. In 35% of the nights, the number of Thrushes exceeds

the number of both of the other two species. Number of Sparrows is higher than the

Warblers and Thrushes in 5% of the nights. The number of calls for different classes in

some of the nights was observed to be the same.

179
Bird Flight Calls of Different Class for Available Data
in Fall 2011
40

Total Number of Flight Calls


35
30
25
20
15 Warbler
10
5
Thrush
0 Sparrow
8/31/2012

9/10/2012
9/12/2012
9/14/2012
9/16/2012
9/18/2012
9/20/2012
9/22/2012
9/24/2012
9/26/2012
9/28/2012
9/2/2012
9/4/2012
9/6/2012
9/8/2012
Night

Figure 6-48: Bird Class Composition for Available Acoustic Data in Nightly

Basis in Fall 2011

Figure 6-49 shows overall bird class composition over the migration period in fall

2011. The class information is acquired by acoustics. It can be seen that Warblers and

Thrushes were 56% and 38% of all of flight calls, respectively. Sparrows were only 6%

of all flight calls.

Bird Class Composition for Available Data in Fall 2011


Sparrow
6%

Thrush
38% Warbler
56%

Figure 6-49: Overall Bird Class Composition for Available Acoustic Data in Fall

2011

180
Figure 6-50 shows the total number of bird flight calls on a nightly basis for available

acoustic data over the migration period in fall 2011. A maximum of 40 calls per night

was recorded on 88% of the nights.

Figure 6-51 shows the number of bat passes of different species for available data

during the migration period in fall 2011. An acoustic detector provides species

information and the number of passes. Seven different species were observed over the

recorded nights. Numbers of Epfus were higher than number of the other species in about

27% of the nights. Number of Labos exceeds other species during 32% of the nights.

Total Number of Bird Flight Calls for Available Data in


Fall 2011
70
Total Number of Flight Calls

60
50
40
30
20
10
0

Night

Figure 6-50: Total Number of Bird Flight Calls for Available Acoustic Data in Fall

2011

181
Bat Passes of Different Species for Available Data in Fall
2011
60

Total Number of passes


50
Nyhu
40
Lano
30
Laci
20
Epfu
10
Pesu
0
Labo
Mylu

Night

Figure 6-51: Bat passes of Different Species for Available Acoustic Data in

Nightly Basis in Fall 2011

Figure 6-52 shows overall bat species composition as acquired by acoustic detector.

Epfu, Mylu, and Labo provide 26%, 27%, and 29% of all the passes respectively. Nyhu,

Lano, Laci, and Pesu with 0.04%, 4%, 1%, and 13% of passes respectively represent a

small number of all passes and are categorized as rare species.

182
Bat Species Composition For Available Data in Fall 2011
Nyhu
0% Lano
4%
Laci
1%
Mylu
27% Epfu
26%

Pesu
Labo
13%
29%

Figure 6-52: Overall Bat Species Composition for Available Acoustic Data in Fall 2011

Figure 6-53 shows the total number of bat passes on a nightly basis in spring 2011. In

almost all of the nights the number of passes was observed as lower than 60 passes per

night.

Figure 6-54 shows the total number of radar tracks on a nightly basis in fall 2011. The

data were acquired by radar in vertical mode. It can be seen that in 85% of all nights,

number of tracks were less than 50 tracks per night.

183
Total Number of Bat Passes for Available Data in Fall 2011
140

Total Number of Passes


120
100
80
60
40
20
0

Night

Figure 6-53: Total Number of Bat Passes for Available Acoustic Data in Fall 2011

Total Number of Radar Tracks for Available Data in


Fall 2011 (Vertical Radar Data)
300
Number of Tracks

250
200
150
100
50
0
8/25/2011
8/26/2011
8/27/2011
8/28/2011
8/29/2011
8/30/2011
8/31/2011

9/10/2011
9/11/2011
9/12/2011
9/13/2011
9/14/2011
9/15/2011
9/16/2011
9/17/2011
9/1/2011
9/2/2011
9/3/2011
9/4/2011
9/5/2011
9/6/2011
9/7/2011
9/8/2011
9/9/2011

Night

Figure 6-54: Total Number of Radar Tracks for Available Data in Fall 2011 (Vertical

Mode)

Figures 6-55 to 6-70 show the range of radar tracks on a nightly basis. The range

information is obtained from radar in vertical mode. Figure 6-55 shows the range of the

targets during the night of 8/25/2011. Percentages of targets recorded at less than 500

meters, between 500 and 1000 meters, and more than 1000 meters were 17%, 56%, and

17% respectively.

184
8/25/2011
5000
4500
4000
3500

Range (m)
3000
2500
2000
Average Range
1500
1000
500
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23

Target Number

Figure 6-55: Range of Radar Tracks for Available Data in 8/25/2011

Figure 6-56 shows the range of the targets during the night of 8/26/2011. Percentage

of the targets at a range of about 2000 meter and higher is 96%.

8/26/2011
3000

2500

2000
Range (m)

1500

1000 Average range

500

0
1 3 5 7 9 11 13 15 17 19 21 23 25 27
Target Number

Figure 6-56: Range of Radar Tracks for Available Data in 8/26/2011

Figure 6-57 shows the range of the targets during the night of 8/27/2011. All the

targets were observed at lower than 1000 meters. The percentage of the targets recorded

at less than 600 meters was 50% of all targets.

185
8/27/2011
1200

1000

800

Range (m) 600

400 Average Range

200

0
1 2 3 4
Target Number

Figure 6-57: Range of Radar Tracks for Available Data in 8/27/2011

Figure 6-58 shows the range of the targets during the night of 8/28/2011. The targets

were observed at approximately 470 meters or less.

8/28/2011
475

470
Range (m)

465

460
Average Range
455

450
1 2
Target Number

Figure 6-58: Range of Radar Tracks for Available Data in 8/28/2011

Figure 6-59 shows the range of the targets during the night of 8/29/2011. It can be

seen that 66% of the targets were observed at range of 2000 meters above.

186
8/29/2011
3000

2500

2000
Range (m)
1500

Average Range
1000

500

0
1 2 3
Target Number

Figure 6-59: Range of Radar Tracks for Available Data in 8/29/2011

Figure 6-60 shows the range of the targets during the night of 8/30/2011. The target

was observed at range of 2000 meters.

8/30/2011
2500

2000
Range (m)

1500

1000 Average Range

500

0
1
Target Number

Figure 6-60: Range of Radar Tracks for Available Data in 8/30/2011

Figure 6-61 shows the range of the targets during the night of 9/1/2011.

187
9/1/2011
2500

2000

Range (m) 1500

1000 Average Range

500

0
1 2
Target Number

Figure 6-61: Range of Radar Tracks for Available Data in 9/1/2011

Figure 6-62 shows the range of the targets during the night of 9/3/2011. The

percentage of targets observed at less than 600 meters is 87%.

9/3/2011
700

600

500
Range (m)

400

300
Average Range
200

100

0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

Target Number

Figure 6-62: Range of Radar Tracks for Available Data in 9/3/2011

Figure 6-63 shows the range of the targets during the night of 9/4/2011. It can be seen

that 66% of the targets were in the range of 600 to 800 meters.

188
9/4/2011
1400

1200

1000
Range (m)
800

600
Average Range
400

200

133

193
109
121

145
157
169
181

205
217
229
241
13
25
37
49
61
73
85
97
1

Target Number

Figure 6-63: Range of Radar Tracks for Available Data in 9/4/2011

Figure 6-64 shows the range of the targets during the night of 9/5/2011. Percentages

of targets in the range of 100 and 200 meters, and higher than 200 meters are 77% and

23% respectively. Figure 6-65 shows the range of the targets during the night of

9/6/2011. It can be seen that 97% of targets were at less than 500 meters. Figure 6-66

shows the range of the targets during the night of 9/9/2011. The percentage of the targets

between 350 and 550 meters was 87%.

189
9/5/2011
1000
900
800
700
Range (m) 600
500
400 Average Range
300
200
100
0
1 4 7 101316192225283134374043464952555861646770737679
Target Number

Figure 6-64: Range of Radar Tracks for Available Data in 9/5/2011

9/6/2011
2500

2000
Range (m)

1500

1000 Average Range

500

0
1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35
Target Number

Figure 6-65: Range of Radar Tracks for Available Data in 9/6/2011

190
9/9/2011
700

600

500
Range (m)
400

300
Average Range
200

100

0
1 2 3 4 5 6 7 8
Target Number

Figure 6-66: Range of Radar Tracks for Available Data in 9/9/2011

Figure 6-67 shows the range of the targets during the night of 9/10/2013. The

percentages of the targets observed at less than 200 meters and between 450 and 650

meters were 60% and 40% respectively.

9/10/2011
700

600

500
Range (m)

400

300
Average Range
200

100

0
1 2 3 4 5 6 7 8 9 10
Target Number

Figure 6-67: Range of Radar Tracks for Available Data in 9/10/2011

Figure 6-68 shows the range of the targets during the night of 9/11/2013. It can be seen

that 65% of the targets were observed between 350 and 600 meters.

191
9/11/2011
900
800
700
600
Range (m)
500
400
Average Range
300
200
100
0
1 2 3 4 5 6 7 8 9 1011121314151617181920212223242526
Target Number

Figure 6-68: Range of Radar Tracks for Available Data in 9/11/2011

Figure 6-69 shows the range of the targets during the night of 9/12/2013. The

percentages of the targets observed at less than 600 meters and above 2400 meters were

75% and 25% respectively.

9/12/2011
3000

2500

2000
Range (m)

1500

1000
Average Range

500

0
1 2 3 4
Target Number

Figure 6-69: Range of Radar Tracks for Available Data in 9/12/2011

192
Figure 6-70 shows the range of the targets during the night of 9/13/2013. It can be

seen that almost all of the targets were recorded at less than 1000 meters. The percentage

of the targets at less than 500 meters was 75%.

9/13/2011
4500
4000
3500
3000
Range (m)

2500
2000
1500
Average Range
1000
500
0
1 3 5 7 9 1113151719212325272931333537394143454749515355
Target Number

Figure 6-70: Range of Radar Tracks for Available Data in 9/13/2011

Common data is defined as data during common nights where data were available

from all three sensors. Figure 6-71 shows the number of bird flight calls over common

nights in fall 2011. These results were obtained using an acoustic detector. It can be seen

that 39% of all bird flight calls detected in fall 2011 were observed in the common nights.

Figure 6-72 shows the number of bird flight calls of different species over common

nights. Percentage of flight calls for Warblers, Thrushes, and Sparrows was 45%, 38%

and 9% respectively. Figure 6-73 shows the total number of bat passes in fall 2011 over

common nights. It can be seen that 32% of all passes detected in fall 2011 were observed

in the common nights.

193
Total Number of Bird Flight Calls over Common
Nights in Fall 2011
35

Number of Flight Calls


30
25
20
15
10
5
0

9/16/2011
8/26/2011
8/27/2011
8/28/2011
8/29/2011
8/30/2011
8/31/2011

9/10/2011
9/11/2011
9/12/2011
9/13/2011
9/14/2011
9/15/2011
9/1/2011
9/2/2011
9/3/2011
9/4/2011
9/5/2011
9/6/2011
9/7/2011
9/8/2011
9/9/2011
Night

Figure 6-71: Total Number of Bird Flight Calls over Common Nights in Fall 2011

Number of Bird Flight Calls of Different Species over


Common Nights in Fall 2011
30
Total Number of Flight Calls

25

20

15
Warbler
10
Trush
5
Sparrow
0

Night

Figure 6-72: Bird Class Composition over Common Nights in Fall 2011

194
Total Number of Bat Passes over Common Nights
in Fall 2011
140

Total Number of Bat Passes


120
100
80
60
40
20
0

Night

Figure 6-73: Total Number of Bat Passes over Common Nights in Fall 2011

Figure 6-74 shows the number of bat passes of different species over common nights.

Percentage of passes for Epfu, Labo, and Mylu was 40%, 26% and 34% respectively.

Percentage of passes for other species such as Pesu, Laci, and Nyhu was 25%, 42%, 40%

respectively.

Number of Bat Passes of Different Species over Common


Nights for Fall 2011
60
Number of Bat Passes

50
Nyhu
40
Lano
30
20 Laci
10 Epfu
0
pesu
Labo
Mylu
Night

Figure 6-74: Bat Species Composition over Common Nights in Fall 2011

195
Figure 6-75 shows the flight direction of the targets over common nights as acquired

by IR camera. It can be seen that 42% of the targets were flying toward south.

Flight Direction over Common Nights in


Fall 2011

W
11% N
27%
E
20%

S
42%

Figure 6-75: Flight Direction of Targets over Common Nights in Fall 2011

Figure 6-76 shows the total number of IR tracks over common nights in fall 2011. There

were less than 2000 targets in 78% of the nights.

Total Number of IR Tracks over Common Nights in Fall


2011
2500

2000
Number of Tracks

1500

1000

500

0
8/26/2011

9/14/2011
8/27/2011
8/28/2011
8/29/2011
8/30/2011
8/31/2011

9/10/2011
9/11/2011
9/12/2011
9/13/2011

9/15/2011
9/16/2011
9/1/2011
9/2/2011
9/3/2011
9/4/2011
9/5/2011
9/6/2011
9/7/2011
9/8/2011
9/9/2011

Night

Figure 6-76: Total Number of IR Tracks over Common Nights in Fall 2011

196
Figures 6-77 to 6-83 show the direction of the targets over the common nights,

obtained by IR. It can be seen that in most of the fall nights, the direction of bird flight is

towards south.

Figure 6-77: IR Direction in Night of (a) 08/29/2011 (b) 08/30/2011

Figure 6-78: IR Direction in Night of (a) 09/01/2011 (b) 09/03/2011

197
Figure 6-79: IR Direction in Night of (a) 09/04/2011 (b) 09/05/2011

Figure 6-80: IR Direction in Night of (a) 09/10/2011 (b) 09/11/2011

198
Figure 6-81: IR Direction in Night of (a) 09/12/2011 (b) 09/13/2011

Figure 6-82: IR Direction in Night of (a) 09/15/2011 (b) 09/16/2011

199
Figure 6-83: IR Direction in Night of (a) 09/26/2011 (b) 09/27/2011

Figure 6-84 shows the total number of radar tracks over common nights in fall 2011.

These data are obtained using radar in vertical mode. It can be seen that 70% of all the

targets acquired by radar in fall 2011 were observed in common nights.

Total Number of Radar Tracks over Common Nights in


Fall 2011
300
Number of Tracks

250
200
150
100
50
0
8/26/2011
8/27/2011
8/28/2011
8/29/2011
8/30/2011
8/31/2011

9/10/2011
9/11/2011
9/12/2011
9/13/2011
9/14/2011
9/15/2011
9/16/2011
9/1/2011
9/2/2011
9/3/2011
9/4/2011
9/5/2011
9/6/2011
9/7/2011
9/8/2011
9/9/2011

Night

Figure 6-84: Total Number of Radar Tracks over Common Nights in Fall 2011

Direction and species information is obtained by combining IR and acoustic data.

Figure 6-85 shows the flight direction of bird species. Percentage of Warblers, Thrushes,

200
flying towards south was 66%, 33% respectively. No Sparrows were found in fusion data.

It can also be observed that the number of all Warblers and Thrushes flying towards

south is higher than in the other three directions.

Direction of Bird Species for Fusion Data in Fall 2011


7
6
Number of Targets

5
4
Warbler
3
Sparrow
2
Thrush
1
0
N S E W
Direction

Figure 6-85: Direction of Different Bird Classes for Fusion Data in Fall 2011

Figure 6-86 shows the flight direction of all the bird targets in fusion data for fall

2011. The percentage of all birds flying towards south, east, north and west are 53%, 6%,

23% and 18% respectively.

201
Direction of Birds in Fusion Data in Fall 2011

W N
18% 23%
E
6%

S
53%

Figure 6-86: Overall Direction of Bird Classes for Fusion Data in Fall 2011

Figure 6-87 shows the flight direction of bat passes for fusion data in fall 2011. The

percentage of Epfu and Labo flying towards south was 66% and 33% respectively.

Direction of Bat Species


4.5
4
3.5
Number of Targets

3
2.5 Epfu
2 Mylu
1.5 Labo
1
Nyhu
0.5
0
N S E W
Direction

Figure 6-87: Direction of Bat Passes for fusion Data in Fall 2011

202
Figure 6-88 shows the direction of bat passes for fusion data in fall 2011. Percentage

of all bats flying towards south, west, and north are 55%, 36%, 9% respectively. No bats

were detected flying in the east direction.

Direction of Bat Passes in Fusion Data for Fall 2011

N
9%
W
36%

S
55%
E
0%

Figure 6-88: Direction of Bat Passes for Fusion Data in Fall 2011

The range and species information are obtained by radar and acoustic detector,

respectively. Figure 6-89 to 6-93 shows the range of bird classes and bat species for

fusion data in fall 2011. Figure 6-91 shows the range of Epfus. It can be seen that 85% of

the Epfus were found in the range of 150 to 200 meters.

Range of Epfus
500
450
400
350
Range (m)

300
250
200
150 Epfu
100
50
0
1 2 3 4 5 6 7
Target Number

Figure 6-89: Range of Epfus for Fusion Data in Fall 2011


203
Figure 6-90 shows the range of Labos. It can be seen that their range was between

150 to 450 meters.

Range of Labos
500
450
400
350
Range (m)

300
250
200
Labo
150
100
50
0
1 2
Target Number

Figure 6-90: Range of Labos for fusion Data in Fall 2011

Figure 91 shows the range of Nyhu and Mylus.

Range of Nyhu and Mylu


350

300

250
Range (m)

200

150
Series1
100

50

0
Nyhu Mylu
Target

Figure 6-91: Range of Nyhu and Mylu for Fusion Data in Fall 2011

Figure 6-92 shows the range of Warblers for fusion data in fall 2011. It can be seen

that the range is between 200 to 450 meters.

204
Range of Warblers
450
400
350
300
Range (m)
250
200
Warbler
150
100
50
0
1 2 3 4 5 6 7 8 9 10
Target Number

Figure 6-92: Range of Warblers for Fusion Data in Fall 2011

Figure 6-93 shows the range of Thrushes for fusion data in fall 2011.

Range of Trushes
350
300
250
200
150
Trush
100
50
0
1 2 3 4 5 6
Target Number

Figure 6-93: Range of Thrushes for Fusion Data in Fall 2011

205
Chapter 7

7 Conclusion and Future Work

A comprehensive framework of multisensory data fusion for avian monitoring is

developed as a distributed system of autonomous modules. The multi- sensory

environments consist of acoustics, IR, and marine radar. The sensors were deployed in

the Western basin of Lake Erie in Ohio. Acoustics is useful for target identification at the

taxonomic level. Infrared sensing and IR video processing provides quantitative

information and x-y coordinates of the target. Marine radar gives target altitude (z-

coordinates) as well as other information.

Due to the diversity of sensor technology, the data were processed separately

based on the types of data. Different techniques and algorithms were used to process the

raw data. Acoustic data processing was implemented based on feature extraction of the

calls, and classification. Different feature extraction techniques consist of Fourier

transform, discrete wavelet transform, and Mel frequency cepstrum coefficients were

used to extract the significant features. Classification was performed based on the

developed evolutionary neural network which is a novel technique in the application of

avian monitoring. This technique is a combination of genetic algorithm and neural

network. The genetic algorithm is used to optimize the weight selection of neural

network. The developed classification technique is compared with existing classification

206
techniques such as support vector machine, discriminant function analysis, and back

propagation neural network. The results show that discrete wavelet transform with

evolutionary neural network classifier gives the higher accuracy comparing with other

techniques.

The infrared imaging processing were performed based on the background

subtraction, thresholding, morphological based noise suppression, and finally building

trajectory of tracks. Features of the tracked targets were extracted which consist of heat

intensity, flight direction, straightness index, etc. Clustering based on ant clustering

algorithm and fuzzy C means was performed to group the targets based on their features.

According to the results 92% accuracy was achieved.

The radar data was processed initially with radR software to detect the blips, and

then the results were fed to the tracking algorithm which is based on estimation using

developed particle filter and assignment by the nearest neighbor. The advantage of

particle filter is that it is efficient in nonlinear and non Guassian systems, in contrast to

existing tracking algorithm in radR.

Data fusion of multi-sensory system was performed based on distributed system

of autonomous modules. The fusion is a heterogeneous dissimilar sensor of various

technology types. It is performed in two levels of feature level and decision level. The

first level involves the integration of the IR and radar data, and a feature vector is

generated based on the IR and radar features. Vertical radar data were used for fusion as

it has common coverage area with IR. Spatial and temporal alignments were performed to

provide the common time and spatial coordinate system. Association matrices were

created containing the putative pairs of the targets and assignment was made based on the

207
minimum error. The acoustics were combined with the data in the level 2 fusion.

Similarly, association and assignment were performed to select the best putative targets.

The acoustic data contain the identity of the target, while the IR/radar vector contains the

range, direction, intensity value, and some other features. A fuzzy Bayesian technique is

developed in the level 2 fusion to provide the probability of the identity of the targets

given the L1 features. The fuzzy Bayesian fusion is divided in two steps: the fuzzy

inference system to provide a priori probability which is the avian category, and Bayesian

inference to provide posteriori probability of the identity. Fuzzy rule-based systems can

approximate prior and likelihood probabilities in Bayesian inference and thereby

approximate posterior probabilities. The probability of higher than 80% is considered as

a desired value. The combination of all three sensors provides multi modal information of

targets which can be used to make inferences about the activity and behavior of the

targets in the region. This approach leads to reduction of uncertainties and provides a

desired level of confidence and detail information about the patterns. This work is a

unique, multi-fidelity, and multi-disciplinary.

The developed system is used to process the spring and fall 2011 data collected in

in the Ottawa National Wildlife Refuge. The processing was performed both in sensory

and fusion mode. The result of fusion provides the complementary information of the

migrant birds/bats during migration period of seasons. Our approach provides a detailed

panorama of local avian species diversity and their nocturnal flight behavior while

migrating. The research/system is potentially of enormous value to biologists and

conservation decision makers to rapidly but effectively asses bird and bat density,

208
diversity and, most importantly, behavior within natural areas or proposed wind

development sites. The developed system could be deployed anywhere in the world.

Although data fusion is the interest topic of many researches in recent years, there

are many aspects of data fusion that are still needs to be understood or developed.

Following are several issues in data fusion which needs to be considered in order to

improve the inferences from data fusion in different applications.

Data fusion in avian monitoring application is important to provide the inferences

regarding the target’s activity and behavior. The data fusion in the developed avian

monitoring system is based on hard fusion in which data are obtained using different hard

sensors including acoustic detector, radar, and IR. The input data directly come from

these physical sensors; however, it is more complementary even though more challenging

if part of the input data is added to the system by expert biologists. The human inferences

can provide observation, relations between the entities, or other information which

improve the performance of fusion in terms of accuracy.

The avian monitoring system is developed based on the dedicated sensors, where

acoustic detector is used to record the songs/calls of birds/bats. IR camera is used to

record the videos, and marine radar is deployed to collect data based on radar scans. Data

are collected and stored in separate hard drives which are used for sensor’s preprocessing

techniques to produce the meta-data. There are many challenges to handle the capacity

issues and the problems may occur over the data transfer from sensor to drives and drives

to PC/laptops. As the volume of data for migration seasons is high and requires a large

storage memory, it is more efficient to deploy a middleware model system with a shared

memory to link between sensors and efficiently store all the data at a desired site.

209
However, the model needs to support scalability, stability, and reliability. Also a

communication infrastructure such as internet protocol can efficiently improve the data

transfer for the end user to access the data resource or shared memory.

In contrast to homogenous fusion systems which rely on the integration from

multiple sensors of the same types, heterogeneous fusion system combines data from

various types of sensors. The variety of the sensors depends on the application and the

desired accuracy. The data fusion of acoustics, IR, and radar is a heterogonous fusion

with sensors from different types. The fusion of the features from different heterogonous

sensors required efforts to associate the targets. The data alignment consists of temporal

and spatial alignment. It is performed to provide a common reference in either time or

space dimensions. Although the architecture of data fusion of acoustics, IR, and radar is

well designed and implemented, still more consideration is needed on the type variant

data fusions especially if the number of each type of sensor increases. In these situations,

it is more efficient to develop a multi-mode data manager to manage the data in two

levels: 1) homogenous level which consists of same nature of data coming from same

type of sensors 2) heterogeneous level which different nature of data coming from

different sensors.

210
References

1. J. Rydell, L. Bach, M. Douborg-Savage, M. Green, L. Rodrigues, et al. , “Bat

Mortality at Wind Turbines in Northwestern Europe”, Acta Chiropetrologica,

2010, pp. 261-274.

2. R. Osborn, K. Higguns, R. Usgaard, C. Dieter, R. Neiger, “Bird Mortality

Associated with Wind Turbines at the Buffalo Ridge Wind Resource Area,

Minnesota”, American Midland Naturalist 2000, pp. 41-52.

3. E. B. Arnett, W. K. Brown, W. P. Erikson, J. K. Fiedler, et al. , “Patterns of Bat

Fatalities at Wind Energy Facilities in North America”, Journal of Wildlife

Management, 2008, pp. 61-78.

4. E. F. Baerwald, G. H. D'Amours, B. J Klug, R. M. R. Barclay, “Barotrauma is a

Significant Cause of Bat Fatalities at Wind Turbines”, Current Biology, Vol. 18,

Cell Press, pp. R695-6G.

5. G. Wibbelt, A. Kurth, D. Hellmann, M. Weishaar, A. Barlow, et al. “White-Nose

Syndrome Fungus (Geomyces destructans) in Bats, Europe”, Journal of Emerging

Infectious Diseases, 16(8), 2010, pp. 1237-1243.

6. J. Esteban, A. Starr, R. Willetts, P. Hannah, P. Bryanston-Cross, “A Review of

Data Fusion Models and Architectures: Towards Engineering Guidelines”, Neural

Computing and Applications, Vol. 14, Issue 4, 2004, pp. 273-281.

211
7. Z. Tie-Zhu, J. Hong, “Intelligent Target Recognition Based on Data Fusion”,

IEEE International Conference on Information Science and Engineering, 2009,

pp. 1311-1314.

8. J. A. Besada, G. Frontera, A. M. Bernardos, G. de Miguel, “Adaptive Data

Fusion for Air Traffic Control Surveillance”, Hybrid Artificial Intelligent System,

Springer, Vol. 6679, 2011, pp. 118-126.

9. C. Dambra, F. Relit, S. B. Serpico, A. Wielgorski, A. Agostinelli, “Remote

Sensing Data Fusion by Means of a Region-Overlapping Technique”, IEEE

Geoscience and Remote Sensing Symposium, 1991, pp. 1091-1094.

10. S. Haining, Q. Liang, “Data Fusion in Multi-Target Radar Sensor Network”,

IEEE Radio and Wireless Symposium, 2007, pp. 129-132.

11. Y. Wang, F. Chu, Y. He, D. Guo, “Multisensor Data Fusion for Automotive

Engine Fault Diagnosis”, IEEE Tsinghua Science and Technology, Vol. 9, Issue:

3, 2004, pp. 262-265.

12. F. Cremer, E. den Breejen, K. Schutte, “Sensor Data Fusion for Anti-Personnel

Land-Mine Detection”, Proceedings of EuroFusion, 1998, pp. 55-60.

13. S. T. Kumara, R. L. Kashyap, A. L. Soyster, “AI in Manufacturing: Theory and

Practice”, Institute of Industrial Engineering, 1989

14. P. Jackson, “Introduction to Expert Systems”, Addison-Wesley Publishing,

Reading Massachusetts, 1986.

15. W. R. Evans, “Applications of Acoustic Bird Monitoring for the Wind Power

Industry”, Cornell Laboratory of Ornithology, National Avian-Wind Power

Planning Meeting III, pp. 141- 152.

212
16. S. Parsons, G. Jones, “Acoustic Identification of Twelve Species of Echolocationg

Bat by Discrimination Analysis and Artificial Neural Networks”, Journal of

Experimental Biology, 2000, pp. 2641-2656.

17. M. J. O'Farrell, C. Corben, W. L. Gannon, “Geographic Variation in the

Echolocation Calls of the Hoary Bat( Lasiurus cinereus)”, Acta Chiropterologica,

2(2), 2000, pp. 185-196.

18. D. R. Griffin, “Listening in the Dark. The Acoustic Orientation of Bats and Men”,

Yale University Press, New Haven, 1958.

19. I. Ahlen, “Identification of Scandinavian Bats by Their Sounds”, Swedish

University of Agricultural Sciences, Department of Wildlife Ecology, 1981

20. M. B. Fenton, G. P. Bell, “Recognition of Species of Insectivorous Bats by Their

Echolocation Calls”, Journal of Mammology, Vol. 62, No.2, 1981, pp. 233-243.

21. S. Parsons, M. K. Obrist, “Recent Methodological Advances in the Recording and

Analysis of Chiropteran Biosonar Signals in the Field”, Echolocation in Bats and

Dolphins, Proceedings of the Biosonar Conference, Chicago University Press,

1998.

22. C. R. Allen, S. E. Romeling, L. W. Robbins, “Acoustic Monitoring and Sampling

Technology”, Department of Biology Missouri State University,

http://www.batcallid.com/SamplingTechnology.html

23. Titley Scientific Inc. , Available from: http://titley-scientific.com

24. Binary Acoustic Technology, Available from: http://binaryacoustictech.com/

25. Wildlife Acoustics, Available from: http://www.wildlifeacoustics.com/

26. Bat Call Identification Inc., Available from: http://www.batcallid.com/

213
27. Sonobat Software, Available from: http://www.sonobat.com/

28. R. D. Redgwell, J. M. Szewczak, G. Jones, S. Parsons, “Classification of

Echolocation Calls from 14 Species of Bat by Support Vector Machines and

Ensembles of Neural Networks”, Algorithms, Vol. 2, Issue 3, 2009, pp. 907-924.

29. S. C. Burnett, W. M. Masrees, “The Use of Neural Networks to Classify

Echolocation Calls of Bats”, Journal of Acoustical Society of America, 1999.

30. R. A. Altes, “Signal Processing for Target Recognition in Bisonar”, Neural

Networks, Elsevier, Vol. 9, Issue 7-8, 1995, pp. 1275-1295.

31. E. R. Britzke, J. E. Duchamp, K. L. Murray, R. K. Swihart, L. W. Robbins,

“Acoustic Identification of Bats in the Eastern United States: A Comparison of

Parametric and Nonparametric Methods”, The Journal of Wildlife Management,

75(3), 2011, pp. 660-667.

32. R. F. Lance, B. Bollich, C. L. Callahan, P. L. Leberg, “Surveying Forests-Bat

Communities with Anabat Detectors”, Bats and Forests Symposium, 1996, pp.

175-184.

33. J. A. Kogan, D. Margoliash, “Automated Recognition of Bird Song Elements

from Continuous Recordings using Dynamic Time Warping and Hidden Markov

Models: A Comparative Study”, The Journal of the Acoustical Society of

America, 103(4), 1998, pp. 2185-2196.

34. G. Vaca-Castaño, D. Rodriguez, “Using Syllabic Mel Cepstrum Features and K-

Nearest Neighbors to Identify Anurans and Birds Species”, IEEE Workshop on

Signal Processing Systems SIPS, 2010, pp. 466-471.

214
35. D. W. Armitage, H. K. OBer, “A Comparison of Supervised Learning Techniques

in the Classification of Bat Echolocation Calls”, Ecological Informatics, Elsevier,

Vol. 5, Issue. 6, 2010, pp. 465-473.

36. M. Negnevitsky, “Artificial Intelligence: A Guide to Intelligent Systems”,

Addison Wesley, 2005.

37. A. Selin, J. Turunen, and J. T. Tanttu, “Wavelets in Recognition of Bird Sounds”,

EURASIP Journal on Advances Signal Processing, Springer, 2007.

38. R. Bardeli, D. Wolff, F. Kurth, M. Koch, K. H. Tauchert, et al., “Detecting Bird

Sounds in a Complex Acoustic Environment and Application to Bioacoustic

Monitoring”, Pattern Recognition Letters, ELSEVIER, 31(12), 2010, pp. 1524-

1534.

39. K. Sun, J. Feng, L. Jin, Y. Liu, Y. Jiang, “Identification of Sympatric Bat Species

by the Echolocation Calls”, Frontiers of Biology in China, Springer, 3(2), 2008,

pp. 227-231.

40. Ultra Sound Advice, Available from: http://www.ultrasoundadvice.co.uk/

41. M. K. Obrist, R. Boesch, P. F. Fluckiger, “Variability in Echolocation Call Design

of 26 Swiss Bat Species: Consequences, Limits and Options for Automated Field

Identification with a Synergetic Pattern Recognition Approach”, Mammalia,

68(4), 2004, pp. 307-322.

42. Petterson Elektronik, Available from: http://www.batsound.com/

43. S. A. Gauthreaux, J. Livingston, “Monitoring Bird Migration with a Fixed-beam

Radar and a Thermal-Imaging Camera”, Journal of Field Ornithology,77(3),

2006, pp. 319-328.

215
44. K. L. Krijgsveld, R. Lensink, H. Schekkerman, P. Wiersma, M. J. M. Poot,

“Baseline Studies North Sea Wind Farms: Fluxes, Flight Paths and Altitudes of

Flying Birds”, Bureau Waardenburg bv, Alterra Research Intitue Voore de

Groene Ruimte, 2005, Available From:

http://www.buwa.nl/fileadmin/buwa_upload/Bureau_Waardenburg_rapporten/04-

008_Baseline_study_offshore_wind_Bureau_Waardenburg_2005.pdf

45. P. Komatineni, N. Nagarajan, M. Majid, G. Mirzaei, J. D. Ross, M. Jamali, P.

Gorsevski, J. Frizado, V. Bingman,“Quantification of Bird Migration using

Doppler Weather Surveillance Radars (NEXRAD)”, IEEE Radar Conference,

2013, pp. 1-4.

46. G. Mirzaei, M. W. Majid, J. D. Ross, M. M. Jamali, J. Frizado, P. V. Gorsevski,

V. P. Bingman, “Implementation of Ant Clustering Algorithm for IR Imagery in

Wind Turbine Application”, IEEE International Midwest Symposium on Circuits

and Systems, 2012, pp.868-871.

47. S. A. Gauthreaux, C. G. Belser, “Bird Ornithology and Biological Conservation”,

The Auk, 120(2), 2003, pp. 266-277.

48. M. M. Jamali, B. Synder, J. Williams, R. Kindred, G. S. John, “Remote Avian

Monitoring System for Wind Turbines”, IEEE International Conference on

Electro/Information Technology, 2011, pp. 1-5.

49. Bird Monitoring using the Mobile Avian Radar System (MARS) by Geo-Marine.

2004; Available from: http://www.boemre.gov/offshore/PDFs/CWFiles/67.pdf.

50. T. A. Kelly, S. McLaughlin, A. Smith, New Ornithological Radar Technologies

for Bird and Bat Discrimination and Species Identification, DeTect Inc, 2007,

216
available from: http://www.detect-inc.com/DeTect%20info%20-

%20wind/Paper_Radar_for_Bird_and_Bat_Discrimination_and_Species_Identific

ation_WINDPOWER_2008.pdf

51. S. Zaugg, G. Saporta, E. Van Loon, H. Schmaljohann, F. Liechti, “Automatic

Identification of Bird Targets with Radar, via Pattern Produced by Wing

Flapping”, Journal of Royal Society Interface, 5(26), 2008, pp. 1041-53.

52. Radar Ornithology Lab, Clemson University, BirdRad, 1998, Available from:

http://virtual.clemson.edu/groups/birdrad/birdradar.htm

53. J. M. Richardson, K. A. Marsh, “Fusion of Multisensor Data”, International

Journal of Robotics Research, 7(6), 1988, pp. 78-96.

54. J. Esteban, A. Starr, R. Willetts, P. Hannah, P. Bryanston-Cross, “A Review of

Data Fusion Models and Architectures:Towards Engineering Guidelines”, Journal

of Neural Computing and Applications, ELSEVIER, 14(4), 2005, pp.273-281.

55. D. L. Hall, J. Llinas, “An Introduction to Multisensor Data Fusion”, IEEE

proceedings, 85(1), 1997, pp. 6-23.

56. J. H. Clark, A. L. Yuille, “Data Fusion for Sensory Information processing

Systems”, Springer International Series in Engineering and Computer Science,

Vol. 105, 1990.

57. F. E. White, “Data Fusion Lexicon”, The Data Fusion Subpanel of the Joint

Directors of the Laboratories, Technical Panel for C3,1991.

58. E. Blasch, S. Plano, “Level 5: User Refinement to Aid the Fusion Process”,

Proceedibgs of SPIE, Vol. 5099, 2003.

217
59. D. A. Lambert, “A Unification of Sensor and Higher Level Fusion”, IEEE

International Conference on Information Fusion, 2006, pp.1-8.

60. A. N. Steinberg, C. L. Bowman, F.E. White, “Revisions to the JDL Data Fusion

Model”, Processing of SPIE Sensor Fusion: Architectures, Algorithms, and

Applications”, 1999, pp. 430-41.

61. A. N. Steinberg, C. L. Bowman, F.E. White, “Rethinking the JDL Data Fusion

Levels”, National Symposium on Sensor and Data Fusion, 2004.

62. J. Llinas, C. Bowman, G. Rogova, A. Steinberg, E. Waltz, “Revising the JDL

Data Fusion Model II”, Proceedings of the Seventh International Conference on

Information Fusion, 2004.

63. D. A. Lambert, “STDF Model Based Maritime Situation Assesment”, IEEE

International Conference on Information Fusion, 2007, pp.1-8.

64. D. A. Lambert, “A Blueprint for Higher-Level Fusion System”, Journal of

Information Fusion, ELSEVIER, 10(10), 2009, pp. 6-24.

65. S. C. A. Thomopoulos, “Sensor Integration and Data Fusion”, SPIE Sensor

Fusion II: Human and Machine Strategies, 1989, pp. 178-191.

66. R. C. Luo, M. G. Kay, “Multisensor Integration and Fusion’, Issues and

Approaches”, SPIE Sensor Fusion, pp.42-49.

67. C. J. Harris, A. Bailey, T. J. Dodd, “Multi-sensor Data Fusion in Defence and

Aerospace”, Aeronautical Journal, 102 (1015), pp. 229-244.

68. M. A. Abidi, R. C. Gonzalez, “Data Fusion in Robotics and Machine

Intelligence”, Academic Press, San Diego, 1992.

218
69. D. L. Hall, “Mathematical Techniques in Multisensor Data Fusion”, Artech

House, 1992.

70. Defence Science and Technology Organization (DSTO) Data Fusion Special

Interest Group, Data Fusion Lexicon, Department of Defence, Australlia, 1994.

71. L. Wald, “Some Terms of Reference in Data Fusion”, IEEE Transactions on

Geosciences and Remote Sensing, 37(3), 1999, pp. 1190-1193.

72. D. L Hall and J. Llinas , “Handbook of Multisensor Data Fusion”, CRC Press:

USA, 2001.

73. S. Challa, T. Gulrez, Z. Chaczko, T. N. Paranesha, “Opportunistic Information

Fusion: A New Paradigm for Next Generation Networked Sensing Systems”,

Proc. International Conference on Information Fusion, 2005.

74. B. Khaleghi, A. Khamis, F. O. Karray, “Multisensor Data fusion: A Review of the

State-of –the-Art”, Information Fusion, ELSEVIER, 2011.

75. H. Bostrom, S. F. Andler, M. Brohede, R. Johansson, A. Karlsson, “On the

Definition of Information Fusion as a Field of Research”, Available from

http://www.isif.org/sites/isif.org/files/FULLTEXT01.pdf

76. A. Jalobeanu, J.A. Gutiérrez, “Multisource Data Fusion for Bandlimited Signals:

A Bayesian Perspective”, Proc. of 25th workshop on Bayesian Inference and

Maximum Entropy Methods, AIP Conference Proceedings, Volume 872, 2006,

pp. 391-400.

77. A. Rattani, D. R. Kisku, M. Bicego, M. Tistarelli, “Feature Level Fusion of Face

and Fingerprint Biometrics”, IEEE International Conference on Biometrics:

Theory, Applications, and Systems, 2007, pp. 1-6.

219
78. A. Ross, R. Govindarajan, “Feature Level Fusion in Biometric Systems”,

Available From: http://www.nws-sa.com/biometrics/ear/featureFusion.pdf.

79. G. U. Bokade, A. M. Sapkal, “Feature Level Fusion of Palm and Face for Secure

Recognition”, International Journal of Computer and Electrical Engineering, 4(2),

2012, pp. 157-160.

80. Y. Gao, M. Maggs, “Feature Level Fusion in Personal Identification”, IEEE

Computer Society Conference on Computer Vision and Pattern Recognition,

2005.

81. U. Dieckmann, P. Plankensteiner, T. Wagner, “SESAM: A Biometric Person

Identification System Using Sensor Fusion”, Pattern Recognition Letters, Vol. 18,

1997, pp. 827-833.

82. A. Amidities, A. Polychronopoulos, N. Floudas, L. Andreone, “Fusion of Infrared

Vision and Radar for Estimating the Lateral Dynamics of Obstacles”, Information

Fusion, ELSEVIER, 2005, pp.129-141.

83. Y. L. Guilloux, J. Lonnoy, C. Blanc, L. Trassoudaine, H. Tattegrain, “Paroto

Project: Infrared and Radar Data Fusion for Obstacle Avoidance”, Advanced

Microsystems for Automative Applications, Springer, 2004, pp. 181-200.

84. V. D. Calhoun, T. Adali, “Feature-Based Fusion of Medical Imaging Data”, IEEE

Transactions On Information Technology In Biomedicine, 13(5), 2009, pp. 711-

720.

85. A. H. Gunatilaka, B. A. Baertlein, “Feature-Level and Decision-Level Fusion of

Noncoincidently Sampled Sensors for Land Mine Detection”, IEEE Transactions

On Pattern Analysis And Machine Intelligence, 23(6), 2001, pp. 577-589.

220
86. F. Cremer, W. de Jong, K. Schutte, A. G. Yarovoy, V. Kovalenko, R. F.

Bloemenkamp, “Fusion of Polarimetric Infrared features and GPR Features for

Landmine Detection”, Proceedings of the 2nd international Workshop on

Advanced Ground Penetrating Radar, 2003, pp. 222-227.

87. G. Yang, L. Duo, J. Chen, C. Hou, “Synergy Decision in the Multi-Target

Tracking Based on IRST and Intermittent-Working Radar”, Information Fusion,

ELSEVIER, 2001, pp. 243-250.

88. D. Akselrod, A. Sinha, T. Kirubarajan, “Information Flow Control for

Collaborative Distributed Data Fusion and Multisensor Multitarget Tracking”,

IEEE Transactions on Systems, Man, and Cybernetics-Part C: Applications and

Reviews, 42(4), 2012, pp. 501-517.

89. J. D. de Jesus, J. J. V. Calvo, A. I. Fuente, “Surveillance System Based on Data

Fusion From Image and Acoustic Array Sensors”, IEEE Aerospace and Electronic

Systems Magazine, 15(2), 2000, pp. 9-6.

90. G. Mirzaei, M. W. Majid, J. D. Ross, M. M. Jamali, J. Frizado, P. V. Gorsevski,

V. P. Bingman, “Bio Acoustic Feature Extraction and Classification of Bat

Echolocation Calls”, IEEE International Conference on Electro/Information

Technology, 2012, pp. 1-4.

91. G. Mirzaei, M. W. Majid, J. Ross, M. M. Jamali, J. Frizado, P. V. Gorsevski, V.

P. Bingman, “Acoustic Monitoring Techniques for Avian Detection and

Classification”, IEEE Asilomar Conference on Signals Systems and Computers,

2012, pp. 1835-1838.

221
92. S. Sigurdsson, K. B. Petersen, T. L. Schioler, “Mel Frequency Ceptral

Coefficients: An Evolution of Robustness of MP3 Encoded Music”, Proceedings

of the International Symposium on Music Information Retrieval, 2006.

93. S. Mallat, “A Wavelet Tour of Signal Processing”, Academic Press, 2008.

94. J. G. Proakis, D. K. Manolakis, “Digital Signal Processing”, Prentice Hall , New

Jersy, 2007.

95. G. Mirzaei, M. W. Majid, M. M. Jamali, J. Ross, “The Application of

Evolutionary Neural Network for Bat Echolocation Calls Recognition”, IEEE

International Joint Conference on Neural Networks, 2011, pp. 1106-1111.

96. G. Mirzaei, M. W. Majid, J. Ross, M. Jamali, P. Gorsevski, J. Frizado, V.

Bingman, “Avian Detection & Tracking Algorithm using Infrared Imaging”,

IEEE International Conference on Electro/Information Technology, 2012, pp. 1-4.

97. M. Piccardi, “Background Subtraction Techniques: A Review”, IEEE

International Conference on Systems, Man and Cybernetic, 2004, pp. 3099-3104.

98. C. Stauffer, W. E. L. Grimson, “Adaptive Background Mixture Model for Real-

Time Tracking”, Proc. IEEE Coputer Society Conference on Computer Vision

and Pattern Recognition, 1999, pp. 246-252.

99. Furunu USA, Available From : http://www.furunousa.com/products/default.aspx.

100. P. D. Taylor, J. M. Brzustowski, C. Matkovich, M. L. Peckford, D. Wilson,

“radR: An Open-Source Platform for Acquiring and Analysing Data on Biological

Targets Observed by Surveillance Radar”, BMC Ecology, 2010, Available From:

http://www.biomedcentral.com/1472-6785/10/22.

222
101. M. S. Arulampalam, S. Maskell, N. Gordon, T. Clapp, “A Tutorial on Particle

Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking”, IEEE

Transactions on Signal Processing, 50(2), 2002, pp. 174-188.

102. T. Cover, P. Hart, “Nearest Neighbor Pattern Classification”, IEEE Transaction

on Information Theory, Vol 13, Issue 1, 1967, pp. 21-27.

103. G. Mirzaei, J. Ross, M. Jamali, P. Gorsevski, J. Frizado, V. Bingman, “Data

Fusion of IR and Marine Radar Data”, IEEE Asilomar Conference on Signals

Systems and Computers, 2013, In press.

104. L. I. Smith, “A Tutorial on Principal Componnet Analysis”, 2002, Available

From:

http://www.cs.otago.ac.nz/cosc453/student_tutorials/principal_components.pdf

105. D. Liu, X. Wang, J. Zhang, X. Huang, “Feature Extraction using Mel Frequency

Ceptral Coefficients for Hyperspectral Image Classification”, Applied Optics,

49(14), 2010, pp. 2670-2675.

106. L. N. de Castro, “Fundamentals of Natural Computing”, Chapman & Hall/CRC,

2006.

107. M. W. Majid, G. Mirzaei, M. M. Jamali, “Evolutionary Neural Network

Parallelization with Multi-Core Systems on Chip”, IEEE International Conference

on Electro/Information Technology, 2012, pp. 1-4.

108. M. W. Majid, G.Mirzaei, M. M. Jamali, “Parallelization of Feature Extraction

Techniques on Consumer-Level Multi-Core System”, IEEE International

Conference on Electro/Information Technology, 2012, pp. 1-4.

223
109. J. F. Hair, B. Black, B. Babin, R. E. Anderson, R. L. Tatham, “Multivariate Data

Analysis”, Prentice hall, 2005.

110. IBM SPSS Software, Available From: http://www.spss.com/software/statistics

111. V. N. Vapnik, “The nature of Statistical Learning Theory”, Springer, 1995.

112. J. M. Drake, C. Randin, A. Guisan, “Modelling Echological Niches with Support

Vector Machine”, Journal of Applied Echology, 43(3), 2006, pp. 424-432.

113. J. D. Valle, L. E. S. Oliveira, A. L. Koerich, A. A. Britto, “People Countimg in

Low Density Video Sequences”, Springer, 2007, pp. 737- 748.

114. T. Darrell, G. Gordon, M. Harville, J. Woodfill, “Integrated Person Tracking

using, Stereo, Colour and Pattern Detection”, International Journal of Computer

Vision, 2000, pp. 175-185.

115. J. Wu, S. M., X. Wang,T. Zhang, “Ship Target Detection and Tracking in

Cluttered Infrared Imagary”, Machine Vision, Pattern Recognition, Optical

Engineering, 50(5), 2011, pp. 057207-1, 057207-12.

116. F. Zhang, C. Li, L. Shi, “Detecting and Tracking Dim Moving Point Target in IR

Image Sequence”, Infrared Physics and Technology, ELSEVIER, 46(4), 2005,

pp. 323-328.

117. FLIR Thermal Camera, Available From:

http://www.flir.com/thermography/americas/us/view/?id=61194&pi_ad_id=3236

6318665&gclid=CI7Hx6LGyrsCFVLxOgodcRsAeA

118. R. C. Gonzalez, R. E. Woods, “Digital Image Processing”, Pearson, 2009.

224
119. R. Pless, J. Larson, S. Siebers, B. Westover, “Evaluation of Local Models of

Dynamic Background”, in Proceedings IEEE Conference on Computer Vision

and Pattern Recognition, Vol. 2, 2003, pp. 73-78.

120. I. Haritaoglu, D. Harwood, L. S. Davis, “W4: Real-time Surveillance of People

and Their Activities”, IEEE Transaction on Pattern Analysis and Machine

Intelligence, Vol. 22, 2000, pp, 809-830.

121. N. Friedman, S. Russell, “Image Segmentation in Video Sequences: A

Probabilistic Approach”, Proceeding of the Thirteen Annual Conference on

Uncertainty in Artificial Intelligence, Morgan Kaufmann Publishers, Inc., 1997

122. F. Yao, A. Sekmen, M. Malkani, “A Novel Method for Real-time Multiple

Moving Targets Detection from Moving IR Camera”, International Conference on

Pattern Recognition, 2008, pp1-4.

123. F. El Baf, T. Bouwmans, B. Vachon, “Fuzzy Foreground Detection for Infrared

Videos”, IEEE Computer Society in Computer Vision and Pattern Recognition

Workshops, 2008, pp. 1-6.

124. J. K. Suhr, H. G. Jung, G. LI, J. Kim, “Mixture of Gaussians-Based Background

Subtraction for Bayer-Pattern Image Sequences”, IEEE Transaction on Circuits

and Systems for Video Technology, Vol. 21, No. 3, 2011, pp. 365-370.

125. M. Piccardi, “Background Subtraction Techniques: A Review”, IEEE

International Conference on Systems, Man and Cybernetics, 2004, pp. 3099-3104.

126. C. Stauffer, W. E. L. Grimson, “Adaptive Background Mixture Model for Real-

Time Tracking”, Proceeding of IEEE Computer Society Conference on Computer

Vision and Pattern Recognition,1999, pp. 246-252.

225
127. M. Sezgin, B. Sankur, “Survey Over Image Thresholding Techniques and

Quantative Performance Evaluation”, Journal of Electronic Imaging, 2004, pp.

146-165.

128. A. Yilmaz, O. Javed, M. Shah, “Object Tracking: A Survey”, ACM Computing

Surveys, Vol. 38, No. 4, 2006, pp. 1-45.

129. L. Chen, X. H. Xu, Y. X. Chen, “An Adaptive Ant Colony Clustering

Algorithm”, Proceeding of the Third International Conference on Machine

Learning and Cybernetics, 2004, pp. 1387-1392.

130. L. Herrmann, A. Ultsch, “Explaining Ant-Based Clustering on the Basis of Self-

Organizing Maps”, European Symposium on Artificial Neural Networks,

Advances in Computational Intelligence and Learning , 2008, pp. 215-220.

131. O. A. M. Jafar, R. Sivakumar, “Ant-Based Clustering Algorithms: A Brief

Survey”, International Journal of Computer Theory and Engineering, Vol. 2, No.

5, 2010, pp. 787-796.

132. L. Chen, X. Xu, Y. Chen, “A Novel Ant Clustering Algorithm Based on Cellular

Automata”, IEEE International Conference on Intelligent Agent Technology,

2004, pp. 148-154

133. J. Abony, B. Feil, “Cluster Analysis for Data Mining and System Identification”,

Birkhauser Verlag AG, 2007.

134. E. Alpaydin, “Introduction to Machine Learning”, Cambridge, Mass: MIT Press,

2004.

135. D. Malyszko, S. T. Wierzchon, “Standard and Genetic K-means Clustering

Techniques in Image Segmentation”, IEEE International Conference on Computer

226
Information Systems and Industrial Managemenet Applications, 2007, pp. 299-

304.

136. E. D. Lumer , B. Faieta , “Diversity and Adaption in Populations of Clustering

Ants”, Proceeding of Third International Conference on Simulation of Adaptive

Behavior: from Animals to Animats, Vol. 3 , MIT Press, 1994, pp.499-508.

137. J. C. Bezdek, R. Ehrlich, W. Full, “FCM: The Fuzzy C-Means Clustering

Algorithm”, Computers and Geosciences, Vol. 10, No. 2-3, 1984, pp. 191-203.

138. Russel Technology, Available From: http://www.russelltechnologies.ca/.

139. E. Waltz, J. Llinas, “Multisensor Data Fusion”, Artech House, Boston, London,

1990.

140. E. Blasch, E. Bosse, D. A. Lambert, “High-Level Information Fusion

Management and Systems Design”, Artech House, 2012.

141. H. B. Mitchell, “Data Fusion:Concepts and Ideas”, Springer, 2012.

142. L. A. Zadeh, “Fuzzy Sets”, Information and Control, 8 (3),1965, pp. 338–353.

143. L. A. Zadeh, “Fuzzy Sets as a Basis for a Theory of Possibility”, Fuzzy Sets and

Systems, 1 (1), 1978, pp. 3–28.

144. S. N. Sivanandam, S. Sumathi, S. N. Deepa, “Introduction to Fuzzy Logic using

MATLAB”, Springer, 2007.

145. H. Seki, M. Mizumoto, “On the Equivalence Conditions of Fuzzy Inference

Methods-Part 1: Basic Concept and Definition”, IEEE Transaction on Fuzzy

Systems, Vol. 19, issue:6, 2011, pp. 1097-1106.

146. H. J. Zimmermann, “Fuzzy Set Theory and Its Applications”, Kluwer Academic

Publisher, 2001.

227
147. M. A. A. Akhoundi, E. Valavi, “Multi-Sensor Fuzzy Fusion using Sensors with

Different Characteristics”, CSi Journal of Computer Science and Engineering, pp.

1-9.

148. T. Kim, H. Ko, “Bayesian Fusion of Confidence Measure for Speech

Recognition”, IEEE Signal Processing Letters, Vol. 12, Issue: 12, 2005, pp. 871-

874.

149. A. Leon-Garcia, “Probability, Statistics, and Random Processes for Electrical

Engineering”, Pearson, Prentice Hall, 2008.

150. H. Stark, J. W. Woods, “Probability, Statistics, and Random Processes for

Engineers”, Pearson, 2002.

151. B. Reuderink, M. Poel, K. Truong, R. Poppe, M. Pantic, “ Decision-Level Fusion

for Audio-Visual Laughter Detection”, Speringer-Verlag Berlin Heidelberg, 2008,

pp. 137-148.

152. K. Veeramachaneni, L. Osadciw, A. Ross, N. Srinivas, “Decision-Level Fusion

Strategies for Correlated Biometric Classifiers”, Proceeding of IEEE Computer

Society Workshop on Biometrics at the Computer Vision and Pattern Recogniton

(CVPR) Conference, 2008.

153. K. Goebel, A. Agogino, “Fuzzy Sensor Fusion for Gas Turbine Power Plants”,

Proceedings of SPIE, Sensor Fusion: Architectures, Algorithms, and Applications

III, 1999, pp. 52-61.

154. B. Solaiman, L. Pierce, F. T. Ulaby, “Multisensor Data Fusion using Fuzzy

Concepts: Application to Land-cover Classification using ERS-1/JERS-1 SAR

228
Composites”, IEEE Transaction on Geoscince Remote Sensing, Vol. 37, 1999,

pp. 1316–1326.

155. A. Jarrah, G. Mirzaei, M.W. Majid, J. D. Ross, M. M. Jamali, P. V. Gorsevski, J.

Frizado, V.P.Bingman, “A Parallel Implementation for IR Video Processing on a

GPU”, IEEE International Midwest Symposium on Circuits and Systems, 2013,

pp. 1160-1163.

156. M. W. Majid, G. Mirzaei, M. Jamali, “Parallel Wavelet Transform Based

Wideband Direction-of-Arrival (DOA) on Multicore/GPU”, IEEE

Electro/Information Technology Conference on Signals Systems and Computers,

2013, pp. 1-4.

157. M. W. Majid, G. Mirzaei, M. M. Jamali, “Novel Implementation of Recursive

Discrete Wavelet Transform for Real Time Computation with Multicore Systems

on Chip (SoC)”, Science Journal of Circuits, Systems, and Signal Processing, Vol.

2, 2012, pp. 22-28.

158. S. Bastas, M. W. Majid, G. Mirzaei, J. D. Ross, M. M. Jamali, “A Novel Feature

Extraction Algorithm for Classification of Bird Flight Calls”, IEEE International

Symposium on Circuits and Systems, 2012, pp. 1676-1679.

159. L. Wei, G. Mirzaei, M. W. Majid, M. M. Jamali, J. D. Ross, P. V. Gorsevski, V.

P. Bingman, “Birds/Bats Movement Tracking with IR Camera for Wind Farm

Applications”, The 2014 IEEE International Symposium on Circuits and Systems,

2014, In-press.

160. P. V. Gorveski, S. C. Cathcart, G. Mirzaei, M. M. Jamali, X. Ye, E.

Gomezdelcampo,“A Group-based Spatial Decision Support System for Wind

229
Farm Site Selection in Northwest Ohio”, Energy Policy, Elsevier, 2012, pp. 374-

385.

230

You might also like