Download as pdf or txt
Download as pdf or txt
You are on page 1of 318

OpenATS COMPASS

User Manual
by Helmut Puhr
Version 0.7.0 Valiant Viscacha
Alpha Release
Short contents

Short contents · ii

Contents · iii

List of Figures · x

List of Tables · xv

Introduction · 1

Installation · 11

UI Overview · 16

Filters · 62

Evaluation · 70

View Points · 141

ListBox View · 157

Histogram View · 164

OSG View · 172

ScatterPlot View · 258

Live Mode · 264

Command Line Options · 268

Troubleshooting · 275

Appendix · 280

ii
Contents

Short contents ii

Contents iii

List of Figures x

List of Tables xv

Introduction 1
User Manual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Feature Highlights . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Application modes 3 , Import 3 , Analysis 3 , Visualization 3 , Evaluation 4 ,
Semi-automated Processing 4
General Aspects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Contributors 6 , Test Data 6
Key Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

Installation 11
Prerequisites . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Operating System 11 , Recommended Hardware 11 , Graphics Cards &
Drivers 12
Using the AppImage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Building from Source Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Running the Application . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Configuration Upgrade . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

UI Overview 16
Main Window . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Main Menubar 18 , Main Tab Area 18 , Main Statusbar 19
Main Menu . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
File Menu 19 , Import Menu 21 , Configuration Menu 22 , Process Menu 23
Import Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
Import ASTERIX Recording 23
Main Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
Override Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

iii
CONTENTS iv

Mappings Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
Running . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
Import ASTERIX from Network 30 , Import GPS Trails 31
Main Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Config Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
Running . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Import View Points 34
Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Configure Data Sources 35
Import/Export of Configuration Data Sources . . . . . . . . . . 39
Show Meta Variables 39 , Configure Sectors 40
Import Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Postprocess . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Calculate Radar Plot Positions 47 , Calculate Associations 49
Reference/Tracker UTN Creation . . . . . . . . . . . . . . . . . . 51
Sensor UTN Creation . . . . . . . . . . . . . . . . . . . . . . . . . 52
Dubious Targets . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Running . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Calculate Associations for ARTAS 54
Data Sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
Views 61

Filters 62
Default Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
Aircraft Address Filter 63 , Aircraft Identification Filter 63 , ADSB Quality
Filter 64 , ADSB MOPS Filter 64 , ARTAS Hash Code Filter 64 , Detection
Type Filter 65 , Position Filter 65 , Time of Day Filter 66 , Track Number Fil-
ter 66 , Mode 3/A Codes Filter 66 , Mode C Codes Filter 67 , Primary Only 67
, UTN Filter 67
Adding a New Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67

Evaluation 70
Pre-Requisites & Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
Target Report Associations 70 , Sector Altitude Filtering 70 , Reference Data 71
Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
Main Tab 74
Data Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
Standard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
Sector Layer/Requirement Mapping . . . . . . . . . . . . . . . . 75
Targets Tab 76 , Filter Tab 79 , Standard Tab 80
Current Standard . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
Results Tab 84
Running . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
CONTENTS v

Load Data 85 , Filtering Targets 86 , Evaluation 90


Results Inspection & Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
Overview 92 , Sector Details 94 , Per-Target Details 96
Generate Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
Detection 100
Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
Calculation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
Result Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
Dubious Targets 103
Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
Calculation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104
Result Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
Dubious Tracks 108
Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
Calculation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
Result Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
Extra Data 112
Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
Result Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
Extra Track 114
Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114
Result Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
Identification Correct 116
Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
Result Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
Identification False 118
Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
Result Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
Mode 3/A False 120
Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
Result Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
Mode 3/A Present 122
Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
Result Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
Mode C False 124
Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
Result Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125
Mode C Present 126
Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
Result Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
Position Across 128
Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128
Result Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
Position Along 130
Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
Result Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
CONTENTS vi

Position Distance 132


Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
Result Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
Position Latency 135
Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
Result Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
Speed 138
Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
Result Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139

View Points 141


View Point . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
Toolbar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144
Table . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
Re-Sorting Using Columns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146
Showing/Hiding Columns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
Filtering Based on Type . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
Filtering Based on Status . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
Function Buttons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
Stepping View Points . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
Management Tab 149 , Data Selection 149 , ListBox View 150 , OSGView 151
, After Selection 151 , Assessment 152
Exporting View Points to PDF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152

ListBox View 157


Layout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
Data Loading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159
Usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160
Selection 160 , Variable Lists 160 , Variables 160 , Show Only Selected 161 ,
Use Presentation 161 , Exporting 161 , Reload 163

Histogram View 164


Layout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165
Data Loading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
Usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167
Toolbar 167 , Config Tab 167 , Histogram 168
Zoom . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168
Selection Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168
Evaluation Result 170

OSG View 172


Layout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
Toolbar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176
Data Widget . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177
Mouse/Keyboard Operations 177 , Status Information 179 , Data Opera-
tions 179
Data Labeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 179
CONTENTS vii

Geometry Menu . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180


Label Multiple . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182
Distance Measurement . . . . . . . . . . . . . . . . . . . . . . . . 182
Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186
Time Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
Depth Check . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193
Save View Point . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195
Data Label Deletion . . . . . . . . . . . . . . . . . . . . . . . . . . 195
Measurement Deletion . . . . . . . . . . . . . . . . . . . . . . . . 196
Selection Color . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
Selection Invert . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
Selection Deletion . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
Overlay Text Color Invert . . . . . . . . . . . . . . . . . . . . . . 196
Switch Map Dimensions . . . . . . . . . . . . . . . . . . . . . . . 196
Zoom to Home . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
Zoom to Loaded Data . . . . . . . . . . . . . . . . . . . . . . . . 196
Configuration Panel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197
Layers Tab 198
Geometry Operations . . . . . . . . . . . . . . . . . . . . . . . . 200
Measurement Operations . . . . . . . . . . . . . . . . . . . . . . 202
Radars Operations . . . . . . . . . . . . . . . . . . . . . . . . . . 203
Sectors Operations . . . . . . . . . . . . . . . . . . . . . . . . . . 204
Map Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
Style Tab 219
Layer Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220
Connect Last Layer & Connect None Height . . . . . . . . . . . 223
Style . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 225
Customized Styling . . . . . . . . . . . . . . . . . . . . . . . . . . 227
Style Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227
Render Order . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
Labels Tab 242
Label Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243
Automatic Labeling . . . . . . . . . . . . . . . . . . . . . . . . . . 244
Label Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 247
Evaluation Tab 249 , Others Tab 251
Height . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252
Ground Speed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253
Position Accuracy . . . . . . . . . . . . . . . . . . . . . . . . . . . 254
Radar Default Accuracies . . . . . . . . . . . . . . . . . . . . . . 256

ScatterPlot View 258


Layout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 259
Data Loading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 260
Usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261
Toolbar 261 , Config Tab 261 , Scatterplot 261
General Zoom . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261
Navigation Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . 261
CONTENTS viii

Zoom to Rectangle Mode . . . . . . . . . . . . . . . . . . . . . . 262


Selection Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . 262

Live Mode 264


Data Sources & Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266
OSG View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266

Command Line Options 268


Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 270
–create_db filename 270 , –open_db filename 270 , –import_data_sources_file
filename 270 , –import_view_points filename 270 , –import_asterix_file file-
name 270 , –import_asterix_file_line arg 270 , –import_asterix_network 270 , –
import_asterix_network_time_offset arg 271 , –import_asterix_network_max_lines
arg 271 , –asterix_framing framing 271 , –asterix_decoder_cfg ’str’ 271
, –import_gps_trail filename 272 , –import_gps_parameters ’str’ 272 , –
import_sectors_json filename 272 , –associate_data 272 , –load_data 272 ,
–export_view_points_report 273 , –evaluate 273 , –import_gps_parameters
’str’ 273 , –evaluate_run_filter 274 , –export_eval_report 274 , –
no_cfg_save 274 , –quit 274

Troubleshooting 275
Known Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 275
CentOS Fuse Usermount Permissions 275 , Missing glibc Library Ver-
sions 275 , White OSGView & Shader Errors in Console Log 276 , Graphical
Issues 277
Reporting Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277
Already Reported Issues 278 , Collect Information 278
For Application Crashes . . . . . . . . . . . . . . . . . . . . . . . 278
Issue Reporting 279

Appendix 280
Appendix: Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 280
v0.6.0 and later 280 , Pre-v0.6.0 and older 280 , Configuration Folder 280 ,
Data Folder 281
Appendix: Data Sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 281
Current Format 281 , Deprecated Format 282
Appendix: View Points . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283
File Content 283 , Custom Attributes 284 , Version 0.2 284
View Point Context . . . . . . . . . . . . . . . . . . . . . . . . . . 285
View Point . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 287
View Point Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . 289
Appendix: Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 291
Positions Accuracy Ellipses 291 , ADS-B 291
V0 Transponders . . . . . . . . . . . . . . . . . . . . . . . . . . . 292
V1 & V2 Transponders . . . . . . . . . . . . . . . . . . . . . . . . 292
MLAT 293 , Radar 294 , RefTraj 294 , Tracker 294
Appendix: Latex Installation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295
CONTENTS ix

Ubuntu & Debian Variants 295 , CentOS & Fedora Variants 295 , Testing 295
Appendix: Utilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 296
jASTERIX 296 , SDDL 296 , ADS-B exchange 301
Appendix: Licensing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 301
Appendix: Disclaimer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 302
Appendix: Used Libaries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 302
Appendix: ADS-B exchange . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 303
List of Figures

1 Main Window . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2 File Menu . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
3 Main Window After Opening a Database . . . . . . . . . . . . . . . . . . . . 21
4 File Menu . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
5 Configuration Menu . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
6 Process Menu . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
7 Import ASTERIX data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
8 Task: Import ASTERIX data override . . . . . . . . . . . . . . . . . . . . . . . 27
9 Task: Import ASTERIX data mappings . . . . . . . . . . . . . . . . . . . . . . 28
10 Task: Import ASTERIX Data DBContent Variable Details . . . . . . . . . . . 29
11 Import ASTERIX data status . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
12 Import ASTERIX Data from network . . . . . . . . . . . . . . . . . . . . . . . 31
13 Import GPS Trail . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
14 Import GPS Trail Config . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
15 Import View Points . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
16 Configure Data Sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
17 Configure Data Sources: Radar details . . . . . . . . . . . . . . . . . . . . . . 38
18 Show Meta Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
19 Configure Sectors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
20 Configure Sectors with example file . . . . . . . . . . . . . . . . . . . . . . . 42
21 Configure Sectors import dialog . . . . . . . . . . . . . . . . . . . . . . . . . . 43
22 Configure Sectors Manage tab . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
23 Configure Sectors with example sector . . . . . . . . . . . . . . . . . . . . . . 46
24 Configure Sectors editing result . . . . . . . . . . . . . . . . . . . . . . . . . . 47
25 Calculate Radar Plot positions . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
26 Calculate Radar Plot positions done . . . . . . . . . . . . . . . . . . . . . . . 49
27 Associate Target Reports . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
28 Calculate Associations from ARTAS . . . . . . . . . . . . . . . . . . . . . . . 55
29 Data Sources Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
30 Filters Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

31 Filters Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
32 Target Address filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
33 Callsign filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
34 ADSB quality filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
35 ADSB quality filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

x
LIST OF FIGURES xi

36 ARTAS Hash Code filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64


37 Detection Type filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
38 Position filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
39 Time of Day filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
40 Track Number filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
41 Mode 3/A Codes filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
42 Mode C Codes filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
43 Primary Only filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
44 UTN filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
45 Adding a filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68

46 Evaluation tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
47 Evaluation Main tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
48 Evaluation Targets tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
49 Evaluation Targets tab with loaded data . . . . . . . . . . . . . . . . . . . . . 78
50 Evaluation Filter tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
51 Evaluation Standard tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
52 Evaluation Standard tab: Add requirement . . . . . . . . . . . . . . . . . . . 82
53 Evaluation Results tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
54 Evaluation: Post-processing after loading . . . . . . . . . . . . . . . . . . . . 85
55 Evaluation Targets tab after loading . . . . . . . . . . . . . . . . . . . . . . . 86
56 Evaluation Filter UTNs dialog . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
57 Evaluation Targets tab after filtering . . . . . . . . . . . . . . . . . . . . . . . 90
58 Evaluation: Running evaluation status . . . . . . . . . . . . . . . . . . . . . . 91
59 Evaluation results: Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
60 Evaluation results: Sector PD errors in OSGView . . . . . . . . . . . . . . . . 93
61 Evaluation results: Sector Detail Example . . . . . . . . . . . . . . . . . . . . 94
62 Evaluation results: Target PD Errors in OSGView . . . . . . . . . . . . . . . . 95
63 Evaluation results: Per-Target PD Detail Example . . . . . . . . . . . . . . . 96
64 Evaluation results: Target Single PD Error in OSGView . . . . . . . . . . . . 97
65 Evaluation results: Generate report dialog . . . . . . . . . . . . . . . . . . . . 98
66 Evaluation results: Generate report in progress . . . . . . . . . . . . . . . . . 99
67 Evaluation Detection requirement . . . . . . . . . . . . . . . . . . . . . . . . . 100
68 Evaluation Dubious Targets . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
69 Evaluation Dubious Tracks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
70 Evaluation Extra Data requirement . . . . . . . . . . . . . . . . . . . . . . . . 112
71 Evaluation Extra Track requirement . . . . . . . . . . . . . . . . . . . . . . . 114
72 Evaluation Identification Correct requirement . . . . . . . . . . . . . . . . . . 116
73 Evaluation Identification False requirement . . . . . . . . . . . . . . . . . . . 118
74 Evaluation Mode 3/A False requirement . . . . . . . . . . . . . . . . . . . . . 120
75 Evaluation Mode 3/A Present requirement . . . . . . . . . . . . . . . . . . . 122
76 Evaluation Mode C False requirement . . . . . . . . . . . . . . . . . . . . . . 124
77 Evaluation Mode C Present requirement . . . . . . . . . . . . . . . . . . . . . 126
78 Evaluation Position Across requirement . . . . . . . . . . . . . . . . . . . . . 128
79 Evaluation Position Along requirement . . . . . . . . . . . . . . . . . . . . . 130
80 Evaluation Position Distance requirement for correct positions . . . . . . . . 132
81 Evaluation Position Distance requirement for false positions . . . . . . . . . 133
LIST OF FIGURES xii

82 Evaluation Position Latency requirement . . . . . . . . . . . . . . . . . . . . 135


83 Evaluation Speed requirement . . . . . . . . . . . . . . . . . . . . . . . . . . . 138

84 View Points Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142


85 View Points Table . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
86 View Points Table: Selected View Point . . . . . . . . . . . . . . . . . . . . . . 146
87 View Points: Edit Columns Menu . . . . . . . . . . . . . . . . . . . . . . . . . 147
88 View Points: Filter By Type Menu . . . . . . . . . . . . . . . . . . . . . . . . . 148
89 View Points: Filter By Status Menu . . . . . . . . . . . . . . . . . . . . . . . . 148
90 View Points ListBox View: Selected View Point . . . . . . . . . . . . . . . . . 150
91 View Points OSGView: Selected View Point . . . . . . . . . . . . . . . . . . . 151
92 View Points: Export PDF Dialog . . . . . . . . . . . . . . . . . . . . . . . . . . 153
93 View Points: Export PDF Dialog Status . . . . . . . . . . . . . . . . . . . . . . 154
94 View Points: Export PDF Dialog Status: pdflatex . . . . . . . . . . . . . . . . 155

95 Listbox View startup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158


96 Listbox View after loading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159
97 Listbox View adding of variables . . . . . . . . . . . . . . . . . . . . . . . . . 161
98 Listbox View export . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162
99 Listbox View export in LibreOffice Calc . . . . . . . . . . . . . . . . . . . . . 163

100 Histogram View startup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165


101 Histogram View after loading . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
102 Histogram View data selection . . . . . . . . . . . . . . . . . . . . . . . . . . 168
103 Histogram View data selected . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
104 Histogram View evaluation position correction result . . . . . . . . . . . . . 170
105 Histogram View Mode C present result . . . . . . . . . . . . . . . . . . . . . 171

106 OSG View overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173


107 OSG View overview after loading . . . . . . . . . . . . . . . . . . . . . . . . . 175
108 OSG View Labels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180
109 OSG View geometry operations . . . . . . . . . . . . . . . . . . . . . . . . . . 181
110 OSG View Label Multiple error message . . . . . . . . . . . . . . . . . . . . . 182
111 OSG View measurement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183
112 OSG View measurement done . . . . . . . . . . . . . . . . . . . . . . . . . . . 184
113 OSG View measurement with height information . . . . . . . . . . . . . . . . 185
114 OSG View selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186
115 OSG View selection done . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
116 OSG View selection with adding . . . . . . . . . . . . . . . . . . . . . . . . . 188
117 OSG View selection with adding done . . . . . . . . . . . . . . . . . . . . . . 189
118 OSG View selection with height . . . . . . . . . . . . . . . . . . . . . . . . . . 190
119 OSG View selection with height done . . . . . . . . . . . . . . . . . . . . . . 191
120 OSG View time filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192
121 OSG View depth check . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194
122 OSG View depth check bad rendering . . . . . . . . . . . . . . . . . . . . . . 195
123 OSG View configuration panel . . . . . . . . . . . . . . . . . . . . . . . . . . 197
124 OSG View layers tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199
LIST OF FIGURES xiii

125 OSG View layer context menu . . . . . . . . . . . . . . . . . . . . . . . . . . . 201


126 OSG View measurement layer operations . . . . . . . . . . . . . . . . . . . . 202
127 OSG View radars layer operations . . . . . . . . . . . . . . . . . . . . . . . . 204
128 OSG View with 2D Sector Examples . . . . . . . . . . . . . . . . . . . . . . . 205
129 OSG View with 3D Sector Examples . . . . . . . . . . . . . . . . . . . . . . . 206
130 OSG View Arcgis map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208
131 OSG View minimal map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 209
132 OSG View OpenStreetMap . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 210
133 OSG View OpenStreetMap Vienna Airport . . . . . . . . . . . . . . . . . . . 211
134 OSG View OpenStreetMap German . . . . . . . . . . . . . . . . . . . . . . . . 212
135 OSG View ReadyMap . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213
136 OSG View ReadyMap detailed elevation . . . . . . . . . . . . . . . . . . . . . 214
137 Maps Folder . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 215
138 OSG View Style tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219
139 OSG View layer mode DBContent:DS ID . . . . . . . . . . . . . . . . . . . . . 222
140 OSG View layer mode Mode 3/A Code:DBContent:DS ID . . . . . . . . . . . 223
141 OSG View Data with lines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 225
142 OSG View Layer color per DS ID . . . . . . . . . . . . . . . . . . . . . . . . . 228
143 OSG View Layer color per Mode 3/A Code . . . . . . . . . . . . . . . . . . . 229
144 OSG View Layer color per Track Number . . . . . . . . . . . . . . . . . . . . 230
145 OSG View Layer color per UTN . . . . . . . . . . . . . . . . . . . . . . . . . . 231
146 OSG View Color by ADS-B MOPS version . . . . . . . . . . . . . . . . . . . . 232
147 OSG View Color by ADS-B position quality . . . . . . . . . . . . . . . . . . . 233
148 OSG View Color by Flight Level . . . . . . . . . . . . . . . . . . . . . . . . . . 234
149 OSG View Color by Detection type . . . . . . . . . . . . . . . . . . . . . . . . 235
150 OSG View Color by Speed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 236
151 OSG View Color by track angle . . . . . . . . . . . . . . . . . . . . . . . . . . 237
152 OSG View Color by track Mode 3/A age . . . . . . . . . . . . . . . . . . . . . 238
153 OSG View Color by track MLAT age . . . . . . . . . . . . . . . . . . . . . . . 239
154 OSG View Color by track ages . . . . . . . . . . . . . . . . . . . . . . . . . . . 240
155 OSG View render order . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
156 OSG View Labels tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242
157 OSG View Edit Label Contents . . . . . . . . . . . . . . . . . . . . . . . . . . 243
158 OSG View Auto-Labels LoD 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . 245
159 OSG View Auto-Labels LoD 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . 246
160 OSG View Auto-Labels LoD 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . 247
161 OSG View Evaluation tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249
162 OSG View Others tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
163 OSG View use height . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253
164 OSG View groundspeed vectors . . . . . . . . . . . . . . . . . . . . . . . . . . 254
165 OSG View position accuracy ellipses . . . . . . . . . . . . . . . . . . . . . . . 256

166 Scatterplot View startup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 259


167 Scatterplot View after loading data . . . . . . . . . . . . . . . . . . . . . . . . 260
168 Scatterplot View data selection . . . . . . . . . . . . . . . . . . . . . . . . . . 262
169 Scatterplot View data selected . . . . . . . . . . . . . . . . . . . . . . . . . . . 263
LIST OF FIGURES xiv

170 Main Window in Live Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265


171 OSG View in Live Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267
List of Tables

1 SQL operators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69

2 Toolbar Actions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144

3 Toolbar mouse interaction modes . . . . . . . . . . . . . . . . . . . . . . . . . 167


4 Toolbar operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167

5 Toolbar mouse action modes . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176


6 Toolbar operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177
7 Map widget view operations in Navigate mode . . . . . . . . . . . . . . . . . 178
8 Map widget view operations in Label mode . . . . . . . . . . . . . . . . . . . 178
9 Map widget view operations in Label Multiple mode . . . . . . . . . . . . . 178
10 Map widget view operations in Measure mode . . . . . . . . . . . . . . . . . 178
11 Map widget view operations in Select mode . . . . . . . . . . . . . . . . . . . 178
12 Layer operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200

13 Toolbar mouse interaction modes . . . . . . . . . . . . . . . . . . . . . . . . . 261


14 Toolbar actions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261

15 ADS-B Position Accuracy Variables . . . . . . . . . . . . . . . . . . . . . . . . 291


16 NUCp Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 292
17 NACp Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 292
18 MLAT Position Accuracy Variables . . . . . . . . . . . . . . . . . . . . . . . . 293
19 Tracker Position Accuracy Variables . . . . . . . . . . . . . . . . . . . . . . . 294
20 Library licenses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 302

xv
Introduction

The OpenATS COMPASS tool aims at providing a generalized framework for ATC surveil-
lance data inspection and analysis.

It’s name being an abbreviation for compliance assessment, the OpenATS COMPASS
tool allows air traffic surveillance recordings to be imported into a database for analysis,
visualization and evaluation.

Many use-cases are supported, e.g. importing EUROCONTROL ASTERIX recordings


into a database, textual & visual analysis, evaluation (calculation of performance indica-
tors) and evaluation report document generation.

The application is highly configurable and quite complex, and is developed for usage
by air traffic surveillance professionals. To support new users, a user manual as well as
YouTube videos are supplied.

The C++ code is released under the GPL-3.0, while the Linux AppImage and the user
manual are released under CC BY 4.0.

OpenATS COMPASS is publicly available and free for anyone to use (including com-
mercial usage), under the previously stated licenses.

User Manual
This document has its focus on interaction and working procedures required to make use
of the existing functionality. In this introduction, feature highlights and acknowledgments
are listed, followed by a brief summary of important aspects of COMPASS in Key Con-
cepts.

In the section Installation, prerequisites are listed and installation instructions are
given.

In the larger section UI Overview, the steps are described to run the application, create
a database or access an existing one, import, process data, as well as load data into Views.

The filtering mechanism described in detail in Filters.

1
INTRODUCTION 2

The evaluation feature is presented in the section Evaluation, describing how


requirement-based standards can be adapted/defined and compliance to said standards
can be assessed.

Re-visiting saved points of interest is described in section View Points. Inspection of


loaded data using the existing Views is described in the sections ListBox View, Histogram
View, OSG View and ScatterPlot View.

The application commonly uses the ’Offline’ mode. When ASTERIX data is imported
from the network, the application switches into ’Live’ mode, which is described in Live
Mode.

The automated execution of configured tasks is described in Command Line Options.

In the section Troubleshooting details about reported issues are collected. It also con-
tains instructions on how to report new issues.

In the last section Appendix additional details are listed. The section Appendix: Utili-
ties explains how data can be manually imported into COMPASS. In Appendix: Licensing
information is given about under which conditions COMPASS can be used, and what li-
braries with what licences are used in the background.
INTRODUCTION 3

Feature Highlights
Application modes
• Offline: Recordings can be imported and larger amounts of data can be inspected
and evaluated
• Live: Data can be read directly from network interfaces and immediately displayed
to show the current airspace situation

Import
• Support of SQLite3 database system
• Dynamic ASTERIX import using jASTERIX
• Import of (D)GPS trails from NMEA files
• Import of polygons from GML, KML, ESRI Shapefiles
• Supported data source types
– Radar
– MLAT & WAM
– ADS-B
– System Tracker
– Reference Trajectory

Analysis
• High performance processing, low memory footprint
• Filtering for detailed analysis
• Simple custom filter generation
• General target report association (find unique targets)
• ARTAS track association (TRI) analysis

Visualization
• Textual data inspection using ListBox View
– Display of data as text tables
– Configurable loading of data of interest
– Export of data as CSV
• Data distribution inspection using Histogram View
– Display of any numeric variable or evaluation result as histogram
– Linear or logarithmic axis
• Graphical data inspection using OSG View
– Automatic labeling of data
INTRODUCTION 4

– Customizable map/terrain display based on osgEarth


– Customizable display of ATC surveillance data using OpenSceneGraph
– High-speed time-filtered display
– Numerous features for analysis, e.g. data selection, labeling, distance measure-
ments
– Configurable data layering and styling for detailed analysis
– Relatively low memory footprint (e.g. 16 million target reports in 8 GB RAM)
• 2D data distribution inspection using ScatterPlot View
– Display of any two numeric variables as X/Y plot
• Cross-view data selection and inspection
• Storable view points for efficient inspection

Evaluation
• Standard compliance evaluation
– Definition of standards based on configurable requirements
– Generalized comparison of test data vs. reference data
– Calculation of requirements/performance indicators
– Investigation/display of results at several levels of detail
– Automatic removal of targets (short tracks, VFR, . . . ) possible
– Manual removal of specific targets possible
– Export of results as report PDF

Semi-automated Processing
• Command line options for common processing features

General Aspects
COMPASS is a highly specialized surveillance data processing framework with a strong
focus on high-performance and a low memory footprint to enable the processing of large
quantities of data. Surveillance data is fetched from a database (limited by a filter system),
then processed and displayed using so-called Views (specific visualizations of the result
set).

As storage medium a database is used, in particular the SQLite3 system, which is


a lightweight and efficient database and allows copying/sharing database as single file
containers.

There are two application modes: Offline and Live. In Offline mode an offline record-
ing can be imported and larger amounts of data can be inspected and evaluated. In Live
mode data can be read directly from network interfaces, inserted into the database, and
INTRODUCTION 5

immediately displayed to show the current airspace situation.

After a previously generated database is opened, data can be loaded using a database
query. A filter configuration may restrict the loaded data to a result set. Such a result set is
displayed for analysis using the various existing Views.

Each View defines which contents of the database are required to fulfill its purpose,
and only such parts are loaded. During a loading process from the database, subsets of the
query result are immediately added to the current result set and all views are updated.

Acknowledgments
The following libraries are used in the project (list not exhaustive):
• Qt5
• Boost
• SQLite3
• GDAL
• Log4Cpp
• LibArchive
• Eigen3
• OpenSceneGraph
• OSGEarth
• nlohmann::json
• Intel Thread Building Blocks
• Catch2
• OpenSSL
• NemaTode

Additionally, map data from the following sources was used:


• Minimal map country borders thematicmapping.org
• Minimal detailed map
– Country borders https://ec.europa.eu/eurostat/web/gisco/
geodata/reference-data/administrative-units-statistical-units/
countries
– Rivers & lakes https://www.eea.europa.eu/data-and-maps/data/
wise-large-rivers-and-large-lakes
• Airports https://ec.europa.eu/eurostat/web/gisco/geodata/
reference-data/transport-networks

Many thanks to the developers of those. Your work is awesome.


INTRODUCTION 6

Contributors
Also, several persons are contributing to the development and testing of COMPASS. Many
thanks to them as well, if you’d like to be named here please contact the author.

Test Data
Special thanks are in order for Malta Air Traffic Services (https://www.maltats.com/)
for providing the project with a test dataset with which screenshots were made.

Special thanks are also in order for Austro Control (https://www.austrocontrol.


at/) for providing the project with a test dataset with which screenshots were made.

Key Concepts
In this section, a few key concepts are introduced to convey a somewhat deeper under-
standing of COMPASS, and to allow the reader to understand some main design choices
made by the author. This should also give indications about the strengths and draw-backs
of the chosen approach.

Database Systems
A database allows for storage, retrieval and filtering of the data of interest. While SQLite3 -
being a stand-alone database - also has some drawbacks, it was chosen for its performance
and ease of use (compared to e.g. NoSQL databases or MySQL variants).

SQLite3 encapsulates a database in a single file container, which is read from a stor-
age medium (e.g. hard drive). Therefore, it does not require the installation of a database
service, which is one of the advantages of the current solution.

• Single file container


– Can easily be copied, shared, archived, ..
• SQLite3 database
– Can be read/edited with other tools
INTRODUCTION 7

* e.g. sqlitebrowser, from Python scripts, ...


– Nothing has to be additionally installed
• High-performance, light-weight

Configuration
At startup numerous configuration files are loaded, and at shutdown the current configu-
ration state of COMPASS is saved.

The configuration is not just a matter of storing simple parameters of components, but
also what components exist. To give an example: Each existing View is saved, and when
the program is started again, the previously active Views are restored. The same is true
for almost all components of COMPASS.

Using this configuration, a user can have a specific program configuration for a specific
usage situation, which can be instantly reused for a different dataset, using a specific View
or filter configuration, allowing for a high degree of flexibility and supporting numerous
use-cases.

• Stored in a home folder for current AppImage version


– e.g. ~/.compass/0.7.0
• Read at application startup
• Written at application shutdown (if wanted)
• Not written at application crash
• If not existing for current version
– Default configuration copied by AppImage

Data Sources & Database Content


Data sources are (at the most basic level) defined by a name and SAC/SIC, and obtain a
certain data source type (DSType). If e.g. the data source ’Wuerzburg’ of DSType ’Radar’
is defined in the application, during an ASTERIX import the respective ASTERIX CAT048
data is inserted into the database and associated to the data source by SAC/SIC.

For each data source up to 4 different lines can be used (L1, L2, L3, L4). Such lines
can either be used to distinguish data recorded from different network lines, but also to
distinguish different recordings of the same data source. For instance, different tracker
runs can be imported and analyzed by importing them into different data source lines.

A certain Database Content (DBContent) is defined by a name and obtains a collection


of variables. For example, CAT048 and CAT062 are types of DBContent and each has
variables holding time, position, Mode 3/A code and Mode C height, and so on. If such a
INTRODUCTION 8

DBContent is present, target reports can be loaded from a database and displayed.

• Data Source: Single data source, e.g. a certain Radar


– Identified by name (DS ID), SAC/SIC, ...
• Data Source Type (DSType)
– Type of source technology, i.e.
* Radar, MLAT, ADS-B, Tracker, RefTraj, Other
• Line Identifier (Line ID)
– L1, L2, L3, L4
• Database Content (DBContent)
– Type of data, e.g. ASTERIX Formats CAT048, CAT020, CAT021, ...
– During import, DBContent is attributed to
* Data source, Line ID
– Contains DBContent Variables: Time of Day, Latitude, Longitude, ...

ASTERIX Data Import


If surveillance data is given in EUROCONTROL’s ASTERIX format, it can be decoded
using the jASTERIX library. This library allows adding new framings, categories and
editions based on configuration only.

The resulting JSON data is then mapped to DBContent variables stored in the database.
The mapping between JSON keys and DBContent variables is configurable, allowing for a
broad usage spectrum.

Meta Variables
To allow displaying data from different DBContent in the same system, so-called Meta
variables were introduced, which hold variables that are present in some or all DBContent
(with a possibly different name or unit).
For example, there is the meta variable ’Time of Day’, which is a collection of sub-variables
for each existing DBContent and the respective ’Time of Day’ variable.

• Each DBContent has DBContent variables


• The same variables exist in multiple DBContents
• Meta variables
– Group DBContent variables of same content
– Allow easier usage
– Can be inspected using the Configuration menu
INTRODUCTION 9

Unique Target Numbers


A Unique Target Number (UTN) groups together DBContent target reports. This infor-
mation can be created either by a general association task or created based on ARTAS TRI
information (one UTN for each ARTAS track, with the associated sensor information).

• Calculated during ’Calculate Associations’ task


• Each unique target is attributed a unique target number (UTN)
• Each target report can have
– No UTN: Not associated
– A single UTN: Associated to 1 target
– Multiple UTNs: Special case e.g. for merged PSR plots
• Targets in the evaluation are identified by unique UTN
• UTN can group together multiple tracks / flight legs
• Allows comparison of data
– Find equivalent reference data for test data

Data Loading
In COMPASS, a unified data loading process was chosen, meaning that only exactly one
common dataset is loaded, which can be inspected using multiple Views.

When started, data is incrementally read from the database, stored in the resulting
dataset, and distributed to the active Views. Each time such a loading process is triggered,
all Views clear their dataset and gradually update.

This makes working with the data somewhat easier to understand, since only one
dataset exists, while on the other hand it does not allow several independent datasets (e.g.
with different filters) to be loaded at the same time.

• If a loading process is started, a dataset will be loaded into memory (RAM)


– DBContent as defined by data sources, filters
– Only contains DBContent variables that are needed at the time
• Only one dataset can exist at a time
• Dataset is distributed and displayed in all views
• If additional DBContent variables are needed
– e.g. by a View
– a reload is required
INTRODUCTION 10

Live Mode
Commonly, COMPASS is used in the ’Offline’ mode, which allows inspection of larger
amounts of recorded ASTERIX data.

To analyze live data from the network, a ’Live’ mode exists which records data from
UDP network streams, decodes the content, imports the data to the database, and imme-
diately displays the data in the OSGView.

The most current data (up to 5 minutes) is stored in main memory (RAM), to inspect
cases of interest in the immediate past.

The ’Live’ mode can be quit, after which the application returns to the ’Offline’ mode,
to allow inspection of the full database as always.
Installation

COMPASS can either be built from source (which does not include the OSGView, but
allows development of additional features), or downloaded as an AppImage (binary dis-
tributable which includes the OSGView). The recommended option for non-developers is
the AppImage.

However, there are a few prerequisites that should be taken into consideration.

Prerequisites
Operating System
The binary will currently only be provided to (somewhat current) Linux 64-bit operating
systems. For the following distributions, the AppImage was reported to be working cor-
rectly:
• CentOS 7.3
• Ubuntu 14.04, 16.04, 17.04, 18.04, 19.04, 20.04
• Linux Mint 18.3

For the following distributions, the AppImage was reported to have issues:
• CentOS 6.*: Unfortunately the operating system’s glibc versions are too old.

If you have tried other operating systems as the ones listed here, it would be appreci-
ated if you could provide feedback about your experience to compass@openats.at.

Other operating systems (e.g. Windows or Mac) are currently not supported.

Recommended Hardware
The application will perform best on a workstation with at least the following minimum
requirements:
• CPU with at least 2 physical cores
• Dedicated NVidia or ATI graphics card

11
INSTALLATION 12

• 8GB of RAM or more

Depending on the loaded datasize, more RAM or a better graphics card might be of
advantage.

Please note that other graphics cards (Intel, Matrox) will in good probability also
work, but are not supported.

An optimal setup could be similiar to:


• Intel i5 or better (at least 2 physical cores)
• Decent NVidia or ATI graphics card, e.g.
– Nvidia Quattro, GT 740 or later, GTX 660 or later
– Or equivalent ATI GPU models
– Refer e.g. to http://gpuboss.com/ for benchmarks
• 16GB of RAM or more

Graphics Cards & Drivers


Since there exist hundreds of different graphics card types, and numerous possible drivers
for each of them, support will only be given for ATI or NVidia graphics cards, with their
native drivers.
To find out what graphics cards are available in the system, use lspci (might have to be
installed):
lspci | grep VGA
Output might look like this:
01:00.0 VGA compatible controller: NVIDIA Corporation GP104 [GeForce
,→ GTX 1080]
To find out which graphics card and driver are being used, the program glxinfo can be
used (might have to be installed). Please execute the following command:
glxinfo | grep OpenGL
The output might look something like this:
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce GTX 1080/PCIe/SSE2
OpenGL core profile version string: 4.5.0 NVIDIA 384.111
OpenGL core profile shading language version string: 4.50 NVIDIA
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 4.5.0 NVIDIA 384.111
INSTALLATION 13

OpenGL shading language version string: 4.50 NVIDIA


OpenGL context flags: (none)
OpenGL profile mask: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.2 NVIDIA 384.111
OpenGL ES profile shading language version string: OpenGL ES GLSL ES
,→ 3.20
OpenGL ES profile extensions:
This reveals several important points:
• OpenGL vendor string: This is the grapics card driver. Only NVidia or ATI ones are
supported.
• OpenGL version string: This is the OpenGl version, 3.0 or later is recommended.
• OpenGL shading language version string: This is the OpenGL shader version, 3.0 or
later is recommended.

Unsupported Graphics Cards or Drivers If different graphics cards or drivers are in use,
the output of glxinfo might be similiar to this:
OpenGL vendor string: nouveau
OpenGL renderer string: Gallium 0.4 on NVE6
OpenGL core profile version string: 3.1 (Core Profile) Mesa 9.2.5
OpenGL core profile shading language version string: 1.40
OpenGL core profile context flags: (none)
OpenGL core profile extensions:
OpenGL version string: 3.0 Mesa 9.2.5
OpenGL shading language version string: 1.30
OpenGL context flags: (none)
OpenGL extensions:
Or this:
OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Intel(R) Ivybridge Desktop
OpenGL core profile version string: 3.3 (Core Profile) Mesa 11.0.6
OpenGL core profile shading language version string: 3.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.0 Mesa 11.0.6
OpenGL shading language version string: 1.30
OpenGL context flags: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.0 Mesa 11.0.6
OpenGL ES profile shading language version string: OpenGL ES GLSL ES
,→ 3.00
OpenGL ES profile extensions:
In such cases, the following issues can be expected and will currently not be addressed:
INSTALLATION 14

• Application might not even start (OpenGL version error)


• Slow display performance (OpenGL emulation by CPU-based Mesa driver)
• Graphical display errors (wrong colors, artefacts, etc.)
Please know that they might also work, but no guarantees or support can be given at
this moment.

If you encounter a white-only OSGView and shader errors are present in the console
log, please refer to White OSGView & Shader Errors in Console Log.

Using the AppImage


To summarize, an AppImage is a form of binary distribution in one complete package,
which does not require installation of any libraries or alteration of the operating system.
Please refer to https://appimage.org/ to get additional information.

To obtain the COMPASS AppImage, download the latest version from https:
//github.com/hpuhr/COMPASS/releases.

Before starting, it has to be made executable (once after download) using the following
command:
chmod +x COMPASS-x86_64_release.AppImage
This is it. The application can then be run using the following command:
./COMPASS-x86_64_release.AppImage
The following points should be considered:
• The AppImage should run under any Linux distribution of similiar date to Ubuntu
14.04 or later, but no guarantees can be made.
• To the authors knowledge, running in virtualized operating systems is possible in
some solutions, but requires additional setup (GPU acceleration).
• OSGView rendering is performed according to the local graphics card and driver,
and might be limited by their capabilities.

Building from Source Code


Building from source is a somewhat difficult undertaking, and special care has to be taken
to install the correct libraries. Therefore it is recommended to use the AppImage whenever
possible.

If building from source is still a requirement, please contact the author for support
using compass@openats.at.
INSTALLATION 15

Running the Application


After an Installation, simply run the application.
For an AppImage, use e.g.:
./COMPASS-release_x86_64.AppImage
For a self-built application, use e.g.:
./build/bin/compass_client
It is recommended to run the application from a console terminal (for analysis of po-
tential issues in the log text), but that is not mandatory.

Configuration Upgrade
If the used version of COMPASS has never been run on the workstation, the configuration
(and additional application data) will be copied into a subfolder of the home directory
(e.g. ~/.compass/v0.7.0/).

When switching from an older COMPASS version to a newer one, it is recom-


mended to do a data sources export (see Import/Export of Configuration Data Sources) in
the old version, and then an import in the new one.

For more details about the configuration, please refer to section Appendix: Configura-
tion.
UI Overview

In this section, the main GUI components are described, to give an overview and introduce
the different workflow options.

Main Window
When the application is started for the first time, the main window is shown as follows.

16
UI OVERVIEW 17

Figure 1: Main Window

In the simplest use case, the main menubar allows to open or create a database and
importing data. After this step, data of interest can be chosen using the main tab area, and
loaded from the database using the ’Load’ button in the main statusbar.

The loaded data is then shown in Views located either in the main tab area or in other
windows.
UI OVERVIEW 18

The values set in the ’Data Sources’ and ’Filter’ tabs define the dataset loaded from the
database into memory (RAM). Only the data required by the application is loaded into
memory.

Only one dataset can exist at a time. At the beginning of a loading process, the old
dataset is cleared, and a new one is filled sequentially from the database. This single
dataset is then distributed to all existing views.

When new Views are added, or Views require additional data, a manual re-load has to
be performed by the user.

To close the application either the ’File’ menu can be used, or the close button in the
main window decoration.

Please note that using the close button in windows other than the main window only
removes the respective Views, but does not close the application.
Please note that depending on the application status some parts of the UI are inacces-
sible, e.g. are only available when a database was opened.

Main Menubar
A common workflow is to open or create a database using the ’File’ menu. If new data is
to be imported, this can be done using the ’Import’ menu. After all data was imported, the
’Process’ menu can be used to post-process the data (if desired).

At the top exists a main menubar, which allows access to the following groups:
• File Menu: Open/close a database, quit application
• Import Menu: Import ASTERIX, NMEA data
• Configuration Menu: Configure data sources, sectors
• Process Menu: Various processing tasks for imported data

Main Tab Area


When a database was opened and data was imported into it, the main tab area allows
configuration of what data should be loaded using the ’Data Sources’ and ’Filter’ tabs.
Additional main features are also located here (e.g. ’Evaluation’ and ’View Points’). Mul-
tiple Views can be added as additional tabs.

The following tabs can exist in the main tab area:


• Data Sources: Select which data sources and lines should be loaded
– see Data Sources
• Filter: Filter which data should be loaded
– see Filters
UI OVERVIEW 19

• Evaluation: Allows adapting/defining requirement-based standards and compli-


ance assessment of said standards
– see Evaluation
• View Points: Show/edit specific cases saved as View Points
– see View Points
• Views: All views included in the main window, e.g. ListBoxView0

To add additional views, the button can be used. Views can be added either to the
window in which the button was clicked (’Add Here’) or in a new window (’Add In New
Window’).

A single View can be removed by clicking the button in its tab (next to the Views
name) and selecting ’Close’.

Please note that using the close button in windows other than the main window only
removes the respective Views, but does not close the application.

Main Statusbar
The main statusbar at the bottom shows general information and contains the ’Load’
button used to trigger a loading process (only visible when database was opened).

The following information is shown:


• Database indicator: Shows ’No Database’ or currently opened database file name
• Application mode: ’Offline’ or ’Live’

Main Menu
File Menu
Databases can be created, opened and closed using the File menu.
UI OVERVIEW 20

Figure 2: File Menu

• New: Create new database file


• Open: Open existing database file
• Open Recent: Open recent existing database file
• Close: Close current database
• Save Config: Save current configuration
• Quit Without Saving Config: Quit application without saving configuration
• Quit: Quit application with saving configuration

After a database was opened, the ’Import’ menu becomes available, and the main win-
dow looks as follows:
UI OVERVIEW 21

Figure 3: Main Window After Opening a Database

Import Menu
Data can be imported into the database using the Import menu. This menu is only acces-
sible if a database has been opened.
UI OVERVIEW 22

Figure 4: File Menu

• ASTERIX Recording: Import ASTERIX recording file


– see Import ASTERIX Recording
• Recent ASTERIX Recording: Import recent ASTERIX recording file
– see Import ASTERIX Recording
• ASTERIX From Network: Import ASTERIX from network interfaces in Live mode
– see Import ASTERIX from Network
• GPS Trail: Import (D)GPS trail from NMEA file
– see Import GPS Trails
• View Points: Import View Points definition file
– see Import View Points

Configuration Menu
Data sources and sectors can be configured using the Configuration menu. Further, the
currently defined Meta Variables can be inspected.

Figure 5: Configuration Menu

• Data Sources: Configure data sources


– see Configure Data Sources
• Meta Variables: Display current Meta Variables
– see Show Meta Variables
• Sectors: Configure sectors in the database
– see Configure Sectors
UI OVERVIEW 23

Process Menu
Post-processing tasks can be performed using the Process menu. This menu is only acces-
sible if a database was opened.

Figure 6: Process Menu

• Calculate Radar Plot Positions: (Re-)Calculate Radar plot position information


– see Calculate Radar Plot Positions
• Calculate Associations: Find unique targets and associate target reports
– see Calculate Associations
• Calculate Associations from ARTAS: Find targets based on ARTAS tracks and asso-
ciate target reports based on ARTAS TRI information
– see Calculate Associations for ARTAS

Import Data
Import ASTERIX Recording
This task allows importing of ASTERIX data recording files into the opened database.
UI OVERVIEW 24

Figure 7: Import ASTERIX data

There exist three tabs:


• Main: File label, input line, jASTERIX decoder settings
• Override: Data override settings
• Mappings: Definition of created database content based on decoded data

Please note that the following framings are currently supported:


• None: Raw, netto, unframed ASTERIX data blocks
• IOSS: IOSS Final Format
• IOSS_SEQ: IOSS Final Format with sequence numbers
• RFF: Comsoft RFF format
UI OVERVIEW 25

Please note that the following ASTERIX categories, editions, reserved expansion fields
and special purpose fields are currently supported:

CAT Editions REFs SPFs


001 1.1
002 1.0
010 0.24 Sensis, 0.31
019 1.2, 1.3
020 1.5, 1.8 1.3
021 2.1, 2.4
023 1.2
034 1.26
048 1.15, 1.23 1.9
062 1.12, 1.16, 1.18 1.2 ARTAS TRIs
063 1.0, 1.1
065 1.2, 1.3

Please note that sensor status messages can be decoded, but are not inserted into the
database. Decoding of ASTERIX CAT002 is recommended if CAT001 data is imported,
since the timestamps are derived from it (if not available in CAT001).

Main Tab
At the top, the recording file to be imported is shown. Below, there exists a combobox
defining in which line the data should be written.

Framing Using the ’Framing’ drop-down menu, the current framing can be selected. Us-
ing the ’Edit’ button, the current framing definition is opened in a text editor.

Categories For each category, a number of elements exist:


• Category checkbox: Number of category and checkbox defining if it should be de-
coded
• Edition: Drop-down menu to select the edition number to be used

• Edition Edit Button : Opens the current edition definition in a text editor
• REF: Drop-down menu to select the Reserved Expansion Field definition (if avail-
able)

• REF Edit Button : Opens the current REF definition in a text editor
• SPF: Drop-down menu to select the Special Purpose Field definition (if available)

• SPF Edit Button : Opens the current SPF definition in a text editor
UI OVERVIEW 26

Additional Using the ’Edit Data Block’ button, the ASTERIX data block definition is
opened in a text editor.

Using the ’Edit Categories’ button, the ASTERIX definitions file is opened in a text
editor.

Using the ’Refresh’ button, all of the jASTERIX definitions are re-loaded from harddisk
(e.g. to update after file changes were made).

Running Using the ’Debug in Console’ checkbox, additional debugging information


is output to the console, and the ASTERIX decoding is switched to single-threading for
easier investigation.

The ’Cancel’ button aborts the import. Using the ’Test Import’ button the import
process can be tested without inserting the data into the database. The ’Import’ button
triggers the import of the selected file into into the database with the given options. This
function can be run multiple times.
UI OVERVIEW 27

Override Tab

Figure 8: Task: Import ASTERIX data override

This feature should only be used in very specific circumstances. When activated, the
function adds a Time of Day offset to all imported data, to compensate time offsets (e.g.
when importing data from a replay).

• ’Override Time of Day Active’ checkbox: Determines if the override is active. Dis-
abled by default.
• Time of Day Offset: Positive or negative offset to be added, in seconds. If the result-
ing value is out of bounds, it is adjusted to the [0, 86400] interval.
UI OVERVIEW 28

Mappings Tab

Figure 9: Task: Import ASTERIX data mappings

At the top, the GUI elements can be used to show/add/remove ASTERIX JSON parsers.
Below, the currently selected ASTERIX JSON parser is shown and can be configured.

An ASTERIX JSON parser in this context is the function that parses the JSON content
created by the jASTERIX parser, and creates DBContent from it. For each ASTERIX cate-
gory a dedicated parser defines the mapping from JSON to a DBContent.

For the common user interaction is normally not recommended, but sometimes it might
be interesting to know what DBContent is created from which ASTERIX data.
UI OVERVIEW 29

Top Elements Using the drop-down menu, the to-be-shown parser can be selected. The
buttons allow for adding and removing ASTERIX JSON object parsers.

Parser GUI Elements The exact definition of how the parsing works is out of scope for
this document, so only a short summary is given here. For more information please contact
the author.
In the mappings list, the following columns are given:
• Active checkbox: Defines if the specific mapping is used
• JSON Key: JSON location and name of the data to be mapped, commonly in ’Data
Item Number’.’Variable’ or ’Data Item Number’.’Sub Item’.’Variable’ format
• DBContent Variable: Target variable to which this data is mapped

Whenever a mapping is selected, the details are shown on the right hand side, pro-
viding information about the ASTERIX datum (based on the jASTERIX specifications) and
the DBContent variable it is mapped to. Using the ’Show DBContent Variable’ button the
variable can be inspected in detail.

Figure 10: Task: Import ASTERIX Data DBContent Variable Details

Running
Using the ’Import’ button the import task can be performed. During import a status
indication will be shown.
UI OVERVIEW 30

Figure 11: Import ASTERIX data status

If a decoding error occurs, a brief message box is shown, after which the application
has to be closed. Please make sure that the correct framing and edition versions are se-
lected, or contact the author for support if this does not resolve the issue.

Comments The time needed for the import strongly depends on the available CPU per-
formance (multi-threading being very beneficial), but an import of 5 million target reports
takes about 3:30 minutes on the author’s hardware.

The (truncated) timestamps of CAT001 are calculated in a simple algorithm based on


the CAT002 messages from the same sensor, so their timestamp data is a bit unreliable, but
exact enough for e.g. time window filtering.

This task can be run several times, e.g. if multiple ASTERIX recordings from different
data sources are to be imported.

Please note the currently not all data fields (as shown in the JSON object parsers)
are imported.

Import ASTERIX from Network


This task allows importing ASTERIX data from the network into the opened database, and
switches from ’Offline’ to ’Live’ mode.
UI OVERVIEW 31

Figure 12: Import ASTERIX Data from network

Except for the differing ASTERIX source at the top, everything works the same as
described in Import ASTERIX Recording.

When the import is started, data is continously read from the network (according to
the data source network lines specificiation) and processed according to Live Mode.

Import GPS Trails


This task allows importing (D)GPS trails into the opened database, by parsing an NMEA
file and writing the position updates into the ’RefTraj’ DBContent.
UI OVERVIEW 32

Figure 13: Import GPS Trail

There exist 2 tabs:


• Main: File path and information text
• Config: Configuration of data source and secondary information

Main Tab
At the top, a label exists indicating the file to be imported.
UI OVERVIEW 33

Below, a text field is given, which after selection of an NMEA file displays the content
information and/or error messages.

Config Tab

Figure 14: Import GPS Trail Config

In the configuration tab, several configuration parameters can be set:


• SAC: System area code of the reference data source
• SIC: System identification code of the reference data source
• Name: Name of the reference data source
UI OVERVIEW 34

• Time of Day Offset: Time correction factor, in seconds. Set to 0 to disable.


• Mode 3/A Code (octal): Mode 3/A code to be set. Uncheck checkbox to disable.
• Target Address (hexadecimal): Mode S address to be set. Uncheck checkbox to dis-
able.
• Callsign: Target identification to be set. Uncheck checkbox to disable.
• Line ID: Line to be used during import

Running
After selecting an NMEA file, the task can be performed using the ’Import’ button.
Please note that reference trajectory updates are skipped in the following two cases:
• The time is the same as the previous update
• The quality is set to 0 (invalid position).

Import View Points


This task allows importing of view points and respective ASTERIX data recording files
into the opened database.

Figure 15: Import View Points


UI OVERVIEW 35

After selection of a view point file, a text field is shown, which displays the view point
context information or an error message.

At the bottom, an ’Import’ button exist, which is only enabled if a valid file was se-
lected.

Notes
For a detailed specification of a view point file, please refer to Appendix: View Points.

For each of the datasets, an ’Import ASTERIX Recording’ task is started, using the
previously set configuration. Please make sure that the import configuration is valid for
such processing.

For the filename in each dataset, the absolute path is searched first. If such a file can
not be found, a file with the same name at the location of the view point file is further
searched. If the referred file can not be found, an error message is displayed.

It is not recommended to import multiple view point files, such a use case is not fully
supported yet.
Using the ’Import’ button a process is started to import the view points and ASTERIX
data recordings.

Configuration
Configure Data Sources
This dialog allows management of data sources, as stored in the configuration as well as
in the database.
UI OVERVIEW 36

Figure 16: Configure Data Sources

If data is imported from ASTERIX, data source information might be missing, so this
information must either be edited manually, or loaded from a previous configuration (or a
previous database containing the same information).

A data source is always stored in the configuration, which means it is persistent and
can be used in all new databases. It is useful to define all existing data sources in the
configuration, since they are immediately used during import of data.

A data source is stored in the database only if data from said source is imported. Dur-
ing the import, if a data source can be found (matching SAC/SIC) in the configuration, it
is automatically added to the database.

Please note that currently the position of data sources is only required for Radar data
sources (in the plot position calculation), for all other data sources it would suffice to have
SAC/SIC and name information for display purposes.
UI OVERVIEW 37

Data Sources Table Content


• Name: Name of the data source
• Short Name: Short name of the data source (optional)
• DSType: Data source type, e.g. Radar, MLAT, ADS-B, ...
• SAC: System Area Code, number between [0,255]
• SIC: System Identification Code, number between [0,255]
• In DB: Indicator whether the data source is stored in the database
• In Config: Indicator whether the data source is stored in the configuration

When a data source is selected in the table, additional details are shown in the right
side, where edting is also possible.

Please note that all changes to a data sources are always written to the database as well
as the configuration.

Depending on the DSType, additional information can be set in a source. For a non-
Radar source, the following information is given:
• ID: Number indentifier (unique)
• Network Line information
– 4 different lines are possible, each with an ’IP:Port’ syntax, e.g.
* ’1.2.3.4:5’

For sources of DSType ’Radar’, the follwing additional information should be pro-
vided:
• Latitude: Source center position as WGS-84 latitude, as floating point number in
degrees, e.g. 42.0001
• Longitude: Source center position as WGS-84 longitude, as floating point number in
degrees, e.g. 17.01
• Altitude: Source center altitude above MSL, in meters

Additional optional information can be provided:


UI OVERVIEW 38

Figure 17: Configure Data Sources: Radar details

• PSR Minimum: PSR minimum range, in nautical miles


• PSR Maxmimum: PSR maximum range, in nautical miles
• SSR Minimum: SSR minimum range, in nautical miles
• SSR Maxmimum: SSR maximum range, in nautical miles
• Mode S Minimum: Mode S minimum range, in nautical miles
• Mode S Maxmimum: Mode S maximum range, in nautical miles
• PSR Azimuth StdDev: PSR azimuth standard deviation, in degrees
• PSR Range StdDev: PSR range standard deviation, in meters
• SSR Azimuth StdDev: SSR azimuth standard deviation, in degrees
• SSR Range StdDev: SSR range standard deviation, in meters
• Mode S Azimuth StdDev: Mode S Radar azimuth standard deviation, in degrees
• Mode S Range StdDev: Mode S Radar range standard deviation, in meters
UI OVERVIEW 39

Import/Export of Configuration Data Sources


Using the 4 buttons on the bottom the following functions can be used:
• Export All: Export all configuration data sources as JSON file
• Clear All: Delete all configuration data sources
• Import: Import configuration data sources from JSON file
• Auto Sync All to DB: Automatically synchronize all configuration data sources to
database

There are two versions of the data sources JSON file used for import/export. Please
refer to Appendix: Data Sources.

Using these functions, the configuration data sources can be changed for sensor context
switches, or e.g. exported before an COMPASS version upgrade.

Show Meta Variables


This dialog allows the display of the Meta Variables stored in the configuration. Editing is
only possible in expert mode, but the dialog can be shown to inform users what DBCon-
tent variables are grouped into Meta Variables.
UI OVERVIEW 40

Figure 18: Show Meta Variables

Configure Sectors
This dialog allows management of sectors (as 2.5D polygons) stored in the database.

This step is recommended if the sectors are to be used for evaluation purposes and/or
if the altitude information is of use. For 2D display-only polygons it is recommended to
add such information to the used map files, as described in Adding/Changing Map Files.

Please note that importing at least one sector is required for using the evaluation fea-
ture.
UI OVERVIEW 41

Figure 19: Configure Sectors

There exist 2 tabs:


• Import: For importing sectors from files
• Manage: Management of imported sectors stored in the database

Import Tab
In the ’File Selection’ list, a list of available files is provided. Entries can be added using
the ’Add’ button, or removed using either the ’Remove’ or ’Remove All’ buttons.

Files are imported using the GDAL library, which can read a number of GIS files, and all
encapsulated polygons or multi-polygons are written to the database with unique names.
Supported common file-formats are e.g.:
UI OVERVIEW 42

• ESRI Shapefile
• GML
• KML

Note that only polygonal information is added, and all information is assumed to be
stored in the WGS84 coordinate system.

Below a text field is given which, after selection of a file, displays the content informa-
tion and/or error messages.

Usage After adding an example file, the ’Import’ tab looks as follows:

Figure 20: Configure Sectors with example file


UI OVERVIEW 43

To import a sector, select the desired file in the file list and press the ’Import’ button,
after which an import dialog is shown.

Figure 21: Configure Sectors import dialog

The following options exist:


• Sector Layer: Define in which sector layer the new sectors will be grouped in
• Exclude: Define whether the imported sectors will be impored as ’Exclude’ sectors
• Color: Color to be used

After successful import, a confirmation message is shown.


UI OVERVIEW 44

Figure 22: Configure Sectors Manage tab

Manage Tab In the ’Manage’ tab, a table listing the existing sectors is given, which
allows editing/deleting of the information stored in the database.

The following columns exist in the table:


• ID: Unique identifier number, read-only
• Sector Name: Name of the sector, editable
• Layer Name: Name of the layer the sector will be grouped in, editable
• Exclude: Whether the sector is an ’Exclude’ sector
• Num Points: Number of points in polygon, read-only
• Altitude Minimum: Minimum altitude in feet, editable, can be empty (gray)
• Altitude Maximum: Maximum altitude in feet, editable, can be empty (gray)
• Color: Color in display, editable
UI OVERVIEW 45

• Delete: Delete button

The ’Exclude’ sector is a special case where e.g. a sector layer ’Example’ has at least one
normal sector ’Area’, giving the 2.5d polygon in which surveillance data of interest exists.
Now, to exclude target reports in a specific sector inside the bigger ’Area’ sector, one or
several smaller sectors can be imported (again into sector layer ’Example’) and marked as
’Exclude’ sectors and at best using a different color.

At the botton, 3 buttons exist:


• Export All: Saves all stored sectors as JSON file
• Clear All: Deletes all stored sectors
• Import: Imports a previously exported sector JSON file

Usage After importing an sectors, the ’Manage’ tab looks as follows:


UI OVERVIEW 46

Figure 23: Configure Sectors with example sector

The attributes of the respective sector can be changed as wanted, by double-clicking on


the text value or clicking on the color. Each change is stored immideately in the database.
The result might look as follows:
UI OVERVIEW 47

Figure 24: Configure Sectors editing result

It is recommended to export all sectors after creation, for later usage (e.g. in another
database).

Postprocess
Calculate Radar Plot Positions
This dialog allows (re-)calculation of Radar plot latitude/longitude position information
based on the defined data sources.
UI OVERVIEW 48

Figure 25: Calculate Radar Plot positions

Please note that for this step the Radar data source positions have to set in the database
(see Configure Data Sources), otherwise no plot position can be calculated.

There are two projection methods (radar polar coordinates to WGS-84 coordinates)
available. The RS2G projection is the currently recommended option.

OGR Projection The EPSG code for the projection has to be chosen according to your
needs, please refer to http://spatialreference.org/ref/epsg/ for a list of possible
codes.
The WGS84 latitude/longitude coordinates are then calculated using the radar posi-
tions in the database, the range and the azimuth. Please note that currently there will be
offsets in the projected coordinates compared to the e.g. the ARTAS projection. The reason
for this is under investigation.

RS2G Projection For this projection, no additional attributes must be given. Please note
that this projection is based on a common ’radar slant to geodesic transformation’, it should
be equvalent to the ARTAS projection. A verification is still needed, please contact the
author if you would be willing to support this.

Usage Using the ’OK’ button the task can be performed. During import a status indica-
tion will be shown:
UI OVERVIEW 49

Figure 26: Calculate Radar Plot positions done

Please note that this task can be re-run with different projections if wanted.

Calculate Associations
This task allows creation of UTNs (Unique Target Numbers) and target report association
based on Mode S Addresses, Mode A/C codes and position information.

Each target found is identified by a UTN, and groups together all reference/track-
er/sensor target reports (which can) be associated to this target.
Please note that if usage of UTNs is not needed, these step does not have to be per-
formed. If usage of the evaluation feature is wanted, running this task is required.
UI OVERVIEW 50

Figure 27: Associate Target Reports

The configuration of the task requires knowledge of the internals of how the associa-
tions are made. As a brief description, the following steps are performed:
• Delete all previously existing associations
• Load all target reports, per data source
• Add UTNs based on Reference (RefTraj) data sources
• Add UTNs based on Tracker data sources
• Add UTNs based on remaining sensor data sources
UI OVERVIEW 51

Common parameters:

Parameter Default Description


Associate Non-Mode S Data true Whether Mode A/C code & position based
association should be performed
Clean Dubious UTNs true Whether UTNs with dubious movement
should have non-Mode S target report re-
moved
Mark Dubious UTNs Unused false Whether UTNs with dubious movement
should be marked as such
Comment Dubious UTNs true Whether UTNs with dubious movement
should be commented as such

The other parameters are discussed in following sub-sections.

Reference/Tracker UTN Creation


Track/Track Association Parameters:

Parameter Default Used In


Maximum Comparison Time Differ- 15.0 All (maximum time delta for any
ence [s] code/position comparison)
Maximum Quit Distance [m] 18520 Target/target association
Maximum Dubious Distance [m] 5556 Target/target association
Maximum Dubious Positions [1] 5 Target/target association
Maximum Acceptable Distance [m] 926 Target/target association
Maximum Altitude Difference [ft] 300 Target/target association
Minimum Updates [1] 2 Target/target association
Minimum Time Overlap Probability 0.5 Target/target association
[0-1]
Max Speed [kts] 100000 Dubious target detection
Maximum Continuation Time Dif- 30 Track continuation
ference [s]
Maximum Acceptable Continuation 1852 Track continuation
Distance [m]

The following steps are performed for each Reference/Tracker data source:
• Per-source target creation
– Create new list of targets based on
UI OVERVIEW 52

* Track number
* Mode S address
* Mode A/C, time difference, position difference (track continuation)
– Find dubious targets (dubious target detection)
– Clean dubious targets (if configured)
– Per-source target to common target association (target/target association)
– Score-based approach
– Associates new per-source targets to existing targets based on
* Mode S address
* Time overlap
* Mode A code(s) similiarity (only if minimum time overlap is given)
* Mode C code(s) similiarity (only if Mode A similiarity is given)
* Position similiarity (only if Mode A/C similiarity is given)

After creating targets for all Reference/Tracker data sources:


• Find dubious targets
• Clean dubious targets (if configured)
• Mark/comment dubious targets (if configured)

The exact method discussion will be added at a later time, since the algorithms will be
improved to include position accuracy information (error standard deviation etc.) in the
near future.

After these steps, for each target which can be created by the simple score-based associ-
ation method will be used to additionally associate sensor target reports (non Reference/-
Tracker based target reports), as discussed in the next section.

Sensor UTN Creation


Sensor/Track Association Parameters:

Parameter Default Description


Maximum Comparison Time Difference [s] 15 Maximum time delta for any
code/position comparison
Maximum Acceptable Distance [m] 3704 Maximum position difference
Maximum Altitude Difference [ft] 300 Maximum altitude difference

The following steps are performed for each sensor data source:
• Find possible target in existing target list based on
– Mode S address
UI OVERVIEW 53

– Mode A code similiarity (if present)


– Mode C code similiarity (if present)
– Position similiarity
• If a matching target is found, the best match is used for association
• If no matching target is found
– If the target report has Mode S address, a new target is created
– Otherwise no association is made

Dubious Targets
The dubious target check is solely based on a simple maximum speed given all associated
Reference/Tracker target reports. It should be used to detect/mark possibly wrong associ-
ations, to investigate such targets later and, if possible, resolve such issues using different
parameter values.

Please note that in a special case where target reports from difference Reference/-
Tracker source are very close in time this check can generate a large number of false posi-
tives and should be disabled. For this reason the default value was set to a disproportion-
ately high value.

Discussion
The user should be aware that, while this association feature is quite an improvement over
the previous method, it is still somewhat limited. It strongly depends on the correctness of
Mode S addresses, as well as the Reference/Tracker information (track number, secondary
information and position information). If the mentioned information is erroneous, the
made association will be sub-optimal or plainly wrong.

For the Sensor UTN Creation, the correctness of the associations strongly depends
again on the Mode S address as well as the quality of the assocations made in the Refer-
ence/Tracker UTN Creation.

Also, for association of non Mode S sensor target reports a trade-off has to be made
in the ’Maximum Acceptable Distance’ parameter, especially if they are primary-only. It
should be set within the limits of Reference/Tracker error plus maximum sensor error
(which can still include radar bias’) and the used target separation minima. This is of
course not well-suited for strongly different sensors accuracies and seperations (e.g. when
mixing ground and air surveillance data).

Running
To run the task, click the ’Run’ button. After the assocations are saved, the task is done:
UI OVERVIEW 54

Calculate Associations for ARTAS


This task allows creation of UTNs and target report association based on ARTAS tracks
and the TRI information.

Please note that if no ARTAS TRI SPF information exists in the database, these steps do
not have to be performed.
UI OVERVIEW 55

Figure 28: Calculate Associations from ARTAS

In this task, the ARTAS association information stored in system track updates can be
used to create UTN which associate each system track update to the used sensor target
reports.

The following configuration options exist


• Tracker Data Source: Name of tracker data source from which associations shall be
created.
• Data Variables: Definition of variables to be used in processing. Does not have to be
changed.
• End Track Time (s): Track update gap time (in seconds) after which a new UTN will
be created (even if no track begin/end flag is set).
UI OVERVIEW 56

• Association Time Past (s): Time window length (in seconds) into the past where sen-
sor target reports are considered for association.
• Association Time Future (s): Time window length (in seconds) into the future where
sensor target reports are considered for association.
• Acceptable Time for Misses (s): Time window length at the beginning and end of the
recording (in seconds) where misses (not found hash codes) are acceptable.
• Dubious Association Distant Time (s): Maximum age of made associations (in sec-
onds), if older they are considered as dubious associations.
• Dubious Association Close Time Past (s): Time window length (in seconds) into the
past where made associations are considered as dubious if multiple sensor hashes
exist.
• Dubious Association Close Time Future (s): Time window length (in seconds) into
the future where made associations are considered as dubious if multiple sensor
hashes exist.
• Ignore Track End Associations: If set, no assocations for system track updates where
the track end flag is set are created.
• Mark Track End Associations Dubious: If set, assocations for system track updates
where the track end flag is set are counted as dubious.
• Ignore Track Coasting Associations: If set, no assocations for system track updates
where the track coasted flag is set are created.
• Mark Track Coasting Associations Dubious: If set, assocations for system track up-
dates where the track coasted flag is set are counted as dubious.

UTN Creation from System Track Numbers and End Track Time :
The task should create a unique target number (UTN) for every ARTAS track. The track
begin/end flags therefore are used to create new UTNs or finalize existing ones. To cover
the case when such information is wrong or missing, the ’End Track Time’ is used to check
the time between track updates from one track number. If the gap is larger then the defined
time, a new UTN is created even if no track begin/end flag was set.

Association Time Window Consider the following figure: In a timeline, a system track
update exists at time 0, while referenced sensor target reports exist at the times 1,2,3. In
this description, it is assued that all sensor target reports have the same referenced ARTAS
MD5 hash value.

The time window defined by [Association Time Past, Association Time Future] defines
which referenced sensor target reports are considered for association. The ’Past’ time is
the time difference into the past (default 20s), the ’Future’ time is the time difference into
the future (default 2s).

In this example, 1 and 2 are considered, while 3 is disregarded. Since 1 is closer in time
to the system track update, the association between (0,1) is made.
UI OVERVIEW 57

The time defined by Dubious Association Distant Time is used to mark assocations which
are older than the defined time as dubious. The associations are still created, but counted
as dubious. Since 2 is still in the association time window, the assocation (0,2) is made but
counted as dubious.

The time window defined by [Dubious Association Close Time Past, Dubious Association
Close Time Future] defines when assocations are counted as dubious if multiple referenced
sensor target reports exist in it. In this case, 1 and 2 are considered for association, and
since 1 is closer in time to the system track update, the association between (0,1) is made.
But, since 2 also falls into this time window, the association of (0,1) is counted as dubious
association.

Ignore Track End/Coasting Associations Currently in ARTAS, (according to the authors


information) track end or coasted updates contain wrong TRI information (from previous
updates). If the respective checkboxes are set, this TRI information is disregarded. If they
are not checked, the associations are made and can be investigated.

Running Using the ’Run’ button the task can be performed.


UI OVERVIEW 58

Data Sources

Figure 29: Data Sources Overview

In this tab, the data sources existing in the database are shown. Data sources are added to
the database if data during the import process was associated to the respective data source
(and the respective data source line).

Data sources are grouped by DSType (data source type, e.g. Radar, MLAT, ...), and can
have up to 4 active lines (L1-L4). Each line for which data exists in the database is shown
as a button.
UI OVERVIEW 59

Loading of the reppective data can be changed on 3 levels:


• By DSType (using checkbox)
• By data source (using checkbox)
• By data source line (using button, strong border means active)

At the bottom, the ’Associations’ label indicates if association information exists, and
from which data source it was generated.
UI OVERVIEW 60

Filters

Figure 30: Filters Overview

At the top, the ’Use Filters’ checkbox defines whether filtering is active.

Each filter consists of a checkbox, defining if a filter is active (contributes to the search
query), a triangle-button (to show/hide the filter configuration elements), a unique name,
and a manage button (activates a context menu).

Please note that the filter configuration will be saved at program shutdown, which is
also true for new filters. At startup, all filters from the configuration are generated and
UI OVERVIEW 61

restored to their previous state.

Please also note that active filters, at the moment, are always combined with a logical
AND. Therefore, when two filters are active, only the intersection of data which both filters
allow is loaded.
As an example, the ’Time of Day’ filter limits the loaded data to a specific time window,
to load only time slices of the dataset. The ’Mode 3/A Codes’ filter restricts to a list of
(comma-separated) Mode 3/A codes, to single out specific flights.

For more information about filtering, please refer to section Filters.

Views

The ’Add View’ button on the top right in each window allows adding views to the
current window or in a new window.

Each View is contained in a tab within a parent window. At startup, per default only
the main window exists, which also holds a ListBox view. If the main window is closed,
the COMPASS client shuts down. New Views can be added using the ’Add View’ button,
which opens a pull-down menu. Each View can either be added to the main window
(’Add Here’) or into a new window (’Add in New Window’. When added, a new tab
exists in the containing window.

New Views can be added either to currently existing windows as new tabs, or to a
newly opened window. A window can be closed either by the close button in the window
decoration, which discards all contained Views within the window.

To close a single View, one can use the button in the tab header, which frees up all
its allocated resources.

Each View adds its required variables to the loading list for the database. During a
loading process, the loading status of a View is shown in the management tab.

Currently, the following Views exist:


• Histogram View
• ListBox View
• OSG View
• ScatterPlot View
Filters

Default Filters

Figure 31: Filters Overview

62
FILTERS 63

Aircraft Address Filter

Figure 32: Target Address filter

When active, this filter forces loading of data with the given Mode S address(es),
so it is possible to give multiple values (in hexadecimal notation, irrespective of up-
per or lower case characters, separated by commas). E.g. ’FEFE10’ is possible, or
’FEFE10,FEFE11,FEFE12’. Target reports without a given Mode S address will not be
loaded unless the value ’NULL’ is (also) given.

Aircraft Identification Filter

Figure 33: Callsign filter

When active, this filter forces loading of data only from aircraft identifications matching
the given expression. The percent operator denotes a ’any characters’ placeholder. So e.g.
’%TEST%’ will match ’TEST123’ or ’TEST123 ’ (with spaces) or ’MYTEST’. Target reports
without a given aircraft identification are not restricted by this filter.
FILTERS 64

ADSB Quality Filter

Figure 34: ADSB quality filter

When active, this filter restricts the loaded ADS-B data based on the transponder MOPS
version and various quality indicators.

ADSB MOPS Filter

Figure 35: ADSB quality filter

When active, this filter restricts the loaded ADS-B data based on the transponder MOPS
version. Multiple values can also be given, e.g. ’0’, ’0,1’, etc.

ARTAS Hash Code Filter

Figure 36: ARTAS Hash Code filter


FILTERS 65

When active, this filter forces loading of data only from target reports with a specific AR-
TAS MD5 hash code, or system track updates referencing this hash code (in their TRI in-
formation). If no hash information is available (e.g. in SASS-C Verif databases or when
this information was not present in the ASTERIX data), this filter should not be used.

Detection Type Filter

Figure 37: Detection Type filter

When active, this filter forces loading of Radar and Tracker data with the given detection
type, so it is possible to give multiple values (separated by commas). E.g. ’1’ is possible,
or ’1,2,3’. Tracker target reports without a given detection type will not be loaded.

The following detection types exist:


• 0: No detection/unknown
• 1: PSR
• 2: SSR
• 3: Combined (PSR+SSR)
• 5: Mode S
• 7: Mode S Combined (PSR+Mode S)

Please note that for CAT62 data the detection type reflects the most recent detection
type used to update the track (last measured detection type).

Position Filter

Figure 38: Position filter

When active, this filter forces loading of data with latitude/longitude inside the given
thresholds (in degrees).
FILTERS 66

Time of Day Filter

Figure 39: Time of Day filter

When active, this filter forces loading of data with the time-of-day inside the given thresh-
olds (in HH:MM:SS.SSS).

Track Number Filter

Figure 40: Track Number filter

When active, this filter forces loading of data with the given track numbers, so it is possible
to give multiple values (separated by commas). E.g. ’1’ is possible, or ’1,2,3’. Target reports
without a given track number will not be loaded unless the value ’NULL’ is (also) given.

Please note that ADS-B target reports can also contain a track number in ASTERIX, but
since the information can not currently be mapped to the database (missing in schema),
this filter does not influence ADS-B data loading.

Mode 3/A Codes Filter

Figure 41: Mode 3/A Codes filter

When active, this filter forces loading of data with the given Mode A code(s), so it is
possible to give multiple values (in octal notation, separated by commas). E.g. ’7000’ is
possible, or ’7000,7777’. Target reports without a given Mode A will not be loaded unless
the value ’NULL’ is (also) given.

Please note that ADS-B target reports can also contain Mode 3/A code information.
FILTERS 67

Mode C Codes Filter

Figure 42: Mode C Codes filter

When active, this filter forces loading of data with a barometric altitude inside the given
thresholds (in feet). If Target reports without a barometric altitude should not be loaded
can be set using the ’NULL Values’ checkbox.

Primary Only

Figure 43: Primary Only filter

When active, this filter forces loading of data without any secondary attributes.

UTN Filter

Figure 44: UTN filter

This filter is only available if target report associations have been generated (see Calculate
Associations).

When active, this filter forces loading of data with the given unique target numbers
(UTNs), so it is possible to give multiple values (separated by commas). E.g. ’1’ is possible,
or ’1,2,3’. Target reports without an associated UTN will not be loaded.

Adding a New Filter

A new filter can be added by clicking the button in the filter tab.
FILTERS 68

Figure 45: Adding a filter

First, one has to give the filter a new (unique) name. Then, conditions have to be
defined and added. A condition consists of a DBContent variable, an operator, a value,
and a reset value.

When the triangular button is clicked, a sub-menu is opened, where one can choose
a DBContent variable. The selected variable restricts data of all DBContents if it is of
type ’Meta’, or just data from one DBContent if it is not. Additionally, the mathematical
operator ’ABS’ can be selected. If so, not the value of the variable but the absolute value of
the variable is used: ’ABS(var)>value’ is equivalent to ’var>value OR var<-value’.

An operator can be chosen with the drop-down menu, the supplied operators are com-
mon SQL operators.
FILTERS 69

Operator Description
= Equal
!= Not equal
> Greater than
>= Greater than or equal
< Less than
<= Less than or equal
IN Matches a value in a comma-separated list
LIKE Pattern matching with % and _
IS Value NULL: No value exists
IS NOT Value NULL: Value exists
Table 1: SQL operators

A reset value also has to be supplied, which can be the chosen value or a minimum/-
maximum value set from the database. Whenever a database different from the previous
one is opened, all filters are reset, since previous values may have become invalid.

After a condition is defined, it has to be added using the ’Add condition’ button.
Existing conditions are shown in the ’Current conditions’ list. Please note that for now
added conditions can not be removed.

Now the described process can be repeated until a usable filter emerges, which is added
using the ’Add’ button. The process of adding a new filter can be canceled by using the
’Cancel’ button, which discards all settings. When added, a new filter shows up immedi-
ately in the filter list and is saved to the configuration for persistence.
Evaluation

The ’Evaluation’ tab allows adapting/defining requirement-based standards and compli-


ance assessment of said standards.

Pre-Requisites & Limitations


• Target report associations must be set (using Calculate Associations)
• At least 1 sector has to be defined (using Configure Sectors)
• Usable reference data must exist
• Usable test data must exist

While it is possible to manually remove single targets from the evaluation, the usage
of correct reference data is paramount for the significance of the evaluation results.

Please note that the evaluation feature should not be used as a sole basis for
decision making - especially not without manually verifying the evaluation results.

There will be improvements in the next releases, and further verification of the results
by the author and other users.

Target Report Associations


Since the task only makes use of the Mode S address, non-Mode S data is not evaluated
and may even show up as gaps/misses in detection.

Sector Altitude Filtering


If sectors with altitude limits are used, please be aware that target reports without a Mode
C code can not be filtered by the set limit. Therefore such target reports are assumed to be
inside in all sectors when inside the defined 2D polygons.

The ’inside-sector’ check is always performed on the reference data only, therefore it is
of importance to only use reference data with existing Mode C code data.

70
EVALUATION 71

Reference Data
The assumption used in the tool is that the reference data is always correct. Therefore,
sub-optimal reference data can cause errors, which will be attributed to the test data in the
evaluation.

To give a few examples what this could mean:


• Wrong ’inside-sector’ check results: This might remove valid test data from the eval-
uation and/or attribute errors to the wrong sector
• Missing target data in reference: This will remove the test data from evaluation for
the time-period of the missing reference data
• Wrong position in reference: This will cause wrong ’inside-sector’ check results
and/or cause wrong horizontal position accuracy results
• Wrong/missing Mode C code in reference: This will cause wrong ’inside-sector’
check results
• Wrong/missing identification in reference: This will cause wrong results in identifi-
cation requirements

Also, since target secondary attributes (currently only Mode S address) are also used in
the ’Target Report Association’ task, errors in these attributes might also lead to imperfect
data association. This would result in wrong evaluation results in almost all requirements.
EVALUATION 72

Overview

Figure 46: Evaluation tab

At the top, 6 tabs exist:


• Main: Main configuration
• Targets: Table of existing targets (filled after data was loaded)
• Filter: Filter which data is loaded, by time and/or ADS-B quality indicators
EVALUATION 73

• Standard: Definition of standards, selection of current standard, configuration of


requirements to be checked
• Results Config: Additional configration for the result generation
• Results: Evaluation results (created after data was evaluated)

Below 3 buttons exist:


• Load Data: Loads the reference/test data
• Evaluate: Runs the evaluation of the current standard (available after data was
loaded)
• Generate Report: Generates a report PDF (available when evaluation result exists)
EVALUATION 74

Configuration
Main Tab

Figure 47: Evaluation Main tab

In the main tab, the main configuration settings can be set.

Data Selection
At the top, the ’Data Selection’ can be performed, by selecting:
• Reference Data:
– DBContent: Any DBContent existing in the database
– Data source checkboxes: Which data sources to use
EVALUATION 75

– Line ID: Which line to use from the data sources


• Test Data:
– DBContent: Any DBContent existing in the database
– Data source checkboxes: Which data sources to use
– Line ID: Which line to use from the data sources

As noted before, usage of appropriate reference data is of paramount importance.

Since ’any’ type of data can be selected for evaluation, this allows for the following
use-cases:
• Tracker as reference, sensor as test data: Evaluation of sensor
• Tracker as reference, tracker as test data: Evaluation/comparison of different track-
ers/tracker runs

Of course it is also possible to use e.g. an imported GPS trail as reference (see Import
GPS Trails), although this is currently not tested for lack of test data. If you might be able
to provide such test data, please contact the author.

Standard
In the center, using the ’Standard’ drop-down menu, the current standard can be selected.
To create/configure the standard please use the ’Standard’ tab.

Sector Layer/Requirement Mapping


Below that, the ’Sector Layers: Requirement Groups Usage’ allows to define which re-
quirements should be verified for which sector layer.

On the left, all existing sector layers are listed, in the shown example:
• fir_cut_sim: DOI altitude limitation

For each sector layer, the requirement groups (defined in the Standard tab) can be ac-
tive/disabled. In the shown example, the existing requirement group ’Mandatory’ is active
in the sector layer.
EVALUATION 76

Targets Tab

Figure 48: Evaluation Targets tab

Before the data was loaded, the table is empty. Each target (defined by the UTN) is shown
in a dedicated row.

The following columns exist:


• Use: Checkbox defining if the target should be used in the evaluation
• UTN: Unique Target Number
• Begin: First timestamp of UTN
• End: Last timestamp of UTN
• #All: Sum number of target reports
EVALUATION 77

• #Ref: Number of target reports in reference data


• #Tst: Number of target reports in test data
• Callsign: Target identification(s)
• TA: Target address (hexadecimal)
• M3/A: Mode 3/A code(s) (octal)
• MC Min: Mode C code minimum [ft]
• MC Max: Mode C code maximum [ft]

Unless otherwise specified, the column content reflects the values from both reference
and test data.

After loading the data, the table can look as follows:


EVALUATION 78

Figure 49: Evaluation Targets tab with loaded data


EVALUATION 79

Filter Tab

Figure 50: Evaluation Filter tab

In the Filter tab, what data is loaded for evaluation can be filtered

Below, the following elements exist:


• ’Use Load Filter’: Toggles usage of the load filter
• ’Use Time Filter: Toggles usage of the time filter
• ’Use ADS-B Filter: Toggles usage of the ADS-B filter
• ’Use v0: Toggles usage of MOPS Version 0
• ’Use Min NUCp: Toggles usage of the minimum NUCp value filter
• ’Use Max NUCp: Toggles usage of the maximum NUCp value filter
• ’Use v1: Toggles usage of MOPS Version 1
EVALUATION 80

• ’Use v2: Toggles usage of MOPS Version 2


• ’Use Min NIC: Toggles usage of the minimum NIC value filter
• ’Use Max NIC: Toggles usage of the maximum NIC value filter
• ’Use Min NACp: Toggles usage of the minimum NACp value filter
• ’Use Max NACp: Toggles usage of the maximum NACp value filter
• ’Use Min SIL v1: Toggles usage of the minimum SIL v1 value filter
• ’Use Max SIL v1: Toggles usage of the maximum SIL v1 value filter
• ’Use Min SIL v2: Toggles usage of the minimum SIL v2 value filter
• ’Use Max SIL v2: Toggles usage of the maximum SIL v2 value filter

Standard Tab

Figure 51: Evaluation Standard tab


EVALUATION 81

In the Standard tab, at the top the current standard can be selected.

Below, the following buttons exist:


• Add: Add a new standard with a unique name
• Rename: Rename the current standard
• Copy: Copy the current standart to a new one
• Remove: Delete current standard

At the bottom, the ’Reference Maximum Time Difference [s]’ field can be edited to
adjust the maximum time difference between reference updates. This value is used to find
time-adjacent reference target reports for test target reports. Please adjust this value to e.g.
the reference update peroid plus 1 second.

Currently, the following standards are (partly) supported:


• EUROCAE ED-116 (Link)
• EUROCAE ED-117/A (Link)
• EUROCAE ED-87C (Link)

Please note that, while the EUROCAE ED-117A has been tested using simulator data,
the following limitations exist:
• In the Stands area, the position accuracy should be calculated using a 5 second po-
sition averaging. This is currently not performed, since the averaging method is not
specified, and is currently being discussed with users.
• Some MLAT sensors use a different update rate for non-moving targets. This is cur-
rently not regarded, since what constituates non-movement is not (generally) speci-
fied, and will lower the probability of detection.

Please note that these limitations will be corrected in the near future.

The other standards have not yet been tested to the fullest degree, which is work in
progress.

Current Standard
Below that, the current standard is shown. On the left side, a tree-view exists showing:
• Standard name
– Requirement Group(s)
* Requirement(s)
EVALUATION 82

Figure 52: Evaluation Standard tab: Add requirement

When clicking on the standard name, a menu is shown allowing adding new require-
ment groups (’Add Group’).

When clicking on a requirement group, a menu is shown allowing the following func-
tions:
• Delete Group: Delete the selected requirement group
• Add Requirement: Add a requirement of the selected type
• Delete Requirement: Delete the selected requirement

The following requirements exist:


EVALUATION 83

Type Description
Detection Calculates probability of detection
Dubious Targets Calculates probability of dubious targets based on physical
movement (based on test data only)
Dubious Tracks Calculates probability of dubious tracks from Trackers (based
on test data only)
Extra Data Calculates probability of unused test data (based on test data
only)
Extra Track Calculates probability of undetected tracks (based on reference
data only)
Identification Correct Calculates probability of correct secondary identification
Identification False Calculates probability of false secondary identification
Mode 3/A False Calculates probability of false Mode 3/A code
Mode 3/A Present Calculates probability of Mode 3/A code present
Mode C False Calculates probability of false Mode C code
Mode C Present Calculates probability of Mode C code present
Position Across Calculates the probability of the position across error begin
within a threshold
Position Along Calculates the probability of the position along error begin
within a threshold
Position Distance Calculates the probability of the position error begin within or
outside a threshold
Position Latency Calculates the probability of the position latency begin within
a threshold
Speed Calculates the probability of the speed error begin within or
outside a threshold or percentage

If a requirement is clicked, it’s configuration widget is shown on the right hand side.

Each requirement has the following common attributes:


• Type: Type of the requirement
• Name: Name of the requirement
• Short Name: Abbreviated name of the requirement
• Comment: Any text comment, e.g. reference to source document
• Probability [1]: Probability threshold, as [0,1]
• Probability Check Type: Probability comparison type for calculated probability
against probability threshold

For detailed information about each requirement please refer to section Requirements.
EVALUATION 84

Results Tab

Figure 53: Evaluation Results tab

Before the data was evaluated, the results are empty.

There are several levels of detail for the results, and each sub-result is shown in a
tree-view on the left side, grouped into sections. Using this tree-view, the results can be
"navigated", and the currently selected results contents are shown on the right side.

More details will be described in the following section Results Inspection & Analysis.
EVALUATION 85

Running
Load Data
After the wanted configuration (in the Main tab) has been set, the ’Load Data’ button
can be clicked. This results in the reference/test data being loaded, after which a post-
processing will be performed.

Please note that the post-processing step uses all available cores on the CPU.

Figure 54: Evaluation: Post-processing after loading

The post-processing pre-calculates only which reference target reports can be used for
direct comparison for specific test target reports.

Therefore, please note that re-loading the data is only required when changes to the
reference/test data settings in the Main tab have been made. Changing requirements or
removing targets from evaluation does not require re-loading.

After the loading and the post-processing have been performed, all targets are shown
in the Targets tab.
EVALUATION 86

Figure 55: Evaluation Targets tab after loading

Filtering Targets
The Targets tab is useful for removing certain targets from the evaluation (’Use’ checkbox)
and inspecting already removed ones.

Single rows can be selected by clicking on them, which triggers a loading process
showing this exact target (with all associated data) in the available Views. Please note that
this does not require re-loading the evaluation data, but can be used at all times during
the evaluation.

The ’Change Usage’ button can be used for the following actions:
• Use All: Enable usage for all UTNs
• Use None: Disable usage for all UTNs
EVALUATION 87

• Clear Comments: Clear comments of all UTNs


• Filter: Start ’Filter UTNs’ dialog

The ’Filter UTNs’ dialog can be used to dynamically filter UTNs based on the configu-
rated values:
EVALUATION 88

Figure 56: Evaluation Filter UTNs dialog


EVALUATION 89

• Remove Short Targets: Removes targets with a small number of target reports or
duration
• Remove Primary-Only Targets: Removes primary-only targets (w/o secondary at-
tributes)
• Remove Mode A/C Code onlys: Removes targets without Mode S attributes
• Remove by Mode A Code: Removes targets having a Mode A code given in the list
• Mode A Code Blacklist: Whether the Mode A codes should be used as blacklist or
whitelist
• Remove by Mode C Code: Removes targets having a Mode C code smaller than the
given value
• Remove by Target Address: Removes targets having a Mode S target address given
in the list
• Target Address Blacklist: Whether the Target Addresses should be used as blacklist
or whitelist
• Remove by Non-Detection of DBContent: Removes targets not being detected by a
given DBContent

When the ’Run’ button is clicked, all (enabled) targets are checked and are disabled if
any of the selected filters apply.
EVALUATION 90

Figure 57: Evaluation Targets tab after filtering

Please note that the ’Change Usage’ button can be used also after the ’Evaluate’ button
is used, which automatically updates the evaluation results.

Evaluation
After the data was loaded, the configuration relating to current standard, requirements
and sector/requirement group usage can be adapted. After that, the evaluation can be
(re-)run using the ’Evaluate’ button.

This will trigger evaluation of the requirements in all sectors (as configured). The
requirement values will be calculated for each target (whether to be used or not). Then,
for each requirement and sector, the results are summed-up as per-sector average (if target
EVALUATION 91

should be used).

Please note that the post-processing step uses all available cores on the CPU.

Figure 58: Evaluation: Running evaluation status

The results are then shown in the Results tab.

Results Inspection & Analysis


As described before, there are several levels of detail in which sub-results can be inspected.

The uppermost is the ’Requirements→Overview’, giving the sector sums for all re-
quirements.

The next level of detail are the sector sum details, located in ’Sectors→Sector Layer
Name→Requirement Group Name→Requirement Name’.

The lowest level are the per-target details, located in ’Targets→UTN’ and the respective
per-target results located in ’Targets→UTN→Sector Layer Name→Requirement Group
Name→Requirement Name’.

By default, when single-clicking a row in a table the respective results are shown in the
existing Views. When double-clicking, a step into the next level of detail is performed (if
available).

Navigation can be made more efficient by returning to the last sub-result by using the
’Back’ button on the top-left.
EVALUATION 92

Overview

Figure 59: Evaluation results: Overview

Please note that the results are given as example only and are no indication of performance
for any system currently in operation.

When single-clicking a row, the respective result error are shown in the existing Views.
EVALUATION 93

Figure 60: Evaluation results: Sector PD errors in OSGView

When double-clicking a row, a step into the respective sector sum details is performed.
EVALUATION 94

Sector Details

Figure 61: Evaluation results: Sector Detail Example

On the left side, the current position in the results sections is shown. On the right, the
current results are shown. At the top, there is an overview table giving the details of the
calculation results in the respective sector layer and requirement.

At the bottom, further result details are listed per-target, sorted in this example by the
Probability of Detection (PD).

When single-clicking a row, the respective target data and result errors are shown in
the existing OSGViews.
EVALUATION 95

Figure 62: Evaluation results: Target PD Errors in OSGView

When double-clicking a row, a step into the respective target details is performed.
EVALUATION 96

Per-Target Details

Figure 63: Evaluation results: Per-Target PD Detail Example

On the left side, the current position in the results sections is shown. On the right, the
current results are shown. At the top, there is an overview table giving the details of the
calculation results for the target in the respective sector layer and requirement.

At the bottom, further result details are listed per-target-report, sorted in this example
by time.

When single-clicking a row, the respective target data and the respective single result
error are shown in the existing OSGViews.
EVALUATION 97

Figure 64: Evaluation results: Target Single PD Error in OSGView

Generate Report
Using the "Export PDF" button, a PDF can be generated. A PDF can only be generated
if a Latex environment is installed on the workstation, as described in Appendix: Latex
Installation.
EVALUATION 98

Figure 65: Evaluation results: Generate report dialog

At the top, the following configuration options exist:


• Report Path: Directory to which the report will be written.
• Report Filename: Filename of the report, to be created in the report path. File exten-
sion must be ’.tex’.
• Change Location: Button to set the report path and filename.
• Author: Author string, is added to the first page of the report.
• Abstract: Abstract string, is added to the first page of the report.
• Include Per-Target Details: Whether to include the per-target details.
• Include Per-Target Target Report Details: Whether to include the per-target target
report details.
• Wait on Map Loading: When OSGView screenshots are created, some maps require
downloading terrain from the Internet. This option enables to wait on completion
of such activities, to generate high-quality screenshots. Disable only when operating
on cached maps without an Internet connection.
EVALUATION 99

• Run PDFLatex: Automatically runs the pdflatex compile process, immediately creat-
ing a PDF after finished export. Is disabled if command could not be found.
• Open Created PDF: Automatically opens the created PDF. Is disabled if pdflatex com-
mand could not be found.

Please note that the two ’Include ... Details’ options can produce very large PDF reports
(10.000+ pages), and may even overload the Latex sub-system (will result in ’TeX capacity
exceeded, sorry’ error). It is therefore recommended to only activate these options for
small datasets with very few sector layers.

The ’Run’ button startes the export process. At the bottom, status information and a
cancel button exists.

To run the export process, click the ’Run’ button.

Figure 66: Evaluation results: Generate report in progress


EVALUATION 100

If the export process was sucessful, the dialog is closed automatically. The report Latex
file was written into the report directory, with screenshots in the ’screenshots’ subfolder.
If the respective options were set, a PDF was automatically generated and is opened using
the default PDF application.

If a Latex error ocurred, a message box with the error message is shown. If the ’TeX
capacity exceeded, sorry’ error is shown, disable one or both of the ’Include ... Details’
options.

Please note that the generated report can of course be edited by the user and re-
generated using pdflatex, which allows for better customization options (adding e.g. de-
tails, corporate identity etc.).

Requirements
Please note that the exact requirement calculation methods are quite complex and will be
added at a later point.

Detection
Configuration

Figure 67: Evaluation Detection requirement


EVALUATION 101

The ’Detection’ requirement can be used to check whether targets are detected at all.
For each target existing in the reference data (within the current sector) a target must
be detected within e.g. each test update interval or other given interval. Missed update
intervals are either called misses or gaps, which are used to calculate a probability of
detection, which has to fulfill a given threshold.

• Probability [1]: Probability of detection


• Probability Check Type: ≥
• Update Interval [s]: Update interval of the test data
• Use Minimum Gap Length: Checkbox if minimum gap length should be used
• Minimum Gap Length [s]: Minimum gap length to be considered
• Use Maximum Gap Length: Checkbox if maximum gap length should be used
• Maximum Gap Length [s]: Maximum gap length to be considered
• Use Miss Tolerance: Checkbox if miss tolerance should be used
• Miss Tolerance [s]: Acceptable time delta for miss detection

Calculation
As a summary, the reference is used to calculate the number of expected update intervals
inside the sector layer (#EUI). Then, for the test data, if the reference exists at the time,
time differences between target reports are checked and the number of misses/gaps are
calculated as number of missed update intervals (#MUI).

Gaps are, if a minimum or maximum gap length is used, only counted if the detected
gap fulfills the thresholds.

The ratio of #MUI and #EUI gives the probability of missed update interval, the
counter-probability gives the Probability of Detection (PD). The PD must greator or equal
than the defined ’Probability’ for the requirement to pass.

Result Values
Sector
EVALUATION 102

Name Description Example


Sector Layer Name of the sector layer fir_cut_sim
Reqirement Group Name of the requirement group Mandatory
Reqirement Name of the requirement Detection
Num Results Total number of results 728
Num Usable Results Number of usable results 417
Num Unusable Results Number of unusable results 311
#Updates/#EUIs [1] Total number update intervals 7960
#MUIs [1] Number of missed update intervals 2221
PD [%] Probability of Detection 72.10
Condition >= 90.00
Condition Fulfilled Failed

Also, a table is given for all single targets, sorted by PD.

Single Target

Name Description Example


Use To be used in results true
#EUIs [1] Expected Update Intervals 6
#MUIs [1] Missed Update Intervals 5
PD [%] Probability of Detection 16.67
Reference Period 0 Time inside sector [15:47:22.680,15:47:45.828]
Reference Period 1 Time inside sector [15:47:53.844,15:47:57.844]
Condition >= 90.00
Condition Fulfilled Failed
EVALUATION 103

Dubious Targets
Configuration

Figure 68: Evaluation Dubious Targets

The ’Dubious Targets’ requirement can be used to check for dubious for dubious move-
ment from data-source. This requirement checks based on test data only, so the reference
data is of no importance.

For each track number (existing in an UTN) a number of checks are performed, and a
probability of dubious target report is calculated. Which checks are used can be defined
as follows, but are focused on short tracks or physically dubious movement.

• Probability [1]: Probability of dubious target


• Probability Check Type: ≥
EVALUATION 104

• Minimum Comparison Time [s]: Skip movement checks if time between updates is
smaller than the defined time
• Maximum Comparison Time [s]: Skip movement checks if time between updates is
larger than the defined time
• Mark Primary-Only: Checkbox if all primary-only tracks should be counted as dubi-
ous
• Use Minimum Updates: Checkbox if tracks with less than the defined number of
updates should be counted as dubious
• Minimum Updates [1]: Minimum number of updates
• Use Minimum Duration: Checkbox if tracks with a duration less than the defined
time should be counted as dubious
• Minimum Duration [s]: Minimum duration
• Use Maximum Groundspeed: Checkbox if maximum groundspeed should be used
• Maximum Groundspeed [kts]: Maximum groundspeed to be considered
• Use Maximum Acceleration: Checkbox if maximum acceleration should be used
• Maximum Acceleration [m/s2 ]: Maximum acceleration to be considered
• Use Maximum Turnrate: Checkbox if maximum turnrate should be used
• Turnrate Groundspeed [deg/s]: Maximum groundspeed to be considered
• Use Maximum ROCD: Checkbox if turnrate rate of climb/descent should be used
• Maximum ROCD [ft/s]: Maximum rate of climb/descent to be considered
• Dubious Probability [1]: Probability of dubious target report to classify a track as
dubious

Calculation
As a summary, the test data is used to calculate the number of dubious targets in relation
to the total number of targets, which gives the probability of dubios target (PDT). If this
probability is larger than the required one, the requirement is failed.

A track is be ignored (from dubious detection) if the last-updating sensor parameter is


active, and the track was updated by at any point by any other sensor.

The total track updates are marked as dubious if any of the following cases hold:
• ’Mark Primary-Only’ is used, and the track is always primary-only (no secondary
attributes)
• ’Minimum Updates’ are used, and the number of track updates is smaller than the
required threshold
• ’Minimum Duration’ is used, and the duration of the track is smaller than the re-
quired threshold
EVALUATION 105

For the movement checks, for each track update the movement is checked. If the Mini-
mum/Maximum Comparison Time check fails, the respective track update is skipped (not
checked).

The following checks can be performed:


• ’Maximum Groundspeed’: The track update’s groundspeed (as output from the data
source, as well as calculated from position differences) is checked against the thresh-
old
• ’Maximum Acceleration’: The track update’s groundspeed derivative (difference to
the previous value divided by time delta) is checked against the threshold
• ’Maximum Turnrate’: The track update’s track-angle derivative (difference to the
previous value divided by time delta) is checked against the threshold
• ’Maximum ROCD’: The track update’s Mode C derivative (difference to the previous
value divided by time delta) is checked against the threshold

For each track, the number of target reports failing the movement checks (PDU) is
calculated and the ratio to the total number of track updates is calculated. If this ratio is
larger than the ’Dubious Probability’, the track is marked as dubious.

Result Values
Sector
EVALUATION 106

Name Description Example


Sector Layer Name of the sector layer fir_body_cut
Reqirement Group Name of the requirement group Required
Reqirement Name of the requirement Dubious
Num Results Total number of results 690
Num Usable Results Number of usable results 504
Num Unusable Results Number of unusable results 186
Use To be used in results true
#Pos [1] Number of updates 44697
#PosInside [1] Number of updates inside sector 33498
#PosOutside [1] Number of updates outside sector 11199
#DU [1] Number of dubious updates inside 2691
sector
PDU [%] Probability of dubious update 8.03
#T [1] Number of targets 594
#DT [1] Number of dubious targets 394
Duration [s] Duration of all targets 129462.93
Duration Dubious [s] Duration of dubious targets 14551.83
Duration Non-Dubious [s] Duration of non-dubious targets 114911.10
Average Duration Dubious [s] Average duration of dubious tar- 36.93
gets
Duration Ratio Dubious [%] Duration ratio of dubious targets 11.24
Duration Ratio Non-Dubious [%] Duration ratio of non-dubious tar- 88.76
gets
PDT [%] Probability of dubious target 66.33
Condition <= 90.00
Condition Fulfilled Passed

Also, a table is given for all single targets, sorted by PDT.

Single Target
EVALUATION 107

Name Description Example


Use To be used in results true
#Up [1] Number of updates 354
#PosInside [1] Number of updates inside sector 354
#PosOutside [1] Number of updates outside sector 0
#DU [1] Number of dubious updates inside 33
sector
PDU [%] Probability of dubious update 9.32
#T [1] Number of targets 7
#DT [1] Number of dubious targets 4
Duration [s] Duration of all targets 1339.86
Duration Dubious [s] Duration of dubious targets 92.50
Duration Non-Dubious [s] Duration of non-dubious targets 1247.36
Average Duration Dubious [s] Average duration of dubious targets 23.13
Duration Dubious Ratio [%] Duration ratio of dubious targets 6.9
Duration Non-Dubious Ration [%] Duration ratio of non-dubious targets 93.1
PDT [%] Probability of dubious target
Condition <= 90.00
Condition Fulfilled Failed
EVALUATION 108

Dubious Tracks
Configuration

Figure 69: Evaluation Dubious Tracks

The ’Dubious Tracks’ requirement can be used to check for dubious tracks generated by a
Tracker data-source (using the attributed track number). This requirement checks based
on test data only, so the reference data is of no importance.

For each track number (existing in an UTN) a number of checks are performed, and a
probability of dubious target report is calculated. Which checks are used can be defined
as follows, but are focused on short tracks or physically dubious movement.
EVALUATION 109

• Probability [1]: Probability of dubious track


• Probability Check Type: ≥
• Use only from L.U. Sensor: Checkbox if only data from a single last-updating sensor
id should be considered
• L.U. Sensor [1]: Last-updating sensor id (number, as in the ’Manage Data-Sources’
task)
• Minimum Comparison Time [s]: Skip movement checks if time between updates is
smaller than the defined time
• Maximum Comparison Time [s]: Skip movement checks if time between updates is
larger than the defined time
• Mark Primary-Only: Checkbox if all primary-only tracks should be counted as dubi-
ous
• Use Minimum Updates: Checkbox if tracks with less than the defined number of
updates should be counted as dubious
• Minimum Updates [1]: Minimum number of updates
• Use Minimum Duration: Checkbox if tracks with a duration less than the defined
time should be counted as dubious
• Minimum Duration [s]: Minimum duration
• Use Maximum Groundspeed: Checkbox if maximum groundspeed should be used
• Maximum Groundspeed [kts]: Maximum groundspeed to be considered
• Use Maximum Acceleration: Checkbox if maximum acceleration should be used
• Maximum Acceleration [m/s2 ]: Maximum acceleration to be considered
• Use Maximum Turnrate: Checkbox if maximum turnrate should be used
• Turnrate Groundspeed [deg/s]: Maximum groundspeed to be considered
• Use Maximum ROCD: Checkbox if turnrate rate of climb/descent should be used
• Maximum ROCD [ft/s]: Maximum rate of climb/descent to be considered
• Dubious Probability [1]: Probability of dubious target report to classify a track as
dubious

Calculation
As a summary, the test data is used to calculate the number of dubious tracks in relation
to the total number of tracks, which gives the probability of dubios track (PDT). If this
probability is larger than the required one, the requirement is failed.

A track is be ignored (from dubious detection) if the last-updating sensor parameter is


active, and the track was updated by at any point by any other sensor.

The total track updates are marked as dubious if any of the following cases hold:
EVALUATION 110

• ’Mark Primary-Only’ is used, and the track is always primary-only (no secondary
attributes)
• ’Minimum Updates’ are used, and the number of track updates is smaller than the
required threshold
• ’Minimum Duration’ is used, and the duration of the track is smaller than the re-
quired threshold

For the movement checks, for each track update the movement is checked. If the Mini-
mum/Maximum Comparison Time check fails, the respective track update is skipped (not
checked).

The following checks can be performed:


• ’Maximum Groundspeed’: The track update’s groundspeed (as output from the
Tracker) is checked against the threshold
• ’Maximum Acceleration’: The track update’s groundspeed derivative (difference to
the previous value divided by time delta) is checked against the threshold
• ’Maximum Turnrate’: The track update’s track-angle derivative (difference to the
previous value divided by time delta) is checked against the threshold
• ’Maximum ROCD’: The track update’s Mode C derivative (difference to the previous
value divided by time delta) is checked against the threshold

For each track, the number of target reports failing the movement checks (PDU) is
calculated and the ratio to the total number of track updates is calculated. If this ratio is
larger than the ’Dubious Probability’, the track is marked as dubious.

Result Values
Sector
EVALUATION 111

Name Description Example


Sector Layer Name of the sector layer fir_body_cut
Reqirement Group Name of the requirement group Required
Reqirement Name of the requirement Dubious
Num Results Total number of results 690
Num Usable Results Number of usable results 504
Num Unusable Results Number of unusable results 186
Use To be used in results true
#Pos [1] Number of updates 44697
#PosInside [1] Number of updates inside sector 33498
#PosOutside [1] Number of updates outside sector 11199
#DU [1] Number of dubious updates inside 2691
sector
PDU [%] Probability of dubious update 8.03
#T [1] Number of tracks 594
#DT [1] Number of dubious tracks 394
Duration [s] Duration of all tracks 129462.93
Duration Dubious [s] Duration of dubious tracks 14551.83
Duration Non-Dubious [s] Duration of non-dubious tracks 114911.10
Average Duration Dubious [s] Average duration of dubious tracks 36.93
Duration Ratio Dubious [%] Duration ratio of dubious tracks 11.24
Duration Ratio Non-Dubious [%] Duration ratio of non-dubious 88.76
tracks
PDT [%] Probability of dubious track 66.33
Condition <= 90.00
Condition Fulfilled Passed

Also, a table is given for all single targets, sorted by PDT.

Single Target
EVALUATION 112

Name Description Example


Use To be used in results true
#Up [1] Number of updates 354
#PosInside [1] Number of updates inside sector 354
#PosOutside [1] Number of updates outside sector 0
#DU [1] Number of dubious updates inside 33
sector
PDU [%] Probability of dubious update 9.32
#T [1] Number of tracks 7
#DT [1] Number of dubious tracks 4
Duration [s] Duration of all tracks 1339.86
Duration Dubious [s] Duration of dubious tracks 92.50
Duration Non-Dubious [s] Duration of non-dubious tracks 1247.36
Average Duration Dubious [s] Average duration of dubious tracks 23.13
Duration Dubious Ratio [%] Duration ratio of dubious tracks 6.9
Duration Non-Dubious Ration [%] Duration ratio of non-dubious tracks 93.1
PDT [%] Probability of dubious track
Condition <= 90.00
Condition Fulfilled Failed

Extra Data
Configuration

Figure 70: Evaluation Extra Data requirement


EVALUATION 113

The ’Extra Data’ requirement is a recommended addition to the ’Detection’ requirement,


while not mandated by any standard known to the author.

While the ’Detection’ requirement detects "missing" test data, it ignores test data for
which no reference exist - which might indicate issues in the reference data which might
be of interest in the evaluation.

The ’Extra Data’ requirement detects "extra" test data, i.e. test data for which no ref-
erence exists (and fulfills possible constraints), and calculates the number of extra target
reports. Based on the number of target reports which are extra, and the number of target
reports which are also detection by the reference, the Probability of Extra (PEx) data is cal-
culated. The PEx must be less or equal than the defined ’Probability’ for the requirement
to pass.

• Probability [1]: Probability of extra data


• Probability Check Type: ≤
• Minimum Duration [s]: Minimum track duration, requirement result is ignored if
less
• Minimum Number of Updates [s]: Minimum number of extra target reports, require-
ment result is ignored if less
• Ignore Primary Only: Requirement result is ignored if target is primary only (has no
secondary attributes, also not in reference)

Result Values
Sector

Name Description Example


Sector Layer Name of the sector layer fir_cut_sim
Reqirement Group Name of the requirement group Mandatory
Reqirement Name of the requirement Extra Data
Num Results Total number of results 728
Num Usable Results Number of usable results 250
Num Unusable Results Number of unusable results 478
#Check. Number of checked test updates 131343
#OK. Number of OK test updates 109622
#Extra Number of extra test updates 21721
PEx [%] Probability of extra test update 16.54
Condition <= 3.00
Condition Fulfilled Failed

Also, a table is given for all single targets, sorted by PEx.


EVALUATION 114

Single Target

Name Description Example


Use To be used in results true
#Ref [1] Number of reference updates 157
#Tst [1] Number of test updates 614
Ign. Ignore target false
#Check. Number of checked test updates 590
#OK. Number of OK test updates 329
#Extra Number of extra test updates 261
PEx [%] Probability of update with extra data 44.24
Condition <= 3.00
Condition Fulfilled Failed

Extra Track
Configuration

Figure 71: Evaluation Extra Track requirement

The ’Extra Track’ requirement is useful for Tracker evaluation, and detects if more than
one test track exist for a target.

First the time period of each track (by ocurrance of track number, with a maximum
time difference of 5 minutes) is calculated. Then, for each test target report, it is checked if
multiple track number periods match, and counted as extra update if there are more than
EVALUATION 115

1. Based on the number of target reports which are extra, and the number of target reports
which are also detection by the reference, the Probability of Extra (PEx) data is calculated.
The PEx must be less or equal than the defined ’Probability’ for the requirement to pass.

• Probability [1]: Probability of extra data


• Probability Check Type: ≤
• Minimum Duration [s]: Minimum track duration, requirement result is ignored if
less
• Minimum Number of Updates [s]: Minimum number of extra target reports, require-
ment result is ignored if less
• Ignore Primary Only: Requirement result is ignored if target is primary only (has no
secondary attributes, also not in reference)

Please note that currently, in the case of multiple tracks existing at the same time, the
requirement does not decided which track is correct and which one is extra, therefore all
target reports are counted as being extra.

Result Values
Sector

Name Description Example


Sector Layer Name of the sector layer fir_cut_sim
Reqirement Group Name of the requirement group Mandatory
Reqirement Name of the requirement Extra Track
Num Results Total number of results 728
Num Usable Results Number of usable results 110
Num Unusable Results Number of unusable results 618
#Check. Number of checked test track updates 56106
#OK. Number of OK test track updates 56106
#Extra Number of extra test track updates 0
PEx [%] Probability of update with extra track 0.00
Condition <= 0.00
Condition Fulfilled Passed

Also, a table is given for all single targets, sorted by PEx.

Single Target
EVALUATION 116

Name Description Example


Use To be used in results true
#Tst [1] Number of test updates 566
Ign. Ignore target false
#Check. Number of checked test track updates 327
#OK. Number of OK test track updates 327
#Extra Number of extra test track updates 0
PEx [%] Probability of update with extra track 0
Condition <= 0.00
Condition Fulfilled Passed

Identification Correct
Configuration

Figure 72: Evaluation Identification Correct requirement

The ’Identification Correct’ requirement is used to calculate the probability of a target


report having a correct (secondary) identification. Correct in this context means there is
identification data available, and it is the same as in the reference.

• Probability [1]: Probability of correct identification


• Probability Check Type: ≥
• Require correctness of All: If checked, all available secondary attributes must match
the reference. If not checked, a single matching secondary attribute is enough.
EVALUATION 117

• Use Mode 3/A Code: If the Mode 3/A code should be checked
• Use Mode S Target Address: If the Mode S target address should be checked
• Use Mode S Target Identification: If the Mode S target identification should be
checked

Result Values
Sector

Name Description Example


Sector Layer Name of the sector layer fir_cut_sim
Reqirement Group Name of the requirement group Mandatory
Reqirement Name of the requirement Identification Correct
Num Results Total number of results 728
Num Usable Results Number of usable results 107
Num Unusable Results Number of unusable results 621
#Updates Total number target reports 101685
#NoRef [1] Number of updates w/o reference 7359
position or identification
#NoRefPos [1] Number of updates w/o reference 7359
position
#NoRef [1] Number of updates w/o reference 0
identification
#PosInside [1] Number of updates inside sector 53997
#PosOutside [1] Number of updates outside sector 40329
#CID [1] Number of updates with correct 53997
identification
#NCID [1] Number of updates with no correct 0
identification
POK [%] Probability of correct identification 100.00
Condition >= 90.00
Condition Fulfilled Passed

Also, a table is given for all single targets, sorted by PEx.

Single Target
EVALUATION 118

Name Description Example


Use To be used in results true
#Up [1] Number of updates 892
#NoRef [1] Number of updates w/o reference position or identi- 84
fication
#NoRefPos [1] Number of updates w/o reference position 84
#NoRef [1] Number of updates w/o reference identification 0
#PosInside [1] Number of updates inside sector 563
#PosOutside [1] Number of updates outside sector 245
#CID [1] Number of updates with correct identification 563
#NCID [1] Number of updates with no correct identification 0
POK [%] Probability of correct identification 100
Condition >= 90.00
Condition Fulfilled Passed

Identification False
Configuration

Figure 73: Evaluation Identification False requirement

The ’Identification False’ requirement is used to calculate the probability of a target report
having a false (secondary) identification. False in this context means there is identification
data available, and it is not the same as in the reference.

• Probability [1]: Probability of false identification


EVALUATION 119

• Probability Check Type: ≤


• Require All False: If checked, all available secondary attributes be different than in
the reference to count. If not checked, a single wrong secondary attribute is enough.
• Use Mode 3/A Code: If the Mode 3/A code should be checked
• Use Mode S Target Address: If the Mode S target address should be checked
• Use Mode S Target Identification: If the Mode S target identification should be
checked

Result Values
Sector

Name Description Example


Sector Layer Name of the sector layer fir_cut_sim
Reqirement Group Name of the requirement group Mandatory
Reqirement Name of the requirement Identification False
Num Results Total number of results 728
Num Usable Results Number of usable results 107
Num Unusable Results Number of unusable results 621
Use To be used in results true
#Up [1] Number of updates 101685
#NoRef [1] Number of updates w/o reference po- 7359
sition or identification
#NoRefPos [1] Number of updates w/o reference po- 7359
sition
#NoRef [1] Number of updates w/o reference 0
identification
#PosInside [1] Number of updates inside sector 53997
#PosOutside [1] Number of updates outside sector 40329
#Unknown [1] Number of updates unknown identifi- 0
cation
#Correct [1] Number of updates with correct iden- 53997
tification
#False [1] Number of updates with false identifi- 0
cation
PF [%] Probability of identity false 0
Condition <= 1.00
Condition Fulfilled Passed

Also, a table is given for all single targets, sorted by PF.


EVALUATION 120

Single Target

Name Description Example


Use To be used in results true
#Up [1] Number of updates 566
#NoRef [1] Number of updates w/o reference position or identi- 51
fication
#NoRefPos [1] Number of updates w/o reference position 51
#NoRef [1] Number of updates w/o reference identification 0
#PosInside [1] Number of updates inside sector 292
#PosOutside [1] Number of updates outside sector 223
#Unknown [1] Number of updates unknown identification 0
#Correct [1] Number of updates with correct identification 292
#False [1] Number of updates with false identification 0
PF [%] Probability of Mode 3/A false 0
Condition <= 1.00
Condition Fulfilled Passed

Mode 3/A False


Configuration

Figure 74: Evaluation Mode 3/A False requirement

The ’Mode 3/A False’ requirement is used to calculate the probability of a target report
having a false Mode 3/A code. False in this context means there is Mode 3/A information
data available, and it is not the same as in the reference. Only values which are valid and
not garbled are used.
EVALUATION 121

• Probability [1]: Probability of false Mode 3/A code


• Probability Check Type: ≤

Result Values
Sector

Name Description Example


Sector Layer Name of the sector layer fir_cut_sim
Reqirement Group Name of the requirement group Mandatory
Reqirement Name of the requirement Mode A False
Num Results Total number of results 728
Num Usable Results Number of usable results 106
Num Unusable Results Number of unusable results 622
Use To be used in results true
#Up [1] Number of updates 101669
#NoRef [1] Number of updates w/o reference position 7353
or code
#NoRefPos [1] Number of updates w/o reference position 7353
#NoRef [1] Number of updates w/o reference code 0
#PosInside [1] Number of updates inside sector 53987
#PosOutside [1] Number of updates outside sector 40329
#Unknown [1] Number of updates unknown code 198
#Correct [1] Number of updates with correct code 53788
#False [1] Number of updates with false code 1
PF [%] Probability of Mode 3/A false 0
Condition <= 1.00
Condition Fulfilled Passed

Also, a table is given for all single targets, sorted by PF.

Single Target
EVALUATION 122

Name Description Example


Use To be used in results true
#Up [1] Number of updates 732
#NoRef [1] Number of updates w/o reference position or code 35
#NoRefPos [1] Number of updates w/o reference position 35
#NoRef [1] Number of updates w/o reference code 0
#PosInside [1] Number of updates inside sector 542
#PosOutside [1] Number of updates outside sector 155
#Unknown [1] Number of updates unknown code 5
#Correct [1] Number of updates with correct code 536
#False [1] Number of updates with false code 1
PF [%] Probability of Mode 3/A false 0.19
Condition <= 1.00
Condition Fulfilled Passed

Mode 3/A Present


Configuration

Figure 75: Evaluation Mode 3/A Present requirement

The ’Mode 3/A Present’ requirement is used to calculate the probability of a target report
having any Mode 3/A code. Present in this context means there is Mode 3/A information
data available, irrespectively if correct or not.

• Probability [1]: Probability of Mode 3/A code present


• Probability Check Type: ≥
EVALUATION 123

Result Values
Sector

Name Description Example


Sector Layer Name of the sector layer fir_cut_sim
Reqirement Group Name of the requirement group Mandatory
Reqirement Name of the requirement Mode A Present
Num Results Total number of results 728
Num Usable Results Number of usable results 107
Num Unusable Results Number of unusable results 621
Use To be used in results true
#Up [1] Number of updates 101685
#NoRef [1] Number of updates w/o reference posi- 7359
tion
#NoRefPos [1] Number of updates w/o reference posi- 7359
tion
#PosInside [1] Number of updates inside sector 53997
#PosOutside [1] Number of updates outside sector 40329
#NoRefId [1] Number of updates without reference 10
code
#Present [1] Number of updates with present tst code 53789
#Missing [1] Number of updates with missing tst code 198
PP [%] Probability of Mode 3/A present 99.63
Condition >= 98.00
Condition Fulfilled Passed

Also, a table is given for all single targets, sorted by PP.

Single Target
EVALUATION 124

Name Description Example


Use To be used in results true
#Up [1] Number of updates 956
#NoRef [1] Number of updates w/o reference position 60
#NoRefPos [1] Number of updates w/o reference position 60
#PosInside [1] Number of updates inside sector 467
#PosOutside [1] Number of updates outside sector 429
#NoRefId [1] Number of updates without reference code 0
#Present [1] Number of updates with present tst code 456
#Missing [1] Number of updates with missing tst code 11
PP [%] Probability of Mode 3/A present 97.64
Condition >= 98.00
Condition Fulfilled Failed

Mode C False
Configuration

Figure 76: Evaluation Mode C False requirement

The ’Mode C False’ requirement is used to calculate the probability of a target report
having a false Mode C code. False in this context means there is Mode C information data
available, and the absolute difference between the test and the reference is larger than the
given threshold.

• Probability [1]: Probability of false Mode C code


• Probability Check Type: ≤
• Maximum Difference [ft]: Maximum altitude difference between the test and the
reference, in feet
EVALUATION 125

Result Values
Sector

Name Description Example


Sector Layer Name of the sector layer fir_cut_sim
Reqirement Group Name of the requirement group Mandatory
Reqirement Name of the requirement Mode C False
Num Results Total number of results 728
Num Usable Results Number of usable results 107
Num Unusable Results Number of unusable results 621
Use To be used in results true
#Up [1] Number of updates 101685
#NoRef [1] Number of updates w/o reference position 7363
or code
#NoRefPos [1] Number of updates w/o reference position 7359
#NoRef [1] Number of updates w/o reference code 4
#PosInside [1] Number of updates inside sector 53997
#PosOutside [1] Number of updates outside sector 40329
#Unknown [1] Number of updates unknown code 40
#Correct [1] Number of updates with correct code 53935
#False [1] Number of updates with false code 18
PF [%] Probability of Mode C false 0.03
Condition <= 3.00
Condition Fulfilled Passed

Also, a table is given for all single targets, sorted by PF.

Single Target
EVALUATION 126

Name Description Example


Use To be used in results true
#Up [1] Number of updates 709
#NoRef [1] Number of updates w/o reference position or code 30
#NoRefPos [1] Number of updates w/o reference position 30
#NoRef [1] Number of updates w/o reference code 0
#PosInside [1] Number of updates inside sector 591
#PosOutside [1] Number of updates outside sector 88
#Unknown [1] Number of updates unknown code 0
#Correct [1] Number of updates with correct code 585
#False [1] Number of updates with false code 6
PF [%] Probability of Mode C false 1.02
Condition <= 3.00
Condition Fulfilled Passed

Mode C Present
Configuration

Figure 77: Evaluation Mode C Present requirement

The ’Mode C Present’ requirement is used to calculate the probability of a target report
having any Mode C code. Present in this context means there is Mode C information data
available, irrespectively if correct or not.

• Probability [1]: Probability of Mode C code present


• Probability Check Type: ≥
EVALUATION 127

Result Values
Sector

Name Description Example


Sector Layer Name of the sector layer fir_cut_sim
Reqirement Group Name of the requirement group Mandatory
Reqirement Name of the requirement Mode C Present
Num Results Total number of results 728
Num Usable Results Number of usable results 107
Num Unusable Results Number of unusable results 621
Use To be used in results true
#Up [1] Number of updates 101685
#NoRef [1] Number of updates w/o reference posi- 7359
tion
#NoRefPos [1] Number of updates w/o reference posi- 7359
tion
#PosInside [1] Number of updates inside sector 53997
#PosOutside [1] Number of updates outside sector 40329
#NoRefC [1] Number of updates without reference 4
code
#Present [1] Number of updates with present tst code 53953
#Missing [1] Number of updates with missing tst code 40
PP [%] Probability of Mode C present 99.93
Condition >= 97.00
Condition Fulfilled Passed

Also, a table is given for all single targets, sorted by PP.

Single Target
EVALUATION 128

Name Description Example


Use To be used in results true
#Up [1] Number of updates 1041
#NoRef [1] Number of updates w/o reference position 92
#NoRefPos [1] Number of updates w/o reference position 92
#PosInside [1] Number of updates inside sector 406
#PosOutside [1] Number of updates outside sector 543
#NoRefC [1] Number of updates without reference code 0
#Present [1] Number of updates with present tst code 398
#Missing [1] Number of updates with missing tst code 8
PP [%] Probability of Mode C present 98.03
Condition >= 97.00
Condition Fulfilled Passed

Position Across
Configuration

Figure 78: Evaluation Position Across requirement

The ’Position Across’ requirement is used to calculate the probability of a target report
having an across-track error smaller than a defined threshold. The offset of the position
(test vs. linear interpolated reference position) is used to calculate the error component
across the track angle of the reference at the time. If the absolute value of this across-track
position error is smaller or equal than the defined threshold, the target report is counted
for the calculated probability PACOK. The PACOK must be greater or equal than the
defined ’Probability’ for the requirement to pass.

• Probability [1]: Probability of acceptable across-track position


EVALUATION 129

• Probability Check Type: ≥


• Maximum Absolute Value [m]: Maximum absolute across-track position difference
between the test and the reference, in meters

Result Values
Sector

Name Description Example


Sector Layer Name of the sector layer fir_cut_sim
Reqirement Group Name of the requirement group Mandatory
Reqirement Name of the requirement Across
Num Results Total number of results 728
Num Usable Results Number of usable results 107
Num Unusable Results Number of unusable results 621
Use To be used in results true
#Pos [1] Number of updates 101685
#NoRef [1] Number of updates w/o reference positions 7359
#PosInside [1] Number of updates inside sector 53997
#PosOutside [1] Number of updates outside sector 40329
ACMin [m] Minimum of across-track error -444.23
ACMax [m] Maximum of across-track error 463.83
ACAvg [m] Average of across-track error 4.19
ACSDev [m] Standard Deviation of across-track error 24.81
ACVar [m2 ] Variance of across-track error 615.60
#ACOK [1] Number of updates with across-track error 52961
#ACNOK [1] Number of updates with unacceptable 1036
across-track error
PACOK [%] Probability of acceptable across-track error 98.08
Condition Across >= 90.00
Condition Across Fulfilled Passed

Also, a table is given for all single targets, sorted by PACOK.

Single Target
EVALUATION 130

Name Description Example


Use To be used in results true
#Pos [1] Number of updates 515
#NoRef [1] Number of updates w/o reference positions 45
#PosInside [1] Number of updates inside sector 186
#PosOutside [1] Number of updates outside sector 284
ACMin [m] Minimum of across-track error -28.77
ACMax [m] Maximum of across-track error 77.88
ACAvg [m] Average of across-track error 1.85
ACSDev [m] Standard Deviation of across-track error 12.83
ACVar [m2 ] Variance of across-track error 164.55
#ACOK [1] Number of updates with across-track error 185
#ACNOK [1] Number of updates with unacceptable across- 1
track error
PACOK [%] Probability of acceptable across-track error 99.46
Condition Across >= 90.00
Condition Across Fulfilled Passed

Position Along
Configuration

Figure 79: Evaluation Position Along requirement

The ’Position Along’ requirement is used to calculate the probability of a target report
having an along-track error smaller than a defined threshold. The offset of the position
(test vs. linear interpolated reference position) is used to calculate the error component
along the track angle of the reference at the time. If the absolute value of this along-track
position error is smaller or equal than the defined threshold, the target report is counted
EVALUATION 131

for the calculated probability PALOK. The PALOK must be greater or equal than the
defined ’Probability’ for the requirement to pass.

• Probability [1]: Probability of acceptable along-track position


• Probability Check Type: ≥
• Maximum Absolute Value [m]: Maximum absolute along-track position difference
between the test and the reference, in meters

Result Values
Sector

Name Description Example


Sector Layer Name of the sector layer fir_cut_sim
Reqirement Group Name of the requirement group Mandatory
Reqirement Name of the requirement Along
Num Results Total number of results 728
Num Usable Results Number of usable results 107
Num Unusable Results Number of unusable results 621
Use To be used in results true
#Pos [1] Number of updates 101685
#NoRef [1] Number of updates w/o reference positions 7359
#PosInside [1] Number of updates inside sector 53997
#PosOutside [1] Number of updates outside sector 40329
ALMin [m] Minimum of along-track error -1094.27
ALMax [m] Maximum of along-track error 828.16
ALAvg [m] Average of along-track error 18.37
ALSDev [m] Standard Deviation of along-track error 30.47
ALVar [m2 ] Variance of along-track error 928.43
#ALOK [1] Number of updates with along-track error 51854
#ALNOK [1] Number of updates with unacceptable 2143
along-track error
PALOK [%] Probability of acceptable along-track error 96.03
Condition Along >= 90.00
Condition Along Fulfilled Passed

Also, a table is given for all single targets, sorted by PALOK.

Single Target
EVALUATION 132

Name Description Example


Use To be used in results true
#Pos [1] Number of updates 865
#NoRef [1] Number of updates w/o reference positions 59
#PosInside [1] Number of updates inside sector 482
#PosOutside [1] Number of updates outside sector 324
ALMin [m] Minimum of along-track error -91.41
ALMax [m] Maximum of along-track error 100.38
ALAvg [m] Average of along-track error 42.52
ALSDev [m] Standard Deviation of along-track error 18.89
ALVar [m2 ] Variance of along-track error 356.75
#ALOK [1] Number of updates with along-track error 453
#ALNOK [1] Number of updates with unacceptable along- 29
track error
PALOK [%] Probability of acceptable along-track error 93.98
Condition Along >= 90.00
Condition Along Fulfilled Passed

Position Distance
Configuration
This requirement can be used in 2 variations:

Figure 80: Evaluation Position Distance requirement for correct positions


EVALUATION 133

Figure 81: Evaluation Position Distance requirement for false positions

The ’Position Distance’ requirement is used to calculate the probability of a target


report having a position error smaller or larger than a defined threshold. This two possi-
bilities allow for calculation of a position correct or being false (e.g. outside an 5σ range of
an expected error distribution).

The offset of the position (test vs. linear interpolated reference position) is used to
calculate the Euklidian error distance. If the value of this position error is fails the defined
comparison threshold, the target report is counted for the calculated probability PCP
(probability of check passed). The PCP must be in turn pass the check for the requirement
to pass.

The ’Failed Values are of Interest’ checkbox define if the target reports passing or failing
the check are of interest.

Position Correct Variation


• Probability [1]: Probability of correct position
• Probability Check Type: ≥
• Threshold Value [m]: Maximum allowed distance from test target report to reference
• Threshold Value Check Type: ≤, distance must be less or equal the given threshold
• Failed Values are of Interest: Checked, the distances of interest are the ones not pass-
ing the check

Position False Variation


• Probability [1]: Probability of false position
EVALUATION 134

• Probability Check Type: ≤


• Threshold Value [m]: Minimum distance from test target report to reference
• Threshold Value Check Type: ≥, distance must be greater or equal the given thresh-
old
• Failed Values are of Interest: Not checked, the distances of interest are the ones pass-
ing the check

Result Values
Sector

Name Description Example


Sector Layer Name of the sector layer fir_cut_sim
Reqirement Group Name of the requirement group Mandatory
Reqirement Name of the requirement Position Correct
Num Results Total number of results 728
Num Usable Results Number of usable results 107
Num Unusable Results Number of unusable results 621
Use To be used in results true
#Pos [1] Number of updates 101685
#NoRef [1] Number of updates w/o reference posi- 7359
tions
#PosInside [1] Number of updates inside sector 53997
#PosOutside [1] Number of updates outside sector 40329
DMin [m] Minimum of distance 0.09
DMax [m] Maximum of distance 1115.68
DAvg [m] Average of distance 34.06
DSDev [m] Standard Deviation of distance 27.19
DVar [m2 ] Variance of distance 739.18
#CF [1] Number of updates with failed compari- 291
son
#CP [1] Number of updates with passed compari- 53706
son
PCP [%] Probability of passed comparison 99.46
Condition >= 90.00
Condition Fulfilled Passed

Also, a table is given for all single targets, sorted by PCP.


EVALUATION 135

Single Target

Name Description Example


Use To be used in results true
#Pos [1] Number of updates 927
#NoRef [1] Number of updates w/o reference positions 52
#PosInside [1] Number of updates inside sector 646
#PosOutside [1] Number of updates outside sector 229
DMin [m] Minimum of distance 4.14
DMax [m] Maximum of distance 284.87
DAvg [m] Average of distance 81.87
DSDev [m] Standard Deviation of distance 34.43
DVar [m2 ] Variance of distance 1185.18
#CF [1] Number of updates with failed comparison 25
#CP [1] Number of updates with passed comparison 621
PCP [%] Probability of passed comparison 96.13
Condition >= 90.00
Condition Fulfilled Passed

Position Latency
Configuration

Figure 82: Evaluation Position Latency requirement

The ’Position Latency’ requirement is used to calculate the probability of a target report
having a time-latency error smaller than a defined threshold. The offset of the position
(test vs. linear interpolated reference position) is used to calculate the error component
along the track angle of the reference at the time, which divided by the negative speed
EVALUATION 136

gives the latency. If the absolute value of this latency is smaller or equal than the defined
threshold, the target report is counted for the calculated probability PLTOK. The PLTOK
must be greater or equal than the defined ’Probability’ for the requirement to pass.

• Probability [1]: Probability of acceptable position latency


• Probability Check Type: ≥
• Maximum Absolute Value [s]: Maximum absolute latency, in seconds

Result Values
Sector

Name Description Example


Sector Layer Name of the sector layer fir_cut_sim
Reqirement Group Name of the requirement group Mandatory
Reqirement Name of the requirement Latency
Num Results Total number of results 728
Num Usable Results Number of usable results 107
Num Unusable Results Number of unusable results 621
Use To be used in results true
#Pos [1] Number of updates 101685
#NoRef [1] Number of updates w/o reference posi- 7359
tions
#PosInside [1] Number of updates inside sector 53997
#PosOutside [1] Number of updates outside sector 40329
LTMin [s] Minimum of latency -00:00:02.758
LTMax [s] Maximum of latency 00:00:04.790
LTAvg [s] Average of latency -00:00:00.081
LTSDev [s] Standard Deviation of latency 00:00:00.146
LTVar [s2 ] Variance of latency 00:00:00.021
#LTOK [1] Number of updates with latency 44216
#LTNOK [1] Number of updates with unacceptable 9781
latency
PLTOK [%] Probability of acceptable latency 81.89
Condition Latency >= 90.00
Condition Latency Fulfilled Failed

Also, a table is given for all single targets, sorted by PLTOK.


EVALUATION 137

Single Target

Name Description Example


Use To be used in results true
#Pos [1] Number of updates 1729
#NoRef [1] Number of updates w/o reference posi- 142
tions
#PosInside [1] Number of updates inside sector 1469
#PosOutside [1] Number of updates outside sector 118
LTMin [s] Minimum of latency -00:00:00.590
LTMax [s] Maximum of latency 00:00:00.431
LTAvg [s] Average of latency -00:00:00.092
LTSDev [s] Standard Deviation of latency 00:00:00.085
LTVar [s2 ] Variance of latency 00:00:00.007
#LTOK [1] Number of updates with latency 1324
#LTNOK [1] Number of updates with unacceptable 145
latency
PLTOK [%] Probability of acceptable latency 90.13
Condition Latency >= 90.00
Condition Latency Fulfilled Passed
EVALUATION 138

Speed
Configuration

Figure 83: Evaluation Speed requirement

The ’Speed’ requirement is used to calculate the probability of a target report having an
speed error smaller than a defined threshold. The difference of the speed (test speed vs.
speed based on reference positions) is calculated, and if its absolute value is smaller or
equal than the defined threshold, the target report is counted for the calculated probability
PCP (probability of check passed). The PCP must be greater or equal than the defined
’Probability’ for the requirement to pass.

In another variation, if the ’Use Percent Threshold if Higher’ checkbox is set, the check
is changed for faster speeds. If the ’Threshold Percent’ value times the calculated speed
is larger or equal the dewfined threshold value, then threshold value is changed to value
of the ’Threshold Percent’ value times the calculated speed. This means that for higher
speeds, an accuracy with the given percentage is required.

• Probability [1]: Probability of false Mode C code


• Probability Check Type: ≥
• Speed Offset Value [m/s]: Maximum absolute speed difference between the test and
the reference, in meters per second
• Use Percent Threshold if Higher: Defines if the percent-based accuracy should be
used for higher speeds
• Threshold Percent [%]: Percent threshold
EVALUATION 139

• Threshold Value Check Type: ≤, speed difference must be less or equal the given
threshold
• Failed Values are of Interest: Checked, the speed values of interest are the ones not
passing the check

Result Values
Sector

Name Description Example


Sector Layer Name of the sector layer fir_cut_sim
Reqirement Group Name of the requirement group Mandatory
Reqirement Name of the requirement Speed
Num Results Total number of results 728
Num Usable Results Number of usable results 107
Num Unusable Results Number of unusable results 621
Use To be used in results true
#Pos [1] Number of updates 101685
#NoRef [1] Number of updates w/o reference speeds 7359
#PosInside [1] Number of updates inside sector 53997
#PosOutside [1] Number of updates outside sector 40329
#NoTstData [1] Number of updates without tst speed data 0
OMin [m/s] Minimum of speed offset 0.00
OMax [m/s] Maximum of speed offset 1114.58
OAvg [m/s] Average of speed offset 3.59
OSDev [m/s] Standard Deviation of speed offset 7.62
OVar [m2 /s2 ] Variance of speed offset 58.01
#CF [1] Number of updates with failed comparison 38
#CP [1] Number of updates with passed comparison 53959
PCP [%] Probability of passed comparison 99.93
Condition >= 90.00
Condition Fulfilled Passed

Also, a table is given for all single targets, sorted by PCP.

Single Target
EVALUATION 140

Name Description Example


Use To be used in results true
#Pos [1] Number of updates 1795
#NoRef [1] Number of updates w/o reference speeds 131
#PosInside [1] Number of updates inside sector 949
#PosOutside [1] Number of updates outside sector 715
#NoTstData [1] Number of updates without tst speed data 0
OMin [m/s] Minimum of speed offset 0.00
OMax [m/s] Maximum of speed offset 56.07
OAvg [m/s] Average of speed offset 5.23
OSDev [m/s] Standard Deviation of speed offset 5.60
OVar [m2 /s2 ] Variance of speed offset 31.34
#CF [1] Number of updates with failed comparison 2
#CP [1] Number of updates with passed comparison 947
PCP [%] Probability of passed comparison 99.79
Condition >= 90.00
Condition Fulfilled Passed
View Points

The ’View Points’ tab displays existing view points, allows selection, stepping and editing
of view points. Additionally, view points can be imported, removed and exported.

141
VIEW POINTS 142

Figure 84: View Points Tab

At the top, a toolbar is shown. In the middle, a table showing all existing view points
exists. At the bottom, general function buttons exist.
VIEW POINTS 143

View Point
A view point is a point of interest in the data persisted in the database. It can have the
following attributes:

Key Value Description


id Identifier, as number
type Type, e.g. ’Short track’, ’Extra track’, ’Content deviation
X’
status Editable status, e.g. ’open’, ’closed’, ’todo’
comment Editable user comment
text Description text
position_latitude Center position WGS-84 latitude
position_longitude Center position WGS-84 longitude
position_window_latitude Geographic window size in WGS-84 latitude
position_window_longitude Geographic window size in WGS-84 longitude
time Center time
time_window Time window size
data_sources Data sources and lines to be loaded
data_sources_types Data source types to be loaded
filters Filter configuration
context_variables Context DBContent variables to be shown

Not all attributes are shown in the table, since some are more processing related than
information relative to the user.

Also, additional attributes can be shown. If other additional attributes exist in the view
point information, they are automatically shown in the table. For additional information
please refer to Custom Attributes.

When a view point is selected, the dataset defined by the view point is loaded auto-
matically and the active Views show the relevant data. Using the elements described in
the following sections a user can quickly step through view points, assess the information
shown and change status and comment information.

Please note that changes to view points are saved immediately to the database and no
undo function exists.
VIEW POINTS 144

Toolbar

Icon Text Description


Select Previous [Up] Steps to the previous view point
Set Selected Status Open [O] Sets the current view point to status ’open’
Set Selected Status Closed [C] Sets the current view point to status ’closed’
Set Selected Status ToDo [T] Sets the current view point to status ’todo’
Edit Comment [E] Edits the current view points status
Select Next [Down] Steps to the next view point
Edit Columns Open menu for hiding/showing columns
Filter By Type Opens menu for filtering based on type
Filter By Status Opens menu for filtering based on status
Table 2: Toolbar Actions

All of the actions can be triggered using the listed keyboard shortcut in the square
brackets. Up/Down refers to the keyboard arrows.
VIEW POINTS 145

Table

Figure 85: View Points Table

In the view points table, all view points are listed, each one in a seperate row. Columns
can be used for ordering (simply click on the column name), and resized as wanted.

A click on a view point selects it, which highlights the row.


VIEW POINTS 146

Figure 86: View Points Table: Selected View Point

This automatically triggers loading of the data and display in the existing Views.

The status can only be set using the toolbar buttons or keyboard shortcuts, while edit-
ing the comment can also be triggered by a double-click on the respective cell.

Re-Sorting Using Columns


By clicking on a column, the View Points Table is sorted on the information in the column.
Ascending/descending can be changed by clicking again on a column already used for
sorting.

Per default, the table is sorted by the ’id’ column.


VIEW POINTS 147

Showing/Hiding Columns
Using the ’Edit Columns’ button in the toolbar, unwanted columns can be hidden. Click
on the button to activate the following menu:

Figure 87: View Points: Edit Columns Menu

The following entries exist:


• All Column names: With a checkbox to select/deselect which columns should be
shown
• Show Only Main: Only the first 4 columns are shown
• Show All: All columns are shown
• Show None: No columns are shown

Filtering Based on Type


Using the ’Filter By Type’ button in the toolbar, unwanted view points can be hidden. Click
on the button to activate the following menu:
VIEW POINTS 148

Figure 88: View Points: Filter By Type Menu

The following entries exist:


• All existing types: With a checkbox to select/deselect which should be shown
• Show All: All types are shown
• Show None: No types are shown

Filtering Based on Status


Using the ’Filter By Status’ button in the toolbar, unwanted view points can be hidden.
Click on the button to activate the following menu:

Figure 89: View Points: Filter By Status Menu

The following entries exist:


• All existing statuses: With a checkbox to select/deselect which should be shown
• Show All: All types are shown
• Show None: No types are shown
VIEW POINTS 149

Function Buttons
There exist 3 buttons for general functions:
• Import: Imports a view point file selected by the user (only recommended if no view
points are already defined)
• Delete All: Deletes all existing view points
• Export: Exports all existing view points as a view point file
• Export PDF: Exports view points as a PDF file

Stepping View Points


Management Tab
When selecting a new view point, the information set ’data_sources’ and
’data_sources_types’ attributes are used to select which DBContent is loaded. The ’fil-
ters’ attribute to set the filter active flags and respective conditions. After that, a loading
process is triggered.

Data Selection
When the loading process is finished, data is automatically selected using the ’time’ and
’time_window’ attributes.
VIEW POINTS 150

ListBox View

Figure 90: View Points ListBox View: Selected View Point

In the ListBox View, the variables set in the ’context_variables’ attribute are added tem-
porarily to the list of variables. Loaded data is presented as always. When the loading
process is finished, selected data is highlighed.
VIEW POINTS 151

OSGView

Figure 91: View Points OSGView: Selected View Point

Loaded data is presented as always and can be adapted to a users needs. After load-
ing, the presented data is centered/zoomed according to the ’position_latitude’, ’posi-
tion_longitude’, ’position_window_latitude’, ’position_window_longitude’ attributes. If
these are not set, the center/zoom is adapted to encompass all of the loaded data.

After Selection
After data loading, the application can be used as in any other situation, therefore changing
filters, adapting the OSGView style or re-loading the dataset is possible.
VIEW POINTS 152

Assessment
In the ’View Points’ tab, after assessing the view point, a user can add a comment and
change the status to annotate the view point with additional information.

After this e.g. the next view point can be selected.

Exporting View Points to PDF


Using the "Export PDF" button, a PDF can be generated. A PDF can only be generated
if a Latex environment is installed on the workstation, as described in Appendix: Latex
Installation.

Please note that:


• Commonly the shown (filtered) View Points are exported, in the configured order
• A Latex report file is generated, as well as View screenshots
• To compile the Latex report into PDF a Latex environment (including the pdflatex
application) must be installed, e.g. TeX Live (used by the author) and MikTeX (see
Link)
• All active views are included, so they should be configured in a suitable manner
– A ListBoxView is reported as the selected data, or the first 30 rows (if no data is
selected), in a table.
– An OSGView is reported as a screenshot figure.
VIEW POINTS 153

Figure 92: View Points: Export PDF Dialog

At the top, the following configuration options exist:


• Report Path: Directory to which the report will be written.
• Report Filename: Filename of the report, to be created in the report path. File exten-
sion must be ’.tex’.
• Change Location: Button to set the report path and filename.
• Author: Author string, is added to the first page of the report.
• Abstract: Abstract string, is added to the first page of the report.
• Export All (Sorted by id): Export all view points, sorted by id, ignoring the filtering
and ordering in the View Point tab.
• Group by Type: Creates sub-section for each type of View Point.
• Add Overview Table: Creates table with an overview of all View Points in the begin-
ning of the document.
• Wait on Map Loading: When OSGView screenshots are created, some maps require
downloading terrain from the Internet. This option enables to wait on completion
VIEW POINTS 154

of such activities, to generate high-quality screenshots. Disable only when operating


on cached maps without an Internet connection.
• Add Overview Screenshot: Adds a preceding screenshot per OSGView with an
larger map overview and a data marker.
• Run PDFLatex: Automatically runs the pdflatex compile process, immediately creat-
ing a PDF after finished export. Is disabled if command could not be found.
• Open Created PDF: Automatically opens the created PDF. Is disabled if pdflatex com-
mand could not be found.

The ’Run’ button startes the export process. At the bottom, status information and a
cancel button exists.

To run the export process, click the ’Run’ button.

Figure 93: View Points: Export PDF Dialog Status


VIEW POINTS 155

If pdflatex is to be run, this is indicated in the ’Status’ text, which might have to be run
several times. For large documents this might take several minutes, this time not being
included in the ’Remaining Time’ estimate.

Figure 94: View Points: Export PDF Dialog Status: pdflatex

The export speed of course depends on the View Points, the number of Views, hard-
ware etc. However, even exporting (somewhat unreasonable) 1500 View Points takes
about only 15 minutes on the authors hardware, so it should be adequate for any reason-
able use case.

If the export process was sucessful, a message is shown and the dialog is closed auto-
matically. The report Latex file was written into the report directory, with screenshots in
the ’screenshots’ subfolder. If the respective options were set, a PDF was automatically
generated and is opened using the default PDF application.

If a Latex error ocurred, a message box with the error message is shown.
VIEW POINTS 156

Please note that the generated report can of course be edited by the user and re-
generated using pdflatex, which allows for better customization options (adding e.g. de-
tails, corporate identity etc.).
ListBox View

A ListBox View displays DBContent data as text in tables to allow textual data inspection.
When started, it presents itself in the following manner.

157
LISTBOX VIEW 158

Figure 95: Listbox View startup

Layout
On the left side a number of tabs exist, one for each type of DBContent and an additional
’All’ tab, each of which contains a table.

On the right side resides the configuration area, which allows configuring what data is
loaded and how it is displayed. The ’Reload’ button on the bottom can be used to trigger
a reload of the view’s data.

Both areas can be resized and hidden if wanted.


LISTBOX VIEW 159

Data Loading
To load the data the ’Reload’ button or the mechanism described in Section UI Overview
can be used. To filter the dataset, the mechanism described in Section Filters can be used.

Figure 96: Listbox View after loading

Once updated, the tables are filled with text representing the values of the chosen
DBContent variables. If a value is undefined its cell remains empy. For each type of
DBContent a dedicated table is shown, as well as the ’All’ table, where data from all
DBContent types is shown collectively.
LISTBOX VIEW 160

Please note that since a specific variable might only exist in certain DBContents, the
number of columns in the various tables might differ.

Usage
Selection
In the first column of each table checkboxes are shown, indicating whether that target re-
port is currently selected. Selection may be changed by selecting/de-selecting the respec-
tive checkboxes, or by altering the selection in other views (cross-selection). If the selection
is changed in one of the other views, this view is updated automatically.

Variable Lists
In the ’Variable Lists’ section of the ’Config’ tab, a variable list preset can be selected via a
combo box. Further, custom variable lists can be added, copied and removed by the user.

The following presets exist:


• Default: Common variables. Can not be renamed or removed.
• Mode A/C Info: Includes the Mode A/C valid/garbled/smoothed flags.
• Track Lifetime: Include the track begin/confirmed/coasted/end flags.
• ADS-B Position Quality: Includes the ADS-B MOPS version, NACp/NUCp/NIC/SIL
information.
• Horizontal Movement: Includes hozizontal movement modes and derivatives.

Variables
For the selected variable list, all DBContent variables which are loaded from the database
are shown in the ’Variables’ list. This list is ordered, and like all configuration elements
persistent. Ordering in the list can be changed by selecting a certain variable and using
the up/down buttons to move it in the respective direction.

When pressing the ’Remove’ button, a selected variable is removed. Pressing the ’Add’
button allows appending a variable to the list using a context-menu.

If Meta variables are used, they are displayed for all DBContents they exist in. If a
DBContent variable is used, it is only displayed in its native content.
LISTBOX VIEW 161

Figure 97: Listbox View adding of variables

After a adding a variable the dataset has to be reloaded to include the additional data,
therefore the ’Reload’ button becomes active.

Show Only Selected


When this checkbox is checked, only selected target reports are shown in the tables. In this
mode, de-selecting a target report removes it from the shown data.

Use Presentation
When this checkbox is checked, the so called presentation mode is used. In the database,
the variables might have different units or a data representations which is not easy to read.
For this purpose, a presentation mode was introduced to e.g. show a Mode A code as
octal, or a Time of Day not in seconds since midnight but in HH:MM:SS.SS format.

When the ’Use Presentation’ checkbox is not checked, the original database values are
presented (and exported).

Exporting
The data from the current table can be exported to a comma-separated value (CSV) text,
either as file or copied to the clipboard.

For this example a Mode 3/A code filter was used to load only target reports and
system track updates from a single target.
LISTBOX VIEW 162

To copy to the clipboard, select data to be copied in the table using the Shift key and
the left mouse button, then press Ctrl-C. The CSV text data can then be pasted in other
applications.

To export the complete loaded dataset, click the ’Export’ button, so that a dialog is
opened.

Figure 98: Listbox View export

Choose a filename, and press ’Save’ to save the data. If the ’Overwrite Exported File’
checkbox was checked, an existing file is automatically overwritten. Please note that
exporting might take some time for larger datasets, and currently no status indication is
given.
LISTBOX VIEW 163

After export, a dialog is shown indicating that the export was completed.

The exported file can be opened in any editor, or for example imported into LibreOffice
Calc.

Figure 99: Listbox View export in LibreOffice Calc

Reload
Additionally, if a change is made that requires re-loading of the data (e.g. additional data
should be displayed) the ’Reload’ button becomes available, and can be used to trigger a
loading process as in Section ??.
Histogram View

A Histogram View displays the distribution of one numerical variable as a histogram,


approximating the statistical distribution of said varable. When started, it presents itself
in the following manner.

164
HISTOGRAM VIEW 165

Figure 100: Histogram View startup

Layout
On the left side resides the plot area in which the histogram is shown (if data has been
loaded). The tool bar at the top shows the currently selected mouse interaction mode and
the available actions.

On the right side resides the configuration area, which allows configuring what data is
loaded and how it is displayed. The ’Reload’ button on the bottom can be used to trigger
a reload of the view’s data.

Both areas can be resized and hidden if wanted.


HISTOGRAM VIEW 166

Data Loading
To load the data the mechanism described in Section UI Overview or the ’Reload’ button
can be used. To filter the dataset, the mechanism described in Section Filters can be used.

Figure 101: Histogram View after loading

On the x-axis the selected variable’s data range is discretized into 20 bins. An additional
bin represents all NULL values.
On the y-axis the bin sizes per DBContent are shown, either in linear or logarithmic
scale.
HISTOGRAM VIEW 167

Below the histogram a legend is shown, giving the total counts of all data points.

In the current example the meta-variable ’Time of Day’ is used, showing the overall
data rate per DBContent.

Usage
Toolbar
The first tool buttons can be used to switch between mouse interaction modes, currently
only one mode is available.

Icon Text Description


Select Allows data selection & de-selection
Table 3: Toolbar mouse interaction modes

The others provide general operations (shortcut refers to keyboard shortcut):

Icon Shortcut Text Description


Invert Selection Selects all de-selected & vice versa
Delete Selection De-selects all target reports
Zoom to Home Pans/zooms to show all existing data
Table 4: Toolbar operations

Config Tab
The elements on the top define which data is visualized in the histogram.

Any numerical variable can be visualized by checking the ’Show Variable Data’ box
and selecting a variable in the selection control below. A reload operation might be re-
quired for the selection to take effect.

Result data can be visualized by checking the ’Show Evaluation Result Data’ box.
In this case the the ’Requirement’ and ’Result’ fields indicate which evaluation result is
presented.

The ’Logarithmic Y Scale’ checkbox can be used to switch between linear and logarith-
mic scale of the y-axis.
HISTOGRAM VIEW 168

Histogram
Zoom
The mouse wheel can be used to zoom in or out of the presented data. This is in the current
presentation only useful in limited circumstances. The space key can be used to reset to
the default zoom level (euqivalent to ).

Selection Mode
In ’Select’ mode, data can be selected. The first left mouse-button click starts selection
(showing a red rectangle), the second click finalizes the selection. The data contained in all
intersected bins is selected.

Figure 102: Histogram View data selection


HISTOGRAM VIEW 169

The selected data is then presented in an extra ’Selected’ entry in the legend, showing
the count of all selected data points.

Figure 103: Histogram View data selected

This enables selection of parts of the data based on the presented variable, allowing
deeper analysis e.g. of dubious data.

The ’Invert Selection’ or ’Delete Selection’ actions allow for easier selection of
the wanted target reports.

By pressing the ’Control’ key during the second click, the newly selected data is added
to any previous selection. This can be used to select data incrementally, making more
complex selections possible.
HISTOGRAM VIEW 170

Evaluation Result
If - using the Evaluation feature - a requirement result is presented, the respective data is
shown in the histogram.

Figure 104: Histogram View evaluation position correction result


HISTOGRAM VIEW 171

Figure 105: Histogram View Mode C present result

Please note that currently evaluation result data can not be selected. This will be im-
proved in one of the next versions.
OSG View

The OSG View allows a graphical representation of target reports from the DBContent.
After creation, it displays a world map in the map widget on the left, and a configuration
panel on the right hand side.

172
OSG VIEW 173

Layout

Figure 106: OSG View overview

The map window will automatically traverse to the medium location of the data in the
current database and later pan/zoom to the loaded data (on the first load only).

There exist 3 main components:


• Toolbar (top left): Selects mouse action or set display options
OSG VIEW 174

• Data Widget (lower left): Displays map and geometry data


• Configuration panel (right): Allows adaption what and how geometry data is dis-
played

Please note that the Data Widget and the Configuration panel can be resized and
hidden if wanted.

To load the data use the the mechanism described in Section UI Overview can be used.
To filter the dataset, the mechanism described in Section Filters can be used.

After loading, the target reports will be shown.


OSG VIEW 175

Figure 107: OSG View overview after loading

How the data is presented in defined in the Configuration panel, please refer to Con-
figuration Panel for details.

For each DSType, a default display configuration is set:


• Tracker: White circle (default)
• Radar: Green cross (default)
• MLAT: Red triangle (default)
• ADS-B: Blue square (default)
OSG VIEW 176

• RefTraj: Orange plus-sign (default)

There exist two display modes, 2D(default) and 3D, which are set based on the used
map:
• 2D: Displays map in a 2D projection and (per default) disables usage of height in
geometry data drawing
• 3D: Displays map as a globe and (per default) enables usage of height in geometry
data drawing

Please refer to Map Operations for details.

Toolbar
The first 4 symbols switch between the mouse action modes, the others provide specific
options.

Icon Text Description


Navigate Allows navigation of the map only
Label Allows labeling, group menu and nagivation of the map
Label Multiple Allows labeling of data in rectangular area and nagivation of the map
Measure Allows map distance measuring and navigation of the map
Select Allows data selection & de-selection

Table 5: Toolbar mouse action modes

The others provide general display modes or operations (shortcut refers to keyboard
shortcut):
OSG VIEW 177

Icon Shortcut Text Description


Toggle Time Filter Enable/disable time filter
Toggle Depth Check Enable/disable depth check. Disabled in 2d mode.
Save View Point Save current view as new view point
Deletes All Labels Deletes all existing labels
Deletes All Measurements Deletes all existing measurements
Edit Selection Color Set selection color
Invert Selection Selects all de-selected & vice versa
Delete Selection De-selects all target reports
Overlay Text Color Invert Changes Overlay Text Color
or D Switch Map Dimensions Changes between 3D/2D display
Zoom to Home Pans/zooms to show all existing data
Zoom to Loaded Data Pans/zooms to show currently loaded data

Table 6: Toolbar operations

Data Widget
Mouse/Keyboard Operations
In the data widget, several mouse and key operations are supported. The following terms
are used:
• LMB: Left mouse button
• MMB: Middle mouse button
• RMB: Right mouse button
The following operations exist, depending on the mouse action modes (set in the tool-
bar):
OSG VIEW 178

Mouse Action Key Action Description


LMB click - -
LMB click & drag - Traverse map
LMB double click - Zoom to clicked location
- Arrows Traverse map
MMB click & drag - Rotate map
MMB scroll & drag - Zoom map
RMB click - -
RMB click & drag - Zoom map
RMB double click & drag - Zoom away from clicked location
- Space bar Return to home position
Table 7: Map widget view operations in Navigate mode

Mouse Action Key Action Description


LMB click - Create label
LMB click & drag - Create label & traverse map
RMB click - Open group menu
... ... Other operations are the same as in Navigate mode
Table 8: Map widget view operations in Label mode

Mouse Action Key Action Description


LMB click - Start/end labeling box
LMB click & drag - Traverse map
... ... Other operations are the same as in Navigate mode
Table 9: Map widget view operations in Label Multiple mode

Mouse Action Key Action Description


LMB click - Start/end measurement
LMB click & drag - Traverse map
... ... Other operations are the same as in Navigate mode
Table 10: Map widget view operations in Measure mode

Mouse Action Key Action Description


LMB click - Start/end selection box
LMB click & drag - Traverse map
... ... Other operations are the same as in Navigate mode
Table 11: Map widget view operations in Select mode
OSG VIEW 179

Status Information
In the lower-left corner, the data status information is given:
• Status
– Idle: Nothing to do
– Loading: Loading in progress
– Done: Loading/redraw done
– Redrawing: Redraw in progress
• Time Begin: First timestamp in the data, in HH:MM:SS
• Time End: last timestamp in the data, in HH:MM:SS
• Loaded: Number of loaded target reports (from the database)
• Skipped: Number of not-drawn target reports
• Drawn: Number of drawn target reports
• Selected: Number of selected target reports

In the upper-right corner, the COMPASS version is shown. In the lower-right corner,
the current coordinates (map coordinates under the mouse cursor) are shown.

Data Operations
Data Labeling

Labels can be shown for target reports, when the Label mode is active. Simply do a
LMB click on a target report symbol.
OSG VIEW 180

Figure 108: OSG View Labels

Geometry Menu
Several data-related operations can be also performed in the data widget when the Label
mode is active, by a RMB click on a target report symbol.
OSG VIEW 181

Figure 109: OSG View geometry operations

The data context menu allows the following operations:


• Toogle Label: Toggle label display of a single target report
• Group: Same operations as on the item group’s layer menu
• Parents: Same operations as on the item’s parent layer menu
Depending on the grouping mode in the ’Config’ tab, different parent layer items are
shown. For each of this parents the same menu entries exist as if they where clicked in the
’Layers’ tab. Please refer to Geometry Operations for details.
OSG VIEW 182

Label Multiple

Multiple target reports can be selected when the Label Multiple mode is active.

The usage of this mode mode works exactly like the Selection mode, only that the target
reports are not selected but each one is labeled. If more then 100 target reports are labeled,
only the first 100 ones are labeled and an error message is shown.

Figure 110: OSG View Label Multiple error message

For details about the rectangular target report labeling please refer to Selection.

Distance Measurement

Distance measurements can be made when the Measure mode is active. If ’Use height’
(section Height) is not checked, measurements are done as follows:
Simply do a LMB click on a target report or a position on the map to start the measure-
ment.
OSG VIEW 183

Figure 111: OSG View measurement

During the measurement, the following information is shown in the top-left corner:
• Distance (km): Great-circle distance using the haversine formula, in meters or kilo-
meters.
• Distance (nm): Great-circle distance using the haversine formula, in nautical miles.
• Bearing (deg): Bearing from point 1 to point 2, in degrees from true north.
Then do a second LMB click on the map on a point of interest to finish the measurement.
OSG VIEW 184

Figure 112: OSG View measurement done

If ’Use height’ (section Height) is checked, measurements are calculated in the same
way as previously (ground distance), but for each target report with height a connecting
line to the respective ground position is displayed.

For a measurement between two target reports with height information, the measure-
ment will be displayed as follows:
OSG VIEW 185

Figure 113: OSG View measurement with height information

The measurements is shown in the Layers tab, and is identified using a number. For
details about the measurement layer operations please refer to Measurement Operations.

Please note the the measurement number background color can be set as with the label
background color.
OSG VIEW 186

Selection

Target reports can be selected when the Select mode is active. If ’Use height’ is not
checked, select is done as follows:
Simply do a LMB click on a target report or a position on the map to start the selection.
Move the mouse to another location to create a red selection rectangle.

Figure 114: OSG View selection

With a second LMB click the selection is finalized and all target reports in the created
latitude/longitude rectangle are selection. This is shown by a different color (yellow by
OSG VIEW 187

default), and the ’Selected’ counter in the lower-left corner shows the current selection
size.

Figure 115: OSG View selection done

If another selection is done, the previous one is cleared by default. If another selection
should be added to the current one, hold down the Control key when doing the second
LMB click.
OSG VIEW 188

Figure 116: OSG View selection with adding

This adds the target reports within the new red rectangle to the current selecion.
OSG VIEW 189

Figure 117: OSG View selection with adding done

If ’Use height’ is checked, selection can be done with height information, which is
shown as a box. Simply do a LMB click on the map for a target report, and move the
cursor to another target report or map location.

If both locations have zero height (map location or no height information in target
report), again a rectangle is shown. If one or both locations have a non-zero height, a 3D
box is displayed.
OSG VIEW 190

Figure 118: OSG View selection with height


OSG VIEW 191

Figure 119: OSG View selection with height done

Time Filter

Once activated using the symbol in the toolbar, the time filter facilitates that only target
reports within a specific time window are shown.
OSG VIEW 192

Figure 120: OSG View time filter

The following items exist:


• Current: The start of the time window
• Window length (s): Duration of the display time window
• Current End: The end of the time window
• Play button: Start/Stop the auto-play mode
• Update interval [s]: Auto-play mode update interval
• Update step [s]: Auto-play mode update step
OSG VIEW 193

• Use Opacity: Checkbox and slide to set opacity (older target reports are made trans-
parent)
• Scrollbar: Manual time-scrolling

During usage of the time filter, most changes in the Configuration panel are not avail-
able (except for changes in the Layers tab and current style).

Please note that the time filter automatically deactivated if a re-load is triggered.

Depth Check

Once activated using the symbol in the toolbar, during drawing it is checked wether
data is occluded in the depicted scene. E.g. if target reports are ’below ground’, they are
not shown.
OSG VIEW 194

Figure 121: OSG View depth check

Please note that if the display check is activated for geometry without height display,
bad rendering can occur. For this reason, this mode is only recommended for geometry
display with height and is disabled in 2D display mode.
OSG VIEW 195

Figure 122: OSG View depth check bad rendering

Save View Point


The current filter configuration and viewed position can be saved as a new view point
using the symbol in the toolbar. When clicked, a name of the view point has to be
entered by the user.

Data Label Deletion

All existing labels can be deleted using the symbol in the toolbar.
OSG VIEW 196

Measurement Deletion

All existing measurements can be deleted using the symbol in the toolbar.

Selection Color

The color for selection highlighting can be configured using the symbol in the toolbar.

Selection Invert

The selection can be inverted using the symbol in the toolbar.

Selection Deletion

The selection can be erased using the symbol in the toolbar.

Overlay Text Color Invert


The overlay text color can be toggled between black on white or white on black using the
symbol in the toolbar. Depending on the map background one or the other is more
suitable.

Switch Map Dimensions

The display modes can be switched using the symbol or in the toolbar, or by press-
ing the D-key.

Zoom to Home
The currently viewed area can be set to encompass all data in the database using the sym-
bol in the toolbar.

Zoom to Loaded Data


The currently viewed area can be set to encompass all currently loaded data using the
symbol in the toolbar.
OSG VIEW 197

Configuration Panel

Figure 123: OSG View configuration panel

There exist 5 tabs:


OSG VIEW 198

• Layers: Allows configuration what data is shown, and access to the map and mea-
surement functions
• Style: Allows configuration of how data is shown
• Labels: Allows configuration of how data is labeled
• Evaluations: Allows configuration of how evaluation results are colored
• Others: Allows configuration of the height usage in the shown geometry

Additionally, at the bottom the ’Update’ button allows redraw/reloading of the geom-
etry data. This button becomes available if changes in the Style or Height tabs require a
redraw or a reload of the data.

Layers Tab
In the ’Layers’ tab, a tree view is given to configure the display of the existing elements.
OSG VIEW 199

Figure 124: OSG View layers tab


OSG VIEW 200

The following main tree elements are:

• Geometry: Shows currently loaded DBConent


• Measurements: Shows the current distance measurements
• Radars: Shows the existing radars
• Sectors: Shows the existing sectors
• Map: Shows current map layers

How the geometry is layered is defined by the Layer mode in the Style tab, please refer
to Layer Mode for details.

For all geometry layers, a second colum is shown, giving the count of number of
target reports in the (sub-) layer (as were loaded from the database).

In the Layers widget, a number of operations are possible for each tree item.

Operation Trigger Description


View sub-items Triangle Opens or closes view of the sub-items
Display item Checkbox Enables or disables display of items (and all sub-items)
Display context menu Click on symbol Opens the items context menu

Table 12: Layer operations

The context menu allows several actions to be performed on an item. If an item has
sub-items, the same action will automatically be performed on the child items.

Geometry Operations
The geometry menu is triggered by clicking on the respective symbol of the layer.
OSG VIEW 201

Figure 125: OSG View layer context menu

Common Layer Operations


• Hide: Disable display of this item
• Show: Enable display of this item
• Hide Children: Disable display of all children
• Show Children: Enable display of all children
• Hide Children Containing: Disable display of all children containing a specific DB-
Content
• Show Children Containing: Enable display of all children containing a specific DB-
Content
• Hide Siblings: Disable display of all sibling items (children of parent except this one)
• Show Siblings: Enable display of all sibling items (children of parent except this one)
• Clear Labels: Removes all labels

Selection Operations
• Select Data: Select (only) target reports in the group
• Add Data to Selection: Add target reports in the group to selection
• Remove Data from Selection: Remove target reports in the group from selection

Ground Lines Operations If height information is used, the following operations exist:
• Show Ground Lines: Shows the connection lines to the ground for all target reports
in the group.
• Clear Ground Lines: Clears the connection lines to the ground for all target reports
in the group.
OSG VIEW 202

Ground Speed Vector Operations If ground speed information is loaded (see Ground
Speed), the following operations exist:
• Show Ground Speed Vectors: Shows ground speed vectors for all target reports in
the group.
• Clear Ground Lines: Clears ground speed vectors for all target reports in the group.

Position Accuracy Ellipses Operations If position accuracy information is loaded (see


Position Accuracy), the following operations exist:
• Show Position Accuracy Ellipses: Shows position accuracy ellipses for all target re-
ports in the group.
• Clear Position Accuracy Ellipses: Clears position accuracy ellipses for all target re-
ports in the group.

Special Root Geometry Layer Operations This layer has special operation present:
• Reset All Styles: Sets all style information to default (including make all data visible)
• Clear All Assocations: Removes display of all shown associations of all data

Measurement Operations

To access the map operations, click the measurement symbol , then the following oper-
ations are available:
• Copy All Texts: Copies measurement data as text to the clipboard
• Delete All: Removes all measurements

Figure 126: OSG View measurement layer operations

A copied measurement text looks as follows:


OSG VIEW 203

1
Distance (m): 188.59
Distance (nm): 0.1018
Bearing (deg): 26.366
Point1: 36.56370614, 15.74786596
Point2: 36.56522404, 15.74880269

2
Distance (m): 242.09
Distance (nm): 0.1307
Bearing (deg): 141.243
Point1: 36.55986876, 15.72751885
Point2: 36.55817286, 15.72921376

Radars Operations
How to define radar attributes is defined in Configure Data Sources.

For each defined radar, the following information can be shown:


• Label text at center position
• Range rings (if defined)
– Minimum/maximum range
– Colored for PSR in brown, SSR in blue, Mode S in purple

The radars layer display configuration is stored in the configuration and restored upon
startup.

To access the map operations, click the (top) radars symbol , then the following
operations are available:
• Show All: Shows all radars
• Hide All: Hides all radars
• Toggle Labels: Shows/hides labels for all radars
• Toggle Range Rings: Shows/hides all range rings for all radars
OSG VIEW 204

Figure 127: OSG View radars layer operations

Sectors Operations
How to define sectors attributes is defined in Configure Sectors.

For each defined sector, a colored polygon is shown, grouped by it’s layer. Such a
structure could look as follows:
• Layer A
– Sector A1
– Sector A2
• Layer B
– Sector B1
Each layer or sector can be shown/hidden. The sectors layer display configuration is
stored in the configuration and restored upon startup.

For the examples shown in the respective task, the following display can be shown:
OSG VIEW 205

Figure 128: OSG View with 2D Sector Examples


OSG VIEW 206

Figure 129: OSG View with 3D Sector Examples

Map Operations

To access the map operations, click the globe symbol . Please note that only the main
Map item has a context menu, and only allows setting of map files and changing of the
opacity.
• Map File: Change the background map
• Opacity: Change the background map opacity
OSG VIEW 207

Changing the Background Map Please note that, while the default background map
is supplied COMPASS, the other background map types are downloaded from public
Internet sources and therefore require an Internet connection. They are then cached locally
to facilitate faster access.

For each map layer defined in the background map, a checkbox is shown to disable the
layer, and by clicking on the map layer symbol it’s opacity can be changed.
To change the background map, click the globe symbol in root Map layer to access
the map selection.

The following maps are commonly available:


• arcgis.earth*
• minimal.earth*
• openstreetmap.earth*
• openstreetmap_german.earth*
• readymap.earth
• readymap-detailed.earth

Please note that for each map marked with * a 3D version (as listed) and a 2D version
(with filename suffix ’_2d’) exists. Each of them contains similiar content, but changes the
display mode to 3D/2D upon selection.

The map loading and display in based on the osgEarth library (http://osgearth.
org/), as are to map file definitions.

The map which can be set using this dialog is simply a file list from the folder
’~/.compass/data/maps’. So, changes can be made to the supplied ones or custom user
maps can be added to this folder.
Please refer to section Adding/Changing Map Files for further details.
OSG VIEW 208

ArcGIS Map As supplied in the osgEarth example files, this map data is obtained
from ArcGIS Online (https://doc.arcgis.com/en/arcgis-online/reference/
what-is-agol.htm). It shows satellite imagery, supplied with elevation data from
ReadyMap.

Figure 130: OSG View Arcgis map


OSG VIEW 209

Minimal Map This minimal map shows national borders based on an ESRI shapefile,
provided by Bjorn Sandvik on thematicmapping.org and European airports, provided
by https://ec.europa.eu/eurostat/web/gisco/geodata/reference-data/
transport-networks.

Figure 131: OSG View minimal map


OSG VIEW 210

Open Street Map This very useful map shows map data from https://www.
openstreetmap.org/.

Figure 132: OSG View OpenStreetMap

It is possible to zoom in to a very high level of detail, to even inspect airport layouts.
OSG VIEW 211

Figure 133: OSG View OpenStreetMap Vienna Airport


OSG VIEW 212

Open Street Map German This very useful map shows map data from https://www.
openstreetmap.de/.

Figure 134: OSG View OpenStreetMap German

It is possible to zoom in to a very high level of detail, to even inspect airport layouts.
OSG VIEW 213

ReadyMap & ReadyMap Detailed This map also shows satellite data, from http://
web.pelicanmapping.com/readymap-tiles/.

Figure 135: OSG View ReadyMap

This detailed version shows the same data as ReadyMap, but to a higher detail level.

Please note that this map includes an elevation layer, so mountains are modeled in 3D.
OSG VIEW 214

Figure 136: OSG View ReadyMap detailed elevation

Adding/Changing Map Files As with all configuration, a local version is kept in


the home folder of the user. The map files are loaded from the hidden folder
’~/.compass/data/maps’ (’~’ is the user’s home directory, like /home/user).
OSG VIEW 215

Figure 137: Maps Folder

In this folder, the previosly discussed maps exist as osgEarth ’.earth’ files. If new .earth
files are added, or the content of such files is changed, they can be used from the OSGView
after a restart.
The easiest example is the ’minimal_new.earth’ file, which uses an ESRI shapefile from
the subfolder ’shapefiles’ to display the national borders.
<map name="Wordwide Line Vectors" type="geocentric">

<options>
<lighting>false</lighting>
<terrain>
<min_tile_range_factor>8</min_tile_range_factor>
<color>#000000FF</color>
</terrain>

</options>

<feature_source name="world-data" driver="ogr">


<url>shapefiles/TM_WORLD_BORDERS-0.3.shp</url>
<convert type="line"/>
OSG VIEW 216

</feature_source>

<feature_model name="world_boundaries" feature_source="world-data


,→ ">

<layout tile_size="100000" paged="true">


<level max_range="1e10"/>
</layout>

<styles>
<style type="text/css">
world {
stroke: #ffff00;
stroke-width: 2px;
stroke-tessellation-size: 1km;
render-lighting: false;
altitude-clamping: none;
render-depth-test: false;
}
</style>
</styles>

</feature_model>

</map>
The world background colour is set using the ’terrain color’ tag. The name of the
shapefile is given using the ’url’ tag, the line colour and width is set in the style below.
Basically a user can add their own files to the ’shapefiles’ folder, and simply duplicate the
’feature_model’ part with their own ESRI shapefiles. This could the look like this:
<map name="Wordwide Line Vectors" type="geocentric">

<options>
<lighting>false</lighting>
<terrain color="#101010ff"/>
</options>

<feature_source name="world-data" driver="ogr">


<url>shapefiles/TM_WORLD_BORDERS-0.3.shp</url>
<convert type="line"/>
</feature_source>

<feature_model name="world_boundaries" feature_source="world-data


,→ ">

<layout tile_size="100000" paged="true">


OSG VIEW 217

<level max_range="1e10"/>
</layout>

<styles>
<style type="text/css">
world {
stroke: #ffff00;
stroke-width: 2px;
stroke-tessellation-size: 1km;
render-lighting: false;
altitude-clamping: none;
render-depth-test: false;
}
</style>
</styles>

</feature_model>

<feature_model name="doi">
<features name="wolrd" driver="ogr">
<url>shapefiles/doi.shp</url>
<build_spatial_index>true</build_spatial_index>
<ogr_driver>ESRI Shapefile</ogr_driver>
<convert type="line"/>
</features>

<layout tile_size="100000">
<level max_range="1e10"/>
</layout>

<styles>
<style type="text/css">
states {
stroke: #00ff00;
stroke-width: 2px;
render-depth-test: false;
}
</style>
</styles>
</feature_model>
</map>
While a number of formats are supported, to add a KML file, add the following part to
a ’map’ (as previously):
<feature_model name="wam_area">
<features name="wam_area" driver="ogr">
<url>shapefiles/wam_area.kml</url>
OSG VIEW 218

<ogr_driver>LIBKML</ogr_driver>
<build_spatial_index>true</build_spatial_index>
</features>

<styles>
<style type="text/css">
states {
stroke: #0000ff;
stroke-width: 2px;
render-depth-test: false;
}
</style>
</styles>
</feature_model>
To add a GML file, add the following part to a ’map’ (as previously):
<model driver="feature_geom" name="gml" cache_enabled="false">
<features driver="ogr">
<ogr_driver>GML</ogr_driver>
<url>shapefiles/example.gml</url>
<caching_policy usage="no_cache"/>
</features>
</model>
To add a GeoTIFF file, add the following part to a ’map’ (as previously):
<image driver="gdal" name="tiff" cache_enabled="false" visible="
,→ false">
<url>/usr/share/osgearth/data/world.tif</url>
<caching_policy usage="no_cache"/>
</image>
To add a graticule (latitude/longitude grid), add the following part to a ’map’ (as pre-
viously):
<geodetic_graticule name="Graticule" visible="true">
<color>#ffff007f</color>
<label_color>#ffffffff</label_color>
<grid_lines>20</grid_lines>
<resolutions>10 5.0 2.0 1.0 0.5 0.25 0.125 0.0625 0.03125</
,→ resolutions>
</geodetic_graticule>
For further information please refer to the osgEarth user manual https://
buildmedia.readthedocs.org/media/pdf/osgearth/latest/osgearth.pdf, e.g.
in Section Features & Symbology.
OSG VIEW 219

Style Tab

Figure 138: OSG View Style tab

In the ’Style’ tab, several elements exist:


OSG VIEW 220

• Layer Mode: Defines how layers are generated. Please refer to Section Layer Mode
for details.
• Connect Last Layer: Whether grouped target reports in the last layer should be con-
nected using lines
• Connect None Height: If groups are connected, whether target reports with not
height information should be connected
• Blend Mode: Defines the drawing blend mode, allowing clearer symbols (contour)
or colors (src-over).
• Style: Defines how geometry is styled. Please refer to Section Style for details.
• Render Order: Defines the drawing order of DBContents. To bottom one is drawn
first, the top one last (over all others)
• Update button: Triggers a redraw or reload of the geometry, becomes available after
a change if needed.

Layer Mode
In this selection the way layers are generated can be changed.

The following items can be present in the list:

• DBContent: DBContent type, e.g. Radar, MLAT, ...


• DS ID: Data source identifier, e.g Radar1, Radar2, ARTAS
• Aircraft Identification: Mode S Target Identification
• Aircraft Address: Mode S Target Address
• Track Number: Track number (local or system)
• Mode 3/A Code: Mode 3/A code
• Line ID: Line ID from import
• UTN: Unique Target Number, only available if association information is present

The layer mode defines what layers are generated, e.g. for ’A’ only layers for all values
of ’A’ are created, for ’A:B’ layers for all values of ’A’ are created, each with sub-layers for
all values of ’B’. In this case, ’A’ is the parent, ’B’ is the child.

If no values exist in the data for a layer, this data is grouped in the layer ’None’.

The following modes exist:

• DBContent:DS ID
OSG VIEW 221

• DBContent:DS ID:Line ID
• DBContent:DS ID:Line ID:Aircraft Address
• DBContent:DS ID:Line ID:Track Number
• DBContent:DS ID:Aircraft Identification
• DBContent:DS ID:Aircraft Address
• DBContent:DS ID:Track Number
• DBContent:DS ID:Mode 3/A Code
• UTN:DBContent:DS ID
• UTN:DBContent:DS ID:Line ID
• UTN:DBContent:DS ID:Aircraft Identification
• UTN:DBContent:DS ID:Aircraft Address
• UTN:DBContent:DS ID:Track Number
• UTN:DBContent:DS ID:Mode 3/A Code
• Aircraft Identification:DBContent:DS ID
• Mode 3/A Code:DBContent:DS ID
• Aircraft Address:DBContent:DS ID
• Aircraft Address:DBContent:DS ID:Line ID

Please note that the UTN layer modes only exist when association information is
present.

Please also note that after a change in the Layer mode a redraw has to be triggered
before the changes take effect.

As examples, a few values for the Layer mode are listed.

DBContent:DS ID In this layer mode the DBContent name is used to create the first
layer, with sub-layers for each data source.
OSG VIEW 222

Figure 139: OSG View layer mode DBContent:DS ID

Mode 3/A Code:DBContent:DS ID In this layer mode the Mode 3/A code is used to
create the first layer, with sub-layers for each DBContent and data source.
OSG VIEW 223

Figure 140: OSG View layer mode Mode 3/A Code:DBContent:DS ID

UTN:DBContent:DS ID In this layer mode the UTN is used to create the first layer, with
sub-layers for each DBContent. All target reports without a UTN (not used by ARTAS) are
grouped into layer ’None’.

An example of this mode is shown in the styling section.

Connect Last Layer & Connect None Height


If the ’Connect Last Layer’ checkbox is set, connection lines between all target reports in
the last layer are drawn, except for the ’None’ layer.

Please note that connection lines for target reports with a time-of-day difference larger
than 5 minutes will be omitted.

If this is activated for a Layer mode in which the last layer is not target-specific,
this will lead to a sub-optimal representation.
OSG VIEW 224

This mode (normally) makes sense in one of the following Layer modes:
• DBContent:DS ID:Line ID:Aircraft Address
• DBContent:DS ID:Line ID:Track Number
• DBContent:DS ID:Aircraft Identification
• DBContent:DS ID:Aircraft Address
• DBContent:DS ID:Track Number
• UTN:DBContent:DS ID:Aircraft Identification
• UTN:DBContent:DS ID:Aircraft Address
• UTN:DBContent:DS ID:Track Number
• Aircraft Identification:DBContent:DS ID
• Aircraft Address:DBContent:DS ID
• Aircraft Address:DBContent:DS ID:Line ID
OSG VIEW 225

Figure 141: OSG View Data with lines

If the height information is used (3D view) and if target reports without height infor-
mation are connected, the lines clutter the display. The ’Connect None Height’ checkbox
allows to set the behaviour.

Please note that changes both these values requires a manual redraw using the ’Re-
draw’ button.

Style
There exist 3 main elements for styling:
OSG VIEW 226

• Preset drop-down menu: Selects currently active style


• Style configuration area: Area below the preset menu, allows configuration of the
current style
• Reset Styles button: Clears all style presets to their default values

Each style can be composed of 3 elements:


• Default Style: Base style which sets the basic styling for all data
• Rule Generator: Generated e.g. layer-specific specific rules
• Generated Rules: Rules which which override the Default Style

Please note that some style presets (e.g. layer style per ACID, ACAD, Track Num-
ber, Mode 3/A Code) generate lots of different (persistent) styling rules, which decreases
startup speed. After using such styles it is possible to reset the styles using the ’Reset
Styles’ button to increase application startup speed.

There are two types of style presets:


• Layer-based presets: Perform styling per layer
• Target report based presets: Set symbol color per data variable value

Layer-based Style Presets The following layer-based style presets exist:


• Default: All data is shown in the same style
• Layer Color per DBContent: Style is defined by DBContent type
• Layer Color per UTN: Style is defined by UTN value
• Layer Color per ACID: Style is defined by Mode S Target Identification value
• Layer Color per DS ID: Style is defined by data source
• Layer Color per Mode 3/A Code: Style is defined by Mode 3/A code
• Layer Color per ACAD: Style is defined by Mode S Target Address
• Layer Color per Track Number: Style is defined by track number

Please note that after changing the style to one of these value a redraw has to be
triggered.

Please also note that for such presets the data from which the style is derived has
to be present in the Layer mode, otherwise the layer is styled with a common base style.
OSG VIEW 227

Target Report based Style Presets The following Target report based style presets exist:
• Color by ADS-B MOPS: Color is defined by the ADS-B MOPS version of the
transponder
• Color by ADS-B Position Quality: Color is defined by the ADS-B NUCp/NACp
value of the target report
• Color by Flight Level: Color is defined by the Mode C code/Flight level
• Color by Speed: Color is defined by groundspeed value
• Color by Detection Type: Color is defined by Radar detection type (PSR, SSR,
PSR+SSR, ...)
• Color by Track Angle: Color is defined by direction of movement value
• Color by Track Age Type: Color is defined by track age of variable

Please note that after changing the style to one of these value a reload has to be trig-
gered.

Customized Styling
While the presets come with (mostly) reasonable default values, adaptation can be per-
formed by clicking on the value that should be changed, either in the Default Style or the
Generated rules. Any changes are applied immediately to the geometry.

Style Examples
To give a few examples, some interesting Layer modes and Style preset combinations are
given:
OSG VIEW 228

Figure 142: OSG View Layer color per DS ID


OSG VIEW 229

Figure 143: OSG View Layer color per Mode 3/A Code
OSG VIEW 230

Figure 144: OSG View Layer color per Track Number


OSG VIEW 231

Figure 145: OSG View Layer color per UTN


OSG VIEW 232

Figure 146: OSG View Color by ADS-B MOPS version


OSG VIEW 233

Figure 147: OSG View Color by ADS-B position quality


OSG VIEW 234

Figure 148: OSG View Color by Flight Level


OSG VIEW 235

Figure 149: OSG View Color by Detection type


OSG VIEW 236

Figure 150: OSG View Color by Speed


OSG VIEW 237

Figure 151: OSG View Color by track angle


OSG VIEW 238

Figure 152: OSG View Color by track Mode 3/A age


OSG VIEW 239

Figure 153: OSG View Color by track MLAT age

Please note, in this style one can set the following parameters (by clicking the respective
value): In this style, one can change the following parameters:
• Variable used (naming as in ASTERIX)
• Time Interval used
• Colors for the 4 intervals (+ None = not set color)
OSG VIEW 240

Figure 154: OSG View Color by track ages


OSG VIEW 241

Render Order
In the render order widget, the drawing order of the drawn geometry is specified. The
one at the top is drawn last (over all others), so it is useful to move the most important
DBContent (for the current inspection) to the top.

Figure 155: OSG View render order

Using the ’Order By’ checkbox, the drawing order can be defined based on:
• DBContent: Data source type (e.g. Tracker, Radar, ...)
• ds_id: Data source (e.g. Tracker1, Tracker2, Radar3, ...)

To change the drawing order click on the item to move and use the buttons on the right
side. Please note that no redraw is required and that the drawing order is persisted in the
configuration.

• Move to Top: Move item to top position

• Move Up: Move item one position up

• Move Down: Move item one position down

• Move to Bottom: Move item to the bottom position


OSG VIEW 242

Labels Tab

Figure 156: OSG View Labels tab

In the ’Labels’ tab, several elements exist:


OSG VIEW 243

• button: Edit label contents


• Auto Label Checkbox: Toggles auto-label feature
• Level of Detail: What level of detail should be used for labels
• Data Sources: Selects which data sources to label and label direction
• Label Filters: Filters which targets are labeled

When the auto-label feature is active and one or more data sources are selected for
labeling, automatic labels are generated in the OSG View according to the content and
label filter settings.

Label Contents

When clicking the ’Edit Label Contents’ button , the label content specific to a DBCon-
tent can be selected.

The label contents are organized in a 3x3 matrix, the contents if the first 2 rows and
columns are fixed while row 3 and column 3 can be edited.

Figure 157: OSG View Edit Label Contents

Multiple level of details (LoDs) are defined (as indicated with border lines in the edit
dialog):
• LoD 1: 1x1 matrix with the best available identification
• LoD 2: 2x2 matrix, LoD 1 with the most important secondary information
• LoD 3: 3x3 matrix, LoD 2 with some additional information

• LoD 1
OSG VIEW 244

– Row 1, column 1: Best available indentification in bold text, in the following


order
* Aircraft Identification, Aircraft Address, Mode 3/A Code, or ’?’ if not avail-
able
• LoD 2
– Row 1, column 2: Aircraft Identification
– Row 2, column 1: Mode 3/A Code
– Row 2, column 2: Mode C Code in flight levels
• LoD 3
– Row 1, column 3: Aircraft Address
– Row 2, column 3: Data source ID
– Row 3, column 1: Ground Speed
– Row 3, column 2: empty by default
– Row 3, column 3: Time of Day

Automatic Labeling
If a data source to be labeled is activated, automatic labels are generated (into the configu-
rated direction) and updated once per second.

Please note that when automatic labeling is active, manually created labels are removed
once per second. This will be improved in the future.
OSG VIEW 245

Figure 158: OSG View Auto-Labels LoD 1

For each target (as in the the last layer item according to Layer Mode) a single label is
created for the latest shown target report.

The Level of Detail can have the following values:


• Auto: Automatic label size depending on number of visible labels
• LoD 1: 1x1 matrix
• LoD 2: 2x2 matrix
• LoD 3: 3x3 matrix
OSG VIEW 246

Depending on the number of visible labels, the LoD is chosen by Auto LoD, if (e.g. with
zoom or time window filter operations) the number of visible labels is reduced, a higher
LoD is shown.

Figure 159: OSG View Auto-Labels LoD 2

The automatic labeling can be most useful when choosing a target-specifc layer mode
(e.g. based on unique secondary identification or UTN) and activating the time window
filter (or in Live mode).
OSG VIEW 247

Figure 160: OSG View Auto-Labels LoD 3

Label Filters
When automatic labeling is activated, labeling of targets can be restricted using the label
filters. Each filter can be activated and its filter value set to label only targets with specific
attributes. The filters offer similar ways as the ones defined in Filters.

Mode 3/A Codes When active, this filter forces labeling of data with the given Mode A
code(s), so it is possible to give multiple values (in octal notation, separated by commas).
E.g. ’7000’ is possible, or ’7000,7777’. Target reports without a given Mode A will not be
OSG VIEW 248

labeled unless the value ’NULL’ is (also) given.

Mode C When active, based on the Mode C Min and Mode C Max values, target reports
are only labeled if the minimum and maximum flight level matches the specified thresh-
olds. Target reports without a Mode C Code will not be labeled unless the NULL checkbox
is checked.

Aircraft Identifcation When active, this filter forces labeling of data only from aircraft
identifications matching the given expression. The percent operator denotes a ’any char-
acters’ placeholder. So e.g. ’%TEST%’ will match ’TEST123’ or ’TEST123 ’ (with spaces) or
’MYTEST’. Target reports without a given aircraft identification will not be labeled unless
the value ’NULL’ is (also) given.

Aircraft Address When active, this filter forces labeling of the given Mode S address(es),
so it is possible to give multiple values (in hexadecimal notation, irrespective of up-
per or lower case characters, separated by commas). E.g. ’FEFE10’ is possible, or
’FEFE10,FEFE11,FEFE12’. Target reports without a given Mode S address will not be la-
beled unless the value ’NULL’ is (also) given.
OSG VIEW 249

Evaluation Tab

Figure 161: OSG View Evaluation tab

In the Evaluation tab, labels exist to denote which evaluation results are currently shown,
OSG VIEW 250

and colors can be set for the specific requirement results.


OSG VIEW 251

Others Tab

Figure 162: OSG View Others tab


OSG VIEW 252

In the ’Others’ tab, several elements exist:


• Height: Defines if and how height information is used.
• Ground speed: Defines if and how ground speed information is used.
• Position Accuracy: Defines if and how position accuracy information is used.
• Radar Default Accuracies: Defines Radar default accuracies.
• Update button: Triggers a redraw or reload of the geometry, becomes available after
a change if needed.

Height
Per default, a target reports height is not used for display, which is common in current
air-traffic displays. However, in certain situation a true 3D display might be of interest to
a user, and therefore several options where incorporated:
• Use Height: Use the height based on Mode C hc , transformed to meters
• Offset Factor: If height information is used, this factor ho is added to the height
• Scale Factor: If height information is used, this factor hs is used to multiply the height
• Null Offset: If height information is used, this factor hn is used for target reports
without height information

Generally, if no height information is given (no Mode C code), the height is either 0
or the height offset (if used). That means that those target reports appear to the on the
ground. If connection lines are drawn between the ones in the air and those without, a lot
of annoying lines are shown.

The formula to calculate the height (if existing) is as follows:

h = ho + hs · hc [m]
If no height information is given:

h = hn [m]
Please note that upon changes to the height usage, a manual redraw has to be per-
formed using the ’Redraw’ button.
OSG VIEW 253

Figure 163: OSG View use height

Ground Speed
• Load Ground Speed Variables: Defines if ground speed variables should be loaded
from the database. Must be enabled to enable display of ground speed vectors.
• Ground Speed Period [s]: Length of ground speed vectors, in seconds.

If loading is enabled (possible re-load required) the ground speed vectors can be shown
using Geometry Operations:
OSG VIEW 254

Figure 164: OSG View groundspeed vectors

Position Accuracy
• Load Position Accuracy Variables: Defines if position accuracy variables should be
loaded from the database. Must be enabled to enable display of position accuracy
ellipses.
• Position Accuracy Scale [sigma]: Size of accuracy ellipses, defined in standard devi-
ations:
– 1.0 σ: 68.268 %
– 1.5 σ: 86.638 %
OSG VIEW 255

– 2.0 σ: 95.44 %
– 2.5 σ: 98.75 %
– 3.0 σ: 99.73 %
– 3.5 σ: 99.95 %
– 4.0 σ: 99.9936 %
– 4.5 σ: 99.999320 %
– 5.0 σ: 99.99994266 %
• Accuracy Ellipse Num Points Factor [1]: Multiplication factor to calculate number of
ellipse points based on size
• Accuracy Ellipse Max. Num Points [1]: Maximum number of ellipse points

If loading is enabled (possible re-load required) the position accuracy ellipses can be
shown using Geometry Operations:
OSG VIEW 256

Figure 165: OSG View position accuracy ellipses

How the position accuracy ellipses are generated is defined in Positions Accuracy El-
lipses.

Radar Default Accuracies


In the following items, the default Radar accuracies can be set:
• Primary Azimuth StdDev [deg]: PSR azimuth standard deviation, in degrees
• Primary Range StdDev [m]: PSR range standard deviation, in meters
• Secondary Azimuth StdDev [deg]: SSR azimuth standard deviation, in degrees
OSG VIEW 257

• Secondary Range StdDev [m]: SSR range standard deviation, in meters


• Mode S Azimuth StdDev [deg]: Mode S Radar azimuth standard deviation, in de-
grees
• Mode S Range StdDev [m]: Mode S Radar range standard deviation, in meters
• Use Radar Minimum StdDev: Defines if minimum or maximum standard deviation
should be used for combined plots

These values are only used if the associated data source does not have custom accuracy
values set (see Configure Data Sources).

How the Radar default accuracies are used is defined in Positions Accuracy Ellipses.
ScatterPlot View

A ScatterPlot View displays the distribution of two numerical variables as points. When
started, it presents itself in the following manner.

258
SCATTERPLOT VIEW 259

Figure 166: Scatterplot View startup

Layout
On the left side resides the plot area in which the data is visualized (if data has been
loaded). The tool bar at the top shows the currently selected mouse interaction mode and
the available actions.

On the right side resides the configuration area, which allows configuring what data is
loaded and how it is displayed. The ’Reload’ button on the bottom can be used to trigger
a reload of the view’s data.

Both areas can be resized and hidden if desired.


SCATTERPLOT VIEW 260

Data Loading
To load the data the mechanism described in Section UI Overview or the ’Reload’ button
can be used. To filter the dataset, the mechanism described in Section Filters can be used.

Figure 167: Scatterplot View after loading data

The values of the selected variables are used for positioning on the x-axis and y-axis.
For each value pair a data point is generated. In the current example the WGS-84 meta-
variables ’Longitude’ and ’Latitude’ are used, showing a similar view as the OSG View.
The color of each data point is defined by the DBContent it belongs to.

On the bottom of the plot a legend is shown, giving the total counts of all data points.
SCATTERPLOT VIEW 261

Usage
Toolbar
The first tool buttons can be used to switch between the various mouse interaction modes:

Icon Text Description


Navigate Allows navigation of the data
Zoom to Rectangle Allows zooming to the selected rectangle
Select Allows data selection & de-selection
Table 13: Toolbar mouse interaction modes

The others provide general actions by which the view can be modified (shortcut refers
to keyboard shortcut):

Icon Shortcut Text Description


Invert Selection Selects all de-selected & vice versa
Delete Selection De-selects all target reports
Zoom to Home Pans/zooms to show all existing data
Table 14: Toolbar actions

Config Tab
The selection controls on the top define which data variables are used to generate data
points for the x/y-axis. Such a variable can be any numerical variable. A reload operation
might be required for the selection to take effect.

Please note that visualization of evaluation result data is currently not implemented.

Scatterplot
General Zoom
The mouse wheel can be used to zoom in or out of the presented data, the space key can
be used to reset to the default zoom level (euqivalent to ).

Navigation Mode
In ’Navigate’ mode, the left mouse-button can be used to pan the shown data.
SCATTERPLOT VIEW 262

Zoom to Rectangle Mode


In ’Zoom to Rectangle’ mode, the left mouse-button can be used to select a rectangular
region, to which a zoom operation is performed.

Selection Mode
In ’Select’ mode, data can be selected. The first left mouse-button click starts selection
(showing a red rectangle), the second click finalizes the selection. All data points inside
the rectangular area are selected.

Figure 168: Scatterplot View data selection

The selected data is then presented in an extra ’Selected’ entry in the legend, showing
the count of all selected data points.
SCATTERPLOT VIEW 263

Figure 169: Scatterplot View data selected

This enables selection of parts of the data based on the presented variables, allowing
deeper analysis e.g. of dubious data.

The ’Invert Selection’ or ’Delete Selection’ actions allow for easier selection of
the wanted target reports.

By pressing the ’Control’ key during the second click, the newly selected data is added
to any previous selection. This can be used to select data incrementally, making more
complex selections possible.
Live Mode

The application switches in Live mode when ASTERIX data is imported from the network,
as described in Import ASTERIX from Network.

The data sources network lines have to be defined in as described in Data Sources
Table Content.

When that Live mode is enabled and the correct network lines are setup (and active)
the main window is shown as follows.

264
LIVE MODE 265

Figure 170: Main Window in Live Mode

In Live mode, most application components are the same as in Offline mode (although
some are deactivated), but in the main status bar the Live mode is indicated and a ’Stop’
button allows stopping the network recording and returning to Offline mode.
LIVE MODE 266

Data Sources & Processing


For each data source, the defined lines are shown (as L1 ... L4 buttons). If no data is re-
ceived (on a define Line), the respective button is disabled (greyed out). If data is received
over a line, it will be come active and the received data stored in the database.

The line button can also be toggled - if a bold border is shown the data from the
respective line stored not only in the database but also in main memory (RAM). The most
recent 5 minutes of data are kept in main memory (RAM), and can be visualized in the
existing Views.

Please note that currently only the OSG View displays data in live mode, for perfor-
mance reasons the other Views are inactive.

OSG View
In Live mode, the OSG View automatically shows the same elements as with the time filter,
and the main components are the same as in Offline mode (although some are deactivated).
LIVE MODE 267

Figure 171: OSG View in Live Mode

In the time filter elements, at maximum 5 minutes of past data is available. In a 1 sec-
ond update newly received network data is shown, and the labels updated (if automatic
labeling is enabled).

Using the time scrollbar, past data can be inspected, which is kept until it becomes
outdated (older than 5 minutes) or the scrollbar is moved to the most right position. In
this position, the displayed time window will again follow the most recent time.
Command Line Options

Several command line options have been added to allow for semi-automated running of
tasks or convinient usage. This also allows for (limited) automated batch processing of
data.

Please note that configuration of the application still has to be performed using
the GUI, therefore it is required to set up the application correctly before command line
options can be used successfully.

Please also note that error or warning message (or related confirmations) will still
halt the automatic running of tasks, to ensure that the user is always aware of occuring
issues.

To get a list of available command line options use e.g.:


./COMPASS-release_x86_64.AppImage --help

Allowed options:
--help produce help message
-r [ --reset ] reset user configuration and data
--expert_mode set expert mode
--create_db arg creates and opens new SQLite3 database
with given filename, e.g.
’/data/file1.db’
--open_db arg opens existing SQLite3 database with
given filename, e.g. ’/data/file1
,→ .db’
--import_data_sources_file arg imports data sources JSON file with
given filename, e.g. ’/data/ds1.
,→ json’
--import_view_points arg imports view points JSON file with
given filename, e.g. ’/data/file1
,→ .json’
--import_asterix_file arg imports ASTERIX file with given
filename, e.g. ’/data/file1.ff’
--import_asterix_file_line arg imports ASTERIX file with given line
,→ ,

268
COMMAND LINE OPTIONS 269

e.g. ’L2’
--import_asterix_network imports ASTERIX from defined network
UDP streams
--import_asterix_network_time_offset arg
used time offset during ASTERIX
,→ network
import, in HH:MM:SS.ZZZ’
--import_asterix_network_max_lines arg
maximum number of lines per data
,→ source
during ASTERIX network import,
,→ 1..4’
--asterix_framing arg sets ASTERIX framing, e.g. ’none’,
’ioss’, ’ioss_seq’, ’rff’
--asterix_decoder_cfg arg sets ASTERIX decoder config using JSON
string, e.g. ’’{"10":{"edition
,→ ":"0.31"}
}’’ (including one pair of single
quotes)
--import_gps_trail arg imports gps trail NMEA with given
filename, e.g. ’/data/file2.txt’
--import_gps_parameters arg import GPS parameters as JSON string,
e.g. ’’{"callsign": "ENTRPRSE",
"ds_name": "GPS Trail", "ds_sac":
,→ 0,
"ds_sic": 0, "mode_3a_code": 961,
"set_callsign": true,
"set_mode_3a_code": true,
"set_target_address": true,
"target_address": 16702992,
"tod_offset": 0.0}’’ (including
,→ one
pair of single quotes)
--import_sectors_json arg imports exported sectors JSON with
given filename, e.g.
’/data/sectors.json’
--associate_data associate target reports
--load_data load data after start
--export_view_points_report arg export view points report after
,→ start
with given filename, e.g.
’/data/db2/report.tex
--evaluate run evaluation
--evaluation_parameters arg evaluation parameters as JSON string,
e.g. ’’{"current_standard": "test
,→ ",
"dbcontent_name_ref": "CAT062",
COMMAND LINE OPTIONS 270

"dbcontent_name_tst": "CAT020"}’’
(including one pair of single
,→ quotes)
--evaluate_run_filter run evaluation filter before evaluation
--export_eval_report arg export evaluation report after start
with given filename, e.g.
’/data/eval_db2/report.tex
--no_cfg_save do not save configuration upon quitting
--quit quit after finishing all previous steps

If additional command line options are wanted please contact the author.

Options
–create_db filename
Adds the supplied filename and creates a new SQLite3 database.

–open_db filename
Adds the supplied filename and opens an existing SQLite3 database.

–import_data_sources_file filename
Adds the data sources defined in the JSON file to the configuration, as described in Im-
port/Export of Configuration Data Sources.

–import_view_points filename
After a database was opened, adds the supplied filename and starts an import using the
task described in Import View Points.

–import_asterix_file filename
After a database was opened, adds the supplied filename and starts an import using the
task described in Import ASTERIX Recording.

–import_asterix_file_line arg
If an import using the task described in Import ASTERIX Recording is started, it will use
the given line identifier (L1 ...L4).

–import_asterix_network
After a database was opened, an import using the task described in Import ASTERIX from
Network is started.
COMMAND LINE OPTIONS 271

–import_asterix_network_time_offset arg
If an import using the task described in Import ASTERIX from Network is started, it will
use the given time offset, in HH:MM:SSS.

–import_asterix_network_max_lines arg
If an import using the task described in Import ASTERIX from Network is started, it will
use the given maximum number of input lines (and deactivate the others).

–asterix_framing framing
When an Import ASTERIX Task is started the given framing is used, the following options
exist:
• none: Raw, netto, unframed ASTERIX data blocks, equivalent to the ’empty’ value
in the GUI
• ioss: IOSS Final Format
• ioss_seq: IOSS Final Format with sequence numbers
• rff: Comsoft RFF format

–asterix_decoder_cfg ’str’
When an Import ASTERIX task is started the given configuration is used, in which the
editions and mapping can be specified for each category using a JSON string.

Using the following string the edition 0.31 can be set for category 010:
’{"10":{"edition":"0.31"}}’ (including one pair of single quotes)

In a nice formatting the string looks like this:


’{
"10":
{
"edition":"0.31"
}
}’

Please note the string "10" to identify catgory 010.

For one or a number of categories, the following options can be set:


• "edition": ASTERIX editions as string, e.g. "1.0"
• "ref_edition": ASTERIX reserved expansion field as string, e.g. "1.9"
• "spf_edition": ASTERIX special purpose field as string, e.g. "ARTAS"
• "mapping": Mapping used after decoding, e.g. "CAT010 to Radar"
COMMAND LINE OPTIONS 272

Please note that the naming must be exactly as in the GUI, otherwise the application
quits with an error message.

–import_gps_trail filename
After a database was opened, adds the supplied filename and starts an import using using
the task described in Import GPS Trails.

–import_gps_parameters ’str’
When an Import GPS Trail task is started the given configuration is used, in which the data
source and secondary parameters for the GPS trail is defined using a JSON string.
In a nice formatting the string looks like this:
’{
"callsign":"ENTRPRSE",
"ds_name":"GPS Trail",
"ds_sac":0,
"ds_sic":0,
"mode_3a_code":961,
"set_callsign":true,
"set_mode_3a_code":true,
"set_target_address":true,
"target_address":16702992,
"tod_offset":0.0
}’

Please note that both ’mode_3a_code’ and ’target_address’ must be given as decimal
values.

–import_sectors_json filename
After a database was opened, adds the sectors defined in supplied filename using the task
described in Manage Tab.

–associate_data
After a database with imported content exists, create target report associations using the
task described in Calculate Associations.

–load_data
Triggers a load process after opening the database.
COMMAND LINE OPTIONS 273

–export_view_points_report
After starting the application into the management window, a export View Points as PDF
process is triggered as described in Exporting View Points to PDF.
The given argument filename defines the report filename, with the report directory as
the parent directory of the given filename.

–evaluate
After opening a database a pre-configured evaluation run is triggered as described in Eval-
uation.

–import_gps_parameters ’str’
When an Evaluation task is started the given configuration is used, in which all parameters
of the evaluation can be defined using a JSON string.
In a nice formatting such a string can look like this:
’{
"active_sources_ref":{
"CAT062":{
"1234":true
}
},
"active_sources_tst":{
"CAT021":{
"2345":true
}
},
"current_standard":"Dubious Targets",
"dbcontent_name_ref":"CAT062",
"dbcontent_name_tst":"CAT021",
"use_grp_in_sector":{
"Dubious Targets":{
"SectorName1":{
"Optional":false
},
"SectorName2":{
"Optional":false
},
"SectorName3":{
"Optional":true
},
"SectorName4":{
"Optional":false
}
}
}
}’
COMMAND LINE OPTIONS 274

The configuration of such parameters is rather complex, and it is recommended to con-


tact the author if detailed information is needed.

–evaluate_run_filter
When an Evaluation task is started an automatic filtering of targets is performed, as con-
figured using the ’Filter UTNs’ dialog defined in Filtering Targets.

–export_eval_report
After a pre-configured evaluation run is was performed a report PDF is generated as de-
scribed in Evaluation.
The given argument filename defines the report filename, with the report directory as
the parent directory of the given filename.

–no_cfg_save
When quitting the application later, no configuration changes are saved.

–quit
After the other tasks were run, automatically quits the application.
Troubleshooting

Known Issues
CentOS Fuse Usermount Permissions
On some operating systems, the following error message is shown when starting the Ap-
pImage:

fuse: failed to exec fusermount: Permission denied

Cannot mount AppImage, please check your FUSE setup.


You might still be able to extract the contents of this AppImage
if you run it with the --appimage-extract option.
See https://github.com/AppImage/AppImageKit/wiki/FUSE
for more information
open dir error: No such file or directory
As the message states, the current user has no permissions to correctly mount
the AppImage. Please refer to the provded link https://github.com/AppImage/
AppImageKit/wiki/FUSE for information how to resolve this.

Missing glibc Library Versions


On some operating systems, the following error message is shown when starting the
AppImage:

/lib64/libc.so.6: version ‘GLIBC_2.14’ not found (required by ...)


/usr/lib64/libstdc++.so.6: version ‘GLIBCXX_3.4.14’ not found (required by ...)
/usr/lib64/libstdc++.so.6: version ‘GLIBCXX_3.4.15’ not found (required by ...)
/lib64/libc.so.6: version ‘GLIBC_2.14’ not found
(required by /tmp/.mount_COMPASS-XWB8RF/appdir/bin/../lib/libosgDB.so.130)
...
As stated in https://github.com/AppImage/AppImageKit/issues/398:

“...inside the AppImage requires a newer glibc version than is present on your target
system (CentOS 6.7 in your example). The recommended way to produce an AppImage

275
TROUBLESHOOTING 276

that would run on CentOS 6.7 would be to build on CentOS 6.x...”

Unfortunately this means that for e.g. CentOS 6.* (or older) an AppImage produced
using Ubuntu 14.04 will not run, and currently there no plans to build one (unless re-
quested by large number of users).

If this is the case for you please comment on https://github.com/hpuhr/COMPASS/


issues/35.

White OSGView & Shader Errors in Console Log


When the OSGView is started on older, unsupported Intel graphics cards the OSGView
only show a white window and the following error messages are logged in the console:
...
glLinkProgram 0x1af6a320"" FAILED

VERTEX glCompileShader "main(vertex)" FAILED


VERTEX Shader "main(vertex)" infolog:
0:1(10): error: GLSL 3.30 is not supported. Supported versions are:
,→ 1.10, 1.20, 1.30, 1.00 ES, 3.00 ES, and 3.10 ES

FRAGMENT glCompileShader "main(fragment)" FAILED


FRAGMENT Shader "main(fragment)" infolog:
0:1(10): error: GLSL 3.30 is not supported. Supported versions are:
,→ 1.10, 1.20, 1.30, 1.00 ES, 3.00 ES, and 3.10 ES

Program "" infolog:


error: linking with uncompiled/unspecialized shadererror: linking
,→ with uncompiled/unspecialized shadererror: linking with
,→ uncompiled/unspecialized shadererror: linking with uncompiled/
,→ unspecialized shadererror: linking with uncompiled/
,→ unspecialized shadererror: linking with uncompiled/
,→ unspecialized shadererror: linking with uncompiled/
,→ unspecialized shadererror: linking with uncompiled/
,→ unspecialized shadererror: linking with uncompiled/
,→ unspecialized shadererror: linking with uncompiled/
,→ unspecialized shadererror: linking with uncompiled/
,→ unspecialized shadererror: linking with uncompiled/
,→ unspecialized shader
glLinkProgram 0x82eefa0"" FAILED

This is due to the the used graphics libraries OpenSceneGraph and osgEarth depend-
ing on shaders defined in GLSL 3.30 (shading language version) and the used graphics
TROUBLESHOOTING 277

card driver not supporting this version of shaders.

There exists a workaround which might work for you: There exists the issue that in
some cases the shaders are supported in the used Mesa graphics driver, but are not de-
tected correctly. One can override the reported OpenGL version for one application using
a system variable and start the app in that mode using:
MESA_GL_VERSION_OVERRIDE=3.3 ./COMPASS-release.AppImage
At least on the authors workstation with an Intel graphics card and for some other
users this resolved the issue.

If this is not the case for you please comment on https://github.com/hpuhr/


COMPASS/issues/151

Graphical Issues
For issues of the following nature:
• Application might not even start (OpenGl version error)
• Slow display performance
• Graphical display errors (wrong colours, artefacts, etc)
If such a display issue exists, additional information is required.

Please verify that the installed graphics card and driver in use match the supported
version stated in Graphics Cards & Drivers.

Only if this is the case, please collect the output of glxinfo and create an issue report as
stated below.

Reporting Issues
There are several ways of reporting issues. The following steps have to be taken:
• Check if the issue was already reported
• Collect all required information
• Report the issue
Please also make sure that you are using the latest version of COMPASS, since the issue
might already have been corrected in the current version.

Also, if you’re planning on using COMPASS, it is of benefit if you register on https:


//github.com, which can be done for free. It is then possible for you to comment on
issues, create new ones or even contribute to the project.
TROUBLESHOOTING 278

Already Reported Issues


Please refer to https://github.com/hpuhr/COMPASS/issues for a list of the currently
known issues. An issue can be “Open” meaning it was not yet fixed, or “Closed”, meaning
it was at least fixed in the source code. If it is closed, it does not mean that the fix is already
included in the current AppImage, but that it will be in the next release.

Please look through the issues and make sure that yours isn’t already listed. If so, you
can comment on it to indicate the severity for you. If not, please proceed to the next step.

Collect Information
Basically everything that is needed to reproduce the error should be submitted. Depending
on the type of the error, this can differ, but at least the following information should be
given:
• Console log of the application
• Exact steps taken until the error occured

For Application Crashes


If the application crashed, it would also be of interest to get a stacktrace. This can
be achieved by running the application using gdb (https://www.gnu.org/software/
gdb/), which might have to be installed.
Then, the application can be run using “gdb ./COMPASS-XXX.AppImage”. The output
will look similiar to this:

GNU gdb (Ubuntu 8.0.1-0ubuntu1) 8.0.1


Copyright (C) 2017 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law. Type "show copying"
and "show warranty" for details.
This GDB was configured as "x86_64-linux-gnu".
Type "show configuration" for configuration details.
For bug reporting instructions, please see:
<http://www.gnu.org/software/gdb/bugs/>.
Find the GDB manual and other documentation resources online at:
<http://www.gnu.org/software/gdb/documentation/>.
For help, type "help".
Type "apropos word" to search for commands related to "word"...
Reading symbols from ./COMPASS-x86_64_RELDBG_0220.AppImage...done.
(gdb)
In the shown console, enter the command “run”. This will run the application. Then,
perform the same steps as previously to reproduce the application crash.
When the crash occurs, the output should look like this:
TROUBLESHOOTING 279

Thread 1 "AppRun" received signal SIGINT, Interrupt.


0x00007fffef950951 in __GI___poll (fds=0x7fffcc007630, nfds=3, timeout=15036)
at ../sysdeps/unix/sysv/linux/poll.c:29
29 ../sysdeps/unix/sysv/linux/poll.c: No such file or directory
Then, enter the command “backtrace”. This will show output similiar to this:

#0 0x00007fffef950951 in __GI___poll (fds=0x7fffcc007630, nfds=3, timeout=15036) at


../sysdeps/unix/sysv/linux/poll.c:29
#1 0x00007fffec5bb169 in ?? () from /lib/x86_64-linux-gnu/libglib-2.0.so.0
#2 0x00007fffec5bb27c in g_main_context_iteration ()
from /lib/x86_64-linux-gnu/libglib-2.0.so.0
#3 0x00007ffff308998c in QEventDispatcherGlib::processEvents
(QFlags<QEventLoop::ProcessEventsFlag>) ()
from /tmp/.mount_COMPASS-1XB9QP/appdir/bin/../lib/libQt5Core.so.5
#4 0x00007ffff303b96b in QEventLoop::exec(QFlags<QEventLoop::ProcessEventsFlag>)
() from /tmp/.mount_COMPASS-1XB9QP/appdir/bin/../lib/libQt5Core.so.5
#5 0x00007ffff30420e1 in QCoreApplication::exec() () from
/tmp/.mount_COMPASS-1XB9QP/appdir/bin/../lib/libQt5Core.so.5
#6 0x00000000006be3c6 in main ()
This is called a stacktrace. Please copy all of this text so that it can be subitted with the
issue report.

Issue Reporting
Please either create a new isse on GitHub (https://github.com/hpuhr/COMPASS/
issues) or send a mail to compass@openats.at, and include all of the previously collected
information.

If the supplied information is not enough you will be contacted as soon as time allows,
with a request for further detail.
Appendix

Appendix: Configuration
The application configuration is stored in the user home directory in a sub-folder ’.com-
pass’.

If such a folder does not exist, or the configuration is outdated, the default configura-
tion is copied from the AppImage into said folder.

Depending on the COMPASS version, different behaviours exist, as described in the


following sections.

v0.6.0 and later


In the ’ /.compass’ folder, separate sub-folders exist for each application version, allowing
for usage of different version of COMPASS.
Directory structure:
• 0.6.0/: Configuration & data for version 0.6.0
– conf/: Configuration as described in Configuration Folder
– data/: Data as described in Data Folder
• osgearth_cache: OSGEarth cache folder (for downloaded map information)

Pre-v0.6.0 and older


In the ’ /.compass’ folder, one configuration for the most current application version exists,
which is kept up-to-date with an update mechanism.
Directory structure:
• conf/: Configuration as described in Configuration Folder
• data/: Data as described in Data Folder
In older versions, the osgearth_cache folder is located in data/maps.

Configuration Folder
All application configuration is stored as JSON files, which are read during startup and
saved during (correct) shutdown of the application.

280
APPENDIX 281

• config.json: Main configuration file defining application version, sub-config location


etc.
• log4cpp.properties: Logging configuration
• default/: Sub-configuration

Data Folder
The data folder contains e.g. used images, 3rd party (static) configuration etc.
• fonts: Used fonts
• gdal: GDAL libary data & configuration
• icons: Application icons
• jasterix_definitions: jASTERIX libary definitions
• maps: OSGView map files
• textures: OSGView symbol textures

Appendix: Data Sources


Currently two version for JSON files for data sources definitions are supported, the current
and the deprecated format.

Current Format

{
"content_type": "data_sources",
"content_version": "0.2",
"data_sources": [
{
"ds_type": "Tracker",
"name": "MRTS",
"sac": 1,
"sic": 3
},
{
"ds_type": "Tracker",
"name": "ARTAS",
"sac": 1,
"sic": 4
},
{
"ds_type": "ADSB",
"info": {
"network_lines": {
"L1": "2.192.31.24:8600",
"L3": "3.192.31.25:8600"
APPENDIX 282

}
},
"name": "ADSB",
"sac": 1,
"short_name": "WADS",
"sic": 5
},
{
"ds_type": "MLAT",
"info": {
"network_lines": {
"L3": "2.192.21.24:8600",
"L4": "3.192.21.25:8600"
}
},
"name": "MLAT1",
"sac": 50,
"short_name": "MLAT1",
"sic": 70
}
]
}

Deprecated Format

{
"Radar": [
{
"altitude": 3.0,
"dbo_name": "Radar",
"latitude": 1.0,
"longitude": 2.0,
"name": "Luqa",
"sac": 120,
"short_name": "LQ",
"sic": 1
},
{
"altitude": 6.0,
"dbo_name": "Radar",
"latitude": 4.0,
"longitude": 5.0,
"name": "Dingli",
"sac": 120,
"short_name": "DG",
"sic": 2
APPENDIX 283

},
{
"altitude": 9.0,
"dbo_name": "Radar",
"latitude": 7.0,
"longitude": 8.0,
"name": "Hal-Far",
"sac": 120,
"short_name": "HF",
"sic": 3
}
],
"Tracker": [
{
"dbo_name": "Tracker",
"name": "ARTAS",
"sac": 120,
"short_name": "ATS",
"sic": 49
}
]
}

Appendix: View Points


This section defines how the view point files have to be structured and written by view
point generators (applications which generate such files).

View point files are written in the JSON format, and includes a version number. The
current version number is 0.2, the previous version 0.1 is not supported.

Please note that basic JSON format knowledge is required to understand the topics in
this section.

Please note that the format definition has to be fulfilled rigorously, otherwise the ap-
plication may not show the view point or (in rare cases) crash.

Please note that two different use cases were considered: a general one and one for
tracker run comparison. The latter has additional parameters defined, to be used only in
such circumstances.

File Content
The following content is commonly stored in a view point file:
• View points context information
– Information about all later view points
APPENDIX 284

– Defines version
– Can be used for importing the relevant data files
– Also giving contextual information
• View point collection, each consisting of information about
– Unique identifier
– View Point type
– Description text
– A point in time with a time window
– A position with a position window
– A list of DSTypes to be loaded
– A list of data sources to be loaded
– A list of filters
– A list of context variables

Custom Attributes
The view point context, as well as view points, can contain additional information. For
the context, this additional information is shown in the ’Import View Points’ task and not
stored.

Additional data in each view point is read in as is, and persisted in the database. In the
’View Points’ tab, all primitive variables (non-object, non-list) are shown as columns.

Therefore, when generating view points, it might be useful to add such variables (e.g.
for ordering according to error magnitude, priority etc.). This is recommended and will be
supported further.

Version 0.2
The most simple example of a view points file is as follows:
{
"content_type": "view_points",
"content_version": "0.2",
"view_points": [
{
"id": 1,
"name": "All",
"status": "open",
"type": "Saved"
}
]
}
APPENDIX 285

There is an encompassing JSON object, which contains the view point context (object)
and view points (list with one entry).

View Point Context


The ’version’ attribute is mandatory, and only version 0.2 is currently supported.

In the context, optinally datasets can be added, which define related data to be im-
ported.
{
"content_type": "view_points",
"content_version": "0.2",
"view_point_context": {
"datasets": [
{
"filename": "/home/sk/data/test/asterix/test.ff"
}
]
},
...
}

Each dataset has to have a filename.

Several datasets are also possible:


{
"content_type": "view_points",
"content_version": "0.2",
"view_point_context": {
"datasets": [
{
"filename": "/home/sk/data/test/asterix/test.ff"
},
{
"filename": "/home/sk/data/test/asterix/test2.ff"
},
{
"filename": "/home/sk/data/test/asterix/test3.ff"
}
]
},
...
}
APPENDIX 286

Each of the defined ASTERIX datasets will be automatically imported when using the
’Import View Points’ task.

Please note that the ASTERIX decoder settings have to be set correctly in the configu-
ration and are the same for all files to be imported.

For the tracker run comparison case, additional attributes are used. The assumption
is that there are two recordings, each containing a tracker run, each of which will be
imported into a separate line. The tracker runs will possibly also have the same SAC/SIC
and a time shift, which can be corrected during ASTERIX import using the features de-
scribed in Override Tab.

Such an import without overrides can look as follows:


{
"content_type": "view_points",
"content_version": "0.2",
"view_point_context": {
"datasets": [
{
"filename": "/home/sk/data/test/asterix/test.ff",
"line_id": 0
},
{
"filename": "/home/sk/data/test/asterix/test2.ff",
"line_id": 1,
"time_offset" : -60
}
]
},
...
}

In this case, the first dataset will be imported into line L1 (0 as L1, default line), while
the second dataset will be imported in to L2 and time shifted by the ’time_offset’ (as Time
of Day in seconds).

Exact definition:
APPENDIX 287

Key Value Description Required Comment


filename File path and name (abso- Y Needed for import
lute) as string
line_id Line ID to be used during
ASTERIX import
time_offset ASTERIX tod offset as ref-
erence time minus dataset
time, as number in seconds

View Point
View points are stored in the ’view_points’ attribute, which is a simple list.

A view point only has to contain and ’id’ and ’type’ attribute, but additional attributes
make it more meaninful.
{
...
{
"id":0,
"type":"any string",
"text":"any string",
"position_latitude":49.5,
"position_longitude":12.2,
"position_window_latitude":0.05,
"position_window_longitude":0.02,
"time":666.0,
"time_window":4.0,
"data_sources": [
[
12750,
[
0,
1
]
],
[
12759,
[
2,
3
]
]
],
"data_source_types": [
APPENDIX 288

"Radar",
"RefTraj"
],
"filters": {
"Time of Day": {
"Time of Day Maximum": "16:02:00.00",
"Time of Day Minimum": "16:00:00.00"
},
"UTNs": {
"utns": "4"
}
},
"context_variables": {
"Meta": [
"Ground Bit",
"Track Groundspeed"
]
}
},
...
}
In each View Point object, the following values can be defined:

Key Value Description Required


id Identifier, as number Y
type Type, as string, e.g. ’Short track’, ’Extra Y
track’, ’Content deviation X’
text Description text as string
position_latitude Center position WGS-84 latitude, as num-
ber
position_longitude Center position WGS-84 longitude, as
number
position_window_latitude Geographic window size in WGS-84 lati-
tude, as number
position_window_longitude Geographic window size in WGS-84 longi-
tude, as number
time Center time, as number in seconds
time_window Time window size, as number in seconds
data_sources List of data sources to load, with lists which
lines to load
data_source_types List of DSTypes to load, as strings
filters List of filters defining which data to load,
as JSON objects
context_variables List of extra content data variables to load
and display
APPENDIX 289

If the ’data_sources’, ’data_source_types’ or ’filters’ attribute are not defined, all data
will be loaded.

View Point Filters


The ’filters’ attribute contains a JSON object, where each filter name is used as a key, and
the value is again an JSON object encompassing the filter conditions. Each filter condition
value is a string, the same as a user would enter in the GUI.
{
...
{
...
"filters": {
"Filter Name 1": {
"Filter Condition 1": "Value 1"
},
"Filter Name 2": {
"Filter Condition 1": "Value 1",
"Filter Condition 2": "Value 2"
}
...
}
...
},
...
}
When a view point is set, only the filters that are defined will be activated, all others
will be disabled. If no ’filter’ is defined, filtering will be disabled (load all data).

All possible filters existing in COMPASS can be set using view points, e.g.:
{
...
{
"id": 0,
"name": "all filters",
"position_latitude": 47.550205341739996,
"position_longitude": 14.672879071070001,
"position_window_latitude": 4.913833554478789,
"position_window_longitude": 7.054962643339518,
"status": "open",
"type": "Saved",
"db_objects": [
"Tracker"
],
"filters": {
"Barometric Altitude": {
APPENDIX 290

"Barometric Altitude Maxmimum": "43000",


"Barometric Altitude Minimum": "500"
},
"Callsign": {
"Callsign Values": "%TEST%"
},
"Hash Code": {
"HashCode Values": "%81c20819%"
},
"Mode 3/A Code": {
"Mode 3/A Code Values": "7000"
},
"Position": {
"Latitude Maximum": "50.78493920733",
"Latitude Minimum": "44.31547147615",
"Longitude Maximum": "20.76559892354",
"Longitude Minimum": "8.5801592186"
},
"Target Address": {
"Target Address Values": "FEFE10"
},
"Time of Day": {
"Time of Day Maximum": "05:56:32.297",
"Time of Day Minimum": "05:44:58.445"
},
"Tracker Data Sources": {
"active_sources": [
13040,
13041
]
},
"Tracker Detection Type": {
"Tracker Detection Types": "1"
},
"Tracker Multiple Sources": {
"Tracker Multiple Sources Value": "N"
},
"Tracker Track Number": {
"ARTAS_REF track_num": "7287",
"ARTAS_TST track_num": "4479"
}
}
},
...
}
The filter values have to be defined exactly as a user would enter them in the DBFilter
conditions in the GUI.

One exception are the data source filters, which need a ’active_sources’ condition list
with integer values for all sources to be loaded, identified as number. Each source number
APPENDIX 291

is calculated by SAC ∗ 255 + SIC.

For the ’Tracker Track Number’ filter, the data source condition is identified by its data
source name. Using the dataset in the context information, the view point file can ensure
that the same name is used.

Please note that setting custom filters (created by the user) is also possible using view
points. Please contact the author for further information.

Appendix: Algorithms
Positions Accuracy Ellipses
According to the 68–95–99.7 rule, 68.27%, 95.45% and 99.73% of the values lie within one,
two and three standard deviations of the mean, respectively.

Therefore, 95% is equivalent to two standard deviations (σ).

Please note that currently only the Cartesian variables are used, and are assumed to be
valid for the target reports position (and north direction), not for a system center.

The usage WGS-84 variables is included, an calculated to Cartesian values on the posi-
tion of the target report.

ADS-B
The accuracy values where taken from The 1090MHz Riddle online book.

ASTERIX Data Item Variable Comment


210.VN mops_version Used to distinguish between DO-260 versions
090.NACp nac_p Used for V1 & V2 transponders
090.NUCp or NIC nucp_nic Used for V0 transponders

Table 15: ADS-B Position Accuracy Variables


APPENDIX 292

V0 Transponders

NUCp HPL RCu σ


9 < 7.5 m <3m 1.5
8 < 25 m < 10 m 5
7 < 0.1 NM (185 m) < 0.05 NM (93 m) 46.5
6 < 0.2 NM (370 m) < 0.1 NM (185 m) 92.5
5 < 0.5 NM (926 m) < 0.25 NM (463 m) 231.5
4 < 1 NM (1852 m) < 0.5 NM (926 m) 463
3 < 2 NM (3704 m) < 1 NM (1852 m) 926
2 < 10 NM (18520 m) < 5 NM (9260 m) 4630
1 < 20 NM (37040 m) < 10 NM (18520 m) 9260
0 > 20 NM (37040 m) > 10 NM (18520 m) -
Table 16: NUCp Values

• NUCp: Navigation Uncertainty Category - Position


• HPL: Horizontal Protection Limit
• RCu: 95% Containment Radius - Horizontal
• σ: Standard deviation in meters, from RCu/2

For the NUCp values, the taken for display are listed in the σ column, resulting in error
circles (same value for both coordinates). Please note that for the NUCp value 0, no error
ellipse is displayed, since the accuracy value is unkown.

V1 & V2 Transponders

NACp EPU (HFOM) VEPU (VFOM) σ


11 <3m <4m 1.5
10 < 10 m < 15 m 5
9 < 30 m < 45 m 15
8 < 0.05 NM (93 m) 46.5
7 < 0.1 NM (185 m) 92.5
6 < 0.3 NM (556 m) 278
5 < 0.5 NM (926 m) 463
4 < 1.0 NM (1852 m) 926
3 < 2 NM (3704 m) 1852
2 < 4 NM (7408 m) 3704
1 < 10 NM (18520 km) 9260
0 > 10 NM or Unknown -
Table 17: NACp Values
APPENDIX 293

• NACp: Navigation Accuracy Category - Position


• EPU (HFOM): 95% horizontal accuracy bounds, Estimated Position Uncertainty
(EPU) a.k.a. Horizontal Figure of Merit (HFOM)
• VEPU (VFOM): 95% vertical accuracy bounds, Vertical Estimated Position Uncer-
tainty (VEPU) a.k.a. Vertical Figure of Merit (VFOM)
• σ: Standard deviation in meters, from EPU/2

For the NACp values, the taken for display are listed in the σ column, resulting in error
circles (same value for both coordinates). Please note that for the NACp value 0, no error
ellipse is displayed, since the accuracy value is unkown.

MLAT
The description of the accuracy values where taken from EUROCONTROL Specification
for Surveillance Data Exchange ASTERIX Part 14 Category 020 Multilateration Target Reports
Appendix A: Reserved Expansion Field (EUROCONTROL-SPEC-0149-14A) document.

ASTERIX Data Item Variable Comment


REF.PA.SDW.SDW (Latitude Component) Latitude StdDev
REF.PA.SDW.SDW (Longitude Component) Longitude StdDev
REF.PA.SDW.COV-WGS (Lat/Long Covariance Lat/Long Cov
Component)
REF.PA.SDC.SDC (X-Component) X StdDev
REF.PA.SDC.SDC (Y-Component) Y StdDev
REF.PA.SDC.COV-XY (Covariance Component) X/Y Covariance

Table 18: MLAT Position Accuracy Variables

• SDC.SDC values: Standard Deviation of Position of the target expressed in Cartesian


coordinates, in meters
• SDC.COV-XY: XY Covariance Component, the unit is listed as in meters, but should
be in m2 and is asssumed as such
• SDW.SDW values: Standard Deviation of Position of the target expressed in WGS-84,
in degrees
• SDW.COV-WGS: Lat/Long Covariance Component, the unit is listed as in degrees,
but should be in deg 2 and is asssumed as such

Notes:
• XY covariance component = sign Cov(X,Y) * sqrt abs [Cov (X,Y)]
• WGS-84 covariance component = sign Cov(Lat,Long) * sqrt abs [Cov (Lat,Long)]
APPENDIX 294

Radar
For each Radar plot, the DBContent Variables ’DS ID’ and ’Detection Type’ are used to
define from which data source the plot originated, and what type of plot was measured.

From the data source information (see Data Sources Table Content) the radar standard
deviations are collected. If no data source specific values are set, the default values (see
Radar Default Accuracies) are used.

Based on the plot type defined by ’Detection Type’, one (single technology) or the min-
imum/maximum of the standard deviations (for combined plots, defined by ’Use Radar
Minimum StdDev’ flag) is used. If the ’Detection Type’ is not set or is 0 (no detection), the
PSR values are used.

The standard deviates used for display purposes are calculated based on the range
standard deviation and the azimuth standard deviation multiplied by the circumference
at the given range, and are of course rotated by the measurement azimuth.

RefTraj
Calculated as the Tracker APW values.

Tracker
The description of the accuracy values where taken from EUROCONTROL Specifica-
tion for Surveillance Data Exchange ASTERIX Part 9 Category 062 SDPS Track Messages
(EUROCONTROL-SPEC-0149-9) document.

ASTERIX Data Item Variable Comment


500.APW.APW (Latitude Component) Latitude StdDev
500.APW.APW (Longitude Component) Longitude StdDev
500.APC.APC (X-Component) X StdDev
500.APC.APC (Y-Component) Y StdDev
500.COV.COV (XY Covariance Component) X/Y Covariance

Table 19: Tracker Position Accuracy Variables

• APC.APC values: Estimated accuracy (i.e. standard deviation) of the calculated po-
sition of a target expressed in Cartesian co-ordinates, in meters
• COV.COV: XY Covariance Component, the unit is listed as in meters, but should be
in m2 and is asssumed as such
• APW.APW values: Standard Deviation of Position of the target expressed in WGS-
84, in degrees
APPENDIX 295

Notes:
• XY covariance component = sign Cov(X,Y) * sqrt abs [Cov (X,Y)]

Appendix: Latex Installation


To be able to use the report generation (View points as well as Evaluation reports), a Latex
installation has to be installed on the used workstation.

Depending on what Linux operating system is used, the commands will differ. In this
section installation instructions for a few selected OS’ is provided.

Ubuntu & Debian Variants

sudo apt-get install texlive-full

CentOS & Fedora Variants


The following command should work:
sudo yum install ’texlive-*’
There is still a guide which provides a more cumbersome installation - since the author
does not use CentOS both options are not yet verified. For the more elaborate installtion
please refer to How To Install Tex Live on CentOS 7.

Testing
After installtion, the following command should run an generate a similar output:
$ pdflatex --version
pdfTeX 3.14159265-2.6-1.40.21 (TeX Live 2020/Debian)
kpathsea version 6.3.2
Copyright 2020 Han The Thanh (pdfTeX) et al.
There is NO warranty. Redistribution of this software is
covered by the terms of both the pdfTeX copyright and
the Lesser GNU General Public License.
For more information about these matters, see the file
named COPYING and the pdfTeX source.
Primary author of pdfTeX: Han The Thanh (pdfTeX) et al.
Compiled with libpng 1.6.37; using libpng 1.6.37
Compiled with zlib 1.2.11; using zlib 1.2.11
Compiled with xpdf version 4.02
APPENDIX 296

Appendix: Utilities
jASTERIX
For usage of the jASTERIX library/tool please refer to jASTERIX.

SDDL
The SDDL tool is a open-source ASTERIX decoder and lister, please refer to SDDL for
more information. This text only aims at provding a short usage guide.

While it is possible to download and build from the source code, this usage guide
recommends downloading the a release AppImage from Releases. Please make sure that
at least version 1.1.0 with included JSON functions is used.
After downloading, set the executable flag on the file:
$ chmod +x SDDL-json-x86_64.AppImage
After this, the file can be executed. The help text can be obtained with the following
command:
./SDDL-json-x86_64.AppImage
*** Surveillance Data Decoder and Lister v1.1.0 ***
2000-2018 by Helmut Kobelbauer, Sinabelkirchen/Austria.

SDDL is free software: you can redistribute it and/or modify it under


the terms of the GNU General Public License as published by the
Free Software Foundation, either version 3 of the License,
or (at your option) any later version.

SDDL is distributed in the hope that it will be useful, but WITHOUT


ANY WARRANTY; without even the implied warranty of MERCHANTABILITY
or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public
,→ License
for more details

...

The ’Surveillance Data Decoder and Lister’ utility may be used as


,→ follows:

sddl { option } [ input_path [ list_path ] ]

where the following options are supported:


-ah=xxx use value (in FL) as assumed height
-all list all levels
-cat list ASTERIX category
-cat=xxx only this ASTERIX category to be listed
-categories print list of supported ASTERIX categories
APPENDIX 297

-f forced overwrite for list file


-fd checking frame against data time
-fl=nn frame limit (only first nn frames are listed)
-formats print list of recording and data formats
-gv list ground vector (for radar and system tracks)
-help print some help info (and abort)
-hex list hex dump
-if=pathname path name of input file
-l=nn list level (1/2=verbose, 3=one message per line)
-lf=pathname path name of list file
-json-type=type output type of json file to be written, possible
,→ types:
none,test,print,text,cbor,msgpack,ubjson,
zip-text,zip-cbor,zip-msgpack,zip-ubjson
-json-file=pathname path name of json file to be written
...
-utc UTC time of day in list file (default)
-wgs84 list WGS-84 position

’input_path’ and ’list_path’ are the (local or full)


path names of the respective files.

For comments or questions our e-mail address is: sddl@gmx.at


Thank you for using this software.

*** End of Surveillance Data Decoder and Lister ***


To print the list of supported ASTERIX categories and editions, use:
./SDDL-json-x86_64.AppImage -categories
...
The ’sddl’ utility at the moment supports the following ASTERIX
,→ categories:
ASTERIX category 000 n.a. April 1998
ASTERIX category 001 1.1 August 2002
ASTERIX category 002 1.0 November 1997
ASTERIX category 003 n.a. April 1998
ASTERIX category 004 1.2 March 2007
ASTERIX category 007 ---- not supported
ASTERIX category 008 1.0 November 1997
ASTERIX category 009 ---- not supported
ASTERIX category 010 1.1 March 2007
option -vsn010=0.24s 0.24* Sensis (Heathrow MDS modifications)
ASTERIX category 011 0.17 December 2001
option -vsn011=0.14 0.14 October 2000
option -vsn011=0.14i 0.14* Sensis (Inn valley modification)
ASTERIX category 016 unknown
ASTERIX category 017 0.5 February 1999
APPENDIX 298

ASTERIX category 018 ---- not supported


ASTERIX category 019 1.1 March 2007
ASTERIX category 020 1.5 April 2008
option -vsn020=1.0 1.0 November 2005
option -vsn020=1.2 1.2 April 2007
option -vsn020=1.5 1.5 April 2008
ASTERIX category 021 2.1 May 2011
option -vsn021=0.12 0.12 February 2001
option -vsn021=0.13 0.13 June 2001
option -vsn021=0.20 0.20 December 2002
option -vsn021=0.23 0.23 November 2003
option -vsn021=1.0P 1.0P April 2008
option -vsn021=1.4 1.4 July 2009
option -vsn021=2.1 2.1 May 2011
option -vsn021=2.4 2.4 15 June 2015
ASTERIX category 023 1.2 March 2009
option -vsn023=0.11 0.11 December 2002
option -vsn023=1.0P 1.0P April 2008
option -vsn023=1.1 1.1 September 2008
option -vsn023=1.2 1.2 March 2009
ASTERIX category 030 2.8.1 26 February 1999
ASTERIX category 031 2.8.1 26 February 1999
ASTERIX category 032 2.8.1 26 February 1999
ASTERIX category 034 1.27 May 2007
ASTERIX category 048 1.15 April 2007
option -vsn048=1.14 1.14 November 2000
option -vsn048=1.15 1.15 April 2007
option -vsn048=1.16 1.16 March 2009
ASTERIX category 062 1.3 April 2005
ASTERIX category 063 1.0 March 2004
ASTERIX category 065 0.12 March 2003
option -vsn065=0.12 0.12 March 2003
option -vsn065=1.3 1.3 April 2007
ASTERIX category 221 ? ?
ASTERIX category 247 1.2 February 2008
ASTERIX category 252 2.8.1 26 February 1999
To list the supported framing formats, use:
./SDDL-json-x86_64.AppImage -formats
*** Surveillance Data Decoder and Lister v1.1.0 ***
2000-2018 by Helmut Kobelbauer, Sinabelkirchen/Austria.

...

The ’sddl’ utility at the moment supports the following recording


,→ formats:
APPENDIX 299

-asf (ASTERIX in) IOSS Final Format recording


-ioss SASS-C IOSS (Final) recording (default)
-net Binary ’netto’ recording
-rec Sequence of records
-rff Comsoft (TM) RFF recording

Our ’sddl’ utility at the moment supports the following data formats:

-asf ASTERIX (in IOSS Final Format recording)


-asx ASTERIX data format (default)
-zzz Unknown data format - ignore

Please be aware that NOT EVERY combination of recording and data


,→ formats
is reasonable.

Thank you for using our software.

*** End of Surveillance Data Decoder and Lister ***


To list an existing ASTERIX recording ’test.rff’ with RFF framing, use:
./SDDL-json-x86_64.AppImage -rff test.rff
This will output the contained ASTERIX data with one line per target report/status
message. To limit parsing to the first 10000 bytes (for testing), use:
./SDDL-json-x86_64.AppImage -rff test.rff -ll=10000
To limit text output (for testing/benchmarking) with enable progress, use:
./SDDL-json-x86_64.AppImage -rff test.rff -l=0 -progress
*** Surveillance Data Decoder and Lister v1.1.0 ***
2000-2018 by Helmut Kobelbauer, Sinabelkirchen/Austria.

...

-> List level set to 0


-> Show progress indication
-> Using ASTERIX as default data format ...
-> Input file ’test.rff’ opened ...
-> 100 KB of input data read and processed
...
-> 220 MB of input data read and processed
; end of input file
; length=227970515 byte(s)
-> End of input file reached

-> Processed 227970515 bytes in 12.819 seconds (about 16.960 MB/sec;


163279 frames/sec)
APPENDIX 300

*** End of Surveillance Data Decoder and Lister ***


JSON output can be obtained in several ways:
• none: No JSON output, default mode
• test: JSON output is generated, but not printed or written
• print: JSON output is generated and printed to console
• text: JSON output is generated and written as text to a file, for which a filename must
be set
• cbor: JSON output is generated and written as CBOR to a file, for which a filename
must be set
• msgpack: JSON output is generated and written as MSGPack to a file, for which a
filename must be set
• ubjson: JSON output is generated and written as UBJSON to a file, for which a file-
name must be set
• zip-text: JSON output is generated and written as text to a ZIP archive file, for which
a filename must be set
• zip-cbor: JSON output is generated and written as CBOR to a ZIP archive file, for
which a filename must be set
• zip-msgpack: JSON output is generated and written as MSGPack to a ZIP archive
file, for which a filename must be set
• zip-ubjson: JSON output is generated and written as UBJSON to a ZIP archive file,
for which a filename must be set
For parsing in COMPASS, only the ’text’ or ’zip-text’ mode are currently supported.
Use the first only for smaller dataset, the latter provides good compression. The resulting
ZIP file will be about twice the size of the original ASTERIX file.
To decode and write a JSON zip-text file ’test_json.zip’, use:
./SDDL-json-x86_64.AppImage -rff test.rff -l=0 -progress -json-type=
,→ zip-text
-json-file=test_json.zip
*** Surveillance Data Decoder and Lister v1.1.0 ***
2000-2018 by Helmut Kobelbauer, Sinabelkirchen/Austria.

...

-> List level set to 0


-> Show progress indication
-> Output JSON type ’zip-text’
-> Export JSON to filename ’test_json.zip’
-> Using ASTERIX as default data format ...
-> Input file ’test.rff’ opened ...
-> 100 KB of input data read and processed
...
APPENDIX 301

-> 220 MB of input data read and processed


; end of input file
; length=227970515 byte(s)
-> End of input file reached

-> Processed 227970515 bytes in 106.411 seconds (about 2.043 MB/sec;


,→ 19669 frames/sec)

*** End of Surveillance Data Decoder and Lister ***

ADS-B exchange
For information about ADS-B exchange please refer to section Appendix: ADS-B exchange.

It is usually possible to obtain a dataset from any day prior to the current one, covering
the whole world.

The data contents is coded in JSON and is described in https://www.adsbexchange.


com/datafields/.

Please be aware that using the produced data for commercial purposes (without a
specific agreement) would violate the ADSB exchange terms & conditions. Please refer to
https://www.adsbexchange.com/legal-and-privacy/ for additional information.

The user could try to obtain a day with:


$ wget http://history.adsbexchange.com/Aircraftlist.json/2016-06-20.
,→ zip
The time taken to get the file depends on the user’s internet provider. The files are
usually around 8G bytes.
To create a file loadable in COMPASS, and after installing the COMPASS utils, the user
can run the ’test.sh’ script found in the utils directory.

The script expects to find an ADS-B exchange database in the current working directory
and generates a SQLite3 database ready to be loaded into COMPASS containing the traffic
between 08Z and 09Z of the given day.

Appendix: Licensing
The COMPASS source code (database backend & GUI) is released under GNU GPLv3:
https://www.gnu.org/licenses/gpl-3.0.en.html

The AppImage binary is released under Creative Commons Attribution 4.0 Interna-
tional (CC BY 4.0)
Human readable: https://creativecommons.org/licenses/by/4.0/
APPENDIX 302

Legal: https://creativecommons.org/licenses/by/4.0/legalcode

The OSGView is not released as open-source and can only be used in the AppImage
binary.

Appendix: Disclaimer
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EX-
PRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGE-
MENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE
FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF
CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Appendix: Used Libaries


While it is permitted to use the AppImage for commercial purposes, the used open-source
libraries might still prohibit this without further permission. It is the responsibility of
the user to see to that the license information here is correct and that their use cases are
permitted under the referenced licenses.

Package License Link


Qt5 GNU LGPL v3 Link
Boost Boost license Link
OpenSceneGraph GNU LGPL v3 Link
osgEarth GNU LGPL v3 Link
MySQL++ GNU LGPL Link
MySQL Connector/C++ GNU GPL v2 or later Link
SQLite3 Public domain Link
GDAL X11/MIT Link
Log4cpp LGPL Link
LibArchive New BSD Link
Eigen3 MPL2 Link
JSON for Modern C++ MIT Link
Intel TBB Apache 2.0 Link
Catch2 BSL-1.0 Link
NemaTode Zlib Link
Table 20: Library licenses

Please note that this list is not exhaustive, and that a large number of other, smaller
libraries is used in the background, for which the licenses were not checked. Please use
the ’ldd’ tool to get details.
APPENDIX 303

Appendix: ADS-B exchange


ADS-B exchange is a project hosted at https://www.adsbexchange.com/.

From the preamble of the home page one can read:

Welcome to ADSBexchange.com, a co-op of ADS-B/Mode S/MLAT feeders from around the


world, and the world’s largest source of unfiltered flight data. Thanks to our worldwide community
of participants, if the data is broadcast over the air, you can find it here. This opens up a whole
new world of interesting traffic for hobbyists, without materially affecting security for anyone.

Please be aware that using the produced data for commercial purposes (without a
specific agreement) would violate the ADSB exchange terms & conditions. Please refer to
https://www.adsbexchange.com/legal-and-privacy/ for additional information.

You might also like