Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

applied

sciences
Article
Development of UAV Tracing and Coordinate
Detection Method Using a Dual-Axis Rotary Platform
for an Anti-UAV System
Bor-Horng Sheu 1 , Chih-Cheng Chiu 2 , Wei-Ting Lu 1 , Chu-I Huang 1 and Wen-Ping Chen 1,3, *
1 Department of Electrical Engineering, National Kaohsiung University of Science and Technology,
Kaohsiung City 80778, Taiwan
2 Engineering Science and Technology, College of Engineering, National Kaohsiung University of Science
and Technology, Kaohsiung City 80778, Taiwan
3 Ph.D Program in Biomedical Engineering, Kaohsiung Medical University, Kaohsiung City 80778, Taiwan
* Correspondence: wpc@nkust.edu.tw

Received: 22 May 2019; Accepted: 21 June 2019; Published: 26 June 2019 

Abstract: The rapid development of unmanned aerial vehicles (UAVs) has led to many security
problems. In order to prevent UAVs from invading restricted areas or famous buildings, an anti-UAV
defense system (AUDS) has been developed and become a research topic of interest. Topics under
research in relation to this include electromagnetic interference guns for UAVs, high-energy laser
guns, US military net warheads, and AUDSs with net guns. However, these AUDSs use either manual
aiming or expensive radar to trace drones. This research proposes a dual-axis mechanism with UAVs
automatic tracing. The tracing platform uses visual image processing technology to trace and lock
the dynamic displacement of a drone. When a target UAV is locked, the system uses a nine-axis
attitude meter and laser rangers to measure its flight altitude and calculates its longitude and latitude
coordinates through sphere coordinates to provide drone monitoring for further defense or attack
missions. Tracing tests of UAV flights in the air were carried out using a DJI MAVIC UAV at a height
of 30 m to 100 m. It was set up for drone image capture and visual identification for tracing under
various weather conditions by a thermal imaging camera and a full-color camera, respectively. When
there was no cloud during the daytime, the images acquired by the thermal imaging camera and
full-color camera provide a high-quality image identification result. However, under dark weather,
black clouds will emit radiant energy and seriously affect the capture of images by a thermal imaging
camera. When there is no cloud at night, the thermal imaging camera performs well in drone image
capture. When the drone is traced and locked, the system can effectively obtain the flight altitude and
longitude and latitude coordinate values.

Keywords: unmanned aerial vehicles (UAVs), dynamic coordinate tracing; computer vision; anti-UAV
defense system

1. Introduction
Since Dà-Jiāng Innovations (DJI) released the new DJI XP3.1 (a flight control system) [1,2],
its multi-rotor drone has been characterized by hovering, route-planning, its comparative low price,
and ease of use [3,4]. These factors have led to the rapid development of drones in the consumer
selfie drone market. At present, they have been adopted all over the world in environmental aerial
photography or surveillance. However, under uncontrolled drone flight control, there have been
many drone misfortunes, both at home and abroad. News events have included the impact on the
navigation area, the impact on famous buildings, and the horror of political figures, and the biggest
worry for drones is that people or terrorists may modify them to make them carry dangerous bombs,

Appl. Sci. 2019, 9, 2583; doi:10.3390/app9132583 www.mdpi.com/journal/applsci


Appl. Sci. 2019, 9, 2583 2 of 17

or use the vehicles themselves as extremely dangerous devices for attacking bombs. Drone potential
hazards are an urgent problem [5,6], so countries have begun to impose strict regulations on drone
traffic management, and even implement no-fly rules. The Federal Aviation Administration (FAA),
similar authorities of China and Taiwan have all drawn up drone system usage scenarios. Besides,
for the sake of preventing and controlling indecent invasions by drones, research into many AUDSs
has progressively turned into an international research topic of interest [7–9]. There are lots of similar
approaches of AUDS, such as spoofing attacks, tracing and early warning, detection, defeating, and
signal interfering [10,11]. The scalable effects net warhead invented by the US military launches
warheads and throws nets to attack enemy drones, and other anti-UAV technologies developed by
scientists. Among them, devices such as radar, satellites, and thermal imaging cameras are used for
early warning and detection, which detect the trace of a drone from the ground. Those technologies
are to carry out an immediate detection, warning, and tracing of the intruding drone, and provide
real-time messages and information support [12–15]. Destroying intruding drones is another kind
of approach, which acquires the location of drones first and then defeats them by using missiles,
lasers, or other weapons. An example of this type of approach is a defense system using strong
energy laser developed by a Turkish company ASELSAN. Although the attack method has the effect
of destruction and intimidation, there is a risk of accidental injury to nearby people if the UAV is
shot down. Interfering with communications of an intruding drone by electromagnetic interference
methods is effective; however, it also affects the surrounding communications. Thus, the shortcoming
is clear—the wireless communications around that area is affected. The quality of life will also be
influenced. The scalable effects net warhead is a technique for quickly and accurately delivering a
net to a remote target using a projectile. It includes a mesh-containing launcher, an ejection spring,
and a releasable ogive. This net warhead can be launched from a suitable barrel using a propellant or
compressed gas. The speed and rotation of the warhead will be determined by the diameter of the
barrel. When the warhead approaches the target, the electronic control board activates the servo motor
and then pulls the central locking plunger to release the ogive. Then, the spring pushes the net to aim
at and spread on the target resulting in its capture.
There are lots of reports of security issues caused by drone incidents. Lots of incidents reported
that terrorists around the world use drones to attack the targets. Thus, there are an increasing market
and insistent need for AUDS, which lead to the development of AUDS technologies. However, with
limited resources, how to develop AUDS detection technologies for civil purposes is a very important
research topic. This research proposes a dual-axis rotary tracing mechanism, which adopts a thermal
imaging camera and a full-color camera. With an algorithm of image identification, this device traces
and locks a drone. The device traces a drone while the drone is in the range of visual detection by
tracing of dynamic coordinates. A nine-axis attitude meter and laser rangers in the proposed device
are used to perform the sensing then calculate its flight altitude of the drone. The spherical coordinates
are used to obtain the longitude and latitude data of the drone. Through continuous locking the
dynamic coordinates of the drone, a ground control station arranges an attack drone to defeat the
target drone by a net gun, laser gunm or other weapons such as rifles. In this stage, a real-time
transmitting of the dynamic coordinate to an attack drone for fast chase is required. This paper
proposes a different approach.

2. System Architecture
The research structure of this paper consists of three units: The dual-axis mechanism, drone
identification, tracing approach, and drone three-dimensional coordinate calculation method. A detailed
description of each unit follows.

2.1. The Dual-Axis Rotary Platform Mechanism


The design of externals for the proposed dual-axis rotary device in this research is shown in
Figure 1. There are two axes: The lateral axis is the x-axis (yaw) and the longitudinal axis is the y-axis
Appl. Sci. 2019, 9, 2583 3 of 17
Appl. Sci. 2019, 9, x FOR PEER REVIEW 3 of 17

(pitch). With the


(pitch). With the rotation
rotation of of yaw
yaw and
and pitch,
pitch, the
the movement
movement of of the
the image
image (the(the picture
picture acquired
acquired by by the
the
thermal imaging or full-color camera) will be driven. The image provides
thermal imaging or full-color camera) will be driven. The image provides flying drone detection with flying drone detection
with a viewing ◦ for yaw rotation and 180◦ for pitch rotation. In order to improve
a viewing angleangle
rangerange
of 360of° 360
for yaw rotation and 180° for pitch rotation. In order to improve the
the detection
detection capability
capability of theofdrone,
the drone,
therethere is a mechanism
is a mechanism frameframeinsideinside the dual-axis
the dual-axis mechanism,
mechanism, which
which is equipped with a fine-tuning device (three sets of laser rangers), a nine-axis
is equipped with a fine-tuning device (three sets of laser rangers), a nine-axis attitude meter, a thermal attitude meter,
aimaging
thermalcamera,
imagingand camera, and a camera
a full-color full-colorforcamera
drone for
imagedrone imageand
capture capture
positionanddetection.
position detection.
The fine-
The fine-tuning mechanism is used to provide drone tracing,
tuning mechanism is used to provide drone tracing, and the laser ranger is adopted and the laser ranger is for
adopted for
focusing
focusing
precisionprecision adjustment.
adjustment. In the In the drone
drone imageimagefocusfocus
andand distance
distance measurement,because
measurement, becausethethe image
image
acquisition
acquisition lens and the laser rangers are not on the same line, the distance measuring of the drone
lens and the laser rangers are not on the same line, the distance measuring of the drone
for
for high-altitude
high-altitude flight
flight needs
needs the the fine-tuning
fine-tuning mechanism
mechanism for for the
the focus.
focus. First,
First, the
the photographic
photographic lenslens is
is
adopted as the focus point of the center of the image frame, then the three
adopted as the focus point of the center of the image frame, then the three laser rangers are set closelaser rangers are set close
to
to the
the lens,
lens, before
before the
the fine-tuning
fine-tuning mechanism
mechanism is is modified
modified withwith thethe focus
focus distance
distance of of the
the drone
drone toto be
be
acquired
acquired by the tracing device, so that the hitting spot of the laser ranger on the drone overlaps with
by the tracing device, so that the hitting spot of the laser ranger on the drone overlaps with
the spot of
the spot of the
thecenter
centerofofthetheimage.
image.GPS GPS andandLoRaLoRa antennas
antennas areare placed
placed outside
outside the the tracing
tracing device
device due
due to the need to avoid metal disturbance and prevent from signal-shelter
to the need to avoid metal disturbance and prevent from signal-shelter in the thick casing of in the thick casing of the
the
tracing
tracing device.
device.

Infrared Thermal
Imaging Camera

Full Color
Camera

Laser
Ranger
Sets

Pitch Rotation

Slip Ring

Yaw Rotation
Tracking
Platform

Figure 1. Design of the dual-axis mechanism.

2.2. Drone Visual


2.2. Drone Visual Image
Image Tracing
Tracing Method
Method
The
The drone image visual
drone image visual tracing
tracing process
processidentifies
identifiesaadrone
droneininmotion
motionusingusingimages
imagesacquired
acquiredbybya
athermal
thermalimaging
imagingcamera
cameraororfull-color
full-colorcamera.
camera.These
Thesecameras
camerasinside
inside aa dual-axis
dual-axis mechanism
mechanism were were
adopted
adopted for
forimage
imageacquisition
acquisitionin this
in study. After image
this study. After acquired, vision image
image acquired, visionprocessing is performed
image processing is
and then drone tracing is done through motor control of the above platform.
performed and then drone tracing is done through motor control of the above platform.
A
A dual-axis
dual-axis tracing
tracing device
device adopts
adopts image
image processing
processing techniques
techniques to to deal
deal with
with images
images ofof aa dynamic
dynamic
object.
object. An
An image
imagecapture
capturecard
cardininthe device
the deviceacquires pictures
acquires at aatspeed
pictures of 30offrames
a speed per second,
30 frames and then
per second, and
performs image processing, as shown in the image processing flowchart in Figure
then performs image processing, as shown in the image processing flowchart in Figure 2. The 2. The acquisition of a
dynamic drone contour [15,16] is carried out by the frame difference method with image
acquisition of a dynamic drone contour [15,16] is carried out by the frame difference method with pre-processing.
image pre-processing.
Appl. Sci. 2019, 9, 2583 4 of 17
Appl. Sci. 2019, 9, x FOR PEER REVIEW 4 of 17

Frame
Start
Difference

UAV image Detecting UAV N


capture moving objects

UAV grayscale Calculating the center


point of UAV

UAV image
Tracking UAV
filtering

Threshold Calculating the amount


processing of displacement

Transmitting
Dilation the amount of
processing displacement

End

Figure
Figure 2. 2. Drone
Drone image
image tracing
tracing flowchart.
flowchart.

First,
First,thetheoriginal
originalimage,
image,asasshown
shownin inFigure
Figure3a,3a,is is
converted to to
converted grayscale of of
grayscale Sgray
SgraybybyRGB
RGB
conversion
conversion [15] asas
[15] shown
shown in in
Figure 3b.3b.
Figure The formula
The formulais:is:

𝑆 S gray =
= 0.299𝑅
0.299R ++ 0.587𝐺
0.587G + 0.114𝐵
+ 0.114B (1)
(1)
where RGB is the three primary color pixel values of the original image. The grayscale image of
where RGB is the three primary color pixel values of the original image. The grayscale image of
Gaussian filter and median filter is then used to remove noise and smooth the image to avoid
Gaussian filter and median filter is then used to remove noise and smooth the image to avoid misjudging
misjudging subsequent images due to noise effect as shown in Figure 3c. Processing Figure 3c as a
subsequent images due to noise effect as shown in Figure 3c. Processing Figure 3c as a histogram can
histogram can obtain a histogram of the image. The histogram of the image can be found that there
obtain a histogram of the image. The histogram of the image can be found that there is a highest value
is a highest value in the histogram. Its value is the T value. After threshold processing, a binary image
in the histogram. Its value is the T value. After threshold processing, a binary image can be obtained
can be obtained as shown in Figure 3d.
as shown in Figure 3d.
After threshold processing is performed, the method of dilation in morphology can be used to
After threshold processing is performed, the method of dilation in morphology can be used to
segment independent image element of the UAV so that the UAV is highlighted in the picture for
segment independent image element of the UAV so that the UAV is highlighted in the picture for better
better tracking of the system. The dilation operation is shown in Equation (2), where A is the original
tracking of the system. The dilation operation is shown in Equation (2), where A is the original image
image and B is the mask. The core operation is that there is an object at the neighboring point and the
and B is the mask. The core operation is that there is an object at the neighboring point and the position
position is set as an object.
is set as an object.  
𝐴A⊕ ⊕ B𝐵=={ z| 𝑧|
B̂ 𝐵∩ A ∩
,𝐴
z
0} ≠ 0 (2)
(2)
Figure
Figure 3e3e
shows
showsthe
the image
image after
afterdilation
dilation (morphology),
(morphology),whichwhichfills
fillsthe
thegap
gapininthe
thebroken
brokenUAV UAV
image.
image.TheTheframe
framedifference
differencemethod
methodallowsallowsextraction
extractionofofa amotion
motionregion
region[17,18]
[17,18]onontwotwoadjacent
adjacent
images
images bybyperforming
performingdifferential
differential operations.
operations. It It
oversees the
oversees movement
the movement ofof anomalous
anomalous objects
objects[19] inin
[19]
the picture with the movement of the target and the movement of the camera.
the picture with the movement of the target and the movement of the camera. There will be a There will be a significant
difference
significantbetween adjacent
difference betweenframes. For real-time
adjacent frames. tracing, due to tracing,
For real-time the fast drone
due toflight speed,
the fast a small
drone flight
amount
speed, a small amount of immediate calculation is needed to trace the drone steadily. There the
of immediate calculation is needed to trace the drone steadily. There is an advantage that is an
background
advantage accumulation problemaccumulation
that the background for the frameproblem
difference
formethod
the frameis almost ignored.
difference methodTheisrenew
almost
speed is quick,
ignored. and less
The renew calculation
speed is quick,isand
required for tracingisarequired
less calculation drone immediately,
for tracing awhichdrone is shown in
immediately,
Figure 3f. The equation is as follows:
which is shown in Figure 3f. The equation is as follows:

𝐷(𝑥,
D(x, 𝑦) = f𝑓
y) = (𝑥,
m+1 (x, y) 𝑦) −(x,𝑓 y(𝑥,
− fm ) 𝑦) (3)
(3)
where 𝐷(𝑥, 𝑦) is the result of differential operation for the mth frame and (m+1)th frame.
Appl. Sci. 2019, 9, 2583 5 of 17

Appl. Sci. 2019, 9, x FOR PEER REVIEW 5 of 17


where D(x, y) is the result of differential operation for the mth frame and (m+1)th frame.

(a) full-color image (c) image filtering


(b) grayscale
acquisition

(f) frame difference method


(d) threshold processing (e) dilation processing
processing
Figure3.3.Drone
Figure Droneimage
imageacquisition
acquisitionprocedures.
procedures.

The drone tracing mechanism of a dual-axis tracing device is to convert a two-dimensional


variation of the pixels of a captured image (∆x, ∆y) into a step-level control amount of a two-axis step
motor inside the above device. A movement amount is calculated by the above conversion of acquired
images, which is applied to complete drone tracing operations. Suppose the image’s resolution is set
as 2P × 2Q pixels, which is shown in Figure 4. An acquired screen image’s origin point is set as (x0 , y0 ),
and its pixel coordinate is (-P, -Q). The screen’s center point is (x1 , y1 ), which is the concentric point in
the tracing screen, and its pixel coordinate is defined as (0, 0). Therefore, the pixel of the screen image
in the x-axis ranges from -P~+P, and the y-axis ranges from −Q~+Q. When the target drone’s image
enters the edge of the rectangular frame of the tracing device, it is acquired by the visual identification
of the system. The tracing system instantly estimates the pixel difference (p, q) between the x-axis and
the y-axis of the center position (x2 , y2 ) and the concentric point (x1 , y1 ) of the drone: (x2 − x1 , y2 − y1 ).
When the p value is positive, it reveals that the center point of the target drone is on the right side of
the center point of the screen image, and vice versa for the left side. Furthermore, when the q value is
positive, it reveals that the center point of the target drone is on the lower side of the center point of the
screen image, and vice versa.
by the visual identification of the system. The tracing system instantly estimates the pixel difference
(p, q) between the x-axis and the y-axis of the center position (𝑥 , 𝑦 ) and the concentric point (𝑥 , 𝑦 )
of the drone: (𝑥 − 𝑥 , 𝑦 − 𝑦 ). When the p value is positive, it reveals that the center point of the
target drone is on the right side of the center point of the screen image, and vice versa for the left side.
Furthermore,
Appl. Sci. 2019, 9, when
2583 the q value is positive, it reveals that the center point of the target drone is on the
6 of 17
lower side of the center point of the screen image, and vice versa.

Figure
Figure 4.
4. Diagram
Diagram of
of the
the target
target drone
drone displacement
displacement vector.
vector.

In order
ordertoto tracetrace the the invading
invading dronesdrones
dynamicdynamic
positionposition at any
at any time, time, three-dimensional
three-dimensional coordinate
coordinate
parameters parameters
such as the flight such as the flight
altitude of thealtitude of the
drone and drone andand
its longitude its latitude
longitude areand latitude and
calculated, are
calculated, and these
these parameters parameters are
are immediately immediately
transmitted transmitted
to the to the ground
ground monitoring centermonitoring
for directcenter
attackfor
or
direct attack
indirect huntorforindirect
the drone. huntThe
forcontrol
the drone. The control
requirements forrequirements
a two-axis stepformotor
a two-axis
in thestep motortracing
dual-axis in the
dual-axis
device aretracing
as follows:device are as follows:
Suppose the position
position of of the
the concentric
concentric point
point of
of the
the screen image isis (𝑥
screen image (x1,, 𝑦y1)) and the position of
the drone is ((𝑥
x2 ,, y𝑦2 )),, which
which are
are shown
shown in in Figure
Figure 4.
4. The system traces the drone in real time, and the
requirement of motor control speed of the dual-axis dual-axis device
device is:
is:
𝑝 𝑞
𝑉 V yaw−rotaion== p 𝑎𝑛𝑑 q
and V𝑉pitch−rotaion = = (4)
(4)
∆𝑡
∆t ∆t ∆𝑡
where ∆t is the
where ∆t istime
thefor thefor
time duration of the image
the duration of thecapture card to acquire
image capture card toimages,
acquirewhich
images,is normally
which is
30 frames per second; that is, ∆t = 1/30 s. The smaller the value of ∆t, the more
normally 30 frames per second; that is, ∆t = 1/30 s. The smaller the value of ∆t, the more accurately accurately the tracing
the
system can grasp
tracing system canthe drone
grasp movement
the drone movement trajectory, so that
trajectory, so the
thatalignment
the alignmentscreen’s concentric
screen’s concentricpoint of
point
the tracing
of the tracing device
device can
canlock
lockthethedrone
droneasassoon soonasaspossible,
possible,butbutrelatively
relativelyrequires
requires aa faster software
faster software
computing requirement.
computing requirement.
The
The system
system has has to
to convert
convert the the pixel
pixel difference
difference (p,
(p, q)
q) of the image
of the image into
into the
the tracing
tracing step
step angle
angle of
of
the
the two-axis step motor of the tracing device; that is, the image coordinates are converted from pixel
two-axis step motor of the tracing device; that is, the image coordinates are converted from pixel
values (p, q)
values (p, q) to
to the
the number
number of of steps
steps (S
(Spp,, SSqq))to
todrive
drivethe
thetwo-axis
two-axisstep
stepmotor,
motor, where
where Sp
Sp and
and Sq
Sq represent
represent
the step numbers of the step motor for the yaw rotation and pitch rotation, respectively.
the step numbers of the step motor for the yaw rotation and pitch rotation, respectively. Since the step Since the
motor rotates 360◦ in one turn, assuming that SR is the total number of steps in one circle of the step
motor, the step angle θm of each step motor is:

3600
θm = . (5)
SR

Due to the fact that the necessary θm value of the step motor in the tracing device is related to
the pixel value acquired by the image capture card, it is important to find the correlation between
these two matters. First, suppose the full-color (or thermal imaging) camera acquires the picture in
a certain direction as shown in Figure 5a. Then, suppose the driver of the step motor in the tracing
device send the square wave signal with the number of SP to the step motor of the x-axis, which drives
the step motor of the x-axis to carry out the rotation of SP steps. These make the dual-axis device carry
out a horizontal right movement, and then acquires the immediate picture, as shown in Figure 5b.
Finally, by comparing the images of Figure 5a,b with the superimposed image and calculating the
Due to the fact that the necessary 𝜃 value of the step motor in the tracing device is related to
the pixel value acquired by the image capture card, it is important to find the correlation between
these two matters. First, suppose the full-color (or thermal imaging) camera acquires the picture in a
certain direction as shown in Figure 5a. Then, suppose the driver of the step motor in the tracing
device
Appl. Sci.send
2019, the square wave signal with the number of SP to the step motor of the x-axis, which drives
9, 2583 7 of 17
the step motor of the x-axis to carry out the rotation of SP steps. These make the dual-axis device carry
out a horizontal right movement, and then acquires the immediate picture, as shown in Figure 5b.
increment
Finally, by of the p-value
comparing theofimages
these two images
of Figure of the
5a,b withx-axis after the step motor
the superimposed image of x-axis rotates
thecalculating
and the
steps of S , as shown in Figure 5c, the tracing system can receive the number of
incrementp of the p-value of these two images of the x-axis after the step motor of the x-axis steps pS of the step
rotates
motorofnecessary
steps Sp, as shownfor each pixel in
in Figure 5c,the
theyaw rotation
tracing can receive the number of steps 𝑝 of the step
direction:
system
motor necessary for each pixel in the yaw rotation direction:
Sp
pS = . (6)
𝑝 =p . (6)

Previous image Image after displacement

Sp step of motor

displacement

(a) Two images overlapping (b)

(c)

P p pixel shifting of image corresponding


to Sp steps for step motor right shifting

Figure 5. Schematic
Schematic diagram
diagram of the relationship between the step angle and image.

Furthermore, the number of steps of the step motor necessary for each pixel in the pitch rotation
direction is q𝑞S . .
S𝑆q
𝑞 =
qS = (7)
(7)
q𝑞

2.3. Drone Three-Dimensional Coordinate Calculation


3. Drone Three-Dimensional Coordinate Calculation
(1) Detection method for drone height: The rotation direction and angle judgment mode of the
(1) Detection
dual-axis method
tracing device arefor drone
based height:
on the valueThe rotation
sensed fromdirection
nine-axisand anglemeters.
attitude judgment mode of
Therefore, the
there
dual-axis tracing device are based on the value sensed from nine-axis attitude meters.
is a close relationship between the placement position of the attitude meter and the accuracy of the Therefore, there
is a close
system relationship
data. between
The starting pointthe of placement
orientationposition of the attitude
determination meter and
by an attitude the accuracy
sensing componentof the
is
system data. The starting point of orientation determination by an attitude
based on the preset position of its own module. The drone movement orientation of the screen image sensing component is
based
is basedononthethe
preset positionofofthe
movement itslens,
ownso module. The drone
the position of themovement orientationofofthe
sensing component theattitude
screen image
meter
is basedbe
should onplaced
the movement
as close asofpossible
the lens,tosothethecamera
position of the
lens, sensing
so that component
the angular of the will
deviation attitude meter
be smaller.
should
After be placed
many tests, as
weclose
foundas that
possible
it hastothe
thebest
camera
effectlens,
when so placed
that theunder
angular thedeviation
lens. will be smaller.
AfterWhen
many the tests, we found that it has the best effect when placed under the lens.
tracing device finds that the drone has entered the visual image of the screen as in
Figure 6, the system immediately traces the drone image in the screen through visual identification, and
computes the relative relationship between the drone image and the center point of the visual image
screen. Then, it performs an action of target drone tracing and position locking. The drone image
enters the center point (x1 , y1 ) of the visual image screen of the tracing device, and the laser ranger can
measure when the target drone enters the laser ranging range. The distance value (r) from the target is
obtained, and the pitch angle (θ) of the rotating stage at this time is obtained by the nine-axis attitude
meter in the apparatus. As shown in Equation (7), the flight altitude h of the drone can be converted.
q
h = r × sinθ = (x2 − x1 )2 + ( y2 − y1 )2 × sinθ (8)
the target is obtained, and the pitch angle (θ) of the rotating stage at this time is obtained by the nine-
axis attitude meter in the apparatus. As shown in Equation (7), the flight altitude h of the drone can
be converted.

Appl. Sci. 2019, 9, 2583 ℎ = 𝑟 × 𝑠𝑖𝑛θ = (𝑥 − 𝑥 ) + (𝑦 − 𝑦 ) × 𝑠𝑖𝑛𝜃 8(8)


of 17

(x2, y2)

Pitch Rotation
h (x1, y1) Laser Ranger Sets

Pitch Angle θ

Attitude
Indicator

Figure
Figure6.6.Schematic
Schematicdiagram
diagramofofthe
thecalculation
calculationofofthe
theheight
heightofofthe
thedrone
droneon
onthe
theimage
imageconcentric
concentricpoint
point
by
bythe
theattitude
attitudemeter
meterand
andthe
thelaser
laserranger.
ranger.

(2)Detection
(2) Detectionmethodmethodfor fordrone
drone longitude
longitude andand latitude
latitude coordinate
coordinate as well
as well as azimuth:
as azimuth: When When
the
the tracing
tracing system system
locks locks the drone
the target target todrone to thepoint
the center center (𝑥 , 𝑦 ) of
point position
position (x1the ) of the
, y1image image
screen screen
through
throughthe
tracing, tracing,
targetthe targethas
drone drone has entered
entered the laser theranging
laser ranging
range. range. The values
The values of theofdistance
the distance
and theand
the horizontal
horizontal elevation
elevation angleangle
of theoftarget
the target
dronedrone
fromfrom thesite
the test testcan
sitebe
can be obtained
obtained by laser
by laser rangers
rangers and
andattitude
the the attitude metermeter
of theof the dual-axis
dual-axis device,device,
and thenandthe then the longitude
longitude and latitude,
and latitude, and theand the azimuth
azimuth angle
angle (relative
(relative to the angle),
to the north north angle),
of the of the drone
drone can becan be analyzed.
analyzed. In thisInpaper,
this paper, the drone
the drone longitude
longitude and
and latitude
latitude coordinates
coordinates [20,21]
[20,21] can be canobtained
be obtained by using
by using the target
the target coordinate
coordinate conversion
conversion equation.
equation. A
A schematic
schematic diagram
diagram of of
thethe longitude
longitude andand latitude
latitude coordinate
coordinate algorithmofofthe
algorithm thetarget
targetdrone
droneisisshown
shown
inFigure
in Figure7.7. This Thisalgorithm
algorithmisisapplicable
applicabletotothethedistance
distancecalculation
calculationbetween
betweenany anytwotwopoints
pointson onthethe
sphericalsurface,
spherical surface,where
whereOOisisthe thecenter
centerpoint
pointofofthe
thesphere,
sphere,andandAAandandBBareareanyanytwotwopoints
pointson onthethe
sphere.Point
sphere. PointAArepresents
representsthe thelongitude
longitude andand latitude coordinates (WAA,,JJAA) )obtained
latitude coordinates obtainedby bythe
theGPS
GPSin inthe
the
dual-axistracing
dual-axis tracingplatform,
platform,and andpoint
pointBBrepresents
representsthe theunknown
unknownlongitude
longitudeand andlatitude
latitudecoordinates
coordinates
(WBB, ,JBJ)B )ofofthe
(W thedrone
dronetraced
tracedby bythe
thedual-axis
dual-axistracing
tracingplatform.
platform.The Thelongitude
longitudeand andlatitude
latitudecoordinates
coordinates
of point B can be obtained by the sphere longitude and latitude
of point B can be obtained by the sphere longitude and latitude coordinate conversion [21]. coordinate conversion [21]. Other
Other
parametersare
parameters areasasfollows:
follows:
Point G: The true north position of the sphere; point O: Center of the sphere; R: Average radius of
the earth (~6371 km);
(WA , JA ): The values of the longitude and latitude of point A obtained by the GPS of the
dual-axis device;
ψ (Azimuth angles of point A, B and G): Obtained by the magnetometer and gyroscope of the
nine-axis attitude meter in the dual-axis device;
∠a: The angle between the two points B and G and the ground connection GO;
∠b: The angle between the two points A and G and the ground connection GO;
∠c: The angle between the two points A and B and the ground connection GO;
AB:ˆ The spherical distance between the two points of point A (dual-axis device) and B (drone
aerial location); that is, the length of minor arc AB ˆ in the arc generated by the intersection of the planes
passing through the three points A, O, and B, and the ball.
Appl. Sci. 2019, 9, 2583 9 of 17

∠c is the metric angle:


ˆ
AB 1800
∠c = × . (9)
R π
Appl. Sci. 2019, 9, x FOR PEER REVIEW 9 of 17

Prime Meridian

(WB, JB )
ψ

( W A, J A) Equator
WB
L
(0, 0 ) D (0, JB)
JB

Schematicdiagram
Figure7.7.Schematic
Figure diagramofofthe
thelongitude
longitudeand
andlatitude
latitudesphere
spherecoordinate
coordinatealgorithm
algorithmfor
forthe
thedrone.
drone.

SinceG:
Point theThe
spherical distance
true north between
position point
of the A (dual-axis
sphere; point O: device)
Center and point
of the B (drone
sphere; aerial location)
R: Average radius
is much smaller than the length of R, the spherical distance ˆ between two points can be approximated
AB
of the earth (~6371 km);
as the
(Wlinear distance
A, JA): The between
values two pointsand
of the longitude AB,latitude
so Equation (9) A
of point can be corrected
obtained by theto:
GPS of the dual-axis
device; 0
ψ (Azimuth angles of point A, B and∠c AB 180by
G):=Obtained
× . the magnetometer and gyroscope of the (10)
nine-axis attitude meter in the dual-axis device; R π
∠a: The ∠θ
∠a and angle
canbetween the two
be converted by points B and
the sphere G and the
longitude and ground connection
latitude 𝐺𝑂[21],
coordinates ; and Equation
∠b: The angle between the two points A
(11) and Equation (12) can be obtained as follows, and G and the ground connection 𝐺𝑂 ;
∠c: The angle between the two points A and B and the ground connection 𝐺𝑂;
𝐴𝐵: The spherical distance
∠a = cos −1
between
(sinAthe
w ×two
cos∠cpoints
+ cosAofw point
× sin∠cA×(dual-axis
cosψ), device) and B (drone
(11)
aerial location); that is, the length of minor arc 𝐴𝐵  in the arc  generated by the intersection of the
−1 sinc × sinψ
planes passing through the three points∠θ A,=O,sin
and B, and the ball. . (12)
sina
∠c is the metric angle:
Since the drone flight altitude at point B is within a few kilometers above the ground, the length
between the two points A and B and the point ∠c = O of×the center
. of earth can be regarded as the same,
(9)
and ∠θ can be regarded as the longitude difference between the two points A and B. Thus, the value of
Since the and
the longitude spherical
latitudedistance
of point between
B (WB , JB )point
is the A (dual-axis
longitude anddevice)
latitude and point Bof(drone
coordinates aerial
drone, where:
location) is much smaller than the length of R, the spherical distance 𝐴𝐵 between two points can be
approximated as the linear distance betweenJBtwo A + ∠θ.𝐴𝐵, so Equation (9) can be corrected to: (13)
= Jpoints

As described above, the longitude and∠clatitude


= ×
values . the drone flying in the air, which is locked
of (10)
by the dual-axis device and the flight altitude h, can be calculated and acquired, then transmitted
∠a and ∠θ can be converted by the sphere longitude and latitude coordinates [21], and Equation
wirelessly to the ground control center for the attack tasks of direct attack or tracing and entrapping.
(11) and Equation (12) can be obtained as follows,
3. Experimental Results and Discussion
∠𝑎 = 𝑐𝑜𝑠 (𝑠𝑖𝑛𝐴 × 𝑐𝑜𝑠∠𝑐 + 𝑐𝑜𝑠𝐴 × 𝑠𝑖𝑛∠𝑐 × 𝑐𝑜𝑠ψ), (2)
The drone tracing test results of the dual-axis device,
×
software, and hardware used in the study
are as follows: ∠𝜃 = 𝑠𝑖𝑛 . (3)

Since the drone flight altitude at point B is within a few kilometers above the ground, the length
between the two points A and B and the point O of the center of earth can be regarded as the same,
and ∠θ can be regarded as the longitude difference between the two points A and B. Thus, the value
of the longitude and latitude of point B (WB, JB) is the longitude and latitude coordinates of drone,
where:
Appl. Sci. 2019, 9, 2583 10 of 17

3.1. Development Environment of the Software and Hardware


(1) Visual identification software development: This study uses OpenCV (open-source computer
vision library), which is a library application software for computer vision and machine learning.
It was originally initiated and developed by the Intel Corporation and distributed under the license of
BSD (Berkeley Software Distribution license). It can be used free of charge in business and research.
OpenCV includes many image processing functions, machine learning algorithms, and libraries for
computer vision applications.
(2) Sensors and imaging equipment:

• Infrared thermal imaging: As shown in Figure 8, this paper uses an intermediate level infrared
thermal imaging camera (TE–EQ1), which is equipped with a 50 mm fixed-focus lens with 384×288
pixels. It can be used for mobile tracing tests with a drone flying altitude of about 100 m. The
TE–EQ1 is capable of observing the distribution of heat sources in drones, providing users with
drone tracing tasks through computer vision technology.
• Full-color single-lens reflex camera: This study uses a Sony α7SII (Japan) with a Sony SEL24240
lens, as shown in Figure 9. The Sony α7SII features ultra-high sensitivity and an ultra-wide
dynamic range. It has a 35 mm full-frame 12.2 megapixel image quality, with a wide dynamic
range of ISO 50 to 409,600 and a BIONZ X processor, which optimizes the performance of the
sensor, highlights details, and reduces noise, and records 4K (QFHD: 3840*2160) movies in
full-frame read-out without image pixel merging, effectively suppressing image edge aliasing
and moiré. The main reason for choosing this device and lens is that the Sony α7SII can be
transmitted via HDMI, achieving high-quality image instant transmission and less delay in camera
transmission than others of a similar price, allowing instant acquisition of images and image
processing. Additionally, the Sony SEL24240 telephoto single lens has a focal length of 240 mm,
allowing drones at short distances (<100 m) to be clearly displayed on the tracing screen.
• Thermal image acquisition: The UPG311 UVC image acquisition device is used to read the infrared
thermal imaging images. The infrared thermal imaging output interface is NTSC or PAL. It is
compatible with multiple systems and supports plug-and-play and UVC (USB video device class).
The agreement supports an image resolution of up to 640 × 480 pixels.
• Frame grabber for full-color camera: We used the Magewell USB Capture HDMI Gen2 capture card,
which supports single-machine simultaneous connection of multiple groups of operations, and is
compatible with Windows, Linux, and OS X operating systems with no need to install drivers
or a real Plug and Play. It supports a completely standard development interface. In addition,
the input and output interfaces use HDMI and USB 3.0, respectively, it supports an input image
resolution of up to 2048 × 2160 and a frame rate of up to 120 fps, and it automatically selects the
aspect ratio that is the most appropriate for the picture.
• Laser ranger: This article uses the LRF 28-2000 semiconductor laser ranger, as shown in Figure 10.
The main reason for using this laser ranger is that it uses an RF section of 900~908 nm wavelength
to protect the human eye, with a range of 3.5 m to as long as 2.0 km. Its measurement resolution
and measurement accuracy are 0.1 m and 1.0 m, respectively. Its specifications are suitable for
the measurement of the flight altitude of the short-range drones in this study, and its significant
distance measurement accuracy can be used as a basis for testing the drone flight altitude.
• Multirotor: At present, the market share of multirotors is dominated by DJI Dajiang Innovation.
With its drone, which is lightweight, portable, and has a small volume being sold at a friendly
price, it is very popular among people generally. Additionally, the news media have reported that
most drone events have involved drones from DJI Dajiang Innovation, so this study uses the DJI
Mavic Pro multirotor as the main tracing target of the experiment.
• Infrared thermal imaging: As shown in Figure 8, this paper uses an intermediate level
infrared thermal imaging camera (TE–EQ1), which is equipped with a 50 mm fixed-
focus lens(a) with 384×288
Body: Sonypixels.
α7SII It can be used for mobile
(b) Lens: Sony tracing
SEL24240 tests with a drone flying
altitude of about 100 m. The TE–EQ1 is capable of observing the distribution of heat
Figure 9. Single-lens reflex camera.
Appl. Sci. 2019, 9,sources
2583 in drones, providing users with drone tracing tasks through computer 11 vision
of 17
technology.
• Thermal image acquisition: The UPG311 UVC image acquisition device is used to read
the infrared thermal imaging images. The infrared thermal imaging output interface is
NTSC or PAL. It is compatible with multiple systems and supports plug-and-play and
UVC (USB video device class). The agreement supports an image resolution of up to 640
× 480 pixels.
• Frame grabber for full-color camera: We used the Magewell USB Capture HDMI Gen2
capture card, which supports single-machine simultaneous connection of multiple
groups of operations, and is compatible with Windows, Linux, and OS X operating
systems
Appl. Sci. 2019, 9, withREVIEW
x FOR PEER no need to install drivers or a real Plug and Play. It supports a completely 11 of 17
Figure 8.8. TE–EQ1 infrared thermal imaging imaging device.
standard development Figure
interface. In addition, the inputdevice. and output interfaces use HDMI
and USB 3.0, respectively, it supports an input image resolution of up to 2048 × 2160 and
• Full-color single-lens reflex camera: This study uses a Sony α7SII (Japan) with a Sony
a frame rate of up to 120 fps, and it automatically selects the aspect ratio that is the most
SEL24240 lens, as shown in Figure 9. The Sony α7SII features ultra-high sensitivity and
appropriate for the picture.
an ultra-wide dynamic range. It has a 35 mm full-frame 12.2 megapixel image quality,
• Laser ranger: This article uses the LRF 28-2000 semiconductor laser ranger, as shown in
with a wide dynamic range of ISO 50 to 409,600 and a BIONZ X processor, which
Figure 10. The main reason for using this laser ranger is that it uses an RF section of
optimizes the performance of the sensor, highlights details, and reduces noise, and
900~908 nm wavelength to protect the human eye, with a range of 3.5 m to as long as 2.0
records 4K (QFHD: 3840*2160) movies in full-frame read-out without image pixel
km. Its measurement resolution and measurement accuracy are 0.1 m and 1.0 m,
merging, effectively suppressing image edge aliasing and moiré. The main reason for
respectively. Its specifications are suitable for the measurement of the flight altitude of
choosing this device and lens is that the Sony α7SII can be transmitted via HDMI,
the short-range (a) Body:drones Sony α7SII
in this study, and its (b)significant
Lens: Sony SEL24240
distance measurement accuracy
achieving high-quality image instant transmission and less delay in camera
can be used as a basis Figure for testing
9.
the drone flight altitude.
transmission than others Figure of9. a Single-lens
similar price, reflex
reflex camera. instant acquisition of images and
allowing
camera.
image processing. Additionally, the Sony SEL24240 telephoto single lens has a focal
• length
Thermal of image
240 mm, acquisition:
allowing dronesThe UPG311 at shortUVC image(<100
distances acquisition
m) to be device
clearly is displayed
used to read on
the infrared
the tracing screen.thermal imaging images. The infrared thermal imaging output interface is
NTSC or PAL. It is compatible with multiple systems and supports plug-and-play and
UVC (USB video device class). The agreement supports an image resolution of up to 640
× 480 pixels.
• Frame grabber for full-color camera: We used the Magewell USB Capture HDMI Gen2
capture card, which supports single-machine simultaneous connection of multiple
groups of operations, and is compatible with Windows, Linux, and OS X operating
systems with no need Figure
Figure 10.
10. Laser
to install Ranger
drivers
Laser RangerorLRFa real
LRF 28-2000B.
Plug and Play. It supports a completely
28-2000B.
standard development interface. In addition, the input and output interfaces use HDMI
• Multirotor:
3.2. Development of Hardware
and USB 3.0,
for the Dual-Axis
Atrespectively,
present, theitmarket Device
supports share of multirotors
an input is dominated
image resolution of up toby 2048DJI Dajiang
× 2160 and
Innovation.
To makeaitframe shockproof, With its
rate of robust, drone,
up to 120 and which
fps, is
waterproof, lightweight, portable,
the outer material
and it automatically selects theand has
of aspect a small
the dual-axis volume
ratio thatdevice being
is theofmost
this
paper is made sold at a friendly
of Teflon.
appropriate Itfor price,
is equipped
the it with
picture. is verytwopopular among
sets of step motorspeople generally.
(including Additionally,
its step motor driver). the
These two-step news
• Laser media
motors have
respectively
ranger: reported
This article control that
uses the most
theLRFup and dronedown
28-2000 events have involved
elevation angle
semiconductor and
laser drones
the movement
ranger, from
as shown of DJIin
the left and rightDajiang
Figure Innovation,
rotation
10. The of main so this
the dual-axis study
reason for uses
device.
usingThe the DJI
thispixels Mavic
laser of Pro multirotor
full-color
ranger image
is that as
it uses the
used main
an in
RFthis tracing
device
section of
are 640 × 480; target
900~908 of the
that is, P =experiment.
nm 640 and Q =to480,
wavelength and the
protect a two-phase
human eye, step motor
with SR =of10,
a range 3.5000
m to is as
used;
longthat is,
as 2.0
θm = 0.036. The km. testIts height
measurementof droneresolution
tracing is and 10~100 m, and the radius
measurement accuracy of the
are dual-axis
0.1 m and device
1.0 m,is
3.2. Development of Hardware for the Dual-Axis Device
a = 0.25 m. The maximumIts
respectively. flight speed of the
specifications aredrone
suitableis 20 forkm/h. As a result, the
the measurement maximum
of the rotation
flight altitude of
a
speed Toofmake
the dual-axis
the short-range is V yaw,max
devicerobust,
it shockproof, drones and = Rstudy,
waterproof,
in this min
× Vand the
uav ≈itsouter
0.139 (material
m/s). distance
significant of the dual-axis
measurement device of this
accuracy
paperFigure
is made 11a of shows
can Teflon.
be used the
Itasisdual-axis
basis fortracing
aequipped with
testing twodevice
the sets and
of
drone step Figure
flightmotors 11b shows the
(including
altitude. itshardware
step motorstructure
driver).
inside the dual-axis tracing device. Figure 11c shows the focus fine-tuning structure diagram for the
laser ranger.

Figure 10. Laser Ranger LRF 28-2000B.

• Multirotor: At present, the market share of multirotors is dominated by DJI Dajiang


Innovation. With its drone, which is lightweight, portable, and has a small volume being
sold at a friendly price, it is very popular among people generally. Additionally, the
news media have reported that most drone events have involved drones from DJI
Dajiang Innovation, so this study uses the DJI Mavic Pro multirotor as the main tracing
𝜃 = 0.036. The test height of drone tracing is 10~100 m, and the radius of the dual-axis device is a =
0.25 m. The maximum flight speed of the drone is 20 km/h. As a result, the maximum rotation speed
of the dual-axis device is 𝑉 , = × 𝑉 ≈ 0.139(𝑚/𝑠).
Figure 11a shows the dual-axis tracing device and Figure 11b shows the hardware structure
Appl. Sci. the
inside 9, 2583
2019,dual-axis 12 the
tracing device. Figure 11c shows the focus fine-tuning structure diagram for of 17
laser ranger.

Laser Ranger
Sets

Image output contact/


Infrared thermal Control board input
imager contact/ System power
contact

Full color camera

Camera focus motor

(a) The dual-axis tracing device (b) Appearance of the front view of the image lens/laser ranger

Lens rotation direction

Pitch rotation

Yaw rotation

Screw clockwise
Screw anti-clockwise
lens pitch down
lens pitch up

Screw clockwise lens Screw clockwise lens


yaw anti-clockwise yaw clockwise

(c) Picture of the focus fine-tuning configuration of the laser ranger


Figure 11.
Figure 11. The
The inner
inner hardware
hardware structure
structure of
of the
the dual-axis
dual-axis tracing
tracingdevice.
device.

Themain
The main control
control board
board of
of the
the dual-axis
dual-axis tracing
tracing device
device is
is shown
shown in
in Figure
Figure12a
12aand
andthe
themotor
motor
Appl. Sci. 2019, 9, x FOR PEER REVIEW 13 of 17
controlunit
control unitisisshown
shownininFigure
Figure12b.
12b.

(a) Main control board (b) Motor control board


Figure 12. 3D
3D simulation
simulation and
and physical
physical circuit
circuit picture
picture of
of the
the main
main control
control unit and motor control unit.

3.3. Drone
3.3. Drone Image
Image Identification
Identification and
and Tracing
Tracing Test
Test in
in Different
DifferentWeather
WeatherEnvironments
Environments
This paper
This paper uses
uses the
the DJI
DJI Mavic
Mavic Pro
Pro four-axis
four-axis drone
drone to
to achieve
achieve tracing
tracing tests
tests of
of drone
drone flight
flight in
in all
all
kinds of weather. The results of tracing tests on sunny days and at flight altitudes of about 70–105
kinds of weather. The results of tracing tests on sunny days and at flight altitudes of about 70–105 m m
are shown in Figure 13. At more than 102 m, drone movement cannot be determined (restricted by
the resolution of the full-color camera). The test results of various weather conditions––sunny,
cloudy, and rainy–– are shown in Figures 14–16. The detailed test description is as follows:
(a) Main control board (b) Motor control board
Figure 12. 3D simulation and physical circuit picture of the main control unit and motor control unit.

3.3. Drone Image Identification and Tracing Test in Different Weather Environments
Appl. Sci. 2019, 9, 2583 13 of 17
This paper uses the DJI Mavic Pro four-axis drone to achieve tracing tests of drone flight in all
kinds of weather. The results of tracing tests on sunny days and at flight altitudes of about 70–105 m
are shown
are shown in
in Figure
Figure13.
13.AtAtmore
morethan
than102
102m,m,drone movement
drone movement cannot be be
cannot determined
determined (restricted by the
(restricted by
resolution of the full-color camera). The test results of various weather conditions—-sunny,
the resolution of the full-color camera). The test results of various weather conditions––sunny, cloudy,
and rainy—-
cloudy, are shown
and rainy–– are in Figures
shown 14–16. The
in Figures detailed
14–16. test description
The detailed is as follows:
test description is as follows:

(a) The distance is 72 m (b) The distance is 86 m

(c) The distance is 94 m (d) The distance is 102 m


Figure
Figure 13. Tests
Tests of drone flight altitudes (on a sunny day).

(1) Sunny environment: As As shown


shown in in Figure
Figure 14,
14, the
the acquisition
acquisition effects
effects of the drone’s image on
sunny days daysare aredifferent based
different on different
based brightness
on different and sunshade
brightness levels due
and sunshade to different
levels due toconditions.
different
In the case of
conditions. Inlarge changes
the case in ambient
of large changesbrightness,
in ambientthe effect of light
brightness, the on the of
effect lens andon
light tracing is quite
the lens and
large. In addition, the size of the drone is small, which can cause the phenomenon that
tracing is quite large. In addition, the size of the drone is small, which can cause the phenomenon the target drone
Appl. Sci. 2019, 9, x FOR PEER REVIEW 14 of 17
is lost
that thewhile
targetin drone
imageisprocessing,
lost while and then processing,
in image recovered. and then recovered.

Figure 14.
Figure Resultsof
14. Results ofthe
thetracing
tracing test
test on
on sunny
sunny days
days (drone
(drone flight
flight altitude
altitude is
is 105
105 m).
m).

(2) Cloudy environment: Figure 15 shows the acquisition effects of the drone’s image on cloudy
(2) Cloudy environment: Figure 15 shows the acquisition effects of the drone’s image on cloudy
days. The cloudy background is clean and free of other interference. In addition, the difference in
days. The cloudy background is clean and free of other interference. In addition, the difference in
grayscale presented by the sky is not obvious, making the tracing target more significant. The tracing
grayscale presented by the sky is not obvious, making the tracing target more significant. The tracing
effect was good, and the target drone was not easy to lose.
effect was good, and the target drone was not easy to lose.
Figure 14. Results of the tracing test on sunny days (drone flight altitude is 105 m).
(2) Cloudy environment: Figure 15 shows the acquisition effects of the drone’s image on cloudy
days.(2)The cloudy
Cloudy backgroundFigure
environment: is clean
15 and free
shows theofacquisition
other interference.
effects ofIn
theaddition, the difference
drone’s image on cloudyin
grayscale
days. The presented by the sky is not
cloudy background obvious,
clean making
and free theinterference.
of other tracing targetInmore significant.
addition, The tracing
the difference in
effect was
grayscale
Appl. Sci. good,
presented
2019, andby
9, 2583 the target
the sky isdrone was notmaking
not obvious, easy to lose.
the tracing target more significant. The tracing
14 of 17
effect was good, and the target drone was not easy to lose.

Figure 15. Results of the tracing test on cloudy days (drone flight altitude is 130 m).
Figure 15.
Figure Resultsof
15. Results ofthe
thetracing
tracingtest
teston
oncloudy
cloudydays
days(drone
(drone flight
flight altitude
altitude is
is 130
130 m).
m).
(3) Rainy environment: Figure 16 shows the acquisition effects of the drone’s image on rainy
(3)
days.(3) Rainy
InRainy environment:
general, Figure
Figure16
no drone appears
environment: 16shows
on rainy the
showsdays acquisition
the (a normaleffects
acquisition drone of
is the
effects drone’s
usually
of the notimage on rainy
waterproof);
drone’s image days.
on drone
rainy
In general,
players
days. no
In usuallydrone appears
general, choose
no drone on
to appearsrainy
take off on days
in rainy (a
clear daysnormal
weather. drone is usually
In thedrone
(a normal case of not waterproof);
rain, thenot
is usually lens drone players
is susceptible
waterproof); drone to
usually
raindrops.choose to
Coupled take
withoff in
poor clear weather.
vision, it is In the
impossible case toof rain,
clearly the lens
present is
the susceptible
tracing
players usually choose to take off in clear weather. In the case of rain, the lens is susceptible to to
image, raindrops.
and it is
Coupled with poor
difficult to Coupled
raindrops. vision,
judge thewith
target, it is
which
poor impossible
leads
vision, it to to clearly
a poor tracing
is impossible present the tracing image, and it is difficult
effect. present the tracing image, and it is
to clearly to
judge the
difficult totarget,
judge which leads
the target, to a poor
which leadstracing
to a pooreffect.
tracing effect.

Figure 16. Results of the tracing test on rainy days (drone flight altitude is 80 m).

3.4. Test of Drone Tracing and Longitude and Latitude Coordinates


Figure 17 shows the tracing process through a full-color image. A target drone was successfully
hit with laser rangers in this study. Conversion of the sensed values of the nine-axis attitude meter and
the known longitude and latitude coordinates (W A , JA ) as (120.328766, 22.648851) of the GPS of the
dual-axis device were achieved through the longitude and latitude sphere coordinate algorithm. It can
be calculated that when the drone is locked by the tracing system, its current longitude and latitude
coordinates (W B , JB ) (based on Equations (12) and (13)) are (120.328766, 22.648394), and the flying
altitude is 58.7 m.
Figure 18 shows the tracing process through a thermal image at the same place. A target drone
was successfully hit with the laser rangers of the dual-axis device. Conversion of the sensed values
of the nine-axis attitude meter and the known longitude and latitude coordinates (W A , JA ), where
a longitude and latitude coordinate (120.328766, 22.648851) of the GPS of the device were achieved
through the longitude and latitude sphere coordinate algorithm. It can be calculated that when the
drone is locked by the tracing system, its current longitude and latitude coordinates (W B , JB ) (based on
Equations (12) and (13)) are (120.328697, 22.648524) and the flying altitude is 51.9 m.
Figure 17 shows the tracing process through a full-color image. A target drone was successfully
hit with laser rangers in this study. Conversion of the sensed values of the nine-axis attitude meter
and the known longitude and latitude coordinates (WA, JA) as (120.328766, 22.648851) of the GPS of
the dual-axis device were achieved through the longitude and latitude sphere coordinate algorithm.
It can be calculated that when the drone is locked by the tracing system, its current longitude and
Appl. Sci. 2019,
latitude 9, 2583
coordinates 15 of 17
(WB, JB) (based on Equations (12) and (13)) are (120.328766, 22.648394), and the
flying altitude is 58.7 m.

Figure
Figure 17.
Appl. Sci. 2019, 9, x FOR PEER REVIEW 17.Test
Testresults
resultsofofthe
thefull-color tracing
full-color interface.
tracing interface. 16 of 17

Figure 18 shows the tracing process through a thermal image at the same place. A target drone
was successfully hit with the laser rangers of the dual-axis device. Conversion of the sensed values
of the nine-axis attitude meter and the known longitude and latitude coordinates (WA, JA), where a
longitude and latitude coordinate (120.328766, 22.648851) of the GPS of the device were achieved
through the longitude and latitude sphere coordinate algorithm. It can be calculated that when the
drone is locked by the tracing system, its current longitude and latitude coordinates (WB, JB) (based
on Equations (12) and (13)) are (120.328697, 22.648524) and the flying altitude is 51.9 m.

Figure 18.
Figure 18. Test results of
Test results of thermal
thermal imaging
imaging tracing
tracing interface.
interface.

4.
4. Conclusions
Conclusion
As
As the
the number
number of of drones
drones increases,
increases, soso does
does the
the risk
risk of
of flight
flight safety.
safety. It
It is
is only
only aa matter
matter ofof time
time
before
before aadrone
drone is used
is usedto attack aircrafts
to attack to cause
aircrafts to damage. Governments
cause damage. are worried
Governments are that the increasing
worried that the
number
increasing of amateur
number drones will lead
of amateur to more
drones willflight
leadaccidents.
to more Therefore, many governments
flight accidents. Therefore, many now
regulate drone registration, pilot’s license regulations, and so forth, and also limit
governments now regulate drone registration, pilot’s license regulations, and so forth, and also limit flight range and
maximum
flight range flight
and altitude.
maximum In addition, various anti-UAV
flight altitude. systems
In addition, have been
various proposed,
anti-UAV but these
systems haveAUDS
been
either
proposed, but these AUDS either use manual aiming or expensive radar for tracing drones. step
use manual aiming or expensive radar for tracing drones. This study uses two sets of This
motors withtwo
study uses a thermal
sets ofimaging camera/full-color
step motors with a thermal camera and sensing
imaging module for camera
camera/full-color the dynamic tracing
and sensing
of drone for
module flying
theatdynamic
high altitudes,
tracingandof measurement
drone flying of at its three-dimensional
high coordinates, which
altitudes, and measurement of its makes
three-
it a low-cost and practical drone tracing device for anti-UAV systems. Future
dimensional coordinates, which makes it a low-cost and practical drone tracing device for anti-UAVresearch will combine an
attack
systems.drone and research
Future a net gunwillwith a throwing
combine function
an attack to provide
drone and a neta safe
gunanti-UAV system with
with a throwing tracing
function to
and capturing of drones.
provide a safe anti-UAV system with tracing and capturing of drones.
Based on the results of this study in UAV positioning and dynamic locking of the moving
coordinates of the UAV, high-energy laser guns or electromagnetic interference guns will be applied
for the direct targeting attack in the future work. In addition, real-time transmission of target UAV
dynamic coordinate information to an attack UAV will be done for fast pursuit. For the stealth hidden
UAV tracking, we will try to search the controller of the UAV remotely by three-point positioning to
Appl. Sci. 2019, 9, 2583 16 of 17

Based on the results of this study in UAV positioning and dynamic locking of the moving
coordinates of the UAV, high-energy laser guns or electromagnetic interference guns will be applied
for the direct targeting attack in the future work. In addition, real-time transmission of target UAV
dynamic coordinate information to an attack UAV will be done for fast pursuit. For the stealth hidden
UAV tracking, we will try to search the controller of the UAV remotely by three-point positioning to
eliminate the potential threat of the hidden UAV. The future aim of the project is to make the anti-UAV
defense system more practical.

Author Contributions: Conceptualization, W.-P.C. and B.-H.S.; methodology, W.-P.C. and C.-C.C.; software,
C.-I.H.; validation, W.-P.C., B.-H.S. and W.-T.L.; formal analysis, W.-P.C. and B.-H.S.; investigation, C.-I.H.;
resources, C.-C.C.; data curation, C.-I.H.; writing—original draft preparation, B.-H.S.; writing—review and editing,
W.-P.C.; visualization, W.-T.L.; supervision, C.-I.H.; project administration, W.-P.C.; funding acquisition, B.-H.S.
and C.-C.C.
Funding: This research was funded by the Ministry of Science and Technology and Southern Taiwan Science Park
Bureau of Taiwan, grant number MOST 107-2622-E-992 -006 -CC2 and 107B19.
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Boh, W.F.; Lim, W.K.; Zeng, Y. Da Jiang Innovations (DJI): The rise of the drones. In The Asian Business
Case Centre-Nanying Technology University Singapore; Nanyang Technological University: Singapore, 2017;
pp. 1–28.
2. Chin, K.S.H.; Siu, A.C.Y.; Ying, S.Y.K.; Zhang, Y. Da Jiang Innovation, DJI: The Future of Possible. Acad. Asian
Bus. Rev. 2017, 3, 83–109. [CrossRef]
3. Diaz, T.J. Lights, drone... action. IEEE Spectr. 2015, 52, 36–41. [CrossRef]
4. Lort, M.; Aguasca, A.; López-Martínez, C.; Marín, T.M. Initial Evaluation of SAR Capabilities in UAV
Multicopter Platforms. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 127–140. [CrossRef]
5. Dempsey, P. View from Washington [News Comment]. Eng. Technol. 2015, 10, 14. [CrossRef]
6. Dempsey, P. View from Washington [News Briefing]. Eng. Technol. 2014, 9, 16. [CrossRef]
7. Kratky, M.; Minarik, V. The non-destructive methods of fight against UAVs. In Proceedings of the 2017
International Conference on Military Technologies (ICMT), Brno, Czech Republic, 31 May–2 June 2017;
pp. 690–694.
8. Foreign Counter-Unmanned Aerial Systems: Developments in the International Arms Markets. Available
online: https://www.ida.org/research-and-publications/publications/all/f/fo/foreign-counter-unmanned-
aerial-systems-developments-in-the-international-arms-markets (accessed on 21 May 2019).
9. Ding, G.; Wu, Q.; Zhang, L.; Lin, Y.; Theodoros, T.A.; Yao, Y. An Amateur Drone Surveillance System Based
on Cognitive Internet of Things. IEEE Commun. Mag. 2018, 56, 29–35.
10. About the Center for the Study of the Drone. Available online: https://dronecenter.bard.edu/publications/
counter-drone-systems/ (accessed on 21 May 2019).
11. Sebastain, P.; Andrei, L. Considerations Regarding Detection and Combat System for UAV’s. Recent 2017, 18,
49–55.
12. Ramos, D.B.; Loubach, D.S.; da Cunha, A.M. Developing a distributed real-time monitoring system to track
UAVs. IEEE Aerosp. Electron. Syst. Mag. 2010, 25, 18–25. [CrossRef]
13. Xu, Z.; Wei, R.; Zhao, X.; Wang, S. Coordinated Standoff Target Tracking Guidance Method for UAVs.
IEEE Access 2018, 6, 59853–59859. [CrossRef]
14. Stary, V.; Krivanek, V.; Stefek, A. Optical detection methods for laser guided unmanned devices. J. Commun.
Netw. 2018, 20, 464–472. [CrossRef]
15. Sheu, B.H.; Chiu, C.C.; Lu, W.T.; Lien, C.C.; Liu, T.K.; Chen, W.P. Dual-axis Rotary Platform with UAV Image
Recognition and Tracking. Microelectron. Reliab. 2019, 95, 8–17. [CrossRef]
Appl. Sci. 2019, 9, 2583 17 of 17

16. Honkavaara, E.; Eskelinen, M.A.; Pölönen, I.; Saari, H.; Ojanen, H.; Mannila, R.; Holmlund, C.; Hakala, T.;
Litkey, P.; Rosnell, T.; et al. Remote Sensing of 3-D Geometry and Surface Moisture of a Peat Production Area
Using Hyperspectral Frame Cameras in Visible to Short-Wave Infrared Spectral Ranges Onboard a Small
Unmanned Airborne Vehicle (UAV). IEEE Trans. Geosci. Remote Sens. 2016, 54, 5440–5454. [CrossRef]
17. Yin, H.; Chai, Y.; Yang, S.X.; Yang, X. Fast-moving target tracking based on mean shift and frame-difference
methods. J. Syst. Eng. Electron. 2011, 22, 587–592. [CrossRef]
18. Du, B.; Sun, Y.; Cai, S.; Wu, C.; Du, Q. Object Tracking in Satellite Videos by Fusing the Kernel Correlation
Filter and the Three-Frame-Difference Algorithm. IEEE Geosci. Remote Sens. Lett. 2018, 15, 168–172.
[CrossRef]
19. Chen, P.; Dang, Y.; Liang, R.; Zhu, W.; He, X. Real-Time Object Tracking on a Drone with Multi-Inertial
Sensing Data. IEEE Trans. Intell. Transp. Syst. 2018, 19, 131–139. [CrossRef]
20. Chang, X.; Haiyan, X. Notice of Retraction-Using sphere parameters to detect construction quality of
spherical buildings. In Proceedings of the 2010 2nd International Conference on Advanced Computer
Control, Shenyang, China, 27–29 March 2010; Volume 4, p. 17.
21. Oraizi, H.; Soleimani, H. Optimum pattern synthesis of non-uniform spherical arrays using the Euler rotation.
IET Microw. Antennas Propag. 2015, 9, 898–904. [CrossRef]

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

You might also like