Graffiti Detection Using Drones

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

2019 IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computing, Scalable Computing &

Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation

Building Smart City Drone for Graffiti


Detection and Clean-up
*
Shuqin Wang Jerry Zeyu Gao Weiyi Li, Yanning Li, Shengqiang Lu
Jiangsu Normal San Jose State University, Kaixuan Wang Taiyuan University of
University, San Jose, USA San Jose State University, Technology,
Xuzhou, China Corresponding email: San Jose, USA Taiyuan, China,
email: wsq308@gmail.com jerry.gao@sjsu.edu

Abstract—Graffiti on buildings and bridges are oftentimes an navigate the UAV and avoid obstacles and collisions during
eyesore. Those on road symbol signs can even pose safety risks to its navigation path. Therefore, we decided to adopt machine
motorists. Not only is graffiti cleaning costly, it also disrupts learning techniques for automatic obstacle recognition so
normal traffic. Graffiti is a widespread problem in many cities in that drones in flight can automatically avoid barriers without
the U.S. This paper proposes a machine learning approach to human intervention. It should also be able to determine if
unmanned aerial vehicle (UAV) graffiti detection and removal. the target area is covered by graffiti. Therefore, the drone in
Our solution builds on the smart city framework. The proposed our system can observe ambient environment for graffiti and
solution is expected to lower graffiti cleaning cost and minimize then clean them up using the carried spray paint upon
impact on city traffic.
recognition of graffiti. The drone’s flight is controlled by an
Keywords— UAV, Smart City, Graffiti detection
operator through mobile applications. A cloud server is set
up to monitor drone status and gives commands to the
I. INTRODUCTION operator. This paper proposes a machine learning based
smart city solution using UAV to support graffiti detection
and removal. It discusses the challenges in drone obstacle
Unmanned Aerial Vehicles (UAVs) were first created avoidance and graffiti detection. Our proposed solution has
only used for military purposes. However, they are now a promising potential to lower graffiti cleaning cost and to
widely used in civil applications, for example, agriculture, minimize the impact on traffic.
environmental protection, public safety and traffic flow This paper provides a promising smart system that
control. However, one of the most evolving areas of UAVs addresses challenges in graffiti detection and removal. The
are their involvement in smart cities. Smart cities are those rest of the paper is organized as follows. Section II reviews
cities that use digital technology to connect, protect, and and summarizes the related work and existing solutions in
enhance the lives of citizens. Drones are just one of the graffiti detection and UAV obstacle avoidance. Section III
many important pieces of the puzzle when it comes to discusses our proposed graffiti detection and cleaning
developing and operating a smart city. Today, thousands of system based on UAV. Section IV evaluates the system
drones are already being used to improve city life: drones performance. Finally, conclusions are presented in Section
are being used to document accident scenes, support first V.
responder activities, and monitor construction sites.
Many cities in the U.S. face a serious graffiti problem. II. RELATED WORK
There are many reports about graffiti at various locations, A. Graffiti Detection
including buildings, bridges, and road symbol signs. Some
even contain profanities or lewd images. Such graffiti A lot of research has been done on graffiti detection. Di
usually cause an undesirable impression. Graffiti cleaning is Stefano et al. [3] presented a novel video surveillance
very expensive and sometimes a logistic nightmare when approach designed to detect vandalism acts occurring on the
highway/street lane closure is required for crew to clean background of the monitored scene, such as graffiti painting
signs above. Current graffiti removal is carried out by on walls and surfaces, public and private property defacing
cleaning crew typically using a Skylift crane. The whole or etching, unauthorized sticking of posters. Angiati et al. [4]
process is tedious and expensive. For example, Los Angeles implemented graffiti detection by catching appearance
spends about $7 million per year dealing with graffiti. Since changes against a reference background which is stationary
2011 the City of San José has spent upward of $4 million on both in space and time. Tombari et al. [5] used a
graffiti eradication efforts. Ordinary methods for graffiti time-of-flight (TOF) camera to detect acts of vandalism
cleaning, which are chemical, physical and biological [1], such as dirtying, etching and defacing walls and object
pose a health risk to working crew. Therefore, there is a surfaces.
strong need for convenient and cost-effective ways to Real-time recognition of making graffiti behavior can
remove graffiti from highway overpasses and tall buildings. fundamentally prevent the formation of graffiti, but this
Inspired by UAV application in forest fire detection [2], requires monitoring every moment. Therefore, recognition
we propose a smart city solution using UAV to support of graffiti images is critical. Parra et al. [6] took advantage
graffiti detection and removal based on machine learning. of touchscreen capabilities of modern mobile devices to aid
Designing a sufficiently safe control system is a daunting recognition of color in gang graffiti images and then used
challenge. Operators controlling a UAV need to skillfully color recognition based on touchscreen tracing to recognize
gang graffiti. Yang et al. [7] proposed an efficient graffiti

978-1-7281-4034-6/19/$31.00 ©2019 IEEE 1922


DOI 10.1109/SmartWorld-UIC-ATC-SCALCOM-IOP-SCI.2019.00337
image retrieval system that uses character detection results Ma et al. [15] proposed a generic framework that
and integrates both image and semantic level understanding integrates an autonomous obstacle detection module and a
of graffiti characters. Parra et al. [8] designed a reinforcement learning (RL) module to develop reactive
mobile-based gang graffiti system that uses location obstacle avoidance behavior for a UAV. Choi et al. [16]
information for querying a database of graffiti images. presented a new two-layer obstacle avoidance algorithm that
Munsberg et al. [9] investigated use of convolutional neural allows an unmanned aircraft system to avoid multiple
networks (CNN) aiming at classifying weakly labeled obstacles with minimal effort. Prevost et al. [17] presented a
images to identify presence or absence of graffiti art in novel model-based predictive UAV guidance algorithm.
images. Li et al. [10] proposed a deep CNN to classify Obstacle avoidance is achieved by minimizing the UAV
graffiti components, which reaches 89.3% accuracy with collision probability with all known dynamic ellipsoidal
dropout regularization. obstacles on the prediction horizon, while ensuring that the
collision probability with any given obstacle at each
B. UAV prediction step does not surpass a preset threshold. Manner
The primary goal of smart city design is to provide et al. [18] presented a control algorithm for quadrotors based
effective infrastructure and services at a low cost. UAVs can on monocular vision that is specifically designed for
help achieve this goal. This is why UAVs are utilized in a obstacle avoidance in a forest environment. Uzol et al. [19]
wide range of applications for smart cities. developed a strategy based on panel methods for fixed wing
Avoiding obstacles while in flight is a non-trivial task aircraft. However, their work was based on the assumption
for UAV because the obstacles may be so positioned that that knowledge of a global map of the environment is
avoiding them requires delicate and agile maneuvers. Wang available. Call et al. [20] presented a 2D sliding mode
et al. [11] developed a new technique for obstacle or obstacle avoidance controller in the horizontal plane.
collision avoidance using a modified Grossberg neural Griffiths et al. [21] made use of optical flow to detect
network. Foina et al. [12] presented a cloud-based system to obstacles and computed a lateral bias necessary to avoid
manage unmanned air traffic, which is an innovative way them. Aguilar et al. [22] proposed an obstacle avoidance
for obstacle or collision avoidance in smart cities. Their system for UAVs using a monocular camera. To detect
work proposes solutions for Unmanned Aerial Systems obstacles, the system compares the image obtained in real
flight at low-altitude through a community platform time from the UAV with a database of obstacles that must
engaging cities and tools for law enforcement officers. The be avoided.
discussion in their work breaks down the new flight Object recognition and scene classification from aerial
problems into three broad areas: citizen engagement via a imagery using deep learning techniques have also acquired a
community platform, vehicle-city coordination, and relevant role in many applications, such as jellyfish
vehicle-to-vehicle coordination. Since unmanned air traffic monitoring [23], road traffic monitoring from UAVs [24],
management and law enforcement functions require assisting avalanche search and rescue operations with UAV
identification of a drone in real-time, an LED color imagery [25], and terrorist identification [26]. In these kinds
sequence is used as a unique identifier with assistance of an of applications, UAVs provide a low-cost platform for aerial
identification and tracking system using cellular networks. image acquisition, while deep learning technologies are
There is a ground identification device that can decode the mainly utilized for feature extraction [27].
LED color sequence.
III. SYSTEM OVERVIEW
As UAVs are now used for many recreational and
technological purposes, there are a few issues related to An operator controlling a UAV needs to be skillful in
cope up with the privacy and other interference issues [13]. navigating the UAV, and avoiding obstacles and crashes
It is becoming increasingly clear that drones will play a during its navigation path. Therefore, we decided to adopt
critical role in smart city applications and will be connected machine learning technologies to enable the UAV to
to, if not directly, a part of the Internet of Things. Moreover, automatically recognize obstacles without human
drones will play an essential role in providing network relay intervention. The UAV should also have detection features
connectivity and situational awareness, particularly in for determining whether the target area is covered with
disaster assessment and recovery effort. The authors for [13] graffiti.
discuss these challenges and consider future research A semi-autonomous UAV graffiti detection and removal
directions for privacy and security related issues for UAVs. system based on machine learning is proposed in this paper.
There is growing interest focusing on utilizing A semi-auto pilot drone can detect nearby environment and
information and communication technology services and recognize the graffiti and decide whether to clean it. The
smart solutions in smart city development. Mohammed et al. drone is also equipped with spray paint. An operator
[14] introduced several opportunities for UAV use to controls the drone through a mobile application. There is
support a smart city, which will be very beneficial to any also a cloud server which can monitor drone status and give
smart city that utilizes UAVs for their economic growth and commands to the operator. This system can effectively
development, such as geo-spatial and surveying activities, reduce human labor cost and minimize its impact on traffic.
civil security control, traffic and crowd management, natural The key component in the system is the drone that is
disaster control and monitoring, agriculture and wirelessly connected to a remote control, which in turn is
environmental management, urban security increasing city’s linked to a mobile phone through a self-developed app. All
attractiveness, big data processing, coordination between the commands sent from the operator go through the remote
heterogeneous systems. Paper [14] also addressed some control before hitting the drone. These communications are
business challenges in this area, for example, ethics and based on DJI Mobile SDK. There is a backend server called
privacy, cost, licensing and legislation, business adoption as Ground Station, which acts as the admin server and sends all
well as other technical challenges. graffiti missions to the mobile application.

1923
The drone has several onboard components. For example, in the red line in Figure 2 remain our focus.
the onboard server can directly control the drone without
instructions from the remote control. The embedded camera
can get visual information of the drone and helps to avoid
obstacles with onboard guidance system. The spray system
can paint over graffiti when detected.
The system architecture is shown in Figure 1. The whole
system consists of drone navigation, graffiti detection; spray
can control unit, and graffiti and user management. The
components in blue are the focus of our work.

Fig.2. Subsystem architecture

A. Obstacle Avoidance
During flight of the drone, it may encounter different
types of obstacles. The guidance system helps the drone
avoid hitting those obstacles, crashing or hurting people
around.
Fig.1. System architecture
To solve these problems, we come up with an obstacle
Communication and User Interfaces: DJI Matrice 100 avoidance solution. We use a guidance sensor to provide
has three interfaces: Guidance SDK, Onboard SDK and several guidance filters for the drone. Whenever an obstacle
mobile SDK. Guidance SDK is used for various navigation hits the guidance filter, the guidance system will issue a
commands and obstacle avoidance. Onboard SDK is warning to the drone. Then the drone will find a way to
connected to the remote controller via a light bridge that is avoid the obstacle. With those filters, we can ensure that the
connected to Ground Control Station (GCS) via a mobile drone moves directly forward, backwards, left or right
navigation app. Mobile app for the system is developed without being affected by obstacles.
using DJI Mobile SDK to control the drone. Onboard SDK
is also used for spray controller. Before we figure out how to build guidance filters, we
must know the flight field of the drone. Table I shows the
Navigation: Onboard camera and GPS keep sending basic performance details of the Matrice 100 drone. From
current location information of the UAV to the operator. For this table, we can notice that when the Matrice 100 drone
semi-automatic navigation and collision avoidance, we use uses the P-GPS mode to hover, it may fly up or down for 0.5
machine learning algorithms in the system to help detect meters and fly around horizontally for 2.5 meters. We can
obstacles automatically and alert the operator when call it the unstable zone of the drone. We need to consider
obstacles are detected within an unsafe distance. The UAV this zone as we don’t want the drone to hover around and
has four flight modes: Follow-Me, To-Location, Explore, suddenly hit some obstacle when we make obstacle
and Configure. Follow-Me is achieved by GPS follow or avoidance.
Camera follow. To-location enables the drone to get to the TABLE I. MATRICE 100 PERFORMANCE SPECS
designated location. The drone can generate work report
resulting from its panoramic exploration. Configure Hovering Accuracy Vertical: 0.5 m, Horizontal:
command can get map view and camera view of the drone. (P-Mode With GPS) 2.5 m
Max. Angular Velocity Pitch: 300°/s, Yaw: 150°/s
Onboard Server: Our onboard server has a pre-trained
model specifically designed to detect graffiti. Once a Max. Tilt Angle 35°
graffiti-like image is detected, the onboard server sends a Max. Speed of Ascent 5 m/s
message to the navigation module to navigate the proximity
around the drone for further confirmation. Max. Speed of Descent 4 m/s
Max. Wind Resistance 10 m/s
Spray Control: After the operator receives the graffiti
image and sends commands to clean it up, the UAV system Max. Speed 22 m/s (ATTI mode, no
will be changed to a cleaning mode. It will paint over the payload, no wind)
graffiti image with carry-on paint to cover the original 17 m/s (GPS mode, no
graffiti. payload, no wind)
Figure 2 provides a more detailed characterization of the Now we can start to build a guidance system. In order to
system architecture presented in Figure 1. Modules enclosed calculate the guidance filters, we need to know the
capabilities of the guidance sensors. Table II shows the

1924
guidance sensor sensor may be installed to different positions of the drone.
TABLE II. GUIDANCE FILTERS DESIGN – VERTICALLY This way, we can get the minimize obstacle sensing
Velocity Detection 0~16 m/s (From the ground 2 m)
distance vertically is:
Range (The measurement shall prevail) Dsvı(Huz/2+Os)/sin28
Velocity Detection 0.04 m/s (From the ground 2 m)
Accuracy One guidance sensor has two guidance cameras, each of
Positioning Accuracy 0.05 m (From the ground 2 m) them has a sensing area of 60°, and therefore, most of the
Effective Sensor Range 0.20 m ~ 20 m sensing areas are coincided. To achieve higher accuracy, we
calculate the coincidence field as a horizontal guidance
External Requirements Good lighting; Texture-rich
filter.
surface with clear patterns
According to Table II and Figure 3, we can see that a Similar to the vertical direction, we have to calculate the
guidance sensor can sense an obstacle range of 0.2 ~ 20 Ds horizontally (Dsh, which is line af in Figure 5). We need
meters. The sensing field of a guidance sensor is 60° to ensure that the width of guidance filter (Wf, which is line
horizontally and 58° vertically. For different sensing fields, ac in Figure 5) is equal to or longer than the width of
we need to design guidance filters both vertically and unstable region (Wuz, which is line gh in Figure 5). Wuz
horizontally. equals to the drone width (Wd, which is line jk in Figure 5)
plus twice the horizontal hover accuracy (HVh):
Wuz=Wd+2*HVh
In Figure 5, it becomes:
gh=jk+2*HVh
a

j
t
e b
d

f k

h
Fig.3.Guidance sensor sensing field
Drone
c
Figure 4 shows the vertical sliding of the drone system. Unstable zone
The green rectangle represents the drone, gray rectangle Guidance sensor
represents the unstable zone, and the pink rectangle
Fig.5.Guidance filter design- horizontally
represents the guidance sensor.
a
As we can see, even with the guidance system, a drone
f
still has a blind field. In order to prevent the drone from
g
entering the blind zone and to make our guidance system
j k
work properly, we only need to let it fly forward alone,
e
b backwards, left or right. This way, we can fully trust the
d
guidance system. If there is no warning, there is no obstacle.
m l

i h
Drone

c Unstable zone
Guidance sensor

Fig.4. Guidance filter design-vertically

The goal of this design is to determine the minimize


obstacle sensing distance (Ds) of the drone. To calculate the
Ds vertically (Dsv, which is line ab in Figure 4), we must
ensure that the height of guidance filter (Hf, which is line ac
in Figure 4) is equal to or longer than the height of unstable
zone (Huz, which is line gh in Figure 4). Huz is equal to the
drone height (Hd, which is line kl in Figure 4) plus twice the
vertical hover accuracy (HAv): Huz = Hd + 2*HAv.
In Figure 4, it becomes:
gh = kl + 2 * HAv
As we know, ğabc = 58e, acıgh, we can get abı
(gh/2+Os)/sin28, Os means the offset of guidance sensor
Fig.6. Obstacle avoidance decision tree
from center of the drone (Os=jm/2-mb), as the guidance

1925
For the guidance system we developed a decision tree to
implement the obstacle avoidance as shown in Figure 6. The
decision tree covers all scenarios a drone might encounter in
its attempt to avoid obstacles. Most decisions are about
moving right or left to avoid obstacles until no obstacle
warnings are received, after which the task continues. Or the
drone cannot move left, right or forward. As a result, it can
only stop to ask operator for help. It’s easy to move left or
right horizontally and keep the drone safe as those two
directions have the guidance sensor for obstacle sensing.
However, there is no guidance sensor for vertical
movement.
As we know, a guidance filter keeps moving along with
the drone until it hits any obstacles. Then it will leave a safe
path for drone to fly through. In Figure 7, we can see that
the guidance sensor bm (purple rectangle) is not in the
middle of the drone (jm). According to the previous Fig.8. Graffiti detection model training process
formulas, we can ensure that ac is longer than fi. If Ds=Dsv The graffiti on walls and the graffiti on traffic signs are
(and thus DsvıDsh), we can get that when guidance is very different. The former are usually large and complex
installed to the upper position of the drone, point a will be while the latter are mostly limited in size due to space
higher than point f by a distance of az. If Ds=Dsh (and thus constraint. Therefore, two distinct models are trained to
Dsv<Dsh), the value of az becomes larger. recognize wall graffiti and traffic sign graffiti, respectively,
The line segment az is the distance between unstable in this project.
zone of the drone and the safe path created by guidance TABLE III. LABEL MAP FOR MODEL DETECTING GRAFFITI ON WALLS
filter. It is the maximum distance a drone can safely fly Item ID Name
upward when drone is not moving horizontally. But if we
1 Throw up Graffiti
keep the guidance filter moving, a new safe path will
continue to be generated from the filter. So the drone can fly 2 Wild style Graffiti
horizontally while flying up if the climb degree (denoted by 3 Cartoon Graffiti
Dc) is no greater than ğagz. 4 Throw up Alphabet
5 Wild style Alphabet
Dc=arctan(Hf-2HAv-2Os)/(2*(Wd+HAh)+Huz/tan28) 6 Cartoon Eye
a The model for wall graffiti is trained in such a way that
it recognizes the style and pattern of graffiti on walls and
g
z f classifies them into one of several styles, such as throw up
j k style, cartoon style, wild style and so on. It can also
b
distinguish whether it is a picture or some alphabets. The
d e label map of the model is listed in Table III.
For graffiti on traffic signs, we just train a model to
m l
detect if there are graffiti on a certain traffic sign. The label
i
h map of the model is listed in Table IV.
Drone
TABLE IV. LABEL MAP FOR MODEL DETECTING GRAFFITI ON TRAFFIC
c Unstable zone SIGNS
Guidance sensor
Item ID Name
Fig.7.Guidance system- vertically 1 Traffic Sign
B. Graffiti Detection 2 Graffiti Traffic Sign

Since Tensor Flow object detection API is open source IV. SYSTEM EVALUATION
and easy to use, we decided to choose this API to do In this project, we trained two models, one for detecting
machine learning work in order to achieve graffiti graffiti on traffic signs and the other for detecting graffiti on
auto-detection before the drone starts the cleaning process. walls. Figure 9 shows the detection results of graffiti on the
Besides, based on its transfer learning, it provides more wall. It clearly indicates that our system recognizes it as
accurate results while consumes less time than other throw up graffiti. We also implemented a model to check
machine learning modules. It also offers various pre-trained the content of graffiti. As shown in Figure 9, although the
models for users to choose. After careful research, we select white frame overlaps with the result presented in the green
the ssd_mobilenet_v2_coco model which is optimized for frame, it indicates that our system recognizes the first letter
mobile platform and pre-trained on COCO dataset. The of the graffiti to be a throw up alphabet. The accuracies of
training process is shown in Figure 8. both models are up to 99%.

1926

Fig.11.Classification loss of box classifier
Fig.9. Graffiti detection on walls

Figure 10 shows the detection results for graffiti on


traffic signs. In this test, we have two types of traffic signs,
one with graffiti painted over and the other clean and clear.
The results indicate that our system can distinguish between
a clear traffic sign and a graffiti traffic sign.

Fig.12.Localization loss of box classifier

(a)Graffiti Traffic Sign

Fig.13.Localization loss of RPN


(b)Clear Traffic Sign
Fig.10. Graffiti Detection on Traffic Signs

The two tests given above demonstrate that the drone


itself can recognize graffiti on walls and graffiti on traffic
signs during its flight and automatically locate the graffiti
with an accuracy of up to 99%.
Figure 11 and 12 show the losses for the final classifier.
The loss for the classification of detected objects into
various classes is illustrated in Figure 11. Localization loss
or the loss of the bounding box regressor is shown in Figure
12. Figure 13 and 14 show the losses for the region proposal
network (RPN). Localization loss or the loss of the
bounding box regressor for the RPN is shown in Figure 13.
Figure 14 illustrates loss of the classifier, which classifies
whether a bounding box is an object of interest or merely Fig.14.Objectness loss of RPN
background.

1927
Latin American Workshop on Computational Neuroscience, 2017, pp.
39-48.
[10]H. Li, J. Kim, and E. J. Delp, "Deep Gang Graffiti Component
Analysis," Electronic Imaging, vol. 2018, pp. 201-1-2014, 2018.
[11]X. Wang, V. Yadav, and S. Balakrishnan, "Cooperative UAV
formation flying with obstacle/collision avoidance," 2007.
[12]A. G. Foina, R. Sengupta, P. Lerchi, Z. Liu, and C. Krainer, "Drones in
smart cities: Overcoming barriers through air traffic control research,"
in Research, Education and Development of Unmanned Aerial Systems
(RED-UAS), 2015 Workshop on, 2015, pp. 351-359.
[13]J. P. Sterbenz, "Drones in the smart city and iot: Protocols, resilience,
benefits, and risks," in Proceedings of the 2nd Workshop on Micro
Aerial Vehicle Networks, Systems, and Applications for Civilian Use,
2016, pp. 3-3.
[14]F. Mohammed, A. Idries, N. Mohamed, J. Al-Jaroodi, and I. Jawhar,
"UAVs for smart cities: Opportunities and challenges," in Unmanned
 Aircraft Systems (ICUAS), 2014 International Conference on, 2014, pp.
Fig.15.Total loss result 267-273.
[15]Z. Ma, C. Wang, Y. Niu, X. Wang, and L. Shen, "A saliency-based
The corresponding total loss is shown in Figure 15, reinforcement learning approach for a UAV to avoid flying obstacles,"
where we can see that the total loss decreases to 0.05% at Robotics and Autonomous Systems, vol. 100, pp. 108-118, 2018.
15000 iterations. [16]Y. Choi, H. Jimenez, and D. N. Mavris, "Two-layer obstacle collision
avoidance with machine learning for more energy-efficient unmanned
V. CONCLUSIONS aircraft trajectories," Robotics and Autonomous Systems, vol. 98, pp.
158-173, 2017.
The prevalent graffiti has become one of the main [17]C. G. Prévost, A. Desbiens, E. Gagnon, and D. Hodouin, "UAV
problems in urban development. This paper proposes a optimal obstacle avoidance while respecting target arrival
semi-autonomous UAV graffiti detection and removal specifications," IFAC Proceedings Volumes, vol. 44, pp. 11815-11820,
system based on machine learning to address this issue. 2011.
[18]S. Mannar, M. Thummalapeta, S. K. Saksena, and S. Omkar,
The semi-auto pilot drone can detect nearby environment "Vision-based Control for Aerial Obstacle Avoidance in Forest
and recognize graffiti and decide whether to clean them up. Environments," IFAC-PapersOnLine, vol. 51, pp. 480-485, 2018.
The drone carries spray paint to the target location and [19]O. Uzol, I. Yavrucuk, and N. Sezer Uzol, "Collaborative target tracking
cleans the graffiti if necessary. There is also a cloud server for swarming MAVs using potential fields and panel methods," in
which can monitor drone status and issues commands to the AIAA Guidance, Navigation and Control Conference and Exhibit,
2008, p. 7167.
operator. This system will effectively reduce human labor [20]B. Call, R. Beard, C. Taylor, and D. Barber, "Obstacle avoidance for
cost and minimize the impact on traffic. unmanned air vehicles using image feature tracking," in AIAA
Guidance, Navigation, and Control Conference and Exhibit, 2006, p.
Due to limitations on test areas, our system needs to be 6541.
further tested in more practical places and more complicated [21]S. R. Griffiths, "Remote terrain navigation for unmanned air vehicles,"
environments. In addition, better machine learning 2006.
algorithms should be tested to improve the recognition [22]W. G. Aguilar, V. P. Casaliglla, J. L. Pólit, V. Abad, and H. Ruiz,
accuracy of graffiti on walls and traffic signs. "Obstacle avoidance for flight safety on unmanned aerial vehicles," in
International Work-Conference on Artificial Neural Networks, 2017,
REFERENCES pp. 575-584.
[23]H. Kim, D. Kim, S. Jung, J. Koo, J.-U. Shin, and H. Myung,
[1] P. Sanmartín, F. Cappitelli, and R. Mitchell, "Current methods of "Development of a UAV-type jellyfish monitoring system using deep
graffiti removal: A review," Construction and Building Materials, vol. learning," in Ubiquitous Robots and Ambient Intelligence (URAI),
71, pp. 363-374, 2014. 2015 12th International Conference on, 2015, pp. 495-497.
[2] P. Skorput, S. Mandzuka, and H. Vojvodic, "The use of Unmanned [24]N. V. Kim and M. A. Chervonenkis, "Situation control of unmanned
Aerial Vehicles for forest fire monitoring," in 2016 International aerial vehicles for road traffic monitoring," Modern Applied Science,
Symposium ELMAR, 2016, pp. 93-96. vol. 9, p. 1, 2015.
[3] L. Di Stefano, F. Tombari, A. Lanza, S. Mattoccia, and S. Monti, [25]M. B. Bejiga, A. Zeggada, A. Nouffidj, and F. Melgani, "A
"Graffiti detection using two views," in The Eighth International convolutional neural network approach for assisting avalanche search
Workshop on Visual Surveillance-VS2008, 2008. and rescue operations with uav imagery," Remote Sensing, vol. 9, p.
[4] D. Angiati, G. Gera, S. Piva, and C. S. Regazzoni, "A novel method for 100, 2017.
graffiti detection using change detection algorithm," in Advanced [26]A. Sawarkar, V. Chaudhari, R. Chavan, V. Zope, A. Budale, and F.
Video and Signal Based Surveillance, 2005. AVSS 2005. IEEE Kazi, "HMD vision-based teleoperating UGV and UAV for hostile
Conference on, 2005, pp. 242-246. environment using deep learning," arXiv preprint arXiv:1609.04147,
[5] F. Tombari, L. Di Stefano, S. Mattoccia, and A. Zanetti, "Graffiti 2016.
detection using a time-of-flight camera," in International Conference [27]A. Carrio, C. Sampedro, A. Rodriguez-Ramos, and P. Campoy, "A
on Advanced Concepts for Intelligent Vision Systems, 2008, pp. review of deep learning methods and applications for unmanned aerial
645-654. vehicles," Journal of Sensors, vol. 2017, 2017.
[6] A. Parra, B. Zhao, J. Kim, and E. J. Delp, "Recognition, segmentation
and retrieval of gang graffiti images on a mobile device," in
Technologies for Homeland Security (HST), 2013 IEEE International
Conference on, 2013, pp. 178-183.
[7] C. Yang, P. C. Wong, W. Ribarsky, and J. Fan, "Efficient graffiti
image retrieval," in Acm International Conference on Multimedia
Retrieval, 2012.
[8] A. Parra, M. Boutin, and E. J. Delp, "Location-aware gang graffiti
acquisition and browsing on a mobile device," in Multimedia on
Mobile Devices 2012; and Multimedia Content Access: Algorithms
and Systems VI, 2012, p. 830402.
[9] G. R. Munsberg, P. Ballester, M. F. Birck, U. B. Correa, V. O.
Andersson, and R. M. Araujo, "Towards Graffiti Classification in
Weakly Labeled Images Using Convolutional Neural Networks," in

1928

You might also like