Vghguj

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

Computers and Electronics in Agriculture 157 (2019) 339–350

Contents lists available at ScienceDirect

Computers and Electronics in Agriculture


journal homepage: www.elsevier.com/locate/compag

Original papers

Development and evaluation of a low-cost and smart technology for T


precision weed management utilizing artificial intelligence
Victor Partel, Sri Charan Kakarla, Yiannis Ampatzidis

Agricultural and Biological Engineering Department, Southwest Florida Research and Education Center, University of Florida, IFAS, 2685 SR 29 North, Immokalee, FL
34142, USA

ARTICLE INFO ABSTRACT

Keywords: Most conventional sprayers apply agrochemicals uniformly, despite the fact that distribution of weeds is typi-
Weed detection cally patchy, resulting in wastage of valuable compounds, increased costs, crop damage risk, pest resistance to
Artificial intelligence chemicals, environmental pollution and contamination of products. To reduce these negative impacts, a smart
Machine learning sprayer was designed and developed utilizing machine vision and artificial intelligence to distinguish target
Smart agriculture
weeds from non-target objects (e.g. vegetable crops) and precisely spray on the desired target/location. Two
Precision agriculture
different experimental scenarios were designed to simulate a vegetable field and to evaluate the smart sprayer’s
Neural networks
Deep learning performance. The first scenario contained artificial weeds (targets) and artificial plants (non-targets). The second
and most challenging scenario contained real plants; portulaca weeds as targets, and sedge weeds and pepper
plants as non-targets. Two different embedded graphics processing unit (GPU) were evaluated as the smart
sprayer processing unit (for image processing and target detection). The more powerful GPU (NVIDIA GTX 1070
Ti) achieved an overall precision of 71% and recall of 78% (for plant detection and target spraying accuracy) on
the most challenging scenario with real plants, and 91% accuracy and recall for the first scenario with artificial
plants. The less powerful GPU (NVIDIA Jetson TX2) achieved an overall precision and recall of 90% and 89%
respectively on the first scenario with artificial plants, and 59% and 44% respectively on the second scenario
with real plants. Finally, an RTK GPS was connected to the smart sprayer and an algorithm was developed to
automatically generate weed maps and visualize the collected data (after every application). This smart tech-
nology integrates a state of the art (AI-based) weed detection system, a novel fast and precision spraying system,
and a weed mapping system. It can significantly reduce the quantity of agrochemicals required, especially
compared with traditional broadcast sprayers that usually treat the entire field, resulting in unnecessary ap-
plication to areas that do not require treatment. It could also reduce costs, risk of crop damage and excess
herbicide residue, as well as potentially reduce environmental impact.

1. Introduction potential crop yield up to 40%; that could be twice as large if no


agrochemicals are used (Deutsch et al., 2018; Oerke, 2006). Global
Farmers use mainly agrochemicals for plant disease, pest and weed pesticide use was assessed to be 3.5 billion kg/year, with an estimated
control, and they follow conventional crop protection strategies (uti- cost of $45 billion in 2015 (Pretty and Bharucha, 2015).
lizing a vast amount of chemicals) despite the negative impacts on the Apart from the advantages of using agrochemicals for pest and weed
environment and human health. For example, more than 90% of the control, there are also disadvantages mainly due to limitations of the
acreage of crops in United States are being sprayed by herbicides conventional spraying technologies. Minimizing the negative impacts of
(Gianessi and Reigner, 2007). The use of herbicides has eliminated the agrochemicals (and spraying technologies) is a major global societal
need of manual labor to pull weeds out of fields. The use of herbicides challenge; 72% of citizens state that agrochemical residues is one of the
has resulted in reduction of production costs and increase of crop yields most important food-related concern (European Food Safety Authority
in the United States. The United States farmers dedicate around 65% of -EFSA, 2013). EFSA announced that 98.9% of food products contain
their total expenditure towards herbicides for weed control, and it is agrochemical residues (with 1.5% of them in excess of the legal limits).
estimated that around $26 billion is spent on herbicides each year in the Additionally, plants resistance to agrochemicals (e.g. herbicides) is
United States (Gianessi and Reigner, 2006). Pests reduce global posing a great threat to crop production in many countries (Jeanmart


Corresponding author.
E-mail address: i.ampatzidis@ufl.edu (Y. Ampatzidis).

https://doi.org/10.1016/j.compag.2018.12.048
Received 5 November 2018; Received in revised form 19 December 2018; Accepted 23 December 2018
Available online 11 January 2019
0168-1699/ © 2018 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY license
(http://creativecommons.org/licenses/BY/4.0/).
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350

et al., 2016; Owen and Zelaya, 2005; Powles, 1996). commercial machine-vision systems for weed smart spraying and con-
Hence, there is an urgent need to reduce reliance on conventional cluded that they are limited to detect weeds on a soil background but
agrochemicals (and sprayers), without affecting agricultural yield. cannot distinguish between crop plants and weeds.
Integrated Pest Management (IPM) provides an alternative and changes Lee et al. (1999) developed and evaluated a smart sprayer to dis-
the philosophy of crop protection emphasizing on the understanding of tinguish between weed leaves and tomato plants. At that time, pro-
the insect, pest and crop ecology and implements several com- cessing speeds were very slow, but the detection results achieved were
plementary tactics, reducing risks to human health and the environ- promising. Kargar and Shirzadifar (2013) developed a machine vision
ment (Lamichhane et al., 2016; Flint and van den Bosch, 1981). Its weed spot-sprayer on corn crops. The system used image segmentation
main strategy is to maintain a pest population below a level that cause and feature extraction to distinguish the grass leaves from corn plants
economic injury, and not to eradicate the entire pest population on over 90% accuracy. It has to be noted that in this case the corn
(Naranjo et al., 2015; Peterson et al., 2016; Miedaner et al., 2018). leaves are much wider than the grass, which increases the detection
Agrochemicals (e.g. herbicides) are mainly applied using hydraulic accuracy. Aitkenhead et al. (2003) analyzed two processes to detect
and hydro-pneumatic sprayers that have high inefficiencies and sig- weeds on carrot crops. The first process included image segmentation
nificant amounts of the active ingredient end up elsewhere in the en- operations based on color and shape of the leaves. The second process
vironment (Creech et al., 2017) contaminating natural resources utilized a self-organized neural network to distinguish weeds. Although
(Sankhla et al., 2016; Ranga Rao et al., 2015). Contamination can be the results at the time were not as good as required for commercial
caused by run-off from the field, discharge from drainage and spray purposes, it showed how artificial intelligence can be used to analyze
drift (off target deposition of spray). The chemical spray drift usually much more properties other than segmentation processes.
occurs when small droplets of the spray liquid are carried away by the H-Sensor (Agricon GmbH, Ostrau, Germany) and See and Spray
wind onto the neighboring crops/fields which results in herbicide (or (Blue River Technology, Sunnyvale, CA, USA) are recent commercial
agrochemicals in general) residues on plant products. This may cause spraying technologies that utilizes artificial intelligence and are able to
damage to the crops and can also be carried over to the end consumer distinguish between crop plants and various weeds (for row crops)
where it can possibly have a significant effect on their health (Swanson (Chostner, 2017). A precision spray technology can significantly reduce
et al., 2014). Therefore, the use of herbicides should be as minimal and the quantity of herbicide required, compared with traditional broadcast
as efficient as possible in order to eliminate the negative environmental sprayers that usually treat the entire field to control pest populations,
impacts which can be a step closer to agricultural sustainability. which potentially results in unnecessary application to areas that do not
For reducing the negative impacts, new spraying technologies have require treatment. Applying herbicide only where weeds occur could
shown a significant improvement in terms of efficiency and safety by reduce costs, risk of crop damage and excess pesticide residue, as well
adopting the latest advances in electronics (Ampatzidis et al., 2018), as potentially reduce environmental impact (Balafoutis et al., 2017).
artificial intelligence (AI) (Abdulridha et al., 2019; Ampatzidis and In this paper, a low-cost and smart technology for precision weed
Cruz, 2018) and automation (Ampatzidis et al., 2017; Luvisi et al., management (for specialty crops) is presented and evaluated. This
2016). Still, most agrochemicals (e.g. herbicides) are applied uniformly, technology utilizes Artificial Intelligence (aka. deep and transfer
despite the fact that distribution of target pests, pathogens and weeds is learning) to distinguish between target and non-target plants in real
typically patchy. Uniform application wastes valuable agrochemicals by time, and spray only on a selected target (e.g. specific weed). The uti-
applying where little or no problems exist. The result is increased costs, lized deep learning neural network analyzes much more complex
risk of crop damage, pest resistance to chemicals, environmental pol- properties than an image segmentation alone, thus, it can be used to
lution and contamination of the edible products (Balafoutis et al., distinguish weeds from crop plants on a variety of situations (e.g. open
2017). field environment). To the best of the authors’ knowledge, we are the
In recent decades there has been rising interest in pest and disease first to approach this problem with deep learning and to integrate the
detection (Abdulridha et al., 2018; Cruz et al., 2019; Cruz et al., 2017), machine vision system with a low-cost, fast response and precise
and in automation of weed sprayers (Fernandez-Quintanilla et al., spraying system, and a weed and spraying mapping system.
2018). Moller (2010) concluded that using computer vision technolo-
gies on agricultural operation lowers the operator stress levels. A smart
2. Materials and methods
sprayer system needs to be able to locate weed spots in real time and
manage to spray the desired chemical only on the proper location. Sun
The Smart Sprayer prototype includes (Fig. 1): (i) individual nozzle
et al. (2012) and Song et al. (2015) analyzed various sensors and
control (12 nozzles with an adjustable spraying cone and 12 valves); (ii)
techniques for weed detection as machine vision, spectral analysis, re-
a low-cost pump; (iii) a real time kinematic GPS (RTK-GPS; around 2 cm
mote sensing and thermal images. Machine vision has been used for
accuracy); (iv) three video cameras (Webcam Logitech c920); (v) a
many years now to distinguish vegetation from the soil background
smart controller (Arduino mega) to control all sensors and the actuators
through image segmentation processes due to color difference between
(e.g., pump); (vi) a computational unit (e.g., NVIDIA Jetson TX2 GPU)
them. McCarthy et al. (2010) reviewed, at the time, the three main
for the image processing and weed detection; and (vii) several relay

Fig. 1. (a) The Smart Sprayer mounted on an All-terrain vehicle (ATV); (b) main components of the Smart Sprayer.

340
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350

Fig. 2. Hardware components of the smart sprayer.

boards, tubes, pressurized manifolds etc.

2.1. Smart sprayer hardware Fig. 4. Tank and pump.

The sprayer prototype was designed to operate attached to a small


All-terrain vehicle (ATV) through a hitch (Fig. 1). Fig. 2 presents the
main components of the smart sprayer. The design considered a work
length of 1.08 m to be covered by the sprayer behind the vehicle, in
between its tires. It utilizes 12 spray nozzles to cover a width of 0.09 m
each, as shown in Fig. 3.
The hardware components were selected to develop a prototype
with a fast response and precise spray upon a received signal from a
control system. The liquid to be sprayed is stored on a 95 L tank
equipped with a 4.1 bar, 8 L/min pump (FIMCO LG-25-SM, North Sioux
City, SD, USA), as shown in Fig. 4. The pump maintains the liquid inside
the sprayer at a constant 4.1 bar pressure.
A structure frame was fabricated to hold all the components by Fig. 5. Frame structure for the nozzles, dimensions in mm.
bending an aluminum sheet of 3.2 mm thickness. The structure di-
mensions are 1.2 m in length, 0.16 m in width and 0.10 m in height
(Fig. 5). In order to properly distribute a constant pressure for all the
nozzles, a manifold was designed with 1 input and 12 outputs for each
valve and nozzle individually. The manifold material utilized was PVC
plastic and the dimensions are shown in Fig. 6.
12 V solenoid valves (WALFRONT 2 V025, China), with a response
time of less than 50 ms, are used to control the nozzles system. Each
nozzle (TEEJET 5500-X5 Glendale Heights, IL, USA) is adjustable in
order to change the angle of the spraying cone.
For the real time image acquisition system, three low-cost web
cameras were utilized (LOGITECH c920, Newark, CA, USA). The cam-
eras were mounted on 3D printed frames and positioned on the frame in Fig. 6. Manifold to distribute even pressure for the nozzle (dimensions in mm).
order to cover the work length of 1.08 m, equally divided in 3 spaces of
0.36 m for each camera. The three cameras were positioned to achieve a
minimum overlap (near to zero). Jetson TX2 is an embedded computer unit that runs a Pascal archi-
For the positioning system of the detected targets, an RTK GPS tecture GPU with 256 CUDA cores on a base clock frequency of
(TOPCON HiperXT, Tokyo, Japan) was utilized using a 2.5 Hz update 854 MHz. The GPU has a memory of 8 GB. The TX2 runs on 7.5 Watts of
rate. Based on the position data received, a heading angle is also cal- power and its current commercial price is $468 (Embedded Systems
culated in order to accurately geo-locate the target on the soil after Developer Kits, Modules, & SDKs, NVIDIA Jetson, n.d.). The graphic
detection. card GTX 1070 Ti also uses a Pascal architecture with 2432 CUDA cores
Finally, two different computational units were utilized and eval- and a base clock of 1607 MHz, the GPU memory is also 8 GB. The GTX
uated: (i) NVIDIA Jetson TX2 (NVIDIA TX2 Developer Kit, Santa Clara, 1070 Ti uses 180 Watts of power and the current price is $449 (GTX
CA, USA); (ii) NVIDIA GTX 1070 Ti GPU (Santa Clara, CA, USA). The 1070 Ti Gaming Graphics Card, NVIDIA GeForce, n.d.). These two units

Fig. 3. Nozzles arrangement design.

341
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350

Fig. 7. Overall workflow of the smart system.

were chosen as the best low-cost GPUs available at the time in the trained for each experiment situation using around 1000 images of
market. targets and non-targets labeled manually for each target position on the
images. For all the experiments three different networks were trained:
2.2. Smart sprayer software
1. Tiny YOLOv3 containing pictures of artificial weeds as targets and
A software was developed in order to achieve a precise spraying on artificial plants (sunflowers) as non-targets.
the target and to develop an application (weed) map. Fig. 7 presents the 2. Tiny YOLOv3 containing pictures of the weed portulaca as targets
overall workflow of the smart system. The program was developed and the Pepper plants and the weed sedge as non-targets (Fig. 8).
using C programming language. The software runs in real time, being 3. YOLOv3 containing the same pictures as network number 2.
able to process up to 28 fps (frames per second) in all the steps com-
bined: image acquisition, object detection and communication. The YOLO network runs on the GTX 1070 Ti GPU at a 24-fps rate, a
good rate for real time applications. The same network on the Jetson
2.2.1. Image acquisition TX2 unit produced only 2.4 fps, which is not practical for our purposes.
The software acquires the most recent frame from the three cameras To increase speed on the Jetson TX2 computer unit Tiny YOLOv3 net-
simultaneously on a resolution of 640 × 480 pixels each. The images works were used achieving a 22 fps rate. The Tiny YOLO networks have
are then merged together side-by-side on a 1920 × 480 pixels single the advantage of being faster but loses accuracy in comparison to YOLO
image, which is then resized for a 1024 × 256 pixels final image, which networks.
was found to be a good size to achieve real time processing speeds. The
cameras are limited to acquire up to 30 fps. The overall processing 2.2.3. Convolutional neural network and deep learning
speed is determined by the network utilized and the capabilities of the YOLO object detection system uses convolutional neural networks
GPU. (CNN) to train and detect objects. A basic CNN is made up of four main
operations known as convolution, non-linearity, pooling or sub sam-
2.2.2. Target detection pling and classification. The four main operations are illustrated in
For the target (object) detection YOLOv3, a state-of-the-art detec- Fig. 9. Every image can be represented as a three-dimensional matrix of
tion network (Redmon et al., 2016) was utilized. The network was pixel values based on its visual features.

Fig. 8. Smart sprayer detection on weed portulaca as target and pepper as non-target.

342
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350

Fig. 9. A graphical illustration of a Convolutional Neural Network.

position of the target on the image. Eq. (1) shows the calculation of the
nozzle position (NP) to be triggered and Eq. (2) shows the value used to
identify the target distance from the nozzle (TDN). This value will be
later used by the Arduino unit to calculate the spraying time (ST) on the
target.
NP = (11× Xtarget ) + 1 (1)

Where:
X |0 < X < 1

NP |1 NP 12

TDN = (9× Ytarget ) (2)

Where:
Y |0 < Y < 1

TDN |0 TDN 9

After every frame detection one string of 12 bytes containing all the
target calculated values on the frame is sent to the Arduino controller
for processing and to trigger the respective nozzle.

2.2.5. Weed position registration for mapping


After every frame detection on the computational unit, GPS data are
recorded by the external RTK GPS. Using the calculated angle or-
ientation of the sprayer, based on previous GPS data, and the target
position on the image, an absolute position for every target is calculated
and saved on a text file for future reading and utilization on creating a
Fig. 10. Workflow of valve control system upon detection.
target weeds absolute position map (weed map). If no GPS device is
connected to the system, the position is not saved.

In Fig. 9, the first step is where the image is scanned for the features 2.2.6. Valve control system
and the rectangle represents the filter that passes over it. There can be An Arduino script (that controls the valves) was developed to read
several filters that are applicable for a single image. The matrix of va- the serial data coming from the computational unit containing the
lues from the filter and the corresponding matrix of values in the input “target to be sprayed” (weed) position and the vehicle speed calculated
image is multiplied. The multiplication of two matrices is known as from the GPS data. The identifier of target distance from the nozzle
convolution, hence the name convolutional layer. After the convolution (TDN) and the vehicle speed are used to calculate the spraying time as
layer, there will be several feature maps, one for each filter used. The shown in Eq. (3).
larger rectangle, as shown in the picture above, is one patch of the
image that needs to be down sampled. The next step is where the fea- TDN × 360
ST =
tures maps are created using the down sampling. A new set of feature 1000 × Speed (3)
maps are created using the stack from the set before that is created by
down sampling. Then, another down sampling is used to create a new ST = Spraying time in seconds.
set of feature maps. Then, at the end you see a fully connected layer that TDN = Identifier of target distance to nozzle, integer number be-
classifies output with one label per node. This cycle keeps on going for tween 0 and 9.
every image in the training set. Speed = Vehicle speed in m/s.

2.2.4. Weed and nozzle position and triggering After calculating the spraying time for each nozzle, the Arduino unit
The communication between the computational unit and the (Arduino Mega2560, Italy) then sends a 5 V signal for a relay (JBTEK
Arduino unit is done through a universal serial bus (USB) connection. 4450182, China) that will then trigger the selected 12 V solenoid valve.
Upon detection, the detected box center (weed) relative coordinates are A workflow of the control system upon target detection is shown in
used to calculate the nozzle position to be sprayed and the height Fig. 10.

343
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350

Fig. 11. A part of the experimental track created with artificial plants (non-target) and weeds (target).

3. Experimental design GPUs, where in each repetition the targets were rearranged.

The experiments described below were arranged in order to eval- 3.2. Experiments 3 & 4: TX2 GPU and GTX 1070 Ti GPU with real
uate: (i) the detection accuracy of the vision-based system; (ii) the weeds and plants
precision and accuracy of the spraying system; and (iii) the perfor-
mance of the two computational units (TX2 GPU and GTX 1070 Ti In these experiments, the performance of the smart sprayer using
GPU). The experiments were conducted at the Southwest Florida both computational units were evaluated with real plants and weeds
Research and Education Center (SWFREC) located in Immokalee, (experiment 3 using the TX2 GPU; and experiment 4 using the GTX
Florida, USA. For each scenario a set of experimental tracks were de- 1070 Ti GPU). In this scenario, the experimental track contains two
veloped to simulate a vegetable field, where the weeds are placed weeds, portulaca (target) and sedge (non-target), placed randomly
randomly along with the crop plants. The weeds and plants (or anything among the pepper plants (non-target) which were placed in a straight
that is not supposed to be sprayed) are referred to as targets and non- line (to simulate a vegetable field where plants are placed in straight
targets respectively. All the experiments were conducted during the rows; Fig. 12). Only the portulaca weeds (Fig. 13a) were considered as
morning between 9 am to 11 am in August 2018, on sandy soil. The the target in order to evaluate the smart sprayer in a more complex
spraying liquid was water mixed with pink colored florescent dye, environment; the smart sprayer had to spray only on a specific weed
which makes it easier to determine where the targets or non-targets are and not on anything else (non-target includes pepper plants, Fig. 13c;
sprayed, in all the cases. and sedge weeds, Fig. 13b). The length and width of the track is the
same as the one with the artificial plants and weeds. This experiment
3.1. Experiments 1 & 2: TX2 GPU and GTX 1070 Ti GPU with was repeated 10 times each with both the GPUs, where in each re-
artificial plants and weeds petition the targets were rearranged. The pepper plants placed in a
straight line were used to resemble the crop and the weeds were placed
In these experiments, the detection accuracy of the vision-based around them randomly (e.g. Fig. 12) so as to resemble an actual sce-
system and the precision and accuracy of the spraying system were nario of weeds around the crop.
evaluated with artificial plants using the two computational units (ex-
periment 1 using the TX2 GPU; and experiment 2 using the GTX 1070 Ti 3.3. Experiment 5: Weed mapping using GTX 1070 Ti GPU and RTK-
GPU). A track with the artificial yellow colored plants (non-target) and GPS system
green colored weeds (target) was developed for this experiment. The
yellow colored plants were placed in a straight line resembling the In this experiment, the GTX 1070 Ti GPU was used with an RTK-GPS
actual crop and the green colored artificial plants, acting as weeds, were to develop a spray and weed map. For this scenario, a separate ex-
randomly placed around the crop (Fig. 11). The length of the track was perimental track was developed to simulate the actual row-middles in
10 m and the width of track was 0.9 m; the actual spraying width was Florida’s vegetable fields (Fig. 14), where the plants are grown on
0.6 m; the remaining 0.3 m was used by the wheels of the ATV to pass raised beds. In this scenario, the smart sprayer was used for weed
through. This experiment was repeated 10 times each with both the management in the row-middles, which contain only weeds (no crops).

Fig. 12. A part of the experimental track created with real plants (pepper, non-target) and weeds (portulaca, target; and sedge, non-target).

344
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350

Fig. 13. Images showing: (a) portulaca (target weed); (b) sedge (non-target weed); (c) and pepper plant (non-target).

The width of the track was 0.90 m and the length of the track was 35 m cases A and B) to the number of total sprays (sum of cases A, B, C, and
which is three-fourths as long as the actual rows middles. The portulaca E).
weeds were the only ones used in this experiment and they were con-
sidered as the target. The weeds were places randomly in clusters to CaseA + CaseB + CaseC
Precisionofthedetectionsystem =
simulate weed hot spots. This experiment was repeated 10 times, where CaseA + CaseB + CaseC + CaseE
in each repetition the targets were rearranged. 100 (4a)

3.4. Evaluation metrics CaseA + CaseB


Precisionofthesprayingsystem =
CaseA + CaseB + CaseC + CaseE
In all experiments, the performance of the smart sprayer was eval- 100 (4b)
uated after spraying, based on visual observations, determining where
the targets or non-targets are sprayed (observing the pink colored The recall of the detection system (Eq. (5a)) is described as the
florescent dye on the ground). Table 1 presents and describes the eva- percentage of the number of targets detected (sum of cases A, B, and C)
luation metrics. Additionally, for every experiment, a Go-Pro (HERO5 over the number of targets in the track. The number of targets varies
model, San Mateo, CA, USA) camera was utilized to take a video of the with every experiment. The recall of the spraying system (Eq. (5a)) is
sprayer passing through. The video was used to validate that the visual described as the percentage of the number of targets sprayed (sum of
observations and the evaluation metrics were accurately measured. cases A and B) over the number of targets in the track.
The precision and recall (Fig. 15) of the vision-based detection
system and the overall precision sprayer system are calculated using the CaseA + CaseB + CaseC
Recallofthedetectionsystem = 100
collected evaluation metrics. For both systems, precision is defined as NumberofTargets (5a)
the number of true positives over the number of true positives plus the
number of false positives, and recall is defined as the number of true CaseA + CaseB
Recallofthesprayingsystem = 100
positives over the number of true positives plus the number of false NumberofTargets (5b)
negatives. The precision of the detection (machine vision) system (Eq.
(4a)) is described as the percentage of the number of targets detected Furthermore, in all experiments, the percentages of: (i) fully sprayed
(sum of cases A, B, and C) to the number of total detections (sum of targets (Case A/No. of targets); (ii) half-sprayed targets (Case B/No. of
cases A, B, C, and E). The precision of the spraying system (Eq. (4b)) is targets); (iii) missed sprayed targets (Case C/No. of targets); and (iv)
described as the percentage of the number of targets sprayed (sum of missed targets (Case D/No. of targets) were calculated.

Fig. 14. A part of the track created for the weed mapping experiment.

345
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350

Table 1
Description of the evaluation metrics.
Evaluation Metrics Description

Case A target is fully sprayed desired situation where the target successfully identified (by the vision system) and sprayed
Case B target is partially sprayed target successfully identified by the vision system but only a portion of the target is sprayed (e.g. half of the weed is
sprayed)
Case C target is identified but spray missed the target successfully identified (by the vision system) but the spray missed the target (e.g. spray next to the weed and
target not on it)
Case D target is not identified (and not sprayed) the vision-based system does not detect the target (weed)
Case E a non-target is sprayed A non-target faulty detected as target and sprayed (e.g. the vision system faulty detects a plant as target and sprays
–fully or partially- on it)
No. of Targets total number of targets The total number of the weeds that needed to be sprayed
Pd precision of detection system Eq. (4a)
Rd recall of the detection system Eq. (4b)
Ps precision of the sprayer system Eq. (5a)
Rs recall of the spraying system Eq. (5b)
FST percentage of fully sprayed targets Case A/No. of targets (%)
HST percentage of half sprayed targets Case B/No. of targets (%)
MST percentage of missed spraying targets Case C/No. of targets (%)
MT percentage of missed targets Case D/No. of targets (%)

4. Results and discussion

4.1. Experiments 1 & 2: TX2 GPU and GTX 1070 Ti GPU with
artificial plants and weeds

The results for the experiments 1 and 2 with artificial plants and
weeds are presented in Table 2 (TX2 GPU) and Table 3 (GTX 1070 Ti
GPU) respectively. The total number of targets in both experiments were
30. It can be clearly noticed that the GTX 1070 Ti GPU is better per-
forming than the TX2 GPU especially comparing the percentages of fully
sprayed and missed targets (Fig. 16). The case of fully sprayed targets
(FST) has been increased by 10% (from 83% to 91%) using the GTX 1070
Ti GPU, whereas the case of missed targets (MST) was reduced to zero
(from 10% using the TX2 GPU). There is a small number (5%) of half
sprayed targets (HST) when using the TX2 GPU, but this problem seems
to be eliminated when using the GTX 1070 Ti GPU. Another interesting
observation is that there is not a single instance of a non-target (Case E)
being sprayed in both the experiments. Fig. 17 compares the evaluation
metrics (e.g. precision, recall etc.) for both experiments (exp. 1 using the
TX2 GPU; exp. 2 using the GTX 1070 Ti GPU).

4.2. Experiments 3 & 4: TX2 GPU and GTX 1070 Ti GPU with real
weeds and plants

The results for the experiments 3 and 4 with real weeds and plants
are presented in Table 4 (TX2 GPU) and Table 5 (GTX 1070 Ti GPU)
respectively. The average values for all the evaluation metrics are in-
cluded at the end of each table. The number of targets in both experi-
ments were 20.
It can be observed from the data that the sprayer with the TX2 GPU
has missed (MT) almost half of the targets. This can be attributed to the
capability limitation of the GPU as it may not be powerful enough to
process the images of the weeds/plants and detect them. When the TX2
GPU is replaced by a superior performing GTX 1070 Ti GPU, the missed
targets (MT) were reduced by 81% (from 43% to 8%). Also, the pre-
cision (Ps) and recall (Rs) of the spraying system have been increased
significantly by 20% (from 59% to 71%) and 77% (from 44% to 78%)
respectively (Fig. 18). The precision (Pd) and recall (Rd) of the detec-
Fig. 15. Precision and Recall explained in a graphical format (Wikipedia).
tion system have been also increased similarly by 10% (from 77% to

346
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350

Table 2
Results for the experiment 1, with artificial plants, using the TX2 GPU.
Reps Case A Case B Case C Case D Case E Pd Rd Ps Rs FST HST MST MT

Rep 1 26 1 3 0 0 100% 100% 90% 90% 87% 3% 10% 0%


Rep 2 23 3 4 0 0 100% 100% 87% 87% 77% 10% 13% 0%
Rep 3 25 2 3 0 0 100% 100% 90% 90% 83% 7% 10% 0%
Rep 4 23 2 3 2 0 100% 93.3% 89% 83% 77% 7% 10% 6%
Rep 5 24 2 4 0 0 100% 100% 87% 87% 80% 7% 13% 0%
Rep 6 26 4 0 0 0 100% 100% 100% 100% 87% 13% 0% 0%
Rep 7 29 1 0 0 0 100% 100% 100% 100% 97% 3% 0% 0%
Rep 8 26 1 3 0 0 100% 100% 90% 90% 87% 3% 10% 0%
Rep 9 25 0 3 2 0 100% 93.3% 89% 83% 83% 0% 10% 7%
Rep 10 23 0 6 1 0 100% 96.7% 79% 77% 77% 0% 20% 3%
Average 100% 98% 90% 89% 83% 5% 10% 2%

Table 3
Results for the experiment 2, with artificial plants, using the GTX 1070 Ti GPU.
Reps Case A Case B Case C Case D Case E Pd Rd Ps Rs FST HST MST MT

Rep 1 28 0 2 0 0 100% 100% 93% 93% 93% 0% 7% 0%


Rep 2 30 0 0 0 0 100% 100% 100% 100% 100% 0% 0% 0%
Rep 3 28 0 2 0 0 100% 100% 93% 93% 93% 0% 7% 0%
Rep 4 27 0 3 0 0 100% 100% 90% 90% 90% 0% 10% 0%
Rep 5 25 0 5 0 0 100% 100% 83% 83% 83% 0% 17% 0%
Rep 6 27 0 3 0 0 100% 100% 90% 90% 90% 0% 10% 0%
Rep 7 25 0 5 0 0 100% 100% 83% 83% 83% 0% 17% 0%
Rep 8 27 0 3 0 0 100% 100% 90% 90% 90% 0% 10% 0%
Rep 9 29 0 1 0 0 100% 100% 97% 97% 97% 0% 3% 0%
Rep 10 27 0 3 0 0 100% 100% 90% 90% 90% 0% 10% 0%
Average 100% 100% 91% 91% 91% 0% 9% 0%

Spray Results Using the TX2 Spray Results Using the GTX
GPU 1070 Ti GPU
2% Fully Sprayed 0% Fully Sprayed
0%
5% Targets 9% Targets
10%
Half Sprayed Half Sprayed
Targets Targets
Missed Spraying Missed Spraying
Targets Targets
83% 91%
Missed Targets Missed Targets

a) b)
Fig. 16. Pie Charts showing the percentage of spray results with the: (a) TX2 GPU; and (b) GTX 1070 Ti GPU.

Fig. 17. Experimental results with artificial plants comparing the evaluation metrics between the two GPUs.

347
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350

Table 4
Results for the experiment 3, with real plants, using the TX2 GPU.
Reps Case A Case B Case C Case D Case E Pd Rd Ps Rs FST HST MST MT

Rep 1 12 0 1 7 3 81% 65% 75% 60% 60% 0% 5% 35%


Rep 2 6 0 2 12 2 80% 40% 60% 30% 30% 0% 10% 60%
Rep 3 9 0 4 7 5 72% 65% 50% 45% 45% 0% 20% 35%
Rep 4 6 0 2 12 1 89% 40% 67% 30% 30% 0% 10% 60%
Rep 5 8 0 3 9 1 92% 55% 67% 40% 40% 0% 15% 45%
Rep 6 11 0 3 6 8 64% 70% 50% 55% 55% 0% 15% 30%
Rep 7 13 0 5 2 6 75% 90% 54% 65% 65% 0% 25% 10%
Rep 8 8 0 2 10 4 71% 50% 57% 40% 40% 0% 10% 50%
Rep 9 8 0 3 9 5 69% 55% 50% 40% 40% 0% 15% 45%
Rep 10 7 0 2 11 3 75% 45% 58% 35% 35% 0% 10% 55%
Average 77% 58% 59% 44% 44% 0% 14% 43%

Table 5
Results for the experiment 4, with real plants, using the GTX 1070 Ti GPU.
Reps Case A Case B Case C Case D Case E Pd Rd Ps Rs FST HST MST MT

Rep 1 18 0 2 0 4 83% 100% 100% 90% 90% 0% 10% 0%


Rep 2 14 0 3 3 6 74% 85% 82% 70% 70% 0% 15% 15%
Rep 3 20 0 0 0 2 91% 100% 100% 100% 100% 0% 0% 0%
Rep 4 14 0 4 2 3 86% 90% 88% 70% 70% 0% 20% 10%
Rep 5 19 0 1 0 4 83% 100% 100% 95% 95% 0% 5% 0%
Rep 6 14 0 4 2 3 86% 90% 88% 70% 70% 0% 20% 10%
Rep 7 18 0 2 0 1 95% 100% 100% 90% 90% 0% 10% 0%
Rep 8 11 0 6 3 4 81% 85% 79% 55% 55% 0% 30% 15%
Rep 9 14 0 3 3 3 85% 85% 82% 70% 70% 0% 15% 15%
Rep 10 13 0 4 3 3 85% 85% 81% 65% 65% 0% 20% 15%
Average 85% 92% 71% 78% 78% 0% 14% 8%

Fig. 18. Experimental results with real plants comparing the evaluation metrics between the two GPUs.

85%) and 59% (from 58% to 92%) respectively. spraying region not covered.
In both the experiments, there have been several instances of non-
targets being sprayed (Case E). The non-target spraying percentage was
calculated by comparing the non-target sprays to the total number of 4.3. Experiment 5: Weed mapping using GTX 1070 Ti GPU with
sprays. The non-targets spraying percentage when using TX2 GPU and artificial plants
GTX 1070 Ti GPU were found out to be 15% and 14% respectively.
There is no significant difference between both the GPUs in Case E. Fig. 19 presents the weed (and spray) map developed at the ex-
Another important observation to make in all these experiments is periment 5. The smart sprayer detected most of the weeds (Table 6),
related on the number of missed spraying target (Fig. 18). This is the sprayed on them, and recorded their position (longitude and latitude in
case where the sprayer detects the target but misses spraying on it and UTM). An average offset of 0.25 m was calculated comparing to the
instead sprays on the sidelines of the target. It should be noticed that it actual position of the weeds (measured by the RTK GPS). However, the
is in the about 14% in all the experiments. It has been noted visually weed clusters (hot spots) are easy to distinguish (Fig. 19). A sensor
during the experiments that the majority of the missed spraying targets fusion algorithm (e.g. Kalman filter; Ampatzidis et al., 2006) will be
were on the right side of the sprayer. This aspect needs to be further developed to combine GPS data with other sensors data (e.g. initial
investigated as to why this is happening. We also observed that the measurements unit, IMU) to reduce data noise and improve localization
precision and recall of the sprayer has been significantly higher when accuracy. The precision (Ps) and recall (Rs) of the spraying system were
the spraying region is covered in shadows when compared to the 95% and 89% respectively (Table 6).

348
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350

Fig. 19. Weed map developed using the smart technology.

Table 6
Results for the experiment 5, with artificial plants, using the GTX 1070 Ti GPU.
Reps Case A Case B Case C Case D Case E Pd Rd Ps Rs FST HST MST MT

Rep 1 18 0 2 2 0 100% 91% 90% 82% 82% 0% 9% 9%


Rep 2 18 0 1 3 0 100% 86% 95% 82% 82% 0% 5% 14%
Rep 3 19 0 1 2 0 100% 91% 95% 86% 86% 0% 5% 9%
Rep 4 18 0 1 3 0 100% 86% 95% 82% 82% 0% 5% 14%
Rep 5 19 0 1 2 0 100% 91% 95% 86% 86% 0% 5% 9%
Rep 6 19 0 3 0 0 100% 100% 86% 86% 86% 0% 14% 0%
Rep 7 22 0 0 0 0 100% 100% 100% 100% 100% 0% 0% 0%
Rep 8 22 0 0 0 0 100% 100% 100% 100% 100% 0% 0% 0%
Rep 9 21 0 0 1 0 100% 95% 100% 95% 95% 0% 0% 5%
Rep 10 19 0 2 1 0 100% 95% 90% 86% 86% 0% 9% 5%
Average 100% 94% 95% 89% 89% 0% 5% 6%

5. Conclusion target (weed portulaca).


Finally, a weed map was developed using the smart sprayer and an
A prototype for a smart sprayer using artificial intelligence was RTK GPS. The generated map showed a good visual correlation to the
developed and evaluated. The sprayer comprised a machine vision real scenario’s picture; a sensors fusion algorithm (e.g. Kalman filter)
software (AI-based) that uses deep learning to detect specific target will be developed to remove noise and increase the weed localization
weeds, and a hardware with 12 individual fast response nozzles for accuracy. This technology has a real potential to deliver more pro-
spraying. The system was designed to be composed of low cost and ductive and sustainable agriculture, based on a more precise and cost-
accessible components, on a total of less than $1,500. The smart sprayer efficient approach, especially in a scenario of farming labor shortage
was evaluated with artificial and real plants; two embedded GPU and climate change.
(Graphics Processing Unit) units were evaluated: (i) NVIDIA TX2 GPU;
and (ii) NVIDIA GTX 1070 Ti GPU. The GTX 1070 Ti GPU performed
better than the TX2 GPU especially in the experiment with the real 6. Future Research
plants. Using the GTX 1070 Ti GPU, the missed targets (portulaca weed)
were reduced by 81% (from 43% to 8%), the precision and recall of the An algorithm will be developed to vary the amount of chemicals
detection system were increased by 20% and 77%, and the precision applied (variable rate technology –VRT) based on the weed canopy size
and recall of the overall spraying system were increased by 10% and and vehicle speed (measured from multiple sensors e.g., GPS, odometer,
59% respectively (comparing with the TX2 GPU). That is explained for radar). A sensor fusion algorithm will be developed for the VRT com-
the fact that on challenging cases, where the targets have high simila- ponent of the smart sprayer. Additionally, the smart sprayer will be
rities to the non-targets, a heavier neural network is required, thus evaluated in commercial vegetable fields for precision weed manage-
more computational processing. In general, the smart sprayer was able ment; its performance on weed management and agrochemical use will
to accurately distinguish the weeds portulaca (target) over pepper be compared with traditional broadcast sprayers that usually treat the
plants (non-target) and sedge weeds (non-target) and spray only on the entire field to control pest populations.

349
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350

Appendix A. Supplementary material Gianessi, L., Reigner, N., 2007. The value of herbicides in U.S. crop production. Weed
Technol. 21 (2), 559–566. https://doi.org/10.1614/WT-06-130.1.
GTX 1070 Ti Gaming Graphics Card _ NVIDIA GeForce. (n.d.). Retrieved October 25,
Supplementary data to this article can be found online at https:// 2018, from < https://www.nvidia.com/en-us/geforce/products/10series/geforce-
doi.org/10.1016/j.compag.2018.12.048. gtx-1070-ti/ > .
Jeanmart, S., Edmunds, J.F.A., Lamberth, C., Pouliot, M., 2016. Synthetic approaches to
the 2010–2014 new agrochemicals. Bioorg. Medi. Chem. 24 (3), 317–341.
References Kargar B. A. H., Shirzadifar A. M., 2013. Automatic weed detection system and smart
herbicide sprayer robot for corn fields. In: 2013 First RSI/ISM International
Abdulridha, J., Ampatzidis, Y., Ehsani, R., de Castro, A., 2018. Evaluating the perfor- Conference on Robotics and Mechatronics (ICRoM), pp. 468-473, Tehran, Iran, Feb
mance of spectral features and multivariate analysis tools to detect laurel wilt disease 13-15, 2013.
and nutritional deficiency in avocado. Comput. Electronics Agri. 155, 203–2011 Dec Lamichhane, J.R., Dachbrodt-Saaydeh, S., Kudsk, P., Messéan, A., 2016. Toward a re-
2018. duced reliance on conventional pesticides in european agriculture. Plant Disease 100
Abdulridha, J., Ehsani, R., Abd-Elrahman, A., Ampatzidis, Y., 2019. A remote sensing (1), 10–24.
technique for detecting laurel wilt disease in avocado in presence of other biotic and Lee, W.S., Slaughter, D.C., Giles, D.K., 1999. Robotic weed control system for tomatoes.
abiotic stresses. Comput. Electronics Agri. 156, 549–557. Precis. Agric. 1, 95–113.
Aitkenhead, M., Dalgetty, I., Mullins, C., McDonald, A., Strachan, N., 2003. Weed and Luvisi, A., Ampatzidis, Y., Bellis, L.D., 2016. Plant pathology and information technology:
crop discrimination using image analysis and artificial intelligence methods. Comput. opportunity and uncertainty in pest management. Sustainability 8 (8), 831. https://
Electronics Agri. 39 (3), 157–171. doi.org/10.3390/su8080831.
Ampatzidis, Y., Bellis, L.D., Luvisi, A., 2017. iPathology: robotic applications and man- McCarthy C., Rees S., and Baillie C., 2010. Machine vision-based weed spot spraying: a
agement of plants and plant diseases. Sustainability 9 (6), 1010. https://doi.org/10. review and where next for sugarcane. In: 32nd Annual Conference of the Australian
3390/su9061010. Society of Sugar Cane Technologists (ASSCT 2010), 11-14 May 2010, Bundaberg,
Ampatzidis Y., and Cruz A.C., 2018. Plant disease detection utilizing artificial intelligence Australia.
and remote sensing. International Congress of Plant Pathology (ICPP) 2018: Plant Miedaner, T., Schmid, J.E., Flath, K., Koch, S., Jacobi, A., Ebmeyer, E., Taylor, M., 2018.
Health in a Global Economy, July 29 – August 3, Boston, USA. A multiple disease test for field-based phenotyping of resistances to Fusarium head
Ampatzidis, Y., Kiner, J., Abdolee, R., Ferguson, L., 2018. Voice-Controlled and Wireless blight, yellow rust and stem rust in wheat. Eur. J. Plant Pathol. 2, 451–461.
Solid Set Canopy Delivery (VCW-SSCD) system for mist-cooling. Special Issue: Moller J., 2010. Computer vision – a versatile technology in automation of agricultural
Information and Communications Technologies (ICT) for Sustainability. machinery. In: 21 st. Annual Meeting, Bologna, EIMA International, Nov 13-14, 2010.
Sustainability 10 (2), 421. https://doi.org/10.3390/su10020421. Swanson, N.L., Leu, A., Abrahamson, J., Wallet, B., 2014. Genetically engineered crops,
Ampatzidis Y., Vougioukas S., Blackmore S. and Bochtis D., 2006. An Object- Oriented glyphosate and the deterioration of health in the United States of America. J. Org.
Asynchronous Kalman Filter with Outlier Rejection for Autonomous Tractor Sys. 9, 6–37.
Navigation. In: Proceedings of the XVI CIGR World Congress (International Naranjo, S.E., Ellsworth, P.C., Frisvold, G.B., 2015. Economic value of biological control
Commission of Agricultural Engineering) Bonn, Germany (September 3-7). in integrated pest management of managed plant systems. Ann. Rev. Entomol. 60,
Balafoutis, A., Beck, B., Fountas, S., Vangeyte, J., Wal, T., Soto, I., Gómez-Barbero, M., 621–645.
Barnes, A., Eory, V., 2017. Precision agriculture technologies positively contributing Oerke, E.C., 2006. Crop losses to pests. J. Agri. Sci. 144 (1), 31–43.
to GHG emissions mitigation’, farm productivity and economics. Sustainability 2017 Owen, D.K.K., Zelaya, A.I., 2005. Herbicide resistant crops and weed resistance to her-
(9), 1339. bicides. Pest Manage. Sci. 301–311. https://doi.org/10.1002/ps.1015.
Chostner, B., 2017. See & spray: the next generation of weed control. Res. Mag. 24 Peterson, J.A., Ode, P.J., Oliveira-Hofman, C., Harwood, J.D., 2015. Integration of plant
(4), 4–5. defense traits with biological control of arthropod pests: challenges and opportu-
Creech, C.F., Henry, R.S., Werle, R., Sandell, L.D., 2017. Performance of postemergence nities. Front. Plant Sci. 30https://doi.org/10.3389/fpls.2016.01794. November
herbicides applied at different carrier volume rates. Weed Technol. 29 (3), 611–624. 2016.
Cruz, A., Ampatzidis, Y., Pierro, R., Materazzi, A., Panattoni, A., De Bellis, L., Luvisi, A., Pretty, J., Bharucha, P.Z., 2015. Integrated pest management for sustainable in-
2019. Detection of grapevine yellows symptoms in vitis vinifera L. with artificial tensification of agriculture in Asia and Africa. Insects 6 (1), 152–182. https://doi.org/
intelligence. Comput. Electron. Agri. 157, 63–76. 10.3390/insects6010152.
Cruz, A.C., Luvisi, A., De Bellis, L., Ampatzidis, Y., 2017. X-FIDO: an effective application Powles, B.S., 1996. Herbicide resistance in plants. CRC Press. eBook, Boca Raton.
for detecting olive quick decline syndrome with novel deep learning methods. Front. Ranga, Rao G., Kumari, B., Sahrawat, K., Wani, S., 2015. Integrated pest management
Plant Sci. 10https://doi.org/10.3389/fpls.2017.01741. October 2017 |. (IPM) for reducing pesticide residues in crops and natural resources. In:
Deutsch, C.A., Tewksbury, J.J., Tigchelaar, M., Battisti, D.S., Merrill, S.C., Huey, R.B., Chakravarthy, A. (Ed.), New Horizons in Insect Science: Towards Sustainable Pest
Naylor, R.L., 2018. Increase in crop losses to insect pests in a warming climate. Management. Springer, New Delhi.
Science 361 (6405), 916–919. https://doi.org/10.1126/science.aat3466. Redmon J., Divvala S., Girshick R. and Farhadi A., 2016. You only look once: Unified,
Efsa, E., 2013. The 2010 european union report on pesticide residues in food. EFSA J. 11 real-time object detection. In:The IEEE Conference on Computer Vision and Pattern
(3), 3130. Recognition (CVPR), pp. 779-788
Embedded Systems Developer Kits, Modules, & SDKs _ NVIDIA Jetson. (n.d.). Retrieved Sankhla, M.S., Kumari, M., Nandan, M., Kumar, R., Agrawal, P., 2016. Heavy metals
October 25, 2018, from < https://www.nvidia.com/en-us/autonomous-machines/ contamination in water and their hazardous effect on human health-A review. Int. J.
embedded-systems-dev-kits-modules/?section=jetsonTX2 > . Curr. Microbiol. App. Sci. 5 (10), 759–766. https://doi.org/10.20546/ijcmas.2016.
Fernandez-Quintanilla, C., Peña-Barragán, J.M., Andújar, D., Dorado, J., Ribeiro, A., 510.082.
López-Granados, F., 2018. Is the current state-of-the-art of weed monitoring suitable Song, Y., Sun, H., Li, M., Zhang, Q., 2015. Technology application of smart spray in
for site-specific weed management in arable crops? Weed Res. 58, 259–272. https:// agriculture: a review. Intell. Automat. Soft Comput. 21 (3), 319–333.
doi.org/10.1111/wre.12307. Sun, H., Li, M.Z., Zhang, Q., 2012. Detection system of smart sprayer: Status, challenges,
Flint, M.L., van den Bosch, R., 1981. Introduction to integrated pest management. Plenum and perspectives. Int. J. Agri. Biol. Eng. 5 (3), 10–23.
Press, New York, NY. Wikipedia: < https://en.wikipedia.org/wiki/Precision_and_recall >

350

You might also like