Professional Documents
Culture Documents
Vghguj
Vghguj
Vghguj
Original papers
Agricultural and Biological Engineering Department, Southwest Florida Research and Education Center, University of Florida, IFAS, 2685 SR 29 North, Immokalee, FL
34142, USA
Keywords: Most conventional sprayers apply agrochemicals uniformly, despite the fact that distribution of weeds is typi-
Weed detection cally patchy, resulting in wastage of valuable compounds, increased costs, crop damage risk, pest resistance to
Artificial intelligence chemicals, environmental pollution and contamination of products. To reduce these negative impacts, a smart
Machine learning sprayer was designed and developed utilizing machine vision and artificial intelligence to distinguish target
Smart agriculture
weeds from non-target objects (e.g. vegetable crops) and precisely spray on the desired target/location. Two
Precision agriculture
different experimental scenarios were designed to simulate a vegetable field and to evaluate the smart sprayer’s
Neural networks
Deep learning performance. The first scenario contained artificial weeds (targets) and artificial plants (non-targets). The second
and most challenging scenario contained real plants; portulaca weeds as targets, and sedge weeds and pepper
plants as non-targets. Two different embedded graphics processing unit (GPU) were evaluated as the smart
sprayer processing unit (for image processing and target detection). The more powerful GPU (NVIDIA GTX 1070
Ti) achieved an overall precision of 71% and recall of 78% (for plant detection and target spraying accuracy) on
the most challenging scenario with real plants, and 91% accuracy and recall for the first scenario with artificial
plants. The less powerful GPU (NVIDIA Jetson TX2) achieved an overall precision and recall of 90% and 89%
respectively on the first scenario with artificial plants, and 59% and 44% respectively on the second scenario
with real plants. Finally, an RTK GPS was connected to the smart sprayer and an algorithm was developed to
automatically generate weed maps and visualize the collected data (after every application). This smart tech-
nology integrates a state of the art (AI-based) weed detection system, a novel fast and precision spraying system,
and a weed mapping system. It can significantly reduce the quantity of agrochemicals required, especially
compared with traditional broadcast sprayers that usually treat the entire field, resulting in unnecessary ap-
plication to areas that do not require treatment. It could also reduce costs, risk of crop damage and excess
herbicide residue, as well as potentially reduce environmental impact.
⁎
Corresponding author.
E-mail address: i.ampatzidis@ufl.edu (Y. Ampatzidis).
https://doi.org/10.1016/j.compag.2018.12.048
Received 5 November 2018; Received in revised form 19 December 2018; Accepted 23 December 2018
Available online 11 January 2019
0168-1699/ © 2018 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY license
(http://creativecommons.org/licenses/BY/4.0/).
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350
et al., 2016; Owen and Zelaya, 2005; Powles, 1996). commercial machine-vision systems for weed smart spraying and con-
Hence, there is an urgent need to reduce reliance on conventional cluded that they are limited to detect weeds on a soil background but
agrochemicals (and sprayers), without affecting agricultural yield. cannot distinguish between crop plants and weeds.
Integrated Pest Management (IPM) provides an alternative and changes Lee et al. (1999) developed and evaluated a smart sprayer to dis-
the philosophy of crop protection emphasizing on the understanding of tinguish between weed leaves and tomato plants. At that time, pro-
the insect, pest and crop ecology and implements several com- cessing speeds were very slow, but the detection results achieved were
plementary tactics, reducing risks to human health and the environ- promising. Kargar and Shirzadifar (2013) developed a machine vision
ment (Lamichhane et al., 2016; Flint and van den Bosch, 1981). Its weed spot-sprayer on corn crops. The system used image segmentation
main strategy is to maintain a pest population below a level that cause and feature extraction to distinguish the grass leaves from corn plants
economic injury, and not to eradicate the entire pest population on over 90% accuracy. It has to be noted that in this case the corn
(Naranjo et al., 2015; Peterson et al., 2016; Miedaner et al., 2018). leaves are much wider than the grass, which increases the detection
Agrochemicals (e.g. herbicides) are mainly applied using hydraulic accuracy. Aitkenhead et al. (2003) analyzed two processes to detect
and hydro-pneumatic sprayers that have high inefficiencies and sig- weeds on carrot crops. The first process included image segmentation
nificant amounts of the active ingredient end up elsewhere in the en- operations based on color and shape of the leaves. The second process
vironment (Creech et al., 2017) contaminating natural resources utilized a self-organized neural network to distinguish weeds. Although
(Sankhla et al., 2016; Ranga Rao et al., 2015). Contamination can be the results at the time were not as good as required for commercial
caused by run-off from the field, discharge from drainage and spray purposes, it showed how artificial intelligence can be used to analyze
drift (off target deposition of spray). The chemical spray drift usually much more properties other than segmentation processes.
occurs when small droplets of the spray liquid are carried away by the H-Sensor (Agricon GmbH, Ostrau, Germany) and See and Spray
wind onto the neighboring crops/fields which results in herbicide (or (Blue River Technology, Sunnyvale, CA, USA) are recent commercial
agrochemicals in general) residues on plant products. This may cause spraying technologies that utilizes artificial intelligence and are able to
damage to the crops and can also be carried over to the end consumer distinguish between crop plants and various weeds (for row crops)
where it can possibly have a significant effect on their health (Swanson (Chostner, 2017). A precision spray technology can significantly reduce
et al., 2014). Therefore, the use of herbicides should be as minimal and the quantity of herbicide required, compared with traditional broadcast
as efficient as possible in order to eliminate the negative environmental sprayers that usually treat the entire field to control pest populations,
impacts which can be a step closer to agricultural sustainability. which potentially results in unnecessary application to areas that do not
For reducing the negative impacts, new spraying technologies have require treatment. Applying herbicide only where weeds occur could
shown a significant improvement in terms of efficiency and safety by reduce costs, risk of crop damage and excess pesticide residue, as well
adopting the latest advances in electronics (Ampatzidis et al., 2018), as potentially reduce environmental impact (Balafoutis et al., 2017).
artificial intelligence (AI) (Abdulridha et al., 2019; Ampatzidis and In this paper, a low-cost and smart technology for precision weed
Cruz, 2018) and automation (Ampatzidis et al., 2017; Luvisi et al., management (for specialty crops) is presented and evaluated. This
2016). Still, most agrochemicals (e.g. herbicides) are applied uniformly, technology utilizes Artificial Intelligence (aka. deep and transfer
despite the fact that distribution of target pests, pathogens and weeds is learning) to distinguish between target and non-target plants in real
typically patchy. Uniform application wastes valuable agrochemicals by time, and spray only on a selected target (e.g. specific weed). The uti-
applying where little or no problems exist. The result is increased costs, lized deep learning neural network analyzes much more complex
risk of crop damage, pest resistance to chemicals, environmental pol- properties than an image segmentation alone, thus, it can be used to
lution and contamination of the edible products (Balafoutis et al., distinguish weeds from crop plants on a variety of situations (e.g. open
2017). field environment). To the best of the authors’ knowledge, we are the
In recent decades there has been rising interest in pest and disease first to approach this problem with deep learning and to integrate the
detection (Abdulridha et al., 2018; Cruz et al., 2019; Cruz et al., 2017), machine vision system with a low-cost, fast response and precise
and in automation of weed sprayers (Fernandez-Quintanilla et al., spraying system, and a weed and spraying mapping system.
2018). Moller (2010) concluded that using computer vision technolo-
gies on agricultural operation lowers the operator stress levels. A smart
2. Materials and methods
sprayer system needs to be able to locate weed spots in real time and
manage to spray the desired chemical only on the proper location. Sun
The Smart Sprayer prototype includes (Fig. 1): (i) individual nozzle
et al. (2012) and Song et al. (2015) analyzed various sensors and
control (12 nozzles with an adjustable spraying cone and 12 valves); (ii)
techniques for weed detection as machine vision, spectral analysis, re-
a low-cost pump; (iii) a real time kinematic GPS (RTK-GPS; around 2 cm
mote sensing and thermal images. Machine vision has been used for
accuracy); (iv) three video cameras (Webcam Logitech c920); (v) a
many years now to distinguish vegetation from the soil background
smart controller (Arduino mega) to control all sensors and the actuators
through image segmentation processes due to color difference between
(e.g., pump); (vi) a computational unit (e.g., NVIDIA Jetson TX2 GPU)
them. McCarthy et al. (2010) reviewed, at the time, the three main
for the image processing and weed detection; and (vii) several relay
Fig. 1. (a) The Smart Sprayer mounted on an All-terrain vehicle (ATV); (b) main components of the Smart Sprayer.
340
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350
341
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350
were chosen as the best low-cost GPUs available at the time in the trained for each experiment situation using around 1000 images of
market. targets and non-targets labeled manually for each target position on the
images. For all the experiments three different networks were trained:
2.2. Smart sprayer software
1. Tiny YOLOv3 containing pictures of artificial weeds as targets and
A software was developed in order to achieve a precise spraying on artificial plants (sunflowers) as non-targets.
the target and to develop an application (weed) map. Fig. 7 presents the 2. Tiny YOLOv3 containing pictures of the weed portulaca as targets
overall workflow of the smart system. The program was developed and the Pepper plants and the weed sedge as non-targets (Fig. 8).
using C programming language. The software runs in real time, being 3. YOLOv3 containing the same pictures as network number 2.
able to process up to 28 fps (frames per second) in all the steps com-
bined: image acquisition, object detection and communication. The YOLO network runs on the GTX 1070 Ti GPU at a 24-fps rate, a
good rate for real time applications. The same network on the Jetson
2.2.1. Image acquisition TX2 unit produced only 2.4 fps, which is not practical for our purposes.
The software acquires the most recent frame from the three cameras To increase speed on the Jetson TX2 computer unit Tiny YOLOv3 net-
simultaneously on a resolution of 640 × 480 pixels each. The images works were used achieving a 22 fps rate. The Tiny YOLO networks have
are then merged together side-by-side on a 1920 × 480 pixels single the advantage of being faster but loses accuracy in comparison to YOLO
image, which is then resized for a 1024 × 256 pixels final image, which networks.
was found to be a good size to achieve real time processing speeds. The
cameras are limited to acquire up to 30 fps. The overall processing 2.2.3. Convolutional neural network and deep learning
speed is determined by the network utilized and the capabilities of the YOLO object detection system uses convolutional neural networks
GPU. (CNN) to train and detect objects. A basic CNN is made up of four main
operations known as convolution, non-linearity, pooling or sub sam-
2.2.2. Target detection pling and classification. The four main operations are illustrated in
For the target (object) detection YOLOv3, a state-of-the-art detec- Fig. 9. Every image can be represented as a three-dimensional matrix of
tion network (Redmon et al., 2016) was utilized. The network was pixel values based on its visual features.
Fig. 8. Smart sprayer detection on weed portulaca as target and pepper as non-target.
342
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350
position of the target on the image. Eq. (1) shows the calculation of the
nozzle position (NP) to be triggered and Eq. (2) shows the value used to
identify the target distance from the nozzle (TDN). This value will be
later used by the Arduino unit to calculate the spraying time (ST) on the
target.
NP = (11× Xtarget ) + 1 (1)
Where:
X |0 < X < 1
NP |1 NP 12
Where:
Y |0 < Y < 1
TDN |0 TDN 9
After every frame detection one string of 12 bytes containing all the
target calculated values on the frame is sent to the Arduino controller
for processing and to trigger the respective nozzle.
In Fig. 9, the first step is where the image is scanned for the features 2.2.6. Valve control system
and the rectangle represents the filter that passes over it. There can be An Arduino script (that controls the valves) was developed to read
several filters that are applicable for a single image. The matrix of va- the serial data coming from the computational unit containing the
lues from the filter and the corresponding matrix of values in the input “target to be sprayed” (weed) position and the vehicle speed calculated
image is multiplied. The multiplication of two matrices is known as from the GPS data. The identifier of target distance from the nozzle
convolution, hence the name convolutional layer. After the convolution (TDN) and the vehicle speed are used to calculate the spraying time as
layer, there will be several feature maps, one for each filter used. The shown in Eq. (3).
larger rectangle, as shown in the picture above, is one patch of the
image that needs to be down sampled. The next step is where the fea- TDN × 360
ST =
tures maps are created using the down sampling. A new set of feature 1000 × Speed (3)
maps are created using the stack from the set before that is created by
down sampling. Then, another down sampling is used to create a new ST = Spraying time in seconds.
set of feature maps. Then, at the end you see a fully connected layer that TDN = Identifier of target distance to nozzle, integer number be-
classifies output with one label per node. This cycle keeps on going for tween 0 and 9.
every image in the training set. Speed = Vehicle speed in m/s.
2.2.4. Weed and nozzle position and triggering After calculating the spraying time for each nozzle, the Arduino unit
The communication between the computational unit and the (Arduino Mega2560, Italy) then sends a 5 V signal for a relay (JBTEK
Arduino unit is done through a universal serial bus (USB) connection. 4450182, China) that will then trigger the selected 12 V solenoid valve.
Upon detection, the detected box center (weed) relative coordinates are A workflow of the control system upon target detection is shown in
used to calculate the nozzle position to be sprayed and the height Fig. 10.
343
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350
Fig. 11. A part of the experimental track created with artificial plants (non-target) and weeds (target).
3. Experimental design GPUs, where in each repetition the targets were rearranged.
The experiments described below were arranged in order to eval- 3.2. Experiments 3 & 4: TX2 GPU and GTX 1070 Ti GPU with real
uate: (i) the detection accuracy of the vision-based system; (ii) the weeds and plants
precision and accuracy of the spraying system; and (iii) the perfor-
mance of the two computational units (TX2 GPU and GTX 1070 Ti In these experiments, the performance of the smart sprayer using
GPU). The experiments were conducted at the Southwest Florida both computational units were evaluated with real plants and weeds
Research and Education Center (SWFREC) located in Immokalee, (experiment 3 using the TX2 GPU; and experiment 4 using the GTX
Florida, USA. For each scenario a set of experimental tracks were de- 1070 Ti GPU). In this scenario, the experimental track contains two
veloped to simulate a vegetable field, where the weeds are placed weeds, portulaca (target) and sedge (non-target), placed randomly
randomly along with the crop plants. The weeds and plants (or anything among the pepper plants (non-target) which were placed in a straight
that is not supposed to be sprayed) are referred to as targets and non- line (to simulate a vegetable field where plants are placed in straight
targets respectively. All the experiments were conducted during the rows; Fig. 12). Only the portulaca weeds (Fig. 13a) were considered as
morning between 9 am to 11 am in August 2018, on sandy soil. The the target in order to evaluate the smart sprayer in a more complex
spraying liquid was water mixed with pink colored florescent dye, environment; the smart sprayer had to spray only on a specific weed
which makes it easier to determine where the targets or non-targets are and not on anything else (non-target includes pepper plants, Fig. 13c;
sprayed, in all the cases. and sedge weeds, Fig. 13b). The length and width of the track is the
same as the one with the artificial plants and weeds. This experiment
3.1. Experiments 1 & 2: TX2 GPU and GTX 1070 Ti GPU with was repeated 10 times each with both the GPUs, where in each re-
artificial plants and weeds petition the targets were rearranged. The pepper plants placed in a
straight line were used to resemble the crop and the weeds were placed
In these experiments, the detection accuracy of the vision-based around them randomly (e.g. Fig. 12) so as to resemble an actual sce-
system and the precision and accuracy of the spraying system were nario of weeds around the crop.
evaluated with artificial plants using the two computational units (ex-
periment 1 using the TX2 GPU; and experiment 2 using the GTX 1070 Ti 3.3. Experiment 5: Weed mapping using GTX 1070 Ti GPU and RTK-
GPU). A track with the artificial yellow colored plants (non-target) and GPS system
green colored weeds (target) was developed for this experiment. The
yellow colored plants were placed in a straight line resembling the In this experiment, the GTX 1070 Ti GPU was used with an RTK-GPS
actual crop and the green colored artificial plants, acting as weeds, were to develop a spray and weed map. For this scenario, a separate ex-
randomly placed around the crop (Fig. 11). The length of the track was perimental track was developed to simulate the actual row-middles in
10 m and the width of track was 0.9 m; the actual spraying width was Florida’s vegetable fields (Fig. 14), where the plants are grown on
0.6 m; the remaining 0.3 m was used by the wheels of the ATV to pass raised beds. In this scenario, the smart sprayer was used for weed
through. This experiment was repeated 10 times each with both the management in the row-middles, which contain only weeds (no crops).
Fig. 12. A part of the experimental track created with real plants (pepper, non-target) and weeds (portulaca, target; and sedge, non-target).
344
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350
Fig. 13. Images showing: (a) portulaca (target weed); (b) sedge (non-target weed); (c) and pepper plant (non-target).
The width of the track was 0.90 m and the length of the track was 35 m cases A and B) to the number of total sprays (sum of cases A, B, C, and
which is three-fourths as long as the actual rows middles. The portulaca E).
weeds were the only ones used in this experiment and they were con-
sidered as the target. The weeds were places randomly in clusters to CaseA + CaseB + CaseC
Precisionofthedetectionsystem =
simulate weed hot spots. This experiment was repeated 10 times, where CaseA + CaseB + CaseC + CaseE
in each repetition the targets were rearranged. 100 (4a)
Fig. 14. A part of the track created for the weed mapping experiment.
345
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350
Table 1
Description of the evaluation metrics.
Evaluation Metrics Description
Case A target is fully sprayed desired situation where the target successfully identified (by the vision system) and sprayed
Case B target is partially sprayed target successfully identified by the vision system but only a portion of the target is sprayed (e.g. half of the weed is
sprayed)
Case C target is identified but spray missed the target successfully identified (by the vision system) but the spray missed the target (e.g. spray next to the weed and
target not on it)
Case D target is not identified (and not sprayed) the vision-based system does not detect the target (weed)
Case E a non-target is sprayed A non-target faulty detected as target and sprayed (e.g. the vision system faulty detects a plant as target and sprays
–fully or partially- on it)
No. of Targets total number of targets The total number of the weeds that needed to be sprayed
Pd precision of detection system Eq. (4a)
Rd recall of the detection system Eq. (4b)
Ps precision of the sprayer system Eq. (5a)
Rs recall of the spraying system Eq. (5b)
FST percentage of fully sprayed targets Case A/No. of targets (%)
HST percentage of half sprayed targets Case B/No. of targets (%)
MST percentage of missed spraying targets Case C/No. of targets (%)
MT percentage of missed targets Case D/No. of targets (%)
4.1. Experiments 1 & 2: TX2 GPU and GTX 1070 Ti GPU with
artificial plants and weeds
The results for the experiments 1 and 2 with artificial plants and
weeds are presented in Table 2 (TX2 GPU) and Table 3 (GTX 1070 Ti
GPU) respectively. The total number of targets in both experiments were
30. It can be clearly noticed that the GTX 1070 Ti GPU is better per-
forming than the TX2 GPU especially comparing the percentages of fully
sprayed and missed targets (Fig. 16). The case of fully sprayed targets
(FST) has been increased by 10% (from 83% to 91%) using the GTX 1070
Ti GPU, whereas the case of missed targets (MST) was reduced to zero
(from 10% using the TX2 GPU). There is a small number (5%) of half
sprayed targets (HST) when using the TX2 GPU, but this problem seems
to be eliminated when using the GTX 1070 Ti GPU. Another interesting
observation is that there is not a single instance of a non-target (Case E)
being sprayed in both the experiments. Fig. 17 compares the evaluation
metrics (e.g. precision, recall etc.) for both experiments (exp. 1 using the
TX2 GPU; exp. 2 using the GTX 1070 Ti GPU).
4.2. Experiments 3 & 4: TX2 GPU and GTX 1070 Ti GPU with real
weeds and plants
The results for the experiments 3 and 4 with real weeds and plants
are presented in Table 4 (TX2 GPU) and Table 5 (GTX 1070 Ti GPU)
respectively. The average values for all the evaluation metrics are in-
cluded at the end of each table. The number of targets in both experi-
ments were 20.
It can be observed from the data that the sprayer with the TX2 GPU
has missed (MT) almost half of the targets. This can be attributed to the
capability limitation of the GPU as it may not be powerful enough to
process the images of the weeds/plants and detect them. When the TX2
GPU is replaced by a superior performing GTX 1070 Ti GPU, the missed
targets (MT) were reduced by 81% (from 43% to 8%). Also, the pre-
cision (Ps) and recall (Rs) of the spraying system have been increased
significantly by 20% (from 59% to 71%) and 77% (from 44% to 78%)
respectively (Fig. 18). The precision (Pd) and recall (Rd) of the detec-
Fig. 15. Precision and Recall explained in a graphical format (Wikipedia).
tion system have been also increased similarly by 10% (from 77% to
346
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350
Table 2
Results for the experiment 1, with artificial plants, using the TX2 GPU.
Reps Case A Case B Case C Case D Case E Pd Rd Ps Rs FST HST MST MT
Table 3
Results for the experiment 2, with artificial plants, using the GTX 1070 Ti GPU.
Reps Case A Case B Case C Case D Case E Pd Rd Ps Rs FST HST MST MT
Spray Results Using the TX2 Spray Results Using the GTX
GPU 1070 Ti GPU
2% Fully Sprayed 0% Fully Sprayed
0%
5% Targets 9% Targets
10%
Half Sprayed Half Sprayed
Targets Targets
Missed Spraying Missed Spraying
Targets Targets
83% 91%
Missed Targets Missed Targets
a) b)
Fig. 16. Pie Charts showing the percentage of spray results with the: (a) TX2 GPU; and (b) GTX 1070 Ti GPU.
Fig. 17. Experimental results with artificial plants comparing the evaluation metrics between the two GPUs.
347
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350
Table 4
Results for the experiment 3, with real plants, using the TX2 GPU.
Reps Case A Case B Case C Case D Case E Pd Rd Ps Rs FST HST MST MT
Table 5
Results for the experiment 4, with real plants, using the GTX 1070 Ti GPU.
Reps Case A Case B Case C Case D Case E Pd Rd Ps Rs FST HST MST MT
Fig. 18. Experimental results with real plants comparing the evaluation metrics between the two GPUs.
85%) and 59% (from 58% to 92%) respectively. spraying region not covered.
In both the experiments, there have been several instances of non-
targets being sprayed (Case E). The non-target spraying percentage was
calculated by comparing the non-target sprays to the total number of 4.3. Experiment 5: Weed mapping using GTX 1070 Ti GPU with
sprays. The non-targets spraying percentage when using TX2 GPU and artificial plants
GTX 1070 Ti GPU were found out to be 15% and 14% respectively.
There is no significant difference between both the GPUs in Case E. Fig. 19 presents the weed (and spray) map developed at the ex-
Another important observation to make in all these experiments is periment 5. The smart sprayer detected most of the weeds (Table 6),
related on the number of missed spraying target (Fig. 18). This is the sprayed on them, and recorded their position (longitude and latitude in
case where the sprayer detects the target but misses spraying on it and UTM). An average offset of 0.25 m was calculated comparing to the
instead sprays on the sidelines of the target. It should be noticed that it actual position of the weeds (measured by the RTK GPS). However, the
is in the about 14% in all the experiments. It has been noted visually weed clusters (hot spots) are easy to distinguish (Fig. 19). A sensor
during the experiments that the majority of the missed spraying targets fusion algorithm (e.g. Kalman filter; Ampatzidis et al., 2006) will be
were on the right side of the sprayer. This aspect needs to be further developed to combine GPS data with other sensors data (e.g. initial
investigated as to why this is happening. We also observed that the measurements unit, IMU) to reduce data noise and improve localization
precision and recall of the sprayer has been significantly higher when accuracy. The precision (Ps) and recall (Rs) of the spraying system were
the spraying region is covered in shadows when compared to the 95% and 89% respectively (Table 6).
348
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350
Table 6
Results for the experiment 5, with artificial plants, using the GTX 1070 Ti GPU.
Reps Case A Case B Case C Case D Case E Pd Rd Ps Rs FST HST MST MT
349
V. Partel et al. Computers and Electronics in Agriculture 157 (2019) 339–350
Appendix A. Supplementary material Gianessi, L., Reigner, N., 2007. The value of herbicides in U.S. crop production. Weed
Technol. 21 (2), 559–566. https://doi.org/10.1614/WT-06-130.1.
GTX 1070 Ti Gaming Graphics Card _ NVIDIA GeForce. (n.d.). Retrieved October 25,
Supplementary data to this article can be found online at https:// 2018, from < https://www.nvidia.com/en-us/geforce/products/10series/geforce-
doi.org/10.1016/j.compag.2018.12.048. gtx-1070-ti/ > .
Jeanmart, S., Edmunds, J.F.A., Lamberth, C., Pouliot, M., 2016. Synthetic approaches to
the 2010–2014 new agrochemicals. Bioorg. Medi. Chem. 24 (3), 317–341.
References Kargar B. A. H., Shirzadifar A. M., 2013. Automatic weed detection system and smart
herbicide sprayer robot for corn fields. In: 2013 First RSI/ISM International
Abdulridha, J., Ampatzidis, Y., Ehsani, R., de Castro, A., 2018. Evaluating the perfor- Conference on Robotics and Mechatronics (ICRoM), pp. 468-473, Tehran, Iran, Feb
mance of spectral features and multivariate analysis tools to detect laurel wilt disease 13-15, 2013.
and nutritional deficiency in avocado. Comput. Electronics Agri. 155, 203–2011 Dec Lamichhane, J.R., Dachbrodt-Saaydeh, S., Kudsk, P., Messéan, A., 2016. Toward a re-
2018. duced reliance on conventional pesticides in european agriculture. Plant Disease 100
Abdulridha, J., Ehsani, R., Abd-Elrahman, A., Ampatzidis, Y., 2019. A remote sensing (1), 10–24.
technique for detecting laurel wilt disease in avocado in presence of other biotic and Lee, W.S., Slaughter, D.C., Giles, D.K., 1999. Robotic weed control system for tomatoes.
abiotic stresses. Comput. Electronics Agri. 156, 549–557. Precis. Agric. 1, 95–113.
Aitkenhead, M., Dalgetty, I., Mullins, C., McDonald, A., Strachan, N., 2003. Weed and Luvisi, A., Ampatzidis, Y., Bellis, L.D., 2016. Plant pathology and information technology:
crop discrimination using image analysis and artificial intelligence methods. Comput. opportunity and uncertainty in pest management. Sustainability 8 (8), 831. https://
Electronics Agri. 39 (3), 157–171. doi.org/10.3390/su8080831.
Ampatzidis, Y., Bellis, L.D., Luvisi, A., 2017. iPathology: robotic applications and man- McCarthy C., Rees S., and Baillie C., 2010. Machine vision-based weed spot spraying: a
agement of plants and plant diseases. Sustainability 9 (6), 1010. https://doi.org/10. review and where next for sugarcane. In: 32nd Annual Conference of the Australian
3390/su9061010. Society of Sugar Cane Technologists (ASSCT 2010), 11-14 May 2010, Bundaberg,
Ampatzidis Y., and Cruz A.C., 2018. Plant disease detection utilizing artificial intelligence Australia.
and remote sensing. International Congress of Plant Pathology (ICPP) 2018: Plant Miedaner, T., Schmid, J.E., Flath, K., Koch, S., Jacobi, A., Ebmeyer, E., Taylor, M., 2018.
Health in a Global Economy, July 29 – August 3, Boston, USA. A multiple disease test for field-based phenotyping of resistances to Fusarium head
Ampatzidis, Y., Kiner, J., Abdolee, R., Ferguson, L., 2018. Voice-Controlled and Wireless blight, yellow rust and stem rust in wheat. Eur. J. Plant Pathol. 2, 451–461.
Solid Set Canopy Delivery (VCW-SSCD) system for mist-cooling. Special Issue: Moller J., 2010. Computer vision – a versatile technology in automation of agricultural
Information and Communications Technologies (ICT) for Sustainability. machinery. In: 21 st. Annual Meeting, Bologna, EIMA International, Nov 13-14, 2010.
Sustainability 10 (2), 421. https://doi.org/10.3390/su10020421. Swanson, N.L., Leu, A., Abrahamson, J., Wallet, B., 2014. Genetically engineered crops,
Ampatzidis Y., Vougioukas S., Blackmore S. and Bochtis D., 2006. An Object- Oriented glyphosate and the deterioration of health in the United States of America. J. Org.
Asynchronous Kalman Filter with Outlier Rejection for Autonomous Tractor Sys. 9, 6–37.
Navigation. In: Proceedings of the XVI CIGR World Congress (International Naranjo, S.E., Ellsworth, P.C., Frisvold, G.B., 2015. Economic value of biological control
Commission of Agricultural Engineering) Bonn, Germany (September 3-7). in integrated pest management of managed plant systems. Ann. Rev. Entomol. 60,
Balafoutis, A., Beck, B., Fountas, S., Vangeyte, J., Wal, T., Soto, I., Gómez-Barbero, M., 621–645.
Barnes, A., Eory, V., 2017. Precision agriculture technologies positively contributing Oerke, E.C., 2006. Crop losses to pests. J. Agri. Sci. 144 (1), 31–43.
to GHG emissions mitigation’, farm productivity and economics. Sustainability 2017 Owen, D.K.K., Zelaya, A.I., 2005. Herbicide resistant crops and weed resistance to her-
(9), 1339. bicides. Pest Manage. Sci. 301–311. https://doi.org/10.1002/ps.1015.
Chostner, B., 2017. See & spray: the next generation of weed control. Res. Mag. 24 Peterson, J.A., Ode, P.J., Oliveira-Hofman, C., Harwood, J.D., 2015. Integration of plant
(4), 4–5. defense traits with biological control of arthropod pests: challenges and opportu-
Creech, C.F., Henry, R.S., Werle, R., Sandell, L.D., 2017. Performance of postemergence nities. Front. Plant Sci. 30https://doi.org/10.3389/fpls.2016.01794. November
herbicides applied at different carrier volume rates. Weed Technol. 29 (3), 611–624. 2016.
Cruz, A., Ampatzidis, Y., Pierro, R., Materazzi, A., Panattoni, A., De Bellis, L., Luvisi, A., Pretty, J., Bharucha, P.Z., 2015. Integrated pest management for sustainable in-
2019. Detection of grapevine yellows symptoms in vitis vinifera L. with artificial tensification of agriculture in Asia and Africa. Insects 6 (1), 152–182. https://doi.org/
intelligence. Comput. Electron. Agri. 157, 63–76. 10.3390/insects6010152.
Cruz, A.C., Luvisi, A., De Bellis, L., Ampatzidis, Y., 2017. X-FIDO: an effective application Powles, B.S., 1996. Herbicide resistance in plants. CRC Press. eBook, Boca Raton.
for detecting olive quick decline syndrome with novel deep learning methods. Front. Ranga, Rao G., Kumari, B., Sahrawat, K., Wani, S., 2015. Integrated pest management
Plant Sci. 10https://doi.org/10.3389/fpls.2017.01741. October 2017 |. (IPM) for reducing pesticide residues in crops and natural resources. In:
Deutsch, C.A., Tewksbury, J.J., Tigchelaar, M., Battisti, D.S., Merrill, S.C., Huey, R.B., Chakravarthy, A. (Ed.), New Horizons in Insect Science: Towards Sustainable Pest
Naylor, R.L., 2018. Increase in crop losses to insect pests in a warming climate. Management. Springer, New Delhi.
Science 361 (6405), 916–919. https://doi.org/10.1126/science.aat3466. Redmon J., Divvala S., Girshick R. and Farhadi A., 2016. You only look once: Unified,
Efsa, E., 2013. The 2010 european union report on pesticide residues in food. EFSA J. 11 real-time object detection. In:The IEEE Conference on Computer Vision and Pattern
(3), 3130. Recognition (CVPR), pp. 779-788
Embedded Systems Developer Kits, Modules, & SDKs _ NVIDIA Jetson. (n.d.). Retrieved Sankhla, M.S., Kumari, M., Nandan, M., Kumar, R., Agrawal, P., 2016. Heavy metals
October 25, 2018, from < https://www.nvidia.com/en-us/autonomous-machines/ contamination in water and their hazardous effect on human health-A review. Int. J.
embedded-systems-dev-kits-modules/?section=jetsonTX2 > . Curr. Microbiol. App. Sci. 5 (10), 759–766. https://doi.org/10.20546/ijcmas.2016.
Fernandez-Quintanilla, C., Peña-Barragán, J.M., Andújar, D., Dorado, J., Ribeiro, A., 510.082.
López-Granados, F., 2018. Is the current state-of-the-art of weed monitoring suitable Song, Y., Sun, H., Li, M., Zhang, Q., 2015. Technology application of smart spray in
for site-specific weed management in arable crops? Weed Res. 58, 259–272. https:// agriculture: a review. Intell. Automat. Soft Comput. 21 (3), 319–333.
doi.org/10.1111/wre.12307. Sun, H., Li, M.Z., Zhang, Q., 2012. Detection system of smart sprayer: Status, challenges,
Flint, M.L., van den Bosch, R., 1981. Introduction to integrated pest management. Plenum and perspectives. Int. J. Agri. Biol. Eng. 5 (3), 10–23.
Press, New York, NY. Wikipedia: < https://en.wikipedia.org/wiki/Precision_and_recall >
350