Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/329958978

A Low Cost System to Optimize Pesticide Application Based on Mobile


Technologies and Computer Vision

Conference Paper · November 2018


DOI: 10.1109/LARS/SBR/WRE.2018.00069

CITATIONS READS

8 245

5 authors, including:

Gabrielle Rosa Fabio Pedrotti Terra


Universidade Federal do Rio Grande (FURG) Instituto Federal de Educação, Ciência e Tecnologia Catarinense (IFC)
4 PUBLICATIONS   16 CITATIONS    8 PUBLICATIONS   26 CITATIONS   

SEE PROFILE SEE PROFILE

André Oldoni Paulo Drews-Jr

19 PUBLICATIONS   68 CITATIONS   
Universidade Federal do Rio Grande (FURG)
217 PUBLICATIONS   2,132 CITATIONS   
SEE PROFILE
SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Master degree project View project

OCELUS - Automatizing the Linear Welding Process using Computer Vision View project

All content following this page was uploaded by Paulo Drews-Jr on 18 February 2021.

The user has requested enhancement of the downloaded file.


A Low Cost System to Optimize Pesticide Application based on Mobile
Technologies and Computer Vision
Felipe Weber1 , Gabrielle Rosa1 , Fábio Terra2 , André Oldoni2 , and Paulo Drews-Jr1

Abstract— Over the past few years, agriculture has under- Although application of agricultural chemical is currently
gone profound changes due to technological advances. The carried out by several types of sprayers, the elucidation
application of agrochemicals revolutionized agriculture by re- presented here is relative to the boom sprayers (Fig. 1).
ducing manpower, increasing the productivity and providing a
greater control of spontaneous plants. However, the application These sprayers are generally mounted back to the tractor.
of pesticides using boom sprayers remain the same even though Furthermore, they are composed by a tank, to store the
the advances in autonomous tractors. As crops are usually pesticides, and two booms, one for each side, where nozzles
planted in rows, this paper optimizes and automates the applica- are hung. The nozzles are always open to allow the spray
tion of pesticides through an autonomous system based on local mix flowing through them during the application. Normally,
information using computer vision. The system is implemented
on smartphones and is capable of detecting rows in plantations, there is no control of spraying application. This leads to the
activating the related sprayer nozzles. Additionally, the system problem of eventually applying chemicals in areas where
is easy to install in many types of tractor-mounted boom there is no plantation line at all, which ends up causing
sprayers. Experimental results are obtained in an onion family damage to the environment and product waste.
farming where the system detects planting lines and controls
Old studies about precision agriculture suggests as agri-
the nozzles.
culture develops, there will be increasing demands for better
control systems. Thus, precisely defined targets application
I. INTRODUCTION systems will have to be developed. Moreover, the threat
of pesticide taxes will force farmers to reduce significantly
In the fifties, Green Revolution introduced new technolo-
usage of agrochemicals [5].
gies to the production of agricultural commodities [1]. After
this revolution, agriculture has been associated with the use On the other hand, robots are more and more present in
of an impressive quantity of agrochemicals to control and agriculture application and many researches have been done
avoid pests which impact the crop losses. The main adopted in the last years. Tractor implements and robots for agri-
products are synthetic organic compounds (SOCs) including cultural tasks have to be flexible, reliable, and maintainable
herbicides, pesticides, and other chemicals [2]. They present [6] [7] [8]. The use of Global Positioning System (GPS)
high biological activity which can cause environmental pollu- for control boom sprayers is common in precise agriculture
tion and imbalance of the agroecosystem [3]. In some cases, [9], [10]. However, the capability to reduce the quantity of
effective application of pesticides in plantations is just 0.1%, agrochemical is still limited due to global perception, high
i.e. 99.9% of the applied product tends to move to other costs and limited accuracy.
places such as surface and groundwater [4].
The pesticide waste during the agricultural operation re-
sults in economic and environmental losses. Thus, we pro-
pose a system to reduce the waste by using an image-based
perception capable of detecting plantation lines. Moreover,
we control nozzle openings in a sprayer boom according to
the crop necessity based on the perception. The automation
system is based on the image analysis of the environment.
A smartphone is responsible for both the data input and its
processing as well as the control signals to the boom sprayer
nozzles.

This study was financed in part by the Coordenação de Aperfeiçoamento


de Pessoal de Nivel Superior - Brasil (CAPES) - Finance Code 001. Also
thanks to the FAPERGS by the support under grant number 17/2551-
0000896-0. Fig. 1: Tractor mounted boom sprayer [11].
1 Federal University of Rio Grande
(FURG), Brazil. webercomp@gmail.com,
gabrielle almeida 12@hotmail.com, Fig. 2a exemplifies how the pesticide application is tradi-
paulodrews@furg.br tionally accomplished. All the nozzles are driven indepen-
2 Federal Institute of Education, Science and Technology Sul
Rio-Grandense (IFSul), Brazil. fabioterra@gmail.com, dently of the existence of planting lines below them. On the
andreoldoni@gmail.com other hand, Fig. 2b presents an application using our system
where the planting lines are recognized and performed the are coupled. Electropneumatic valves are inserted just before
nozzle activation according to the detected plants. the nozzles for spray nozzles control. In addition, their
The proposed system enables a fully autonomous and system detects the planting lines in the image by using a
optimized pesticide application. There are many kind of com- monochrome camera. An image segmentation is performed
mercial autonomous tractors [12], [13], including low cost using Gabor filter [17] by a planting line detection. Further-
systems [14]. However, the improvement in the autonomy of more, a linear filter for texture analysis, edge detection and
the boom sprayers is still an issue [15]. This is exactly the feature extraction is also performed. A method to differen-
focus of this work. tiate crops from spontaneous vegetation is also presented.
However, their method is only able to detect weeds present
between lines, which not allow them to be recognized when
they are on the same line as the plantation.
Underwood et al. [18] show a real-time green detection
system with an orientation controlled spray for horticultural
crops. The system consists of a mobile robot developed
by the Australian Center for Field Robotics at the Uni-
versity of Sydney. The apparatus consists of directional
(a) Traditional pesticide application using a boom sprayer. mechanical arm, batteries, computer, cameras, among other
components. In their work, image acquisition is performed by
polychromatic cameras and only the vegetation is detected.
They obtained the vegetation pixels in the image by using
the method removing the excess of green, by specifying
a minimum intensity threshold. This method results in the
detection of vegetation. Finally, the centroid of each plant
is estimate to provide the location where robot arm applies
pesticide.
(b) Proposed autonomous system using perception based on images Although the aforementioned works present significant
and nozzles control. technological advances, they require a high computational
cost and complexity, which makes them unpractical for
Fig. 2: Comparative between traditional boom sprayer system most of system. However, a low-cost system based on a
(a) and the proposed in this work (b). Green circles represent smartphone as a computing device is an important alternative
the activated nozzles and red ones represent the nozzles solution, due to its reduced cost, highly competitive market
turned off. and a constant technological improvement without price
increasing.
This work is part of a larger project whose scope is
the development a fully autonomous solution that turn a III. METHODOLOGY
tractor mounted sprayer in a fully autonomous robotic sys-
The proposed method involves three steps: image seg-
tem, allowing machine to collect environmental information,
mentation, selection of planting lines, and decision mak-
defining strategies based on expert’s knowledge, and acting
ing. Within these processes, there are other important sub-
in the spraying system to optimize its use.
processes involving image processing and computer vision.
This paper proposes a low cost system based on mobile
The following sections describe all steps involved until the
technologies capable to identify plants in the field using
decision making process.
a camera. Furthermore, the system communicates with the
boom sprayer to control the nozzles. This uses modern
A. Image Resizing
technologies to reduce waste and minimize environmental
impacts, being an important step of the agriculture robotics. An important step that precedes the processing of an image
The remainder of the paper is as follows. Initially, Section is its resizing. An image with huge pixel resolution would
II presents the related works. Section III describes the lead to a longer processing time. The area interpolation is
proposed method and their principles. Section IV evaluates adopted in order to preserve the characteristics of the resized
the methodology using experimental field data collected in a image without significant losses. In this kind of interpolation,
family-based onion farm. Finally, Section V summarizes the the new resized pixel is allocated over the old pixels, and then
paper and drawn the future directions. it calculates the average of the covered pixels values. Thus,
the new pixel is the average of its neighboring pixels [19].
II. RELATED WORK After the image resizing step, the image resolution reduces
The literature on the problem dealt in this work is still from 1080 × 1920 pixels to 360 × 480 pixels. Thus, this
limited. The work of Bossu et al. [16] proposed a machine new resolution reduces the system processing time without
vision system for a precision spraying in real-time. They compromising important image information due to the well-
adopted a tractor, where a camera, a computer, and a sprayer defined plants in the images.
B. Plant Segmentation
Among the steps to identify the lines, its crucial to segment
the plants contained in the image. At this point, the plants
need to be identified and separated in the image.
Several methods for segmentation have been proposed [20]
[21] [18]. This work is based on the method described by
[18], which performs the pixel analysis using RGB (red,
green, blue) color space. Thus, the functions described by
Equations 1 and 2 are necessary and sufficient conditions for
a pixel to be considered green in the image and consequently
represents plantation.

G > k(R + B) (1)

(R + B) > t (2)

The RGB color space represents the image using three lay-
ers. Therefore, the constants k and t define the threshold for
the pixel to be considered as part of the leafs in the plantation
and are defined by default as 0.68 and 20, respectively. These
thresholds are sensitive to the scene illumination [18], but
they are robust to changes in the leaf colors. Our empirical
evaluation shows that is easy to define good set of parameters
(a) Original image resized.
for a long journey spraying operation. The approach obtains a
binary image, as shown in Fig. 3b. The black pixels represent
the plants while the white pixels represent the non-plants
pixels.

C. Morphological operators
The detection provided by the green identification can be
limited and lose some parts of the plants due to shadows or
other important color changes. Thus, we applied morpholog-
ical operators: erosion and dilate.
Erosion operation aims to reduce the shape of an object
in the scene but also the noise [22]. Therefore, the proposed
work uses erosion to remove noises and separate clusters
of plants. One of the main parameter to be chosen is the
kernel size. We adopted a kernel size of block of 3 × 3 that
reduces the noise and remains the detected plants. An erosion
operation applied in the Fig. 3b is shown in Fig. 4.
After the erosion, dilation is applied. This is a procedure
to integrate morphological operations. The main function is
to highlight and increase the plants. The kernel size adopted
is 7 × 7 pixels. Therefore, we show a sample result obtained
after the application of dilation in Fig. 5.
(b) Identified green.
D. Plant detection
Fig. 3: Identification of green pixels after image resize in a
The plants are detected based on the centroids of each sample image acquired in a real onion farm. Black pixels
area. Finding the centroid of objects is important to obtain represent the plants, while the White pixels represent the
the best target location and, in our case, determine each crop non-plants.
line [19]. Fig. 6 shows the detected centers of plants in the
image, i.e., the center of each plant. This method finds the
plant centroids in order to obtain the plantation lines.
E. Plantation Lines Estimation
We adopt a line fitting to obtain the plantation lines based
on the knowledge that the seeding always occurs in this way.
Although there are many line fitting methods [23], in this
work is adopted the Progressive Probabilistic Hough Trans-
form (PPHT) [24], [25] based on the previously detected
plants, because its robustness. PPHT needs to adjust several
parameters, we defined that the best values for the pixel
resolution and angle are 1 pixel and 1 degree, respectively.
The minimum crossing number is another parameter that is
sensitive to the number of plants in the image. Experimental
evaluation shows that the interval [20; 30] covers most of the
plants. For onion farm, the minimum crossing is defined as
25. The other parameters such as the minimum number of
points and the maximum distance value between two points
to form a line are defined based on the height in pixels
Fig. 4: Erosion operation applied in the Fig. 3b. of the processed image. Their values are defined as 200
and 480, respectively [19]. The result of the planting lines
identification can be seen in Fig. 7.

Fig. 5: Dilation operation applied in the Fig. 4.


Fig. 7: Plantation lines detected by our method using previ-
ously detected plants (Fig. 6).

F. Spray Nozzle Selection


The definition whether a spray nozzle should be activated
is based on its position and the acquisition time. This position
must coincide with some planting line. Thus, supposing a
simple boom sprayer composed of five nozzle and each
nozzle disposed horizontally equidistantly under the boom.
The number of pixels and the region of the image corre-
sponding to each spraying area depend of the height of
the camera acquisition that is approximately constant. We
adopted a camera calibration method proposed by Zhang
[26]. The pinhole model [27] enables us to associate pixels
to position in the 2D space. Therefore, nozzle actuation is
performed when planting lines are bidirectionally distant 30
Fig. 6: Plants detected in the results shown in Fig. 5. They pixels from the position of the nozzle. Fig. 8 exemplifies
are presented using red circles. how nozzle selection is performed by our system. This
information is exchanged using Bluetooth communication
with a controllable boom sprayer that use solenoid-based is changed to 0.64 and the parameter C, that represents the
nozzles. minimum number of crosses to form a line, is set to 27. The
change in some values can be required due to the difference
in plant densities in the image. Therefore, results obtained
by changing k and C for 0.67 and 26, respectively, is shown
in Fig. 9b. These figures show some screen shots of the
application developed in the present work called App Green.
The Android technology1 is adopted.

(a) Using parameters k = 0.64 and C = 27.


Fig. 8: Spray nozzles selection method. The green squares
represent activated nozzles while the red ones represent
turned off nozzles.

IV. EXPERIMENTAL RESULTS


The experimental evaluation of the system was conducted
at a family-based farm in Rio Grande - RS, Brazil. The
system runs in an onion plantation of 0.4ha. The camera
was positioned under perspective projection in the longi-
tudinal direction of planting lines. The choice of direction
is due to the application of pesticides, which are applied
longitudinally. This factor allows the calibration of the nozzle
position in the 3D space and enables us to control its state.
The evaluation was performed using a Motorola X 2014
XT1097 smartphone, with 1080×1920 pixels resolution in a
portrait orientation. We highlight the adopted smartphone is
a low cost one where we achieved approximately real-time
performance, i.e. 15 Hz. The slow dynamics of the tractor (b) Using parameters k = 0.67 and C = 26.
during a spraying and, by consequence, the boom sprayer
Fig. 9: Comparison between two images with different
allows us to correctly control the spray nozzles.
parameters.
Several tests are carried out using the smartphone in order
to validate the proposed method. Plantation identification is
performed using in several parts of the crop, with different V. CONCLUSIONS
densities of onion plants. This fact results in a smaller or
This work presents a new approach for optimize and au-
larger number of identified plants in the images. The manual
tomate pesticide application in plantations, allowing agricul-
adjustment of some parameters can be required in order
tural machines to become more autonomous. Its differential
to improve the plant detection, as shown in the following
examples. Fig. 9a shows the results when the k parameter 1 http://www.android.com/
is the use of local information and mobile technologies [13] E. Kayacan, E. Kayacan, H. Ramon, and W. Saeys, “Towards
to perform acquisition, processing, and analysis of images, agrobots: Identification of the yaw dynamics and trajectory
tracking of an autonomous tractor,” Computers and Electronics
and also to control the activation of sprayer boom nozzles. in Agriculture, vol. 115, pp. 78–87, 2015. [Online]. Available:
Furthermore, computer vision methods for extracting image http://www.sciencedirect.com/science/article/pii/S0168169915001441
information, the system presents a satisfactory degree of [14] S. A. Garcia, J. G. Gil, and J. I. Arribas, “Evaluation
of the use of low-cost gps receivers in the autonomous
robustness with low implementation cost. In this way, the guidance of agricultural tractors,” Spanish Journal of Agricultural
results obtained are satisfactory, detecting green and planting Research, vol. 9, no. 2, pp. 377–388, 2011. [Online]. Available:
lines, even in several conditions. http://revistas.inia.es/index.php/sjar/article/view/1727
[15] R. Berenstein, O. B. Shahar, A. Shapiro, and Y. Edan, “Grape clusters
For future work, we will propose some improvement of and foliage detection algorithms for autonomous selective vineyard
current detection system, in order to allow the identification sprayer,” Intelligent Service Robotics, vol. 3, no. 4, pp. 233–243, 12
2010. [Online]. Available: https://doi.org/10.1007/s11370-010-0078-z
of a wider range of green intensities in an automatic way. [16] J. Bossu, C. Gée, and F. Truchetet, “Development of a machine
Another color space such as HSV [28] will also be evaluated. vision system for a real time precision sprayer,” Electronic Letters
Another direction is develop an automatic calibration system on Computer Vision and Image Analysis (ELCVIA), vol. 7, no. 3, pp.
54–66, 2008.
in which the parameters could be adjusted at the initializa- [17] H. G. Feichtinger and T. Strohmer, Eds., Gabor Analysis and Al-
tion of procedure without the need of changes during its gorithms: Theory and Applications, 1st ed. Secaucus, NJ, USA:
execution. Birkhauser Boston, Inc., 1997.
[18] J. P. Underwood, M. Calleija, Z. Taylor, C. Hung, J. Nieto, R. Fitch,
and S. Sukkarieh, “Real-time target detection and steerable spray for
R EFERENCES vegetable crops,” International Conference on Robotics and Automa-
tion (ICRA), Workshop on Robotics in Agriculture, 2015.
[19] G. Bradski and A. Kaehler, Learning OpenCV: Computer vision with
[1] R. E. Evenson and D. Gollin, “Assessing the impact the OpenCV library. O’Reilly Media, Inc., 2008.
of the green revolution, 1960 to 2000,” Science, vol. [20] D. M. Woebbecke, G. E. Meyer, K. Von Bargen, and D. A. Mortensen,
300, no. 5620, pp. 758–762, 2003. [Online]. Available: “Color indices for weed identification under various soil, residue, and
http://science.sciencemag.org/content/300/5620/758 lighting conditions,” Transactions of the ASAE, vol. 38, no. 1, pp.
[2] W. B. Wheeler, “Role of research and regulation in 50 years of pest 259–269, 1995.
management in agriculture prepared for the 50th anniversary of the [21] X. Yang, H. Beyenal, G. Harkin, and Z. Lewandowski, “Evaluation of
journal of agricultural and food chemistry,” Journal of Agricultural biofilm image thresholding methods,” Water research, vol. 35, no. 5,
and Food Chemistry, vol. 50, no. 15, pp. 4151–4155, 2002. pp. 1149–1158, 2001.
[3] L. Carbo, V. Souza, E. F. G. C. Dores, and M. L. Ribeiro, “De- [22] R. C. Gonzalez and R. E. Woods, Digital Image Processing, 3rd ed.
termination of pesticides multiresidues in shallow groundwater in a Pearson, 2007.
cotton-growing region of mato grosso, brazil,” Journal of the Brazilian [23] P. Mukhopadhyay and B. B. Chaudhuri, “A survey
Chemical Society, vol. 19, pp. 1111–1117, 2008. of hough transform,” Pattern Recognition, vol. 48,
[4] H. Sabik, R. Jeannot, and B. Rondeau, “Multiresidue methods no. 3, pp. 993–1010, 2015. [Online]. Available:
using solid-phase extraction techniques for monitoring priority http://www.sciencedirect.com/science/article/pii/S0031320314003446
pesticides, including triazines and degradation products, in [24] C. Galamhos, J. Matas, and J. Kittler, “Progressive probabilistic Hough
ground and surface waters,” Journal of Chromatography A, transform for line detection,” in IEEE Computer Society Conference
vol. 885, no. 1, pp. 217–236, 2000. [Online]. Available: on Computer Vision and Pattern Recognition, vol. 1, 1999, p. 560.
http://www.sciencedirect.com/science/article/pii/S0021967399010845 [25] J. Matas, C. Galambos, and J. Kittler, “Robust detection of lines using
[5] J. Stafford, “Implementing precision agriculture in the 21st century,” the progressive probabilistic hough transform,” Computer Vision and
Silsoe Research Institute, 2000. Image Understanding, vol. 78, no. 1, pp. 119–137, 2000.
[6] M. Norremark, H. W. Griepentrog, J. Nielsen, and H. Sogaard, [26] Z. Zhang, “A flexible new technique for camera calibration,” IEEE
“The development and assessment of the accuracy of Transactions on Pattern Analysis and Machine Intelligence, vol. 22,
an autonomous gps-based system for intra-row mechanical no. 11, pp. 1330–1334, 11 2000.
weed control in row crops,” Biosystems Engineering, vol. [27] R. Szeliski, Computer Vision: Algorithms and Applications, 1st ed.
101, no. 4, pp. 396–410, 2008. [Online]. Available: Berlin, Heidelberg: Springer-Verlag, 2010.
http://www.sciencedirect.com/science/article/pii/S153751100800278X [28] M. W. Schwarz, W. B. Cowan, and J. C. Beatty, “An Experimental
[7] Y. Nagasaka, H. Saito, K. Tamaki, M. Seki, K. Kobayashi, and Comparison of RGB, YIQ, LAB, HSV, and Opponent Color Models,”
K. Taniwaki, “An autonomous rice transplanter guided by global ACM Transactions on Graphics, vol. 6, no. 2, pp. 123–158, 4 1987.
positioning system and inertial measurement unit,” Journal of Field [Online]. Available: http://doi.acm.org/10.1145/31336.31338
Robotics, vol. 26, pp. 537–548, 06 2009.
[8] L. Emmi, M. Gonzalez-de Soto, G. Pajares, and P. Gonzalez-de Santos,
“New trends in robotics for agriculture: Integration and assessment of
a real fleet of robots,” The Scientific World Journal, p. 21, 03 2014.
[9] M. T. Batte and M. R. Ehsani, “The economics of
precision guidance with auto-boom control for farmer-owned
agricultural sprayers,” Computers and Electronics in Agriculture,
vol. 53, no. 1, pp. 28–44, 2006. [Online]. Available:
http://www.sciencedirect.com/science/article/pii/S0168169906000366
[10] F. J. Garcia-Ramos, M. Vidal, A. Boné, and A. Serreta, “Methodology
for the regulation of boom sprayers operating in circular trajectories,”
Sensors (Basel, Switzerland), vol. 11, no. 4, pp. 4295–4311, 2011.
[11] ViaRural Brazil, “Tractor mounted boom sprayer,”
http://br.viarural.com/agricultura/pulverizacao-acessorios/jacto/barras-
tres-pontos/barras-condor-pd-01.htm, online; accessed 30 August
2018.
[12] H. Mousazadeh, “A technical review on navigation systems
of agricultural autonomous off-road vehicles,” Journal of
Terramechanics, vol. 50, no. 3, pp. 211–232, 2013. [Online]. Available:
http://www.sciencedirect.com/science/article/pii/S0022489813000220

View publication stats

You might also like