Agronomy 14 00124 v2

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

agronomy

Article
Automatic Modeling Prediction Method of Nitrogen Content in
Maize Leaves Based on Machine Vision and CNN
Lei Sun 1, *, Chongchong Yang 1 , Jun Wang 2 , Xiwen Cui 1 , Xuesong Suo 1 , Xiaofei Fan 1 , Pengtao Ji 1 , Liang Gao 1
and Yuechen Zhang 1

1 State Key Laboratory of North China Crop Improvement and Regulation, Hebei Agricultural University,
Baoding 071001, China; 17633315534@163.com (C.Y.); cuixiwen0720@163.com (X.C.);
13903120861@163.com (X.S.); fanxiaofei@hebau.edu.cn (X.F.); jipengtao1987@126.com (P.J.);
gl13582397068@163.com (L.G.); zhangyc1964@126.com (Y.Z.)
2 College of Life Sciences, Cangzhou Normal University, Cangzhou 061001, China; czsywangjun@163.com
* Correspondence: slslsl0811@126.com; Tel.: +86-15176382329

Abstract: Existing maize production is grappling with the hurdles of not applying nitrogen fertilizer
accurately due to subpar detection accuracy and responsiveness. This situation presents a significant
challenge, as it has the potential to impact the optimal yield of maize and ultimately, the profit margins
associated with its cultivation. In this study, an automatic modeling prediction method for nitrogen
content in maize leaves was proposed based on machine vision and convolutional neural network.
We developed a program designed to streamline the image preprocessing workflow. This program
can process multiple images in batches, automatically carrying out the necessary preprocessing steps.
Additionally, it integrates an automated training and modeling system that correlates the images
with nitrogen content values. The primary objective of this program is to enhance the accuracy of
the models by leveraging a larger dataset of image samples. Secondly, the fully connected layer
of the convolutional neural network was reconstructed to transform the optimization goal from
classification based on 0–1 tags into regression prediction, so that the model can output numerical
values of nitrogen content. Furthermore, the prediction model of nitrogen content in maize leaves
was gained by training many samples, and samples were collected in three key additional fertilizing
stages throughout the growth period of maize (i.e., jointing stage, bell mouth stage, and tasseling
stage). In addition, the proposed method was compared with the spectral detection method under
Citation: Sun, L.; Yang, C.; Wang, J.; full-wave band and characteristic wavelengths. It was verified that our machine vision and CNN
Cui, X.; Suo, X.; Fan, X.; Ji, P.; Gao, L.; (Convolutional Neural Network)-based method offers a high prediction accuracy rate that is not
Zhang, Y. Automatic Modeling only consistently better—by approximately 5% to 45%—than spectral detection approaches but also
Prediction Method of Nitrogen features the benefits of easy operation and low cost. This technology can significantly contribute to
Content in Maize Leaves Based on
the implementation of more precise fertilization practices in maize production, leading to potential
Machine Vision and CNN. Agronomy
yield optimization and increased profitability.
2024, 14, 124. https://doi.org/
10.3390/agronomy14010124
Keywords: detection method; machine vision; automatic modelling; nitrogen content; maizes leaves
Received: 4 December 2023
Revised: 26 December 2023
Accepted: 29 December 2023
Published: 3 January 2024 1. Introduction
Nitrogen (N), as a nutrient element with the largest demands, plays a vital role in the
growth and development of maize, and is also the key to characterizing growth conditions
Copyright: © 2024 by the authors.
and photosynthesis of maize plants [1,2]. Nitrogen deficiency leads to short plant height
Licensee MDPI, Basel, Switzerland. and slow growth of maize, whereas excessive nitrogen causes lodging and heat injuries
This article is an open access article in maize plants. Leaves can reflect nitrogen content in maize plants effectively. Real-time
distributed under the terms and accurate acquisition of nitrogen content information in maize leaves, as an important con-
conditions of the Creative Commons dition for scientific management, is conducive to implementing on-demand fertilization
Attribution (CC BY) license (https:// and increasing yield and production profits of maize [3–5]. Traditional N content detec-
creativecommons.org/licenses/by/ tion methods in plants, such as the Kjeldahl method and the Dumas method, are mainly
4.0/). chemical ones [6]. Chemical methods are characterized by high accuracy and reliability.

Agronomy 2024, 14, 124. https://doi.org/10.3390/agronomy14010124 https://www.mdpi.com/journal/agronomy


Agronomy 2024, 14, 124 2 of 15

However, these methods have complicated detection processes and require professionals
in complex operations. Consequently, they are primarily employed as a benchmark to
calibrate other detection methodologies [7]. The most important disadvantage of these
methods is that they are invasive and time-consuming (considerable time lag between
sampling and result acquisition). In recent years, spectrum technology has advanced, which
provides a new way for N content detection in maize plants. The spectroscopy and remote
sensing technology are superior to chemical methods in terms of easy operation, analysis
speed, and accuracy [8–11]. For example, based on spectral remote-sensing technology,
Dayananda constructed a model for the relationship between N content and the growth
condition of maize plants [12]. Wang Lifeng utilized the successive projections algorithm
(SPA) to identify the characteristic wavelengths that are representative of the biochemical
and biophysical properties of maize leaves specifically during their jointing stage. Under
this wavelength, they constructed a mathematical model of N content in maize leaves and
spectral reflectivity, which achieved good prediction results, The SPA-based PLS (Partial
Least Squares) model showed strong predictive accuracy with Rc 2 = 0.944 and Rp 2 = 0.749,
greatly simplifying the model by reducing 95.07% of the variables [13]. Some imaging de-
tection methods that are used in leaf detection include hyperspectral imaging, near-infrared
imaging, and magnetic resonance imagers. However, it should be noted that the equipment
costs for spectroscopy detection methods can be significant, particularly when specialized
equipment is required and operated by trained personnel, while subsequent data analysis
is very complex. As pointed out by Taheri–Garavand, the cost factor is compensated by
employing cameras in the visible portion of the electromagnetic spectrum. However, such
methodology requires not only positioning of the leaf at a specific orientation in relation to
the camera but also defined illumination conditions. Apart from that, the required specific
illumination limits the method applicability to controlled-light environments. Advantages
of this method include affordability/cost-effectiveness, portability, and provision of rapid
measurements [14]. Over the last decade, Convolutional Neural Networks (CNNs) have
been increasingly employed in plant phenotyping community. They have been very effec-
tive in modeling complicated concepts, owing to their ability to distinguish patterns and
extract regularities from data. Examples include variety identification in seeds [15] as well
as the identification of plant varieties based on the morphological analysis of leaves from
intact plants [16]. Detection of nutrients including N contents in plants based on machine
vision is appreciated increasingly by industrial experts and farmers due to the low cost and
easy operation [17,18].
In N content detection, establishing a model of the relationship between detection
objects and N content is vital to ensure high detection accuracy [19,20]. Traditional model-
ing techniques, such as multiple linear regression (MLR) and backpropagation (BP) neural
networks, cannot meet the demands of researchers on prediction accuracy increasingly.
Recently, deep learning based on big data has provided a strong guarantee of accurate
modeling prediction [21–23]. Deep learning algorithms, represented by CNN have at-
tracted the extensive attention of experts in various industrial sectors and become widely
applied [24–27]. In the training and processing of many data models, these methods achieve
higher efficiency than traditional machine learning algorithms, such as fully connected
neural networks. Thus, they are highly appreciated by researchers engaging in maize
detection, aiming to improve modeling accuracy through a increase in training sample
size [28–30]. Nevertheless, deep learning algorithms are generally used for classification
and identification, rather than accurate classical prediction. For example, Lu Hao trained
images of maize tassels in different states by using a deep learning algorithm to recognize
maize development state automatically and provide guidance to irrigation and fertiliza-
tion [31]. If the deep learning algorithm based on CNN can be applied to construct a model
of image and N content data output, the accuracy of such algorithms has the potential to
be enhanced, particularly if the model architecture and training processes are carefully
optimized. However, the degree of improvement is subject to vary depending on specific
dataset characteristics and application contexts. Moreover, these deep learning algorithms
(e.g., segmentation and normalization) in the process of training and even predic
creasing operation complexity. The sample size is restricted by the workload o
preprocessing. Thus, modeling accuracy cannot be improved by increasing data
Agronomy 2024, 14, 124
ban Öztürk et al., 2018) [32]. 3 of 15

In order to address the above problems, an automatic modeling prediction


of nitrogen
tend to requirecontent in maize
preprocessing leaves(e.g.,
of images based on machine
segmentation andvision and CNN
normalization) was prop
in the
ing maize leaf samples which were collected by machine vision
process of training and even prediction, increasing operation complexity. The sample size equipment
ismethod,
restricted programs of image
by the workload of imagepreprocessing and data
preprocessing. Thus, extraction
modeling accuracywere
cannotcombined
be
improved by increasing data size [32].
model could be constructed quickly by inputting images. Beyond that, predictio
In order to address the above problems, an automatic modeling prediction method of
can be gained directly by inputting one image. The proposed method simplifies t
nitrogen content in maize leaves based on machine vision and CNN was proposed using
elingleaf
maize process
samplessignificantly and thebygoal
which were collected of improving
machine modeling
vision equipment. accuracy
In this method, throug
crease in
programs sample
of image size is realized.
preprocessing This method
and data extraction can predict
were combined and theNmodel
content
couldin maiz
be constructed
quickly andquickly by inputting
accurately images.
at a lower Beyond that,cost,
equipment prediction
layingresults can be
a good gained
foundation to
directly by inputting one image. The proposed method simplifies the modeling process
the application of the proposed method in the agricultural field.
significantly and the goal of improving modeling accuracy through an increase in sample
size is realized. This method can predict N content in maize leaves quickly and accurately
2.aMaterials
at and Methods
lower equipment cost, laying a good foundation to promote the application of the
proposed method inand
2.1. Experiments the agricultural
Materials field.
The experimental
2. Materials and Methods field is located in Baoding, Hebei Province, China. Zheng
2.1.
was adoptedand
Experiments as Materials
a plant cultivar. Our team was responsible for cultivation in 202
tionTheand experimental
other workfieldwere
is located in Baoding,
completed Hebei
with Province,
sensor China. ZhengThe
monitoring. Danmaize
958 wasplantin
adopted as a plant cultivar. Our team was responsible for cultivation in 2021. Irrigation and
presented in Figure 1. Block-based control variable method was used to perform fe
other work were completed with sensor monitoring. The maize planting area is presented in
and additional
Figure fertilizing.
1. Block-based Different
control variable methoddoses of to
was used nitrogen fertilizerand
perform fertilizing were fertilized in d
additional
fertilizing. Different doses of nitrogen fertilizer were fertilized in different sections. Moreover,modeli
sections. Moreover, N content in maize leaves was controlled to improve
Nracy.
content in maize leaves was controlled to improve modeling accuracy.

Figure
Figure 1. 1. Maize
Maize planting
planting area. area.
Our experiment, guided by skilled agricultural staff, divided the experimental field
into 20 test blocks, each with an area of about 120 square meters (0.2 acres). To simulate
varying fertilization conditions, we established a gradient of nitrogen application ranging
from low to high. The first test block was treated with 1 kg of nitrogen as a starting point.
Subsequently, for each additional block, the amount of nitrogen was increased by 0.2 kg,
culminating with a nitrogen application of 4 kg in the final block. This design was intended
to investigate the effect of varying nitrogen levels on maize growth and effectively collect
maize leaf samples across a spectrum from low to optimal to high nitrogen availability.
In order to ensure that the proposed prediction model can provide accurate detection
data support to additional fertilization of maize plants, about 1000 maize leaf samples were
collected from blocks with different fertilizer contents in three key stages (jointing stage,
bell mouth stage, and tasseling stage), respectively. A total of 3102 maize leaf samples were
collected in the laboratory. The ground data of the samples can be seen in Table 1.
Agronomy 2024, 14, 124 4 of 15

Table 1. Temperature and soil relative humidity record.

Date Temperature/◦ C Soil Relative Humidity/%


4.13 22.5 70
4.14 23.7 76
Jointing Stage
4.15 23.3 83
... ... ...
5.07 25.1 75
5.08 27.7 78
Bell mouth Stage
5.09 25.8 86
... ... ...
6.19 31.7 76
Tasseling Stage 6.20 30.6 81
6.21 33.2 78

Each leaf sample was numbered for convenience in the corresponding modeling between
leave images and N content values in detection leaves in a professional laboratory. In the
whole samples, leaf N content ranged from 17.93 to 26.55 mg·g−1 in the jointing stage, 15.38
to 22.07 mg·g−1 in the bell mouth stage, and 12.83 to 20.75 mg·g−1 in the tasseling stage.

2.2. Image Acquisition and Pre-Processing


Canon EOS70D camera, manufactured by Canon Inc. in Tokyo, Japan, was used as
the image acquisition equipment to predict N content. In our research methodology, the
primary programming and model establishment are conducted using Python 3.6. Due to
its extensive library support and flexibility, Python plays a central role during the data
modeling stage, responsible for automating script writing, constructing, and validating
models. Furthermore, Python’s various data science libraries are extensively utilized in the
data preparation stage to facilitate effective data preprocessing and feature engineering.
When it comes to the post-processing steps of model predictive results, we opt to use
Matlab 2018a for its robust capabilities in mathematical computations, to delve deeper into
the interpretation of model outputs, and to perform necessary statistical tests.
Agronomy 2024, 14, x FOR PEER REVIEW 5 of 17
In the collection of maize leaf sample images using a camera on sunny daytime with
good lighting conditions, the whole leaf sample was put to fill in the whole field of view as
much as possible, in order to prevent interferences of other objects in the image to modeling,
such as other
modeling, vegetation,
such as other soil, and artificial
vegetation, objects.
soil, and Some
artificial imagesSome
objects. collected by collected
images the camera by
are presented in Figure 2.
the camera are presented in Figure 2.

Figure2.2.Some
Figure Someimages
imagescollected
collectedby
bythe
thecamera.
camera.

2.3. Automatic Modeling Prediction Based on CNN


The classification is performed in the fully connected (FC) layer of the CNN, which
leverages the characteristics of the input data to categorize the images effectively. Based
on classification results, the probability of allocating samples to different classes is calcu-
lated. In this study, the objective function in the last layer of the model is based on variable
planting maize rather than regression values. The value range of N content in collected
Agronomy 2024, 14, 124 5 of 15

2.3. Automatic Modeling Prediction Based on CNN


The classification is performed in the fully connected (FC) layer of the CNN, which
leverages the characteristics of the input data to categorize the images effectively. Based on
classification results, the probability of allocating samples to different classes is calculated.
In this study, the objective function in the last layer of the model is based on variable
planting maize rather than regression values. The value range of N content in collected leaf
samples was used as data classification labels. After processing these data classification
labels, the target values of different classes that are divided according to the probability of
output vector classification were gained, which are the desired predictions of N content.
In this way, a regression prediction was realized. Among the collected 3102 maize leaf
samples, 80% were used as the training set and 20% were used as the validation set.

2.3.1. Automatic Data Processing


In this study, we analyzed maize leaf images as the primary data source. From these
images, we extracted several key parameters: the average grayscale value, the mean of the
RGB (Red, Green, Blue) components, and the standard deviation and coefficient of variation
for the RGB values. These parameters served as the input variables for our predictive
model. The model was tasked with estimating the nitrogen (N) content in the maize leaves,
which we defined as the model’s output variable. The modeling steps are as follows:
• Inputting images that are collected by a camera.
• Image preprocessing
• Inputting images into the program to obtain intermediate data: mean grey value, RGB
component mean, RGB standard deviation, and RGB variable coefficient. These data
are used as inputs of the model.
• Training the model by using the inputs of models in Step (3) and the laboratory
detected N content in maize leaves.
Following the data collection in Step (1), we developed an integrated program that
consolidates Steps (2) to (4) into a comprehensive workflow. Consequently, the system is
capable of performing three automated procedures—batch preprocessing, extraction of
model inputs, and construction of models incorporating nitrogen content—once the images
are uploaded into the program. Contrarily, in conventional practices, models typically
rely on a single color channel due to constraints in modeling efficiency, which in turn
compromises accuracy. The method proposed in this paper can not only decrease manual
workload but also realize the goal of improving modeling accuracy by increasing training
samples. So we can use RGB images directly to improve modeling accuracy.
One maize leave image was a piece of 200 pixels × 200 RGB pixels image using a
6 × 6 × 3 convolutional kernel (6 × 6 filters with three channels). The convolution kernel
size was 3 × 3, indicating that there were convolution kernels of three channels in different
colors. A 4 × 4 characteristic pattern was generated. Through gray processing of RGB
images, the average gray value of maize leaves was gained. At this moment, it was
transformed into a convolution kernel of a single channel, through which the average
gray value, RGB mean, RGB component standard deviation, and RGB variable coefficient
were obtained.

2.3.2. Training and Modeling


CNNs are a class of deep learning algorithms that have shown remarkable success
in areas such as image recognition, classification, and analysis [33]. These networks are
specifically designed to process data that come in the form of multiple arrays, such as 1D
for time series, 2D for images, or 3D for video data. The architecture of CNNs is inspired by
the organization of the animal visual cortex and is particularly adept at capturing spatial
hierarchies in data. The fundamental elements of a CNN include convolutional layers,
pooling layers, and fully connected layers—each contributing toward the feature extraction
and pattern learning capabilities of the network.
Agronomy 2024, 14, 124 6 of 15

In the design of our proposed network, we embraced CNN’s powerful feature ex-
traction and dimensionality reduction mechanisms to handle complex patterns in our
dataset. By leveraging proven CNN methodologies, we stand on the shoulders of decades
of research that have optimized the processing of high-dimensional data. Therefore, the
techniques employed, such as normalization, weight initialization, and pooling, follow a
well-established theoretical framework conducive to CNN architecture and function.
The convolutional structure of the proposed method is as follows, to facilitate a
coherent evaluation of indicators within the constrained unit system of the credit index,
we initially normalized individual groups of numerical data. This process allowed for the
alignment of sample distributions across a specific range while maintaining compatibility.
Acknowledging the variability in units and scales, standardization of each indicator was
imperative before their assessment could commence. Subsequently, to address the varying
nitrogen content levels amongst maize leaf samples, we transformed the initial values
into a unified numerical range. This normalization was not only crucial to homogenize
the data but also a requisite step before constructing the mathematical model. Finally, we
applied a comprehensive normalization to the entire dataset, ensuring a consistent scale and
range. This pivotal procedure guaranteed that the data comparability and integrity were
upheld, setting a solid foundation for the subsequent initialization of the weight matrix.
Subsequently, the weight matrix was initialized. Beyond that, normally distributed noises
with a standard deviation of 0.1 were added to improve training accuracy. The polarization
was initialized and some small positive values were added to avoid dead nodes. “The
tf.constant function was used to create a matrix of shape [1, 151]. In this matrix, all elements
were set to a numerical value of 0.1.” In order to obtain more image information, one
step was chosen at a time in the definition of the pooling layer: strides[1] = strides[2],
“strides[1] = strides[2]” means that the stride value is the same for both horizontal and
vertical movements of the kernel window during the pooling operation, by setting strides[1]
= strides[2], the pooling operation takes the maximum value within a 2 × 2 filter window
and moves by 2 pixels both horizontally and vertically across the image. This essentially
downsamples the feature map, reducing its size by half. However, because the stride value
is the same for both horizontal and vertical movements, the resulting downsampling does
not alter the aspect ratio of the image. In other words, the same amount of information
is retained in both dimensions, effectively preventing changes in the image size. In order
to decrease parameters and thereby reduce the complexity of the system, pooling was
applied for parameter sparsification. Here, the maximum pooling was used. The size and
step length of the pooling kernel function were set to 2 × 2 and 2, respectively. Through
x_image, the original data were transformed into 6 × 6 two-dimensional images.
In convolutional neural networks, each convolution kernel usually only processes
single-channel information. Therefore, the original RGB image is grayscale processed
and the number of channels is set to 1. The convolution layer is an important hierarchical
structure used to extract image features. For the parameter setting of the convolutional layer,
selecting the appropriate size of the convolutional kernel, the number of image channels,
and the number of convolutional kernels are the key factors. The following reasons are
mainly taken into account for the parameter setting of the convolutional layer. Convolution
kernel size: The choice of a 2 × 2 convolution kernel size can capture more intricate image
features while avoiding high parameter counts caused by excessively large convolution
kernel sizes. In addition, a 2 × 2 size is used in another pooling layer, which helps maintain
consistency and stability between layers. Image channels: After grayscaling, the original
RGB image has been transformed into a redundant three-channel image. Therefore, setting
the image channel to 1 can save training and computing resources, and the value of the
channel does not affect the model’s classification ability. Number of convolution kernels:
The selection of the number of convolution kernels needs to consider the computational
capacity of each kernel and the feature differences among different kernels. In this paper,
choosing 16 and 32 convolution kernels can guarantee the effectiveness and accuracy of the
model while minimizing the complexity of the convolutional layer, and fewer convolution
Agronomy 2024, 14, 124 7 of 15

kernels are also beneficial for reducing overfitting problems Convolutional kernel and
offset: In the convolutional layer, matrix convolutional operation mainly involves the
calculation of convolutional kernels and offsets. Convolutional kernels are used to capture
features in the image, while offsets can be used to adjust model bias or increase anti-
noise ability. When setting parameters such as convolution kernel size, image channels,
and the number of convolution kernels, it is necessary to adjust the value of the offset
appropriately to ensure the accuracy and generalization ability of the model. Therefore,
the first convolutional layer was added, in which the convolution kernel size, number
of image channels, number of convolution kernels, and corresponding offset were set to
2 × 2, 1, 16, and 16, respectively. Later, the second convolutional layer was added, in which
the convolution kernel size, number of image channels, number of convolution kernels
(multiplied by 16), and corresponding offset were set to 2 × 2, 16, 32, and 32, respectively.
The third convolutional layer, namely a FC layer, was added, in which 4 × 4 × 64 (height)
three-dimensional images were pulled into 512 long one-dimensional arrays. Finally, the
output layer I was added, and a 512-long one-dimensional array was compressed into a
1 long array. The offset was set as 1.
The Rectified Linear Unit (ReLU) layer implemented nonlinear mapping to outputs of
convolutional layers. The calculation formula could be expressed as:
Agronomy 2024, 14, x FOR PEER REVIEW 8 of 17
f ( x ) = max (0, x ) (1)

The above structure of CNN proposed is shown in Figure 3.

Figure 3. The above structure of CNN.


Figure 3. The above structure of CNN.
The hyperparameters for initializing the CNN model were carefully selected based
The hyperparameters
on preliminary experimentsfor initializing
aimed the CNN
at optimizing modelmodel were carefully
performance. selected
Learning ratebased
and
on preliminary
training experiments
epochs were set at 0.01aimed at optimizing
and 10,000, modelThis
respectively. performance.
particular Learning ratewas
learning rate and
training epochs were set at 0.01 and 10,000, respectively. This particular
determined after evaluating various values and observing that 0.01 offered a sufficient rate learning rate was
determined
of convergence after evaluating
without causing various values
instability andlearning
in the observing that Furthermore,
process. 0.01 offered atosufficient
prevent
rate of convergence without causing instability in the learning
overfitting, we implemented an iteration stop condition utilizing early stopping process. Furthermore,
based on to
prevent overfitting, we implemented an iteration stop condition utilizing
validation loss. If no improvement in the loss was seen over 100 consecutive epochs on early stopping
based
the on validation
validation loss. Ifwould
set, training no improvement
terminate. One in the
lossloss was
value was seen over every
output 100 consecutive
100 times
epochs
of on the
training. validation
Sample set,results
training training arewould terminate.
presented One 4.
in Figure loss value
The losswas
value output every
decreased
100 times of training.
continuously Sample training
and it approached results
0 with the are presented
increase in trainingintimes,
Figure 4. The
while theloss value
accuracy
decreased continuously and it approached 0 with the increase in training
increased continuously to 100%. This verifies that the proposed method achieves a good times, while the
accuracy increased
modeling effect. continuously to 100%. This verifies that the proposed method achieves
a good
Themodeling effect. was stored and called directly at the prediction step. It outputs N
finished model
content in maize leaves directly by inputting images of detecting maize leaves.
decreased continuously and it approached 0 with the increase in training times, while the
accuracy increased continuously to 100%. This verifies that the proposed method achieves
a good modeling effect.
Agronomy 2024, 14, 124 8 of 15

Figure 4. Training results.

2.4. Prediction
Figure 4. Training results.
In the modeling process of this article, a total of 3102 samples were used for training
and testing. Deep learning was carried out on the training set, with a train-to-test ratio of
The finished
8:2. To measuremodel was stored
the performance and
of the called modeling
proposed directly at the prediction
method, step.were
30 new samples It outputs
N content in maize leaves directly by inputting images of detecting maize leaves.
collected from different blocks in the jointing stage, bell mouth stage, and tasseling stage of
maize plants. Furthermore, these 90 new samples were used as the test and predicted using
different methods. Prediction accuracy was obtained by comparing detection results with
2.4. Prediction
prediction results of different methods. The calculation formula for prediction accuracy
In thebe
could modeling
expressedprocess
as: of this article, a total of 3102 samples were used for training
and testing. Deep learning was carried out on the
yp − yc training set, with a train-to-test ratio of
Accuracy =
8:2. To measure the performance of the proposed ( 1 − ) × 100% (2)
yc modeling method, 30 new samples were
collected
wherefrom different
y represents theblocks in the
prediction jointing
results stage,methods
of different bell mouth
and y stage, andlaboratory
represents tasseling stage
p c
detection results.
If the predicted result yp is exactly the same as the laboratory test result yc , the accuracy
is 100%. If it is higher or lower than yc , the accuracy will decrease accordingly.
In order to verify the advantages of the method proposed in this paper, the spectral
detection method under full-wave band and under the characteristic wavelength of N
content in maize leaves were treated as comparison methods. Prediction accuracies of the
proposed method to samples collected in three growth stages are shown in Figure 5a–c.
The prediction accuracy of the proposed method ranges from 93% to 99% among the
three growth stages. For contrast verification, the above samples were analyzed using the
spectrum detection method. Light reflectance information of maize leaves was collected
with a PSR-1100F spectrometer with a wavelength ranging from 320 nm to 1100 nm. The
prediction accuracy of the light reflectivity and N content model of maize leaves which is
constructed with the MLR method under full-wave band to samples in three stages are
shown in Figure 6a–c (85–95%). The prediction accuracies of the multiple regression model
to samples in three stages under the characteristic wavelength of N content in maize leaves
are presented in Figure 7a–c.
In order to analyze the application and convenience of different methods in agricultural
practices, 30 samples were selected from 90 samples randomly. The prediction accuracy of
the proposed model for these 30 samples is shown in Figure 5d. The prediction accuracy of
the spectral detection-MLR model under a full-wave band is presented in Figure 6d. Given
that characteristic wavelengths of N content in maize leaves vary in different stages, three
stages correspond to MLR models under three different characteristic wavelengths. The
characteristic wavelengths were selected according to the introduction in Reference [34].
The characteristic wavelengths are shown in Table 2. The prediction accuracies of MLR
models under characteristic wavelengths to 30 random samples in the jointing stage, bell
mouth stage, and tasseling stage are presented in Figure 7d–f.
diction accuracy of the light reflectivity and N content model of maize leaves which is
constructed with the MLR method under full-wave band to samples in three stages are
shown in Figure 6a–c (85–95%). The prediction accuracies of the multiple regression
Agronomy 2024, 14, 124
model to samples in three stages under the characteristic wavelength of N content in
9 of 15
maize leaves are presented in Figure 7a–c.

Agronomy 2024, 14, x FOR PEER REVIEW 10 of 17

Figure 5. Prediction accuracies of the proposed method to samples.

Figure 5. Prediction accuracies of the proposed method to samples.

Figure 6. Prediction accuracies of multiple regression models based on full-wave bands.


Figure 6. Prediction accuracies of multiple regression models based on full-wave bands.
EER REVIEW 10 of 16

Agronomy 2024, 14, 124 10 of 15

Figure 7. Prediction accuracies of multiple regression model to samples in three stages under
Figure 7. Prediction accuracies of multiple regression model to samples in three stages under char-
characteristic wavelength of N content in maize leaves.
acteristic wavelength of N content in maize leaves.
Table 2. Characteristic wavelengths of N content in maizes leaves vary in different stages.

In orderStageto analyze the application


Characteristic and Number
Wavelength convenience of different
Characteristic methods in agricul-
Wavelength/nm
Jointing Stage 7 321, 349, 509, 633, 690, 901, 1083
tural practices, 30 samples were selected
Bell mouth stage 7
from 90 samples 321, randomly. The prediction accu-
510, 603, 684, 821, 894, 1076
racy of Heading
the proposed
stage model for these730 samples is shown 323,in
344,Figure 5d.764,The
529, 610, 690, 854 prediction

accuracy of the spectral detection-MLR model under a full-wave band is presented in Fig-
ure 6d. Given that characteristic The mean wavelengths
accuracies of different
of Nmodels
content to samples
in maize in three stagesvary
leaves are shown in
in different
Table 3.
stages, three stages correspond to MLR models under three different characteristic wave-
Table 3. Mean prediction accuracy of different models.
lengths. The characteristic wavelengths were selected according to the introduction in Ref-
erence (Liu
Accuracy (%) Dan et al., 2020) [34]. TheStage
Jointing characteristic
Bell Mouth wavelengths
Stage Tasselingare
Stageshown Randomin Samples
Table 2. The
prediction
The proposedaccuracies of MLR models under characteristic wavelengths to 30 random sam-
96.46400 95.75625 95.80349 96.25322
method
ples in the
MLR model jointing stage, bell mouth
under stage, and89.79837
90.59979
tasseling stage are presented
91.54138
in Figure 7d–
90.85356
full wave band
f.
Jointing stage
94.16378 48.69417
model
Table 2. Characteristic
MLR model under wavelengths
Bell mouth stage of N content in maizes leaves vary in different stages.
characteristic 94.33698 46.63486
model
wavelength
Characteristic
Tasseling stage Wavelength Number Characteristic Wavelength/nm
94.86209 51.53014
model
tage 7 321, 349, 509, 633, 690, 901, 1083
stage 3. Results
7 321, 510, 603, 684, 821, 894, 1076
tage 7
The proposed 323, 344,
modeling method and experiment 529,
process can610, 690,
be seen 764, 854
in Figure 8. A
total of 3102 maize leaf samples had been collected. We collected maize leaf sample images
under sunny conditions to ensure good lighting, angling the camera to capture the entire
The mean accuracies
leaf andof different
avoid models
background to samples
interference. in three
We developed stages
an automatic are shown
method in Table
for predicting
3. leaf nitrogen content using CNN-based machine vision. Our validation involved 90 new

Table 3. Mean prediction accuracy of different models.

Jointing Stage Bell Mouth Stage Tasseling Stage Random Samples


Agronomy 2024, 14, 124 11 of 15

Agronomy 2024, 14, x FOR PEER REVIEW 12 of 17


samples from different maize growth stages (jointing, bell mouth, and tasseling), using
30 samples from each to compare various prediction methods.

Figure 8.
Figure 8. The
The proposed
proposed modeling
modeling method
method and
and experiment
experiment process.
process.

As shown in Figure 5, 5, the


the prediction
prediction accuracy
accuracy of the the proposed
proposed method
method ranges from
93% to
93% to 99%
99%among
amongthe thethree
threegrowth
growthstages.
stages.ItItcan
canbebe observed
observed from
from Table
Table 3 that
3 that thethe aver-
average
age prediction
prediction accuracy
accuracy of theofproposed
the proposedmethodmethod to samples
to samples in three
in three stagesstages is about
is about 95%.95%.
The
prediction
The predictionaccuracy of the
accuracy proposed
of the proposed method
method to to
random
randomsamples
samplesisisbasically
basically consistent
consistent
with
with the
the prediction
prediction accuracy to samples in three stages. However, as revealed in Figure 6,
the prediction accuracy
accuracy ranges
rangesfromfrom85%85%toto95%,95%,while
whilethetheaverage
average prediction
prediction accuracy
accuracy of
of
thethe spectral
spectral detection-MLR
detection-MLR modelunder
model underfull-wave
full-waveband bandtotosamples
samplesin in three
three stages is
about
about 90%.
90%. The prediction accuracy of the spectral detection-MLR
detection-MLR model model under
under full-wave
band
band to to random
randomsamples
samplesisisbasically
basicallyconsistent
consistentwith withthe prediction
the prediction accuracy
accuracy to to
samples in
samples
three stages.
in three As As
stages. shown
shown in Figure 7, the
in Figure 7, prediction
the prediction accuracies of the
accuracies ofspectral detection-MLR
the spectral detection-
model
MLR modelunderunder
characteristic wavelength
characteristic to samples
wavelength in three
to samples instages are relatively
three stages high, about
are relatively high,
94% on average. However, its prediction accuracy to random samples
about 94% on average. However, its prediction accuracy to random samples fluctuates fluctuates greatly
and the and
greatly average prediction
the average accuracyaccuracy
prediction is aboutis50%. about 50%.
4. Discussion
The CNN-based machine vision technique we have developed stands out in comparison
to recent research due to its exceptional accuracy and stability across various growth stages of
maize leaves. It achieves a predictive accuracy ranging from 93% to 99% within the jointing,
tasseling, and silking stages, with an average prediction accuracy of approximately 95%. This
is notably superior to methods employed by Silva et al. (2024) [35], which reported only
a modest correlation coefficient of around 0.6 and an MAE (Mean Absolute Error) below
0.5 when predicting chlorophyll content using random forest models, failing to match the
Agronomy 2024, 14, 124 12 of 15

precision achieved by our CNN approach. Additionally, Cao et al. (2021) [36] demonstrated
the effectiveness of combining dimensionality reduction techniques with different regression
methods for hyperspectral data analysis, highlighting the EN-PLSR (Elastic Net Partial Least
Squares Regression).model as the most accurate option with an R2 value of 0.96 and RMSE
value of 0.19. Despite their promising results, our CNN model offers comparable or even
superior accuracy without requiring complex data preprocessing steps typically associated
with traditional spectral detection and dimensionality reduction methods.
Furthermore, our method consistently demonstrates strong predictive capabilities through-
out different growth stages while maintaining model stability that promises practical applica-
tion benefits in agricultural monitoring tasks. In contrast, although the PLSR model used by
Cao et al. shows high R2 values, its real-world implementation may involve more intricate
data preprocessing steps compared to our CNN approach which directly processes image
data without complicated spectral data dimensionality reduction requirements; thus holding
significant promise for streamlined predictive analysis in agricultural monitoring.
Given that the collected data are within the whole wave band, the spectral detection-MLR
model under the full-wave band does not need modeling for different stages. It is applicable
to samples in different growth stages and random samples. However, it involves various
inputs that are spectral reflectivity under different wavelengths in the full-wave band of the
spectrograph, which increases modeling complexity. Moreover, prediction accuracy declines
due to multicollinearity. Thus, the prediction accuracy fluctuates around 90%.
Characteristic wavelengths of N content in maize leaves are different in the jointing
stage, bell stage, and tasseling stage. Therefore, corresponding models are constructed
for these three stages according to the introduction in Reference [34]. In each stage, light
reflectivity under only 7 characteristic wavelengths is used as the input, determining low
model complexity. Moreover, the constructed models could fully reflect the accuracy of
spectrum detection and achieve high prediction accuracies (about 94%) in the corresponding
stage. The prediction accuracy of the spectral detection-MLR model under characteristic
wavelength is higher than that of the spectral detection-MLR model under full-wave
band. Given that characteristic wavelength is different in different stages, the constructed
models are only effective for samples in the corresponding stage. Therefore, the prediction
accuracies of models for three stages to random samples are relatively low, which is similar
to the random distribution prediction effect. Hence, different characteristic wavelength
models should be constructed for samples in different stages, restricting their applications
to the agricultural field to some extent. Yu Fengh, et al., (2022) [37] proposed remote sensing
of nitrogen content in rice leaves to realize precision fertilization. However, it serves as
a hyperspectral vegetation index for the rapid inversion method with costly equipment
and indirect detection. Apart from that, Hong Bo, et al. [38] proposed a digital imaging
detection method for Nitrogen content in cotton leaves, but this method is based on linear
regression and the result will be poor when the number of samples is large.
On a commercial scale, a capital investment will be initially required to adopt the
employed approach [39]. Nevertheless, maybe the wide-ranging large-scale commercial
applications will be able to provide high returns through considerable improvements in
process enhancement and cost reduction. The proposed method in this paper will collect
images using machine vision. The image information will cover information in the whole
wave band. Therefore, it is not necessary to distinguish samples in different stages. In
comparison to the other two models, the proposed method achieves relatively higher
prediction accuracy (greater than 93% and about 95% on average) for both samples in
specific stages and random samples. The proposed method, which can predict N contents
in maize leaf samples in different stages accurately, is simpler than the spectral detection-
MLR model under characteristic wavelengths.
The accuracy of the proposed method is higher than that of the two spectrum detection
methods. One important reason is that it collects abundant samples as the training set,
but previous image processing and modeling methods have low degrees of automation.
Researchers should preprocess samples before modeling, which requires considerable
Agronomy 2024, 14, 124 13 of 15

workloads. The proposed method integrates image preprocessing and data extraction pro-
grams, which allow automatic batch preprocessing of sample images and direct modeling
by batch input of images. Therefore, it can realize the goal of improving modeling accuracy
through the increase in sample size. Moreover, N content can be detected directly by simply
inputting images. This is convenient to be extensively applied in agricultural practices.
With another advantage of low equipment cost, the proposed method in this paper
only requires a machine vision device. However, the cost of spectrographs presents a
significant barrier, which limits the widespread application of spectrum detection methods
in agriculture. The proposed method is characterized by high accuracy, simple modeling,
low operation complexity, and low construction cost.
For future research, the continuous advancements in machine vision and artificial in-
telligence offer immense potential. Exploring the application of deep learning, particularly
CNNs could enhance the accuracy and efficiency of the proposed method. Furthermore,
expanding the model’s applicability to encompass a wider range of plant health indicators
remains a crucial area for investigation. Researchers might also evaluate integrating this
method with existing agricultural technologies to create a more comprehensive approach
to precision farming. From a commercial perspective, transitioning from research to prac-
tical applications is pivotal. The potential integration of this machine vision system with
drones or portable devices opens up possibilities for on-the-go plant health monitoring and
precision fertilization solutions. This technology has the potential to become an essential
component of an advanced agricultural ecosystem when combined with user-friendly
software interfaces that enable farmers to make data-driven decisions rapidly. Additionally,
lower costs associated with machine vision systems compared to traditional spectrograph
equipment could significantly reduce barriers to entry, making this technology accessible
to a broader range of agricultural businesses. Lastly, incorporating this proposed method
into IoT (Internet of Things) -enabled smart farming systems signifies significant progress
toward automated and intelligent crop management. By leveraging the interconnectivity
provided by IoT, data from machine vision systems can be analyzed and acted upon in
real-time, fostering an environment for enhanced productivity and resource optimization.
In conclusion, while the proposed method already exhibits high accuracy, simple
modeling, and low operation complexity, its full potential lies in the ease with which it
can be scaled up and adapted for future research endeavors, commercial ventures, and
practical agricultural applications. The integration of this method could lead to significant
improvements in crop management and the overall efficiency of agricultural operations,
with substantial economic benefits for the agricultural sector. Therefore, we recommend
a concerted effort to explore these avenues to ensure that the proposed method enjoys a
broad and impactful implementation.

5. Conclusions
Nitrogen, a nutrient element with the largest demands, plays a vital role in the growth
and development of maize. In N content detection, establishing a model of the relationship
between detection objects and N content is crucial for protecting detection accuracy. In this
study, an automatic modeling prediction method of nitrogen content in maize leaves based
on machine vision and a convolutional neural network was proposed. In this method,
programs of image preprocessing and data extraction were combined and the model could
be constructed quickly by inputting images. Apart from that, prediction results could be
obtained directly by inputting one image. The proposed method simplifies the modeling
process significantly and realizes the goal of improving modeling accuracy through an
increase in sample size.
In order to verify the advantages of the method proposed in this paper, the spectral
detection-MLR method under full-wave band and under the characteristic wavelength of N
content in maize leaves have been treated as comparison methods. Based on the comparison
of the results, it can be seen that the method proposed in this paper is characterized by
high accuracy, low equipment cost, and simple modeling. It may lay a solid foundation
Agronomy 2024, 14, 124 14 of 15

for N content detection in maize leaves and provide strong support for fast and accurate
fertilization as well as high yield and high production profits of maize.

Author Contributions: L.S., X.C., J.W. and C.Y. conceived the idea and proposed the method. L.S.,
X.S., C.Y., J.W. and X.C. contributed to the preparation of equipment and acquisition of data, and
wrote the code and tested the method. L.S., P.J., L.G., Y.Z. and X.F.: validation results. L.S. and X.S.
wrote the paper. L.S., X.S., C.Y., J.W. and X.C. revised the paper. L.S., J.W., X.F. and Y.Z. are considered
co-first authors with equal contributions. All authors have read and agreed to the published version
of the manuscript.
Funding: The study was funded by the National Natural Science Foundation of China (32202474 and
32072572), State Key Laboratory of North China Crop Improvement and Regulation (NCCIR2023ZZ-
19). Hebei Talent Support Foundation (E2019100006), Hebei Modern Agricultural Industry Tech-
nology System—Grains and Beans Industry Innovation team “Quality Improvement and Brand
Cultivation” (HBCT2023050204), the earmarked fund for CARS (CARS-23), and the Science and
Technology Project of Hebei Education Department (QN2020444).
Data Availability Statement: Data will be made available upon request.
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Liu, Q. Spatio-temporal changes of fertilization intensity and environmental safety threshold in China. Trans. Chin. Soc. Agric.
Eng. 2017, 33, 214–222.
2. Zhang, Y.; Wang, L.; Bai, Y. Nitrogen nutrition diagnostic based on hyperspectral analysis about different layers leaves in maizes.
Spectrosc. Spectr. Anal. 2019, 39, 2829–2835.
3. Zhang, J.; Tian, H.Q.; Zhao, Z.Y.; Zhang, L.; Zhang, J.; Li, F. Moisture content dectection in silage maizes raw material based on
hyperspectrum and improved discrete particle swarm. Trans. Chin. Soc. Agric. Eng. 2019, 35, 285–293.
4. Guo, W.; Xue, X.; Yang, B.; Zhou, C.; Zhu, X. Non-destructive and rapid Detection Method on Nitrogen Content of Maizes Leaves
Based on Android Mobile Phone. Trans. Chin. Soc. Agric. Eng. 2017, 48, 137–143.
5. Dheri, G.S.; Lal, R.; Verma, S. Effects of Nitrogen Fertilizers on Soil Air Concentration of N2 O and Maizes Growth in a Greenhouse
Study. Taylor J. 2015, 29, 95–105.
6. Gao, F.; Wu, J. Comparison of determining the plant total nitrogen with two methods. Mod. Agric. Sci. Technol. 2012, 2015, 204–205.
(In Chinese)
7. Wang, X.; Yuan, X.; Zhang, X.; Xie, R.; Xiao, W.; Han, L. Analysis of the total nitrogen content of crop residues determined by
using Kjeldahl and Dumas methods. Trans. Chin. Soc. Agric. Eng. 2020, 26, 207–214.
8. Chen, X.; Li, M.; Sun, H.; Yang, W.; Zhang, J.; Mao, B. Rapid determination of moisture content in maizes leaf based on
transmission spectrum. Trans. Chin. Soc. Agric. Eng. 2017, 33, 137–142.
9. Wang, L.; Wei, Y. Progress in inversion of vegetation nitrogen concentration by hyperspectral remote sensing. Spectrosc. Spectr.
Anal. 2013, 33, 2823–2827.
10. Hen, P.; Li, G.; Shi, Y.; Xu, Z.; Yang, F.; Cao, Q. Validation of an unmanned aerial vehicle hyperspectral sensor and its application
in maizes leaf area index estimation. Sci. Agric. Sin. 2018, 51, 1464–1474.
11. Zhang, S.; Zhang, J.; Bai, Y.; Xun, L.; Wang, J.; Zhang, D.; Yang, S.; Yuan, J. Developing a Method to Estimate Maizes Area
in North and Northeast of China Combining Crop Phenology Information and Time-Series MODIS EVI. IEEE Access 2019, 7,
144861–144873. [CrossRef]
12. Dayananda, S.; Astor, T.; Wijesingha, J. Multi-temporal monsoon crop biomass estimation using hyperspectral imaging. Remote
Sens. 2019, 11, 1771. [CrossRef]
13. Wang, L.; Zhang, C.; Zhao, Y. Detection Model of Nitrogen Content in Maizes Leaves Based on Hyperspectral Imaging. Res. Agric.
Mech. 2017, 2017, 140–147.
14. Taheri-Garavand, A.; Rezaei Nejad, A.; Fanourakis, D.; Fatahi, S.; Ahmadi Majd, M. Employment of artificial neural networks for
non-invasive estimation of leaf water status using color features: A case study in Spathiphyllum wallisii. Acta Physiol. Plant. 2021,
43, 78. [CrossRef]
15. Taheri-Garavand, A.; Nasiri, A.; Fanourakis, D.; Fatahi, S.; Omid, M.; Nikoloudakis, N. Automated In Situ Seed Variety
Identification via Deep Learning: A Case Study in Chickpea. Plants 2021, 10, 1406. [CrossRef] [PubMed]
16. Nasiri, A.; Taheri-Garavand, A.; Fanourakis, D.; Zhang, Y.-D.; Nikoloudakis, N. Automated Grapevine Cultivar Identification via
Leaf Imaging and Deep Convolutional Neural Networks: A Proof-of-Concept Study Employing Primary Iranian Varieties. Plants
2021, 10, 1628. [CrossRef] [PubMed]
17. Liu, S.; Yang, G.; Jing, H. Retrieval of winter wheat nitrogen content based on UAV digital image. Trans. Chin. Soc. Agric. Eng.
2019, 35, 75–85.
Agronomy 2024, 14, 124 15 of 15

18. Zhang, L.; Wang, D.; Zhang, Y.; Cheng, Y.; Li, H.; Hu, C. Diagnosis of N nutrient status of maizes using digital image processing
technique. Chin. J. Eco-Agric. 2021, 18, 1340–1344. (In Chinese) [CrossRef]
19. Dong, Z.; Yang, D.; Zhu, H.; Guo, Q.; Wang, Z.; Bai, J. Hyperspectral Estimation of SPAD Value in Maizes Leaves Based on
Continuous Projection Algorithm and BP Neural Network. J. Shanxi Agric. Sci. 2019, 47, 751–755.
20. Wei, P.; Xu, X.; Li, Z. Remote sensing estimation of nitrogen content in summer maizes leaves based on multispectral images of
UAV. Trans. Chin. Soc. Agric. Eng. 2019, 35, 126–134.
21. Fan, L.L.; Zhao, H.W.; Zhao, H.Y.; Hu, H.; Wang, Z. Survey of target detection based on deep convolutional neural networks. Opt.
Precis. Eng. 2020, 28, 1152–1164.
22. Ali, M.; Gilani, S.O.; Waris, A.; Zafar, K.; Jamil, M. Brain Tumour Image Segmentation Using Deep Networks. IEEE Access 2020, 8,
153589–153598. [CrossRef]
23. Bui, T.-A.; Lee, P.-J.; Lum, K.-Y.; Loh, C.; Tan, K. Deep Learning for Landslide Recognition in Satellite Architecture. IEEE Access
2020, 8, 143665–143678. [CrossRef]
24. Ma, Y.; Liu, P. The evolutionary design of convolutional neural net work for image classification. J. Northwest Norm. Univ. 2020,
56, 55–61.
25. Cao, W.; Wu, R.; Cao, G.; He, Z. A Comprehensive Review of Computer-Aided Diagnosis of Pulmonary Nodules Based on
Computed Tomography Scans. IEEE Access 2020, 8, 154007–154023. [CrossRef]
26. Mao, Y.; Zhou, H.; Gui, X.; Shen, J. Exploring Convolution Neural Network for Branch Prediction. IEEE Access 2020, 8, 152008–
152016. [CrossRef]
27. Gao, Y.; Wang, S.Q.; Li, J.H.; Hu, M.Q.; Xia, H.Y.; Hu, H.; Wang, L.J. A Prediction Method of Localizability Based on Deep
Learning. IEEE Access 2020, 8, 110103–110115. [CrossRef]
28. Liu, Y.; Cen, C.; Che, Y.; Ke, R.; Ma, Y.; Ma, Y. Detection of Maizes Tassels from UAV RGB Imagery with Faster R-CNN. Remote
Sens. 2020, 12, 338. [CrossRef]
29. Quan, L.; Feng, H.; Lv, Y.; Wang, Q.; Zhang, C.; Liu, J.; Yuan, Z. Maizes seedling detection under different growth stages and
complex field environments based on an improved Faster R-CNN. Biosyst. Eng. 2019, 184, 1–23. [CrossRef]
30. Ahila Priyadharshini, R.; Arivazhagan, S.; Arun, M.; Mirnalini, A. Maizes leaf disease classification using deep convolutional
neural networks. Neural Comput. Appl. 2019, 31, 8887–8895. [CrossRef]
31. Lu, H. Image Understanding and Analysis in Automated Growth Status Observation of Maizes Tassels. Ph.D. Thesis, Huazhong
University of Science and Technology, Wuhan, China, 2018.
32. Öztürk, Ş.; Akdemir, B. Effects of Histopathological Image Pre-processing on Convolutional Neural Networks. Procedia Comput.
Sci. 2018, 132, 396–403. [CrossRef]
33. Lecun, Y.; Bottou, L.; Bengio, Y.; Haffner, P. Gradient-based learning applied to document recognition. Proc. IEEE 1998, 86,
2278–2324. [CrossRef]
34. Liu, D. Nitrogen Content Detection of Maizes Leaves Based on Spectral Analysis and Image Technology. Master’s Thesis, Hebei
Agricultural University, Baoding, China, 2020.
35. da Silva, B.C.; Prado, R.d.M.; Baio, F.H.R.; Campos, C.N.S.; Teodoro, L.P.R.; Teodoro, P.E.; Santana, D.C.; Fernandes, T.F.S.; da
Silva, C.A., Jr.; Loureiro, E.d.S. New approach for predicting nitrogen and pigments in maize from hyperspectral data and
machine learning models. Remote Sens. Appl. Soc. Environ. 2024, 33, 101110. [CrossRef]
36. Cao, C.; Wang, T.; Gao, M.; Li, Y.; Li, D.; Zhang, H. Hyperspectral inversion of nitrogen content in maize leaves based on different
dimensionality reduction algorithms. Comput. Electron. Agric. 2021, 190, 106461. [CrossRef]
37. Yu, F.H.; Xing, S.M.; Guo, Z.H.; Bai, J.C.; Xu, T.Y. Remote sensing inversion of the nitrogen content in rice leaves using character
transfer vegetation index. Trans. Chin. Soc. Agric. Eng. 2022, 38, 175–182.
38. Hong, B.; Zhang, Z.; Zhang, Q. The Nitrogen Content in Cotton Leaves: Estimation Based on Digital Image. Chin. Agric. Sci. Bull.
2022, 38, 49–55.
39. Taheri-Garavand, A.; Mumivand, H.; Fanourakis, D.; Fatahi, S.; Taghipour, S. An artificial neural network approach for non-
invasive estimation of essential oil content and composition through considering drying processing factors: A case study in
Mentha aquatica. Ind. Crop. Prodducts 2021, 171, 113985. [CrossRef]

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.

You might also like