Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 13

Advancing Plant Phenotyping through Conventional RGB

and Hyperspectral: A Case Study on Maize Plant.

Neha B. Chavan1, Diksha.R.Pawar2, Dr. Ramesh Manza 3 ,

Research Scholar, Research Scholar, Professor,

Department of CS and IT, Department of CS and IT, Department of Cs and IT,


Dr. Babasaheb Ambedkar Dr. Babasaheb Ambedkar Dr. Babasaheb Ambedkar
Marathwada University, Marathwada University, Marathwada University,
Aurangabad, (Maharashtra) Aurangabad, (Maharashtra) Aurangabad, (Maharashtra)
India. India. India.

Email: Email: Email:


nc46317@gmail.com1 dikshasalunke97@gmail.com2 manzaramesh@gmail.com3

____________________________________________________________________

Abstract: Plants hold a vital role in sustaining life on Earth, impacting human health,
ecosystems, and environmental processes. Addressing food security challenges is essential in
achieving Sustainable Development Goals, prompting researchers to explore strategies like
Site-Specific Management, Precision Agriculture, and plant breeding. These approaches,
empowered by DNA sequencing and genotyping, demand extensive field trials and
phenotypic measurements, a labor-intensive process [2][4][5]. To enhance efficiency, this
study leverages computer science techniques, including Computer Vision, Machine Learning,
Image Processing, Remote Sensing, and Geographic Information Systems, to analyze maize
plants, a globally significant crop. The study delves into various remote sensing techniques,
such as Multispectral Imaging, Hyper spectral Imaging, LiDAR, and Thermal Imaging, used
for plant analysis [2]. Hyper spectral Imaging, in particular, captures data across a wide
spectrum, offering valuable insights into plant characteristics based on reflectance patterns.

Two experiments were conducted using the UNL plant phenotyping dataset. The first
experiment focused on RGB image processing, involving plant trait measurements like
height, width, and size. Linear regression analysis revealed genotype and greenhouse
position effects on these traits. The second experiment employed hyper spectral imaging,
utilizing the Normalized Difference Vegetation Index and stem-leaf segmentation to isolate
regions of interest (ROIs). Principal Component Analysis visualized data variations across
different wavelengths.This research underscores the importance of plant analysis, especially
in maize cultivation, and demonstrates the potential of computer science techniques to
automate and enhance phenotyping processes. These advancements are crucial for
addressing food security challenges and ensuring sustainable agriculture practices.

____________________________________________________________________

Keywords: RGB image, Hyper spectral image, Image processing, Feature extraction and
selection, HIS classifier

1. Introduction:
According to the world health organization plants are vital not only for human sustenance and
well-being but also for a wide range of organisms, ecosystems, & environmental processes [i].
Plants is giving physical health and also plays a role in our mental, social, cultural, economic,
and environmental well-being. Worries regarding food security have emerged as a significant
hurdle for global agricultural production. Consequently, the United Nations has officially
designated it as one of the 17 Sustainable Development Goals [1][2]. To tackle the issue of
balanced food production, researchers have dedicated their efforts to strategies like Site-
Specific Management (SSM), Precision Agriculture (PA), and plant breeding. These
approaches have effectively doubled productivity while also being environmentally
conscious, yielding increased output with reduced resource inputs [2][3].

A fundamental aspect of the green revolution involved the development of novel strains of
key grain crops using traditional phenotypic selection techniques, aiming for increased yield
potential. Since the advent of the green revolution, the demand for food has continued to rise,
prompting extensive endeavours in both public and private sectors towards the creation of
crop varieties with even greater yield potential [4]. Plant breeders are actively using DNA
sequencing that reduces the cost of genotyping in crop development [5]. To analyse the genetic
connection with the performance of each variety, hundreds of field trials must be conducted
in real-world environments, and key traits must be measured in different locations and at
different times [2][6]. The process of yield-based phenotyping consumes a significant amount
of human labor, which might potentially be substituted by data obtained from remote sensing
platforms capable of providing the required efficiency for extensive experiments.

Maize (Zea mays L.) holds a position of utmost significance on a global scale due to its
critical role in agriculture, economic stability, and ensuring food security [2]. Maize
cultivation is widespread across the world, and in India highest production is usually
concentrated in states like Karnataka, Andhra Pradesh, Telangana, and Maharashtra [ii]. The
overall maize cultivation area in India experiences fluctuations from year to year, influenced
by factors such as weather patterns and market requirements. Over the years, maize
production in India has demonstrated consistent growth due to its diverse applications and
increasing demand. Nevertheless, maize farming in India encounters challenges related to
unpredictable weather conditions, pest infestations, and disease control. It is of utmost
importance to establish a stable and competitive market environment for maize farmers to
ensure the continued expansion of this crop.

Computer science techniques can play a significant role in automating and enhancing the
analysis process. Computer Vision, Machine Learning, Image Processing, Remote Sensing,
and Geographic Information Systems (GIS) are some popular approaches used in maize plant
analysis and that make things easy. By using this approach we can examine plant
development and phenotypic characteristics predominantly centred on traits that determine
harvest analysis [2]. The primary challenge in contemporary plant breeding lies in the process
of phenotyping. Phenotyping serves a dual purpose in this context. Initially, when applied to
a wide array of plant lines, it enables a plant breeder to discern which lines exhibit the most
promising traits, such as high yield potential or enhanced stress tolerance, within a specific
environmental context. Additionally, by accumulating comprehensive phenotyping data from
a sufficiently diverse set of plants, it becomes possible to integrate this information with
genotypic data to pinpoint specific regions of a plant species genome harbouring
advantageous or disadvantageous alleles [4].
To deal with these challenges we chose recent pilot studies that have explored the utilization
of various image-processing techniques to derive phenotypic measurements from crop plants.
We used computer vision-based plant phenotyping RGB camera technology and hyper
spectral image processing [7][8]. They give applications like high-throughput plant phenotyping
endeavors. These camera technologies have primarily been used in investigations related to
how plants respond to various abiotic stresses [4]. The main objective of this research is to
extract features from hyper spectral data to provide useful information for detecting plant
height, weight, plant projected area, stem diameter, number of leaves, and their area. Using
this information farmers and managers with the insights needed to discern how plants
respond to their local surroundings, enabling timely detection, diagnosis, and corrective
actions for agricultural management challenges.

2. Literature review:
2.1. Remote sensing:

Several remote sensing techniques are commonly used for plant analysis, each offering
unique advantages for different aspects of plant research. Multispectral Imaging, Hyper
spectral Imaging, LiDAR (Light Detection and Ranging),Thermal Imaging, Fluorescence
Imaging, Radar Imaging, UAV (Unmanned Aerial Vehicle) Remote Sensing, Visible and
Infrared (VNIR) Spectroscopy, Satellite Remote Sensing, Ground-Based Spectro-radiometry
these are the most popular remote sensing techniques used for plant analysis [2].

Remote sensing, a technology enabling non-contact observation, data collection, and analysis,
plays a crucial role in agriculture by connecting optical properties of plants to their
morphological characteristics. This approach aids in identifying spectral features related to
various aspects of plant growth, including conditions, nutrient status, pest and disease
presence, and yield prediction [1][9][10]. It combines ground-based measurements with data
collected from sensors mounted on various platforms like vehicles, Unmanned Aerial
Systems (UAS), manned aircraft, and even space borne satellites. While space borne hyper
spectral missions have been limited, medium-resolution satellite imagery like EO-1's
Hyperion instrument has been used for land monitoring and agriculture [2]. Challenges include
low revisit times and cloud cover during satellite passes, leading to the increasing use of
hyper spectral cameras on manned aircraft, helicopters, and zeppelins for flexible and reliable
data acquisition [2].

AVIRIS (Airborne Visible/Infrared Imaging Spectrometer), a NASA hyper spectral sensor,


captures images with a 4m pixel size and 224 bands featuring a spectral resolution of 10 nm,
spanning from 380 nm to 2500 nm but needs manual power, and its high cost [2][11]. UAV
(Unmanned Aerial Vehicle) remote sensing equips farmers and managers with the insights
needed to discern how plants respond to their local surroundings, enabling timely detection,
diagnosis, and corrective actions for agricultural management challenges [10]. However, UAV
systems face limitations related to payload capacity, power demands, the need for technical
expertise in flight operations (particularly for advanced sensors), and the management of
substantial data volumes for storage, processing, and analysis [2]. Wheel-based systems, which
capture data at the plot scale by utilizing advanced global positioning systems (GPS),
yielding data with both high spatial and spectral resolutions. Challenges associated with these
systems include relatively slow platform speeds, hindering simultaneous data acquisition over
large fields for plot comparisons, as well as issues related to wet soils, soil compaction, and
platform/sensor vibrations due to varying terrain [2][12].
2.2 Hyper spectral Imaging (HSI):

Electromagnetic energy, spanning various wavelengths, interacts with Earth's surface objects
through absorption, transmission, and reflection processes. These interactions create a unique
spectral signature, revealing the object's characteristics and chemistry. Hyper spectral
imaging (HSI) sensors typically acquire data across the visible and near-infrared (VNIR) as
well as the short-wave infrared (SWIR) spectrum, encompassing wavelengths from
approximately 400 nm to 2500 nm. These sensors utilize narrow bands (<10 nm) to produce a
continuous and contiguous data cube [2]. The way plants reflect light is influenced by their
physical and chemical characteristics, which can change depending on the type of plant, the
amount of water in their tissues, and their growth stage. When it comes to remote sensing of
vegetation, passive sensors are commonly used to capture how plant canopies reflect light.
The visible reflectance of leaves and canopies is mainly determined by plant pigments like
chlorophylls, carotenoids, and anthocyanin. Interestingly, the patterns of reflectance in the
red edge and near-infrared regions, are distinctive features resulting from the absorption
patterns of chlorophyll and the way light scatters within leaves [2].

2.2.1 Feature extraction And Selection:

Various techniques have been developed to address feature space management, reduce data
dimensionality, and extract more meaningful information from hyperspectral imaging (HSI)
data while reducing computational complexity [2]. Feature selection methods encompass
Jeffries–Matusita (J–M) distance, Bhattacharyya distance, Mutual Information (MI), and
signal-noise ratio [2][13]. Feature extraction techniques involve both knowledge-based
approaches like vegetation indices and statistical methods, which can be unsupervised (linear
or nonlinear) or supervised (parametric or nonparametric) combinations of the original
features. Common unsupervised methods include linear approaches such as principal
component analysis (PCA), independent component analysis (ICA), minimum noise fraction
(MNF), and nonlinear methods like isometric feature mapping (ISOMAP) and locally linear
embedding (LLE). Additionally, supervised feature extraction techniques encompass linear
discriminant analysis (LDA), local Fisher discriminant analysis, and nonparametric
discriminant analysis (NDA) [2][13].

2.2.2 HSI classification/clustering:

Various algorithms are employed to classify data, extract valuable information, and assign
labels to images related to specific land cover categories. These methods aim to exploit
distinct patterns or spectral signatures associated with particular phenomena, which are
subsequently linked to specific classes [2][13]. In agriculture, passive sensors are used to
measure the reflectance of canopies and leaves, providing essential reflectance data [2][14].
Machine learning encompasses a wide array of tools for data management and analysis,
offering solutions ranging from predictions to inferences. In the context of image
classification, unsupervised methods identify coherent groups within the dataset, while
supervised classification algorithms, through iterative learning from labelled data (training
data), establish mappings for classes with intricate attributes [2][15]. Some of the popular
classifier used in HIS are Spectral Angle Mapper (SAM), Support Vector Machines (SVM),
Random Forest: Principal Component Analysis (PCA),Linear Discriminant Analysis (LDA),
Maximum Likelihood Classifier, K-Nearest Neighbors (K-NN),Neural Networks,
Endmember Extraction and Linear Unmixing, etc.
3. EXPERIMENTAL METHODOLOGY
3.1 Dataset:

3.1.i. UNL plant phenotyping dataset

For the experiment work UNL plant phenotyping dataset is used. Dr. Sruti Das Choudhury
has provided this open-access Maize plant datasets (https://plantvision.unl.edu/dataset). As
shown in Fig.1 datasets consist of images captured using the LemnaTec Scanalyzer 3D High
Throughput Plant Phenotyping facility located at the University of Nebraska-Lincoln (UNL),
USA.[iv]. These datasets are primarily used for advancing research in plant phenotyping,
which involves the detailed quantification and analysis of a plant's physical and physiological
traits.

Fig.1 a) Side view camera, b) Top view camera.

The UNL Plant Phenotyping Datasets encompass a diverse array of plant-centric data,
comprising high-resolution imagery, sensor-generated data, detailed environmental
parameters, and comprehensive metadata pertaining to the subject plants under investigation.
In many research, scientists harness these datasets to delve into the genetic diversity and
variability inherent in crop species, enabling the identification of traits conducive to increased
yields, heightened stress resilience, and enhanced resistance to diseases [iv][v]. Such insights
play a pivotal role in breeding initiatives dedicated to the cultivation of hardier and more
productive crop varieties. Additionally, these datasets are instrumental in the creation of
algorithms and tools geared towards expediting the phenotyping process, ultimately
enhancing its efficiency and precision that's the reason we selected this dataset for the
research.

3.1.ii. Data selection

Every day, the plants were subjected to imaging procedures conducted in four distinct
chambers, each equipped with a different type of camera. These cameras encompassed
thermal infrared, fluorescence, traditional RGB (Red, Green, Blue), and hyper spectral
technologies and total 500 GB of image data generated [4][16].However, in our research we
used only 2 cameras images(RGB, Hyper spectral) and taking very less number of images
that give result very fast.

In RGB camera the plants were captured in images from two different angles, precisely 0 and
90 degrees apart from each other, in addition to a view from directly above with a resolution
of 2454 × 2056 pixels. An RGB camera was employed to acquire images at two different
zoom levels. Over the first 27 days, the camera operated at a reduced zoom level, with each
pixel equating to an area of around 1.507 mm at the camera's distance from the pot, resulting
in an approximate 2x change. In the last 5 days, the camera's settings were modified to ensure
that each pixel represented 0.746 mm at the camera's distance from the pot [4]. Specifically,
each Side View consists of 32 days' worth of images shown in Fig.2, totalling 96 RGB
images we used for RGB image processing.

Fig.2 Vegetative Stage of growing Maize Plant capture image by side view 0

In the hyper spectral camera recorded images at a resolution of 320 horizontal pixels and
vertical resolution varied between 494 and 499 pixels. The imaging process utilized halogen
bulbs for illumination, specifically the Sylvania model # ES50 HM UK 240V 35W 25◦
GU10. For each pixel, a total of 243 distinct intensity values were recorded shown in Fig.3,
covering a spectrum of light wavelengths spanning from 546 nm to 1700 nm. Each of these
wavelength-specific datasets was saved as an individual grayscale image [4].
Fig3. Grayscale Images that record 243 distinct intensity

3.2 Experiments with Result


3.2.I.i) Experiment1: RGB image processing:

To isolate portions of the plant within RGB images, a segmentation process was applied
based on a green index formula ((2×G)/(R+B)).There plot diagram shown in Fig4. Pixels
with a computed index value exceeding 1.15 were identified as plant pixels this approach we
taken from [4][16]. This approach resulted in the generation of false-positive plant pixel
identifications within the reflective metal columns situated at the image's perimeter. In order
to mitigate the influence of false positives, these regions were intentionally omitted from the
analysis. Consequently, when plant leaves extended across the reflective metal frame, certain
genuine plant pixels were inadvertently excluded. In instances where no plant pixels were
detected in the image commonly observed during the initial days when the plant had either
not yet germinated or had not reached above the pot's recorded value was marked as "NA" in
the output file.

Fig.4 RGB image plot using Green Index formula


Heritability evaluation was conducted using a linear regression model to examine the impact
of genotype and greenhouse position on plant features. In this analysis, genotype ZL019,
which necessitated replication, was deliberately omitted . The responses were treated
individually for each day and were modelled as follows:

Yh,i j,t = Mh,t + Gh,i,t + Gph,ν(i, j),t + Eh,i j,t

In this equation:

 Yh,i j,t represents the response variable for a specific trait on day t for genotype i in
greenhouse position j and h for height.
 Mh,t is the overall mean for the trait on day t.
 Gh,i,t represents the genotype effect for genotype i on day t.
 Gph,ν(i, j),t is the effect of the greenhouse position j nested within genotype i on day t.
 Eh,i j,t represents the error term or residual for the model.

Included within the analysis were three distinct traits derived from the images: plant height,
width, and size, all observed from two different perspectives, namely, 0 and 90 degrees.

3.2.I.ii) Result Analysis for Experiment1:

In this analysis, we employed a linear regression model to explore the connection between
two key plant attributes: Plant Height and Plant Width, and their influence on the Projected
Plant Area. Our primary aim was to gain a comprehensive understanding of how these two
independent variables impact the dependent variable, Projected Plant Area. The model's
outcomes revealed crucial coefficients that shed light on the linear associations within this
botanical context. Specifically, the coefficient for Plant Height was determined to be
[Coefficient_Height], and for Plant Width, it was [Coefficient_Width]. Additionally, the
model's intercept was calculated as [Intercept] shown in Fig.5 plot. These coefficients offer
valuable insights into the linear interplay between the independent variables (Plant Height
and Plant Width) and the dependent variable (Projected Plant Area).

Fig.5 Linear Regression plot


Moreover, our analysis extended to making predictions using the trained linear regression
model. We predicted the Projected Plant Area for a hypothetical data point featuring Plant
Height of 500 units and Plant Width of 400 units, yielding a predicted Projected Plant Area of
[Predicted_Area]. Furthermore, we applied the model to the existing dataset, generating
predictions and visualizing the outcomes with regression lines. The scatter plot showcased
actual data points for both Plant Height and Plant Width against the Projected Plant Area,
denoted by blue and red points, respectively. Concurrently, the green and orange regression
lines illustrated the relationships between each independent variable and the Projected Plant
Area, offering a comprehensive visualization of these associations. This analysis provides
valuable insights into the dynamics of Plant Height and Plant Width in relation to Projected
Plant Area, with potential implications for plant-related research and decision-making
processes.

3.2.II.i) Hyper spectral image processing

In the process of hyper spectral image analysis, two distinct methods with specific thresholds
were employed to isolate ROI (regions of interest) corresponding to the plants:

1. Normalized Difference Vegetation Index (NDVI) Method: The first method


utilized the well-established NDVI formula, which computes the index for each pixel
using the expression: (R750nm - R705nm) / (R750nm + R705nm). Pixels with an
NDVI value exceeding 0.25 were categorically identified as originating from the plant
that shown in Fig.6.(a) [4][17].
2. Stem and Leaf Segmentation Method: The second method focused on the
differentiation in reflectance between the stem and the leaves, particularly at
wavelengths of 1056 nm and 1151 nm. To segment the stem from other plant parts,
pixels were selected where (R1056nm / R1151nm) produced a value greater than 1.2.
Meanwhile, leaf pixels were defined as those identified as plant pixels based on NDVI
but not classified as stem pixels [4]. The only stem shown in Fig.6.(b) and only leaves
were shown in Fig.6.(c)

Fig.5 Plant Analysis

(a) Original Plant (b) Only Stem (c) Only leaves


These methods and associated thresholds were applied to hyper spectral images to distinguish
and delineate plant regions effectively. Leaf pixels were specifically defined as those pixels
that were identified as plant pixels based on the NDVI method but were not categorized as
stem pixels [4]. In addition to the inherent biological variation among individual plants, there
was notable overall intensity fluctuation observed, both among different plants imaged on the
same day and within the same plant on different days. These variations were primarily due to
fluctuations in the performance of the lighting system employed in the hyper spectral imaging
chamber.

3.2.II.ii) Experiment2 HIS result analysis:

For visualizing the variation across 243 distinct wavelength measurements across multiple
plant images, a principal component analysis (PCA)-based approach was utilized. After the
aforementioned normalization process, PCA analysis was applied to the intensity values of
individual pixels. PCA values for each individual plant pixel within the analysed plants were
then transformed into intensity values using the formula [x-min(x)]/[max(x)-min(x)] [4].
Subsequently, false colour RGB images were generated, with the values of the first principal
component stored in the red channel, the second principal component in the green channel,
and the third principal component in the blue channel. This approach facilitated the
visualization of variations in plant data across multiple wavelengths and images shown in
below Fig.7. plot

Fig.7. PCA scatter plot

Principal Component Analysis (PCA) on a dataset composed of measurements at different


wavelengths. PCA is a dimensionality reduction technique used to find the underlying
structure within the data and reduce its complexity. In this context, PCA is applied to the
dataset to identify the most important patterns or components. The visualizing results of PCA
using scatter plots that demonstrates how the data points are distributed in a two-dimensional
PCA space, with colors indicating the values from the original dataset. This visualization
helps in understanding the relationships and variance within the data.
4. Conclusion:
This paper discusses the importance of plants in various aspects of human life, including their
role in food security and the environment. It highlights the challenges in agriculture and the
need for advanced techniques like Site-Specific Management (SSM), Precision Agriculture
(PA), and plant breeding to increase productivity while reducing resource inputs. The focus
of the paper is on maize cultivation in India and the challenges it faces. The application of
computer science techniques such as Computer Vision, Machine Learning, Image Processing,
Remote Sensing, and Geographic Information Systems (GIS) in plant analysis discussed,
particularly in maize plant analysis and emphasizes the importance of phenotyping in
contemporary plant breeding and how remote sensing technologies can assist in this process.

This paper mentions the use of various remote sensing techniques for plant analysis,
including multispectral imaging, hyper spectral imaging, LiDAR, thermal imaging, and
others. It also discusses the challenges associated with remote sensing, such as cloud cover
and data acquisition. Furthermore, it explains hyper spectral imaging in detail, including how
it captures data across different wavelengths and its applications in plant analysis. The
different methods used such as feature extraction and selection techniques used in hyper
spectral imaging data and various classification/clustering algorithms employed for data
analysis. It also describe the experimental methodology, focusing on the UNL plant
phenotyping dataset used for research and explains the selection of data from different
cameras and the specific experiments conducted, such as RGB image processing and hyper
spectral image processing. Finally, we discuss results of the experiments, including linear
regression analysis for plant traits based on RGB images and the use of Principal Component
Analysis (PCA) for hyper spectral data analysis.

From the experiments we collect phenotypic measurement included plant height, weight,
plant projected area, total number of visible leaves, total fully extended leaves and also
analyse genotype effect and greenhouse position effect on plant traits. By using NDVI we
able to measure stem diameter average at the base of the plant as well as stem diameter at the
collar of the top fully extended leaf.

5. Acknowledgment
The authors gratefully acknowledge support from the Shree Chhatrapati Shahu Maharaj
Research, Training and Human Development Institute (SARTHI), An Autonomous Institute
of Govt. of Maharashtra for providing financial assistance for the Major Research Project.
This work was supported by Dr. Babasaheb Ambedkar Marathwada University.

6. References
[1] W. Rosa, Ed., “Transforming Our World: The 2030 Agenda for Sustainable
Development,” in A New Era in Global Health, New York, NY: Springer Publishing
Company, 2017.
[2] HYPERSPECTRAL IMAGE CLASSIFICATION FOR DETECTING FLOWERING IN
MAIZE by Karoll Quijano
[3] R. Bongiovanni and J. Lowenberg-DeBoer, “Precision Agriculture and
Sustainability,”Precis. Agric., vol. 5, pp. 359–387, Aug. 2004, doi:
10.1023/B:PRAG.0000040806.39604.aa.
[4] Conventional and hyperspectral time-series imaging of maize lines widely used in field
trials Zhikai Liang 1 , Piyush Pandey2 , Vincent Stoerger 3 , Yuhang Xu 4 , Yumou Qiu 5 ,
Yufeng Ge2 and James C. Schnable1
[5] J. W. White et al., “Field-based phenomics for plant genetics research,” Field Crops
Res.,vol. 133, pp. 101–112, Jul. 2012, doi: 10.1016/j.fcr.2012.04.003.
[6] N. Yu, L. Li, N. Schmitz, L. F. Tian, J. A. Greenberg, and B. W. Diers, “Development of
methods to improve soybean yield estimation and predict plant maturity with an unmanned
aerial vehicle based platform,” Remote Sens. Environ., vol. 187, pp. 91–101, Dec. 2016, doi:
10.1016/j.rse.2016.10.005.
[7] Hartmann A, Czauderna T, Hoffmann R et al. HTPheno: an image analysis pipeline for
high-throughput plant phenotyping. BMC Bioinformatics 2011;12(1):148.
[8] Zhang X, Huang C, Wu D et al. High-throughput phenotyping and QTL mapping reveals
the genetic architecture of maize plant growth. Plant Physiol 2017;173:1554–64.
015;8(10):1520–35.
[9] P. J. Pinter, Jr. et al., “Remote Sensing for Crop Management,” Photogramm. Eng.
Remote Sens., vol. 69, no. 6, pp. 647–664, Jun. 2003, doi: 10.14358/PERS.69.6.647.
[10] S. Manfreda et al., “On the Use of Unmanned Aerial Systems for Environmental
Monitoring,” Remote Sens., vol. 10, no. 4, p. 641, Apr. 2018, doi: 10.3390/rs10040641.
[11] P. S. Thenkabail, J. G. Lyon, and A. Huete, Hyperspectral indices and image
classifications for agriculture and vegetation, vol. II. London: CRC Press, 2019.
[12] S. Sankaran et al., “Low-altitude, high-resolution aerial imaging systems for row and
field crop phenotyping: A review,” Eur. J. Agron., vol. 70, pp. 112–123, Oct. 2015, doi:
10.1016/j.eja.2015.07.004.
[13] J. A. Benediktsson and P. Ghamisi, Spectral-Spatial Classification of Hyperspectral
Remote Sensing Images. Artech House, 2015.
[14] J. Xue and B. Su, “Significant Remote Sensing Vegetation Indices: A Review of
Developments and Applications,” Journal of Sensors,
2017.https://www.hindawi.com/journals/js/2017/1353691/ (accessed Apr. 16, 2020)
[15] A. E. Maxwell, T. A. Warner, and F. Fang, “Implementation of machine-learning
classification in remote sensing: an applied review,” Int. J. Remote Sens., vol. 39, no. 9, pp.
2784–2817, May 2018, doi: 10.1080/01431161.2018.1433343.
[16]. Ge Y, Bai G, Stoerger V et al. Temporal dynamics of maize plant growth, water use,
and leaf water content using automated high throughput RGB and hyperspectral imaging.
Comput Electron Agricult 2016;127:625–32.
[17] Gamon J, Surfus J. Assessing leaf pigment content and activity with a reflect meter. New
Phytologist 1999;143(1):105–17.

8. Websites:
[i] https://www.who.int/news-room/fact-sheets/detail/biodiversity-and-health
[ii] https://iimr.icar.gov.in/?page_id=51
[iii]
https://www.lkouniv.ac.in/site/writereaddata/siteContent/202004021910156883ajay_misra_g
eo_principles_of_RS.pdf
[iv] https://plantvision.unl.edu/dataset
[v] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4383386/

You might also like