Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

HardwareX 15 (2023) e00468

Contents lists available at ScienceDirect

HardwareX
journal homepage: www.elsevier.com/locate/ohx

SPOT: Scanning plant IoT facility for high-throughput


plant phenotyping
Stephen Lantin a, *, Kelli McCourt a, b, Nicholas Butcher c, Varun Puri d,
Martha Esposito a, Sasha Sanchez a, Francisco Ramirez-Loza a, Eric McLamore e,
Melanie Correll a, Aditya Singh a
a
Department of Agricultural and Biological Engineering, University of Florida, Gainesville, FL 32611, USA
b
Department of Environmental Engineering and Earth Science, Clemson University, SC 29634, USA
c
Department of Mechanical and Aerospace Engineering, University of Florida, Gainesville, FL 32611, USA
d
Department of Computer and Information Sciences and Engineering, University of Florida, Gainesville, FL 32611, USA
e
Department of Agricultural Sciences, Clemson University, SC 29634, USA

A R T I C L E I N F O A B S T R A C T

Keywords: Many plant phenotyping platforms have been kept out of the reach of smaller labs and institutions
Plant phenotyping due to high cost and proprietary software. The Scanning Plant IoT (SPOT) Facility, located at the
Hyperspectral imaging University of Florida, is a mobile, laboratory-based platform that facilitates open-source collec­
Thermal imaging
tion of high-quality, interoperable plant phenotypic data. It consists of three main sensors: a
Remote sensing
hyperspectral sensor, a thermal camera, and a LiDAR camera. Real-time data from the sensors can
Plant breeding
be collected in its 10 ft. × 10 ft. scanning region. The mobility of the device allows its use in large
growth chambers, environmentally controlled rooms, or greenhouses. Sensors are oriented nadir
and positioned via computer numerical control of stepper motors. In a preliminary experiment,
data gathered from SPOT was used to autonomously and nondestructively differentiate between
cultivars.

Specifications table

Hardware name Scanning Plant IoT (SPOT) Facility


Subject area Engineering and material science
Environmental, planetary, and agricultural sciences
Hardware type Imaging tools
Measuring physical properties and in-lab sensors
Closest commercial analog LemnaTec Greenhouse and Growthroom Scanalyzers
Open-source license CC BY-SA
Cost of hardware Up to $38,000, less depending on sensor selection
Source file repository Open Science Framework (OSF): https://doi.org/10.17605/osf.io/4d3hp

* Corresponding author.
E-mail address: slantin@ufl.edu (S. Lantin).

https://doi.org/10.1016/j.ohx.2023.e00468
Received 4 July 2022; Received in revised form 21 July 2023; Accepted 24 August 2023
Available online 26 August 2023
2468-0672/© 2023 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license
(http://creativecommons.org/licenses/by-nc-nd/4.0/).
S. Lantin et al. HardwareX 15 (2023) e00468

Hardware in context

In general, remote sensing is the process of acquisition and measurement of information about an object by a device not in contact
with the object being studied. In its inception, optical remote sensing involved taking aerial photographs from balloons and aircraft to
collect data for the creation of maps and wartime reconnaissance. As satellite technology was developed, cameras were sent into space
to collect data over the earth’s surface. [1]. More recently, unmanned aerial vehicles (UAVs) mounted with cameras have been used to
analyze the spatial extent, structure, quality, and variability of agricultural crops to inform tailored treatments (precision agriculture)
[2]. While imaging systems have successively increased in spatial resolution, often by bringing the sensor closer to the target (i.e.,
mounted on UAVs) or by using higher resolution sensors, knowledge gaps in correlating UAV-based image data to fundamental plant
biochemistry and function remain. There is also a burgeoning need for the development of high-throughput crop phenotyping (HTP)
techniques for rapid evaluation of cultivars and for elucidating stress responses to biotic or abiotic stresses [3]. The Scanning Plant IoT
Facility (SPOT) was developed to fill this gap and to promote hypothesis-based research in rapid plant phenotyping and assessment of
plant responses to stress. SPOT utilizes remote sensing equipment in a controlled laboratory setting to provide high-frequency, high-
resolution, and repeatable hyperspectral, thermal, and structural imaging of plants. The overall intent is to use insights from exper­
iments conducted in SPOT to corroborate with field-scale UAV data, allowing high-throughput phenotyping on the plot and field
scales.
Beyond its applications for corroborating field data, the design of SPOT and its placement in the laboratory have several advantages
for acquiring plant health information compared to other remote sensing options. UAVs and other field spectroscopy equipment must
be powered by batteries, meaning a user can only take data for as long as the battery lasts. This limitation stifles continuous data
collection. SPOT’s ability to be located near a constant power source allows for use at any time, promoting reliable, automated data
collection. Additionally, the indoor nature of SPOT enables the use of controlled experiments to characterize the effects of intended
stressors on plant phenotype without confounding environmental effects (e.g., variable weather during imaging, pests). Using the
accompanying software, SPOT can be programmed to capture image data at a specified time for a specified duration. Furthermore, it
can be programmed to move the cameras at a specific speed and in a specific direction (on the x,y plane). This allows the user to set the
system and then automate sensor data acquisition.
Many high-throughput phenotyping systems have been developed over the years with varying capabilities and accessibility.
Phenopsis [4,5] was developed in 2005 to obtain an understanding of Arabidopsis’ response to water deficiency. Phenopsis utilizes a
standard red–green–blue (RGB) imaging system and weighing systems in its platform. SPOT expands upon sensing systems by
incorporating thermal, hyperspectral, and LiDAR imaging devices which can be used to better characterize plant responses to treat­
ments by generating a much more detailed dataset of measurements. The Plant Accelerator [6], developed by the Australian Plant
Phenomics Facility uses steady-state fluorescence imaging, visible light imaging, hyperspectral imaging, and an automated weighing
and watering station. This setup is housed in four large greenhouses and utilizes more than one kilometer of conveyors to move the
plants for imaging. While it does well to promote phenotyping research, its size and capital cost may limit its replication in other
research laboratories. SPOT was designed to be much more economical while still allowing for controlled block design experiments.
In addition to size, other lab-based plant phenotyping systems have been inaccessible to the research community as designs and
results are not available to the general public. For example, PhenoWell® [7], a screening system for soil-grown plants, is trademarked
and its design has been submitted for a patent. Traitmill [8] has been used to gather information such as plant morphology and plant
color; however, its experimental design and outcomes are also proprietary. Many companies, such as Qubit Phenomics [9], Pheno­
Vation [10], and Photon Systems Instruments [11] have also developed novel phenotyping designs, but they are considered trade
secrets. In response to closed-source systems such as Traitmill, researchers have developed the Phenobox, an open-source, high-
throughput phenotyping system [12]. The Phenobox system, much like SPOT, provides a reliable alternative to costly phenotyping
systems. However, its sensors are mounted laterally. Phenobox data may be more useful with sensors mounted above the scanning
region, as canopy images contain adaxial views that can be utilized for scaling experiments with aerial sensors. Additionally, a 3-D
robotic system was developed at Washington State University (WSU), featuring a multispectral sensor and an infrared thermal
camera [13]. A joint NASA/USDA collaboration has also recently developed a powerful hyperspectral/fluorescence imaging system
[14]. Though the WSU and NASA/USDA systems can extract several features such as drought stress from images, their control pro­
grams are written in the closed-source LabVIEW programming system which is not as conducive for interfacing with remote clients and
UI/UX development. Additionally, not all sensor companies provide software drivers that interface with LabVIEW. All source code for
SPOT is written in Python, is open-source, and cameras are mounted above a scanning region pointing nadir at the plants. Another
plant phenotyping system, Ghent University’s Phenovision and other technologies housed at the Netherlands Plant Eco-phenotyping
Centre (NPEC) are similarly closed-source and not easily reproduced [15–17]. SPOT improves on its predecessor, the HyperScanner at
the University of Wisconsin, Madison, with support for multiple sensors, a much sturdier frame, and a larger scanning area [18].

Hardware description

The SPOT Facility is a 10 ft. × 10 ft. × 10 ft., cubical open-air structure made of extruded aluminum. It contains three main sensors:
a Headwall Nano-Hyperspec® hyperspectral sensor (VNIR, 400–1000 nm, 270 spectral bands, Headwall, Bolton, MA), a FLIR Vue Pro
R thermal camera (Teledyne Technologies, Thousand Oaks, CA), and an Intel RealSense L515 LiDAR camera. All sensors are pointed
nadir (downward) toward the scanning region and are mounted on a plate near the top of the structure; the X-Y position of the plate is
controlled via custom Python commands fed to stepper motors on each axis. As such, the field of view of the sensors can be adjusted to
capture subjects in the entire scanning region.

2
S. Lantin et al. HardwareX 15 (2023) e00468

Hyperspectral data acquisition

Compared to the ‘staring’ arrays (single-shot imaging systems, or frame cameras) present in many consumer cameras and other
phenotyping systems, SPOT’s hyperspectral sensor (Headwall Nano-Hyperspec®) uses a pushbroom scanning configuration to capture
image data. This imaging technique allows the sensor to acquire data in hundreds (2 7 0) of spectral bands over the visible-near infrared
(VNIR, 400–1000 nm) region as the sensor is moved across the subject area. With more wavebands, SPOT broadens the analytical scope
of analyses not available to system with fewer wavebands such as multispectral imaging. Additionally, the camera has a relatively high
spectral resolution (~2 nm), ideal for producing high spectral resolution data enabling the differentiation between spectrally similar
phytochemical responses. A live waterfall feed of the hyperspectral data available for viewing on SPOT’s computer from Headwall’s
Hyperspec III® desktop application enables observation of the spectral data collection in real-time. The sensor is shipped from the
manufacturer with (optional) radiometric calibration that allows the conversion of raw image data into at-sensor radiance. A Spec­
tralon® panel can be used as a spectrally ‘flat’ white reference to convert at-sensor radiance to apparent at-surface reflectance to ensure
image spectra are standardized across images.

Thermal image data acquisition

SPOT uses a FLIR Vue Pro R thermal camera to collect thermal imagery. Automated data collection can be performed alongside
hyperspectral imaging using either the free FLIR UAS phone application, or via serial commands issued from a desktop computer.

LiDAR structural data acquisition

Canopy structural data may inform plant phenotyping decisions, such as selection for shorter cultivars during plant breeding or to
assess biomass increments through growth cycles. SPOT utilizes an Intel RealSense L515 LiDAR imaging system to collect high-
resolution point clouds of scanned objects. Standard Python scripts available for free on Intel’s website facilitate data acquisition.

Motion control

SPOT’s employs high-precision motion control to ensure that sensors are positioned properly above the plants. Because the
hyperspectral scanner captures image data with a pushbroom scanning mechanism, the hyperspectral sensor, thermal imager, and
LiDAR imaging system are all mounted on a single block pointed nadir toward plants in the enclosure. This mounting block is con­
nected to two sets of perpendicular lead screws, which in-turn are mounted on guide rails for translational motion in two axes. Each
lead screw is connected to SPOT’s cubical aluminum frame by pillow blocks on one end and one stepper motor (AutomationDirect
SureStep® Stepper Motor STP-MTRH-34127) per axis on the other. When supplied with requisite signals from the control system, the
stepper motors drive the lead screws and move the mounting block with the cameras across the scanning region. Each stepper motor is
controlled by separate motor drives (AutomationDirect SureStep® Advanced Microstepping Drive STP-DRV-80100) accept serial
communication messages from an automated Python script. The linear velocity of the sensor mount block (see Fig. 2), and the frame
rate of the hyperspectral pushbroom sensor are calibrated to produce square image pixels.

Wiring and lighting

The electronic components of the SPOT facility are externally powered and plug into standard high-capacity power strips attached
to the structure using zip ties. Four standard 60 W halogen light bulbs are attached to the sensor mount and move with the pushbroom
sensors to provide consistent illumination through the entire scan. Halogen light bulbs were chosen to provide illumination with near-
solar spectral coverage, although other illumination sources may also work in specific conditions. Standard uncoated LEDs are not
recommended as illuminations sources, however, as many either do not provide infrared radiation or will only provide photons in
small spectral ranges.

Use cases

- Allows for relatively inexpensive high-throughput plant phenotyping and breeding.


- Can also acquire data for stress response characterization for scaling to plot and field level.
- Controlled laboratory environment allows performance of block design experiments.
- Can be used for other analyses, including entomology studies and historical painting analysis.

Design files summary

A full list of design files is listed in Table 1, with a CAD model rendering of the SPOT Facility shown in Fig. 1.

Bill of materials summary

A complete list of materials is shown in Table 2.

3
S. Lantin et al. HardwareX 15 (2023) e00468

Table 1
All CAD files are saved as .STEP files in the Open Science Framework (OSF) web application to promote interoperability.
Design file name File type Open source license Location of the file

2313N16_CORNER MACHINE BRACKET.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp


47065T604_T-SLOTTED FRAMING.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
47065T604_T-SLOTTED FRAMING EXTRUSION.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
airplane plate large (CNC).step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
Arduino MEGA 2560 R3.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
bearing.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
Bottom_plate_holding_camera.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
braccketscrewB.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
bracket_screwA.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
camera.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
corner_support.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
GuideRail_open_bearing.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
lead_screw_bearing_plate.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
lead_screw_bearing.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
Lead_Screw_BearingB.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
lead_screw_plate_large_below.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
Lead_Screw.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
lights.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
motion_coupler.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
nutA.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
Part1.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
Part2.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
pillow_block.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
rail attachment.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
SRA16SSCTL202_000_3.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
SRA16SSCTL202_000_5.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
stp-mtrh-34127.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
support_wheel.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
wheelz.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
x_axis_bracket_standoff.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
x_axis_end_bracket.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
x_axis_motor_bracket.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp
x_axis_offset_bracket.step CAD files CC BY-SA https://doi.org/10.17605/osf.io/4d3hp

Build instructions

Hardware

Building the frame

1. Cut 45˚wedges into the short ends of the smaller extruded aluminum pieces (double six slot rails, 6″ × 3″ × 1.5″) so that they have a
trapezoidal cross section. Use a hole saw with a drill guide jig to cut four holes in each piece’s slots at an angle (Fig. 3).
Assemble the base of the cubical frame with the black oxide socket head screws, washers, and end nuts. To do this, align a
smaller extruded aluminum piece with the end of a larger extruded aluminum piece (6′, 3″ × 3″) as shown above. Slide in two end-
feed single nuts into each of the two slots of the larger piece, aligning the nuts with the previously drilled holes of the smaller piece.
Screw in the head screws with washers into both end nuts.
Repeat, this time aligning another larger extruded aluminum piece with the just attached smaller piece. Once the two larger pieces
have been joined by the smaller piece, join a third large aluminum piece perpendicular to both larger pieces to form a corner of the
frame. Combining four corners this way forms the frame base.
Elevate the frame base and attach a swivel caster on each lower corner using an L-shaped aluminum piece with the T-slot framing
fasteners.

Assembling the guide rail mechanism – Y-axis

To form the first end of the y-axis, slide six end nuts (with button flange head screws) each into both slots on the same side of
an unattached, large, extruded aluminum piece.
Attach one end of two linear sliders (spaced ~ 24″ apart), and an end pillow block supported by an aluminum block using the
button flange head screws as shown below. Insert a lead screw into the pillow block (Fig. 4).
To finish the other side of the y-axis, slide four end nuts (with button flange head screws) each into both slots on the same side of an
unattached, large, extruded aluminum piece.
Attach the loose ends of the linear sliders to the large aluminum piece with the button flange head screws. Slide two pillow blocks
with open bushings onto each linear slider.

4
S. Lantin et al. HardwareX 15 (2023) e00468

Fig. 1. CAD model of the Scanning Plant IoT (SPOT) Facility.

Using a hole saw, drill a 3″ diameter hole into a piece of channel aluminum and four holes for bolts. Attach a stepper motor (STP-
MTRH-34127) to the outside of the channel using provided nuts and bolts.
Slide a 1″ stanchion base onto the lead screw, and then fit the lead screw into the SureStep stepper motor.
Attach the stepper motor assembly to the outer side of the extruded aluminum piece using button flange head screws and nuts, as
shown in Fig. 5.
Attach the completed y-axis to the top of the cubical frame base, using the remaining two large aluminum extruded pieces and the
remaining six smaller aluminum pieces to complete the cube structure.

Assembling the guide rail mechanism – X-axis


The x-axis of SPOT’s guide rail mechanism will be hung from a thin aluminum block (27″ × 10″ × ½”) attached to the pillow blocks
and stanchion of the y-axis (Fig. 6).
The length of the aluminum block (27″) attaches to the pillow blocks and stanchion of the y-axis, while the width (10″) of the
aluminum block attaches to the linear sliders of the x-axis.

Drill holes into the aluminum block corresponding to the locations of 1) the y-axis’s pillow blocks and stanchion on the upper side,
and 2) the screw holes on the two remaining linear sliders, which will form the basis of the x-axis.
Attach the y-axis parts on top of the block and the x-axis parts on the bottom of the block with Mil Spec socket head screws.

SPOT’s x-axis is formed by the two remaining linear sliders, which are attached by pillow blocks and a stanchion to another, smaller
aluminum block (sensor mount), which serves as the base for sensor attachment (Fig. 7).

Slide the remaining four pillow blocks with open bushings onto the x-axis linear sliders (two each).
Drill holes into the sensor mount block to connect the pillow blocks and the remaining stanchion, similar to the previous block.
Attach the smaller block to the pillow blocks and stanchion with Mil Spec socket head screws and insert the remaining lead screw
into the stanchion. See the previous figure.

5
S. Lantin et al. HardwareX 15 (2023) e00468

Fig. 2. The completed structure of the SPOT Facility.

Fit one end of the lead screw into the other stepper motor (STP-MTRH-34127) and attach the stepper motor to one end of the linear
sliders using a piece of channel aluminum.
On the other end of the lead screw, fit the remaining end pillow block, securing to the ends of the linear sliders.

Peripherals and sensor setup

Cut the female ends of the two 16/3 15 ft. black extension cords off, then cut the cords in half so that there are a total of four ~ 7.5
ft. parts, two with plugs on one end and two without.
Connect each of the stepper motors to a microstepping drive (STP-DRV-80100) using the 20 ft. (STP-EXTH-020) extension cables,
and then the microstepping drives to the power supplies (STP-PWR-7005) using the cord halves without plugs. Wire the power
supplies to a wall outlet using the remaining two halves of the 16/3 black extension cords with plugs.
Fasten two power strips on the top of the aluminum frame.
Attach Headwall Nano-Hyperspec hyperspectral sensor to the smaller aluminum block (see previous figure). The sensor should be
nadir (downward) pointing and such that the pushbroom scanning line is perpendicular to the x-axis. Connect the Cat6 20 ft.
ethernet cable to the sensor, and then to the desktop computer for power and serial communication. Attached the provided power
cable.
Attach the FLIR Vue Pro R thermal camera next to the hyperspectral camera. The camera should be nadir pointing. Connect the USB
cable to the sensor, and then to the desktop computer for power.
Attach the Intel RealSense LiDAR camera next to the FLIR Vue Pro R thermal camera. The camera should be nadir pointing. Connect
the USB cable to the sensor, and then to the desktop computer for power and serial communication.
All sensor cables (power and communications) should be fed through the black split tubing wire loom for proper cable manage­
ment. The wire loom should be hung along the x-axis in a way that does not interfere with imaging, lighting, and lead screw motion
mechanism.

6
S. Lantin et al.
Table 2
The hyperspectral imaging sensor constitutes the bulk of the cost of the system. Depending on the use case, other suitable sensors are available for a fraction of the cost.
Designator Component Number Cost/unit Total cost Source of materials Material type

LiDAR imaging system Intel RealSense L515 LiDAR camera 1 $369.00 $369.00 Intel Electronics
Thermal camera FLIR Vue Pro R 1 $3,349.00 $3,349.00 FLIR Electronics
Hyperspectral camera Headwall Nano-Hyperspec® hyperspectral imaging sensor 1 $25,750.00 $25,750.00 Headwall Photonics Electronics
Hyperspectral camera wiring Cat6 ethernet cable, 20 ft. 1 $8.99 $8.99 Amazon Electronics
Structural frame Extruded aluminum (silver anodized, hollow, quad rail profile), 6ft. length, 3″ × 3″ 12 $137.14 $1,645.68 McMaster-Carr Metal
Structural frame Extruded aluminum (double six slot rails), 6″ length, 3″ × 1.5″ 24 $8.44 $202.56 McMaster-Carr Metal
Stepper motor SureStep stepper motor, STP-MTRH-34127 2 $192.00 $384.00 AutomationDirect Electronics
Wiring SureStep extension cable, 20 ft., STP-EXTH-020 2 $38.00 $76.00 AutomationDirect Metal
Motor drive SureStep® Advanced Microstepping Drive STP-DRV-80100 2 $332.00 $332.00 AutomationDirect Electronics
Power supply Linear power supply, STP-PWR-7005 2 $233.00 $466.00 AutomationDirect Electronics
Power supply cord Woods 99026116/3 SJTW 15 ft. black extension cord 2 $9.98 $19.96 Amazon Electronics
Guide rail mechanism Linear slider, 6 ft. (1″ diameter) 4 $776.88 $3,107.52 Thomson Linear Metal
Guide rail mechanism ACME lead screw (1″ diameter) 2 $99.82 $199.64 McMaster-Carr Metal
Guide rail mechanism Pillow block bushing, open, SSUPBO16 8 $202.35 $3,237.60 Thomson Linear Metal
7

Guide rail mechanism 15 Series 1″ narrow horizontal stanchion base 2 $33.97 $67.94 8020.net Metal
Guide rail mechanism Channel aluminum, 2 $70.73 $141.46 Grainger Metal
7″ × 6″ × 2″
Guide rail mechanism End pillow block (FYHP205) 2 $24.37 $48.74 Amazon Metal
Cable management Gardner bender 7-ft × 0.5-in plastic black split tubing wire loom 1 $3.28 $3.28 Lowe’s Plastic
Fasteners T-slot framing fasteners, end-feed nut with flanged head, 5/16″-18 thread, steel 28 $2.93/4 $20.51 McMaster-Carr Metal
Fasteners Mil. spec. alloy steel socket head screw, 5/16″-18 thread size, ¾” long 48 $4.33 $207.84 McMaster-Carr Metal
Fasteners Black-oxide alloy steel socket head screw, 5/16″-18 thread size, ¾” long 96 $9.75/50 $19.50 McMaster-Carr Metal
Fasteners Black-oxide 18–8 stainless steel washer for 5/16″ screw size, 0.344″ ID, 0.75″ OD 96 $7.14/100 $7.36 McMaster-Carr Metal
Fasteners T-slotted framing, end-feed single nut, 5/16″-18 thread 96 $8.06/10 $80.60 McMaster-Carr Metal
Casters Clear polyurethane swivel caster with brake, 3″, 62,274 4 $5.49 $21.96 McMaster-Carr Plastic
Peripherals Power strips 3 $10.89 $32.67 Amazon Plastic
Lighting Fixture Classic 60-watt EQ PAR16 dimmable warm white reflector flood halogen light Bulb (2-pack) 2 $16.98 $33.96 Lowe’s Other
Lighting Fixture Lithonia Lighting white incandescent outdoor switch-controlled floodlight 2 $16.98 $33.96 Lowe’s Other
Lighting Fixture Southwire 25-ft 16/2 white stranded lamp cord 1 $8.64 $8.64 Lowe’s Other
Lighting Fixture Project source 15-Amp 125-Volt NEMA 5-15p general-duty straight plug 4 $4.32 $17.28 Lowe’s Other

HardwareX 15 (2023) e00468


S. Lantin et al. HardwareX 15 (2023) e00468

Fig. 3. Frame corner assembly.

Fig. 4. y-axis lead screw and attachment to aluminum frame via pillow block.

Fig. 5. y-axis stepper motor and attachment to aluminum frame.

8
S. Lantin et al. HardwareX 15 (2023) e00468

Fig. 6. Top view, connecting the x- and y- axes.

Fig. 7. Sensor mount aluminum block connection to the x-axis linear sliders.

Installing the lighting fixture

Wire the four halogen bulbs with about 2 feet of lamp cord each.
Screw the halogen floodlights into the two floodlight holders.
Mount the floodlight holders on either side of the hyperspectral sensor, ensuring that the line of lights is aligned with the
pushbroom scanning line (Fig. 8).
Attach the ends of the wires to the internal live and neutral prongs of the plugs and insert each of them into a socket on a power
strip.
Mount the power strip onto the sensor mount.

9
S. Lantin et al. HardwareX 15 (2023) e00468

Fig. 8. The lighting fixture moves with the pushbroom camera to maintain constant imaging illumination during a scan. Overhead room lights are
turned off during the scan.

Software

Before attempting to run programs on SPOT, ensure that the software is installed correctly.

Required software

• Python 3.6 or newer


• IDLE 3.6 or newer, or suitable Python IDE
• Hyperspec III® - available from Headwall
• SpectralView® - available from Headwall
• “main.py,” the Python script used to control the sensor/camera positions.

No external Python packages are needed to run the stepper motors; however, operation of the LiDAR camera ­
requires the pyrealsense2 package, which can be installed through pip.
Note that the software setup may work with other versions of the listed software, but intercompatibility is not guaranteed.

Potential safety hazards


Electrical current – ensure that power supplies are unpowered when connecting the stepper motors and motor drives. Follow all
guidelines from local Environmental Health and Safety organizations.

Operation instructions

Initialization and communication testing

a. Ensure that all power and communication wiring is properly connected according to the Peripherals and Sensor Setup section. All
software as denoted in the Software section should be installed.
b. Configure the Headwall Nano-Hyperspec® as per manufacturer’s instructions. This should include making directories with a
radiometric calibration file accessible, and a folder for file transfer of hyperspectral images.
c. Turn on the computer and open “main.py.”
d. Change the directories at the top of the script to match desired locations for the thermal and LiDAR images.

10
S. Lantin et al. HardwareX 15 (2023) e00468

e. Uncomment the “move” commands at the bottom of the script. These commands should have argument values that will move the
sensor plate a small distance in both axes. Make note of the directions in which they move, as these are the positive directions. To
move the plate back, replace the positive-valued arguments with negative-valued ones. Save the file every time arguments are
changed and before each run of the program to ensure that the stepper motors receive the desired values.
f. Once confirmation that the sensor plate is moving according to the stepper motor commands, uncomment the “data_capture”
commands. These will save image data to the Headwall Nano-Hyperspec®.
g. Open the Hyperspec III® program and click on the “Waterfall” button on the left panel. A live feed of the hyperspectral sensor line
scan should appear to test image collection.
h. Before each experiment: Test image collection in SPOT by adjusting lighting, hyperspectral sensor integration time, stepper motor
speed, and subject height as necessary. A good image should balance the integration time and stepper motor speed to produce
square pixels in the image. Be sure to include the Spectralon® white reference reflectance panel in each camera pass.
i. The FLIR Vue Pro R thermal camera and Intel RealSense L515 LiDAR camera are staring arrays (cf. pushbroom line scanning). As
such, only the frequency of image capture needs to be adjusted to ensure that enough overlap exists in each image for stitching in
post-processing.

Regular operation

j. Open “main.py.”
k. “main.py” will have a loop at the bottom of the script to automate data collection in SPOT. Change “move” commands as
necessary to capture the subject.
l. Once data is collected, hyperspectral data files should be downloaded from the camera to the desktop computer using the “File
Transfer” button on the left panel in Hyperspec III®.
m. To view the hyperspectral image files, navigate to the directory that was transferred and open the “.hdr” file. This will launch
the SpectralView program.
n. If a radiance calibration file is available, use the “Radiance” button on the left panel in SpectralView to calibrate your image
files. They will be saved in the same folder and have “_rd” appended to the end of the file name. Open the radiance-calibrated
file in SpectralView, and if there is a reflectance panel in the image, right-click on it and select “Classify.” Then, adjust the blue
boundaries to cover the reflectance panel. Click the “Recalculate” button on the bottom and save the white reference as a 99%
reflectance file. Once the white reference is saved, click the “Reflectance” button on the left panel in SpectralView and similarly
apply the reflectance correction. Your file will have “_rd_rf” appended to the end of the file name, indicating that it has been
properly radiometrically calibrated.

Remote operation

SPOT can also be run remotely via control of Python scripts using the cron command-line utility and/or operation via remote
desktop.

Radiometric calibration

The radiometric calibration pipeline used in this research is as follows:


First, prior to data collection, dark and white reference images are collected to establish proper scaling for scanned spectra. The
dark reference is taken to be the spectral output of the sensor under no illumination conditions (practically, with the sensor cap on) and
is saved internally for all future data collection. The white reference is taken with every hyperspectral scan to accommodate minor
lighting changes between scans. To do this, a reflectance panel constructed of Spectralon (>99% reflectance across 400–1000 nm) is
aligned with the top of the imaging target (i.e., the plant canopy) such that the panel’s radiance will be the scan’s maximum across all
wavebands. Once a hyperspectral scan is complete and images are saved, the factory radiance calibration file is applied to correct for
pixel aberrations due to nonuniformities in sensor response. The digital numbers within the hyperspectral cube are now in units of at-
sensor-radiance. Though these spectral radiance values are informative, these values are relative to the experimental setup and the
lighting conditions in which they are taken. As such, they are not ideal for building generalized prediction models and must be
converted to apparent at-surface reflectance for models to be generalizable.
The reflectance calibration step scales all radiance values by dividing all pixels by an average of the radiance of the white reference.
Next, a correction for the specular reflectance (anisotropy) of the plant canopy is needed. The plant canopies can be assumed to be
either diffusely reflective (isotropic/Lambertian) or apply a bidirectional reflectance distribution function (BRDF) as a correction. In
this pipeline, plant canopies are assumed to be Lambertian, which has been found to be a satisfactory estimation in the visible and NIR
domains [19]. A simple semantic segmentation via a thresholding using a vegetation index (commonly the normalized difference
vegetation index, NDVI) helps mask out the background.

11
S. Lantin et al. HardwareX 15 (2023) e00468

Fig. 9. Radiometric calibration pipeline. In addition to establishing a white reference and dark reference with the SPOT experimental setup, many
post-processing steps are needed to ensure that the information contained in the remaining hyperspectral data contains only phytochemically
relevant data.

While the post-processing described above is sufficient for many general purposes, the non-uniformity of plant heights under the
sensor can cause some reflectance values to be evaluated at over 1.0. This generally happens when the plant canopy is tall, and some
leaves angles towards incident light causing ‘hot spots’ to be present. These situations produce pixels with spectra that are similar in
shape to others but scaled by factors approaching almost 2X of the average. To correct for these aberrations, we calculate the Euclidean
norm (2-norm) of each spectrum and divide the spectrum with this number to generate brightness-adjusted spectra [20]. These ‘vector-
normalized’ plant reflectance spectra are now ready for use for analyses downstream. A summary of this processing pipeline is shown
in Fig. 9.

Validation and characterization

To validate the SPOT data collection setup and analysis, an experiment was performed to see if it could differentiate between
different cultivars of lettuce (Lactuca sativa). Three cultivars of lettuce, ‘Rex’, ‘Skyphos’, and ‘Outredgeous’ (Johnny’s Selected Seeds,
Winslow, ME, USA), were grown from seed, starting with a one-week germination period in a hand-built ebb-and-flow system, with
rockwool (CropKing, Lodi, OH, USA) as the substrate. To grow the lettuce, the growing procedure in the Cornell Lettuce Handbook was
used [21] with slight modifications. On planting day, seeds were placed in the rockwool and placed in the ebb-and-flow system. The
ebb-and flow system was flooded for 15 min every 12 h with deionized water and placed under a plastic humidity cover. KingLED 1000
W LED Grow Lights (Shenzheng King Lighting Co., Ltd., Shenzhen, Guangdong, China) were powered and set to the “VEG” setting, and
shade cloths were placed over the humidity cover until a quantum sensor (Apogee Instruments, Logan, UT) measured a photosynthetic
photon flux density (PPFD) of ~ 50 μmol m− 2 s− 1. The next day (1 DAP), a nutrient solution mix of 0.60 g/L CHEMGRO Master Mix
(CHEMGRO, East Petersburg, PA), 0.30 g/L magnesium sulfate, and 0.45 g/L calcium nitrate, and shade cloths were removed until the
PPFD reached 250 μmol m− 2 s− 1. Lights were set on automatic timer for a photoperiod of 16 h on/8 h. Three days after planting (3
DAP), germination (>95% in all cultivars) was recorded and humidity covers were removed. Five days after planting (5 DAP), the flood
frequency was increased to 15 min every 6 h. Seven days after planting (7 DAP), the most uniform seedlings were transplanted to six
separate hydroponic desktop nutrient film technique (NFT) systems (eight plants each, two systems for each cultivar; CropKing, Lodi,
OH), with 0.6 g/L CHEMGRO, 0.60 g/L, 0.60 g/L magnesium sulfate, 0.37 g/L calcium nitrate, with deionized water as the solvent. The
PPFD and photoperiod established on 5 DAP was continued for a total grow of 28 days. All plants grew in ambient room conditions
(~23˚C, ~45% RH when not under the humidity covers).

12
S. Lantin et al. HardwareX 15 (2023) e00468

Fig. 10. Raw hyperspectral data collected from SPOT (left), after the radiance correction is applied (center), and after the reflectance correction is
applied (right). Representative reflectance panel pixel (red) and plant pixel (blue) show appropriate scaling. (For interpretation of the references to
color in this figure legend, the reader is referred to the web version of this article.)

Fig. 11. Normalized reflectance spectra for ‘Outredgeous’ (‘O’), Rex (‘R’), and Skyphos (‘S’) cultivars, at 7 DAP (20220926) and 14 DAP
(20221003) vs. wavelength (nm). Averages (solid color) and standard deviations (translucent) are shown. Clear spectral differences between
Outredgeous plants and the other two cultivars are apparent, the shape of the spectra appears unique to each cultivar.

Hyperspectral data collection

At 7 DAP (transplant) and 14 DAP, hyperspectral scans of plants were taken in SPOT and the radiometric calibration process
described under the Radiometric Calibration heading is applied (Fig. 10).
Once the reflectance correction is applied, the scan undergoes semantic segmentation, masked to remove non-plant pixels, and are
aggregated in a single comma-separated-value (CSV) file, where they are vector-normalized. The spectra are now ready to be used as
predictors for evaluating plant responses stressors or nutrient application etc. (Fig. 11).
While ‘Skyphos’ is a red lettuce cultivar, its canopy has not received enough radiation by DAP 14 to express anthocyanins. As such,
two of the cultivars, ‘Rex’ and ‘Skyphos’, have normalized reflectance spectra that overlap when plotted together. However, the shapes
of each of the spectra appear to be unique to the cultivar. To elucidate the effect of shape, delta-normalized difference spectral index
bx − by
(difference of normalized difference spectral indices [NDSI] from the overall mean) plots for all plants are calculated as bx +by , for all
combinations of waveband values bx and by are shown in Fig. 12.Fig. 13.

13
S. Lantin et al. HardwareX 15 (2023) e00468

bx − by
Fig. 12. Delta-Normalized difference spectral index (NSDI) values are estimated by first calculating the NDSIs as bx +by , for all combinations of
waveband values bx and by , followed by differencing by the average NDSI across all cultivars. Values are displayed as an upper diagonal matrix, as
all matrices are symmetric. While the normalized spectral reflectance plots show similar shapes between cultivars, the NDSI plots clearly
demonstrate biochemical differences between cultivars.

14
S. Lantin et al. HardwareX 15 (2023) e00468

Fig. 13. Example of LiDAR scan from Intel RealSense L515 (Bibb lettuce, left) and thermal image from FLIR Vue Pro R (‘Outredgeous’ red romaine
lettuce, right). (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)

LiDAR and thermal data collection

LiDAR data are acquired with the Intel RealSense L515 and thermal images are taken with the FLIR Vue Pro R. As both sensors are
staring arrays, data may be taken independently of hyperspectral data by positioning the sensors above the scanning region and
initiating data collection. To collect and save data with the RealSense, open-source, plug-and-play code to export point clouds as.PLY
files is available from www.intelrealsense.com. Data collection for the FLIR Vue Pro R can be initiated through the free FLIR UAS phone
application (Android and iOS). Once the Vue Pro R is plugged in and powered via a standard desktop connection, the application
connects to the camera via Bluetooth and can be triggered as needed. The Vue Pro R saves images to an onboard micro-SD card as 14-
bit TIFF tiles.

Ethics statements

This work did not involve human or animal subjects.

CRediT authorship contribution statement

Stephen Lantin: Writing – original draft, Writing – review & editing, Methodology, Investigation, Formal analysis, Software, Data
curation, Validation, Visualization. Kelli McCourt: Writing – original draft. Nicholas Butcher: Visualization, Data curation. Varun
Puri: Software. Martha Esposito: Investigation. Sasha Sanchez: Investigation. Francisco Ramirez-Loza: Investigation. Eric
McLamore: Conceptualization, Supervision, Project administration, Funding acquisition. Melanie Correll: Conceptualization, Su­
pervision, Project administration, Funding acquisition. Aditya Singh: Conceptualization, Supervision, Project administration,
Funding acquisition, Resources, Visualization.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to
influence the work reported in this paper.

Acknowledgments

This work was supported by UF IFAS [Equipment and Infrastructure Award] and USDA-NIFA Hatch [FLA-ABE-006356]. Funding
for graduate students is provided by the UF Graduate School [Graduate Student Funding Award (UF ABE Pathfinders Fellowship)] and
the NASA Space Technology Graduate Research Opportunity (NSTGRO21) [80NSSC21K1257].

15
S. Lantin et al. HardwareX 15 (2023) e00468

References

[1] J.N. Pelton, S. Madry, S. Camacho-Lara (Eds.), Handbook of Satellite Applications, Springer New York, New York, NY, 2013.
[2] F.J. Pierce, P. Nowak, Aspects of Precision Agriculture, Adv. Agron.. 67 (1999) 1–85.
[3] P. Song, J. Wang, X. Guo, W. Yang, C. Zhao, High-throughput phenotyping: Breaking through the bottleneck in future crop breeding, Crop J. 9 (3) (2021)
633–645, https://doi.org/10.1016/j.cj.2021.03.015.
[4] C. Granier, et al., PHENOPSIS, an automated platform for reproducible phenotyping of plant responsees to soil water deficit in Arabidopsis thaliana permitted
the identification of an accession with low sensitivity to soil water deficit, New Phytol. (2005).
[5] J. Fabre, M. Dauzat, V. Nègre, N. Wuyts, A. Tireau, E. Gennari, P. Neveu, S. Tisné, C. Massonnet, I. Hummel, C. Granier, PHENOPSIS DB: an Information System
for Arabidopsis thaliana phenotypic data in an environmental context, BMC Plant Biol. 11 (1) (2011).
[6] Australian Plant Phenomics Facility. https://www.plantphenomics.org.au/.
[7] J. Li, M.A.C. Mintgen, S. D’Haeyer, A. Helfer, H. Nelissen, D. Inzé, S. Dhondt, PhenoWell®—A novel screening system for soil-grown plants, Plant-Environ.
Interact. 4 (2) (2023) 55–69, https://doi.org/10.1002/pei3.10098.
[8] C. Reuzeau, J. Pen, V. Frankard, J. De Wolf, R. Peerbolte, W. Broekaert, W. Camp, TraitMill: A Discovery Engine for Identifying Yield-enhancement Genes in
Cereals, Plant Gene and Trait 1 (2010), https://doi.org/10.5376/pgt.2010.01.0001.
[9] Qubit Phenomics. https://qubitphenomics.com/ (2023).
[10] PhenoVation. https://www.phenovation.com/ (2023).
[11] Photon Systems Instruments. https://plantphenotyping.com/pro%E2%80%A6/plantscreen-compact-system/ (2023).
[12] A. Czedik-Eysenberg, S. Seitner, U. Güldener, S. Koemeda, J. Jez, M. Colombini, A. Djamei, The ‘PhenoBox’, a flexible, automated, open-source plant
phenotyping solution, New Phytol. 219 (2) (2018) 808–823.
[13] Zhang, C., Pumphrey, M. O., Zhou, J., Zhang, Q. & Sankaran, S. Development of an Automated High-Throughput Phenotyping System for Wheat Evaluation in a
Controlled Environment. Trans. Am. Soc. Agric. Biol. Eng. (2019).
[14] J. Qin, O. Monje, M.R. Nugent, J.R. Finn, A.E. O’Rourke, R.F. Fritsche, I. Baek, D.E. Chan, M.S. Kim, Development of a hyperspectral imaging system for plant
health monitoring in space crop production, Sens. Agricult. Food Qual. Saf. XIV 12120 (2022) 16–21, https://doi.org/10.1117/12.2618635.
[15] Phenovision. https://www.psb.ugent.be/phenotyping/phenovision. (2023).
[16] Warris, S. & van de Zedde, R. High-throughput phenotyping and machine learning applications. (2022).
[17] Netherlands Plant Eco-phenotyping Centre. https://www.npec.nl/ (2023).
[18] M.R. Lien, R.J. Barker, Z. Ye, M.H. Westphall, R. Gao, A. Singh, S. Gilroy, P.A. Townsend, A low-cost and open-source platform for automated imaging, Plant
Methods 15 (1) (2019), https://doi.org/10.1186/s13007-019-0392-1.
[19] M. Chelle, Could plant leaves be treated as Lambertian surfaces in dense crop canopies to estimate light absorption? Ecol. Model. 198 (1-2) (2006) 219–228.
[20] H. Feilhauer, G.P. Asner, R.E. Martin, S. Schmidtlein, Brightness-normalized Partial Least Squares Regression for hyperspectral data, J. Quant. Spectrosc. Radiat.
Transf. 111 (12-13) (2010) 1947–1957.
[21] M. Brechner, A. Both, J. Hydroponic Lettuce Handbook (2013).

Stephen Lantin is a Ph.D. student and NASA Space Technology Graduate Researcher in the Agricultural & Biological Engineering Department
at the University of Florida. His current research focuses on hyperspectral imaging and crop modeling, with applications in space agriculture,
Earth-based controlled environment agriculture, and ecosystem dynamics. Prior to matriculating, Stephen received a B.S. in Chemical En­
gineering from the University of California, Santa Barbara in 2019 and has previously worked in technology transfer and designing testbeds to
characterize electric propulsion technology for NASA.

16

You might also like