Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

https://doi.org/10.20965/ijat.2021.

p0313
Forest Data Collection by UAV Lidar-Based 3D Mapping:
Segmentation of Individual Tree Information from 3D Point Clouds

Technical Paper:

Forest Data Collection by UAV Lidar-Based 3D Mapping:


Segmentation of Individual Tree Information
from 3D Point Clouds
Taro Suzuki∗1,† , Shunichi Shiozawa∗2 , Atsushi Yamaba∗3 , and Yoshiharu Amano∗4
∗1 Chiba Institute of Technology

2-17-1 Tsudanuma, Narashino, Chiba 275-0016, Japan


† Corresponding author, E-mail: taro@furo.org
∗2 Terra Drone Corporation, Tokyo, Japan
∗3 Forestry Research Center, Hiroshima Prefectural Technology Research Institute, Miyoshi, Japan
∗4 Waseda University, Tokyo, Japan

[Received November 11, 2020; accepted February 12, 2021]

In this study, we develop a system for efficiently mea- of such artificially expanded forests, which are currently
suring detailed information of trees in a forest envi- at their harvest time, are left unharvested as they do not
ronment using a small unmanned aerial vehicle (UAV) generate any profit. Various capacities of aged trees have
equipped with light detection and ranging (lidar). The been degraded not only to render them vulnerable to ty-
main purpose of forest measurement is to predict the phoon damage, but also to reduce their water-sustaining
volume of wood for harvesting and delineating for- capacities to cause natural disasters such as landslides. In
est boundaries by tree location. Herein, we pro- addition, Japan’s declining birthrates and young people’s
pose a method for extracting the position, number of increasing exodus to urban areas have rapidly reduced
trees, and vertical height of trees from a set of three- the forestry workforce, whereas efficient forest manage-
dimensional (3D) point clouds acquired by a UAV lidar ment and measurements remain to be addressed. From
system. The point cloud obtained from a UAV is dense the perspective of sustainable forest management, indi-
in the tree’s crown, and the trunk 3D points are sparse vidual management by which to manage single trees in-
because the crown of the tree obstructs the laser beam. dividually would be more suitable than areawide forest
Therefore, it is difficult to extract single-tree infor- management [1]. However, as most forests are distributed
mation from 3D point clouds because the characteris- in mountainous areas, measuring single trees individually
tics of 3D point clouds differ significantly from those in a sloped environment in a wide range of mountains will
of conventional 3D point clouds using ground-based impose significant burden on workers.
laser scanners. In this study, we segment the forest Meanwhile, small unmanned aerial vehicles (UAVs)
point cloud into three regions with different densities have been applied for measuring objects from the sky as
of point clouds, i.e., canopy, trunk, and ground, and an affordable and easy method to replace conventional air-
process each region individually to extract the target borne measurements using aircraft. In particular, three-
information. By comparing a ground laser survey and dimensional (3D) measurements using light detection and
the proposed method in an actual forest environment, ranging (lidar) are expected to replace conventional pho-
it is discovered that the number of trees in an area tographic measurements [2]. In measuring a forest, single
measuring 100 m × 100 m is 94.6% of the total num- trees used for measurements by ground-based lidar, are
ber of trees. The root mean square error of the tree often obstructed by their branches and leaves, rendering it
position is 0.3 m, whereas that of the vertical height difficult to acquire the tree crown point clouds that work-
is 2.3 m, indicating that single-tree information can be ers are compelled to repetitively move using their mea-
measured with sufficient accuracy for forest manage- suring devices; hence, the burden on workers remains an
ment. issue to be addressed [3, 4].
The aim of this study is to build a system to efficiently
measure single-tree information for artificial forest man-
Keywords: remote sensing, 3D point cloud, UAV, seg- agement. Fig. 1 shows the tree information used in this
mentation study. The height of a tree is defined as the height of
the trunk at 1.2 m above the ground (the terrain on the
mountain side in the case of a slope). The measured ver-
1. Introduction tical height from the top of the tree to the terrain is known
as the vertical height, and the trunk’s cross-sectional cen-
Japan is an eminent forest country, with forest areas ter refers to the location of the tree in the forest. The
constituting 68% of the entire country. 41% of the for- tree locations used for the land’s boundary information,
est area is artificial forest with needle-leaved trees. Many the number of trees used as an index for thinning, and

Int. J. of Automation Technology Vol.15 No.3, 2021 313

© Fuji Technology Press Ltd. Creative Commons CC BY-ND: This is an Open Access article distributed under the terms of
the Creative Commons Attribution-NoDerivatives 4.0 International License (http://creativecommons.org/licenses/by-nd/4.0/).
Suzuki, T. et al.

with an accuracy of 0.25–1.5 m by defining a tree top as


Top the maximum point in the planar region estimated from
Tree canopy terrain point clouds acquired near it [4]. The measure-
Number of trees ment of tree heights using ground laser is often hindered
by the tree branches and leaves, rendering it difficult to
acquire tree top point clouds that cannot be measured cor-
rectly using their vertical heights. Moreover, ground laser
can only measure a limited range at a time.
In the currently prevailing airborne laser surveys, high-
Position of tree accuracy lidar equipped on medium- or large-sized air-
craft irradiates the terrain to measure a wide range [5–
Vertical height 9]. It can measure the digital surface model (DSM) for
1.2m the branches and leaves by directly irradiating laser from
the sky, as well as the digital terrain model (DTM) from
point clouds that are transmitted through the branches and
leaves to reach the ground. Hence, we can calculate the
Fig. 1. Target single-tree information to be extracted from digital canopy height model from the difference between
3D point cloud using proposed method. the DSM and DTM. In areas where the forests to be mea-
sured are distinctly bounded, approximate quantities of
trees can be measured using airborne surveys. However,
in an airborne survey where laser is irradiated immedi-
the vertical heights used to estimate tree volumes provide ately under a high altitude, the trunk point clouds can
the most important information for single-tree measure- barely be acquired, and the tree crown point clouds are
ments. As the distance between standing trees in a well- as sparse as four points per square meters. Therefore, the
maintained forest is approximately 2–4 m, we aim to es- numbers of trees cannot be determined correctly, nor can
timate the horizontal positions of single trees with an ac- individual information regarding each tree be measured;
curacy of 0.5 m or less. these issues remain to be addressed [10].
In this study, we developed a small UAV equipped Visiblelight and near-infrared radiation images taken
with lidar and created 3D point clouds of a wide range by a camera have been applied to forest surveys. As cam-
of forests including their absolute coordinates. We fur- eras are relatively lightweight compared with lidars, they
ther propose a method for processing 3D point clouds are used not only in aircraft, but also in small UAVs [11–
to extract and measure the numbers, locations, and verti- 13]. We can create a DSM by applying the structure from
cal heights from the numerous 3D point clouds acquired. motion method to images that have been continuously
Three-dimensional point clouds acquired by a small UAV photographed from the sky; however, DTM information
from the sky are particularly sparse in the trunk region, cannot be obtained because it is difficult to acquire terrain
unlike those measured using the conventional ground- and trunk information from such images.
based lidar. As a small UAV flies at a low altitude, signifi- UAV laser surveys using UAVs equipped with lidar
cant differences in density occur between the point clouds have recently garnered increasing attention. They survey
immediately under the flight path and those distant from forests from the sky, thereby improving work efficiency
it because of the difference in the angle of incidence of compared with conventional single-tree forest measure-
the laser. Therefore, in this study, we propose a single- ments using ground laser. Furthermore, airborne laser
tree segmentation method for 3D point clouds acquired surveys can acquire detailed point clouds by flying at low
using a small UAV equipped with lidar. We expect the altitudes over forests [14]. Despite the many operational
proposed method to enable us to reduce the measurement advantages of laser surveys by small UAVs, they are not
time and mitigate human burden. Furthermore, we eval- used extensively for forest surveys. To create precise 3D
uated the proposed method by comparing the measure- point clouds using a UAV equipped with lidar, the position
ments yielded by it with single-tree information acquired and attitude of lidar in flight must be estimated with high
by ground-based lidar in the field of an actual forest. accuracy. Conventionally, a composite system comprising
a global navigation satellite system (GNSS) and an iner-
tial navigation system (INS) has been used to estimate the
2. Related Studies position and attitude of lidar [15]. Because a composite
GNSS/INS system is expensive and requires some initial-
The conventional forest measurement and manage- izing flights, it poses problems when operating in moun-
ment method uses ground-based lidar for forest measure- tainous areas that are limited in space. In a few previous
ments [3]. It can acquire 3D point clouds that are as dense studies [14, 16–18], forests were measured using a GNSS
as 100,000 or more points per square meter, enabling equipped with lidar. However, the measurements did not
not only the measurements of the locations, heights, and include detailed single-tree data.
trunk diameters of trees, but also the determination of tree Our contributions from the current study are as follows:
shapes. Furthermore, it can measure the height of a tree

314 Int. J. of Automation Technology Vol.15 No.3, 2021


Forest Data Collection by UAV Lidar-Based 3D Mapping:
Segmentation of Individual Tree Information from 3D Point Clouds

Lidar GNSS Receivers


(Velodyne VLP16) (u-blox NEO M8T)

Fig. 3. Lidar and GNSS receivers equipped with UAV.

such as the GNSS receivers, antennas, and lidar, weigh


approximately 3.5 kg in total [19].
Fig. 2. UAV 3D mapping system with multiple GNSS re-
The developed system combining RTK-GNSS with
ceivers and lidar.
the installed multiple low-cost one-frequency GNSS re-
ceivers and antennas can estimate the UAV’s position and
attitude with high accuracy. The use of multiple GNSS
• We develop a UAV equipped with multiple GNSSs antennas and their RTK-GNSS processing enables us to
and lidar and build a 3D point cloud mapping sys- directly measure the 3D attitude of the UAV from the ge-
tem capable of measuring forests with an accuracy ometrical arrangement of the GNSS antennas without us-
of 10 cm. ing any expensive INS. In estimating the position of the
UAV, for which one GNSS receiver is typically used, the
• To measure single trees from their 3D point clouds, developed system uses six GNSS receivers to acquire the
which are characteristically uneven in density and UAV’s position, thereby allowing the redundancy of the
sparse on the trunk, we propose a method for extract- installed GNSS receivers to be utilized to improve the ac-
ing composite tree information by segmenting a tree curacy of the RTK-GNSS, using low-cost one-frequency
into a tree crown and trunk. GNSS receivers. Moreover, the developed system can di-
rectly measure the UAV’s 3D attitude from the installed
• Considering the shape characteristics of trees, we multiple GNSS antennas, thereby obviating the necessity
propose a method to cluster their 3D point clouds for any initial flights to estimate the UAV’s attitude, which
using their vertical compressions. is required in conventional similar systems. Therefore,
the developed system becomes more suitable for measure-
• We acquire data from an actual forest environment ments in mountainous areas with limited space. For more
to comparatively evaluate single-tree measurements details, please refer to [20].
obtained by ground lidar and the proposed method,
which uses a UAV equipped with lidar.
3.2. 3D Point Clouds
Figure 4 shows an example of 3D point clouds of a
forest acquired using the developed UAV equipped with
3. 3D Mapping System lidar, where the point clouds are shown in different colors
based on their elevations. The abovementioned example
3.1. Overview represents the data of approximately 30 million points in
Figure 2 shows the developed hexacopter-type UAV a 100-m square acquired by a UAV flying at an altitude of
equipped with multiple GNSS antennas. We used 60 m. Fig. 5 shows the cross-sectional views of the above-
Matrice 600 from DJI Inc. as the UAV platform. The mentioned 3D point clouds of the forest, from which one
mapping system comprising multiple GNSSs and lidar can observe that as the laser was irradiated on the ground
was separated from the UAV and installed via a passive surface through the clearance of the trees and leaves, the
damper. acquired ground surface 3D point clouds became sparse.
Figure 3 shows an enlarged photograph of the mapping Furthermore, as the point clouds of the trunk were ex-
system: it comprises six GNSS antennas installed approx- tremely sparse compared with those of the tree crown, the
imately 1.8 m apart from each other; it uses VLP-16 of density of acquired point clouds differed among the trees.
Velodyne Inc. for lidar; it uses NEO-M8T of u-biox Inc., Next, we describe our proposed method for estimating the
a one-frequency GNSS receiver capable of outputting the locations, numbers, and vertical heights of trees from 3D
GNSS carrier wave phase, as a receiver for real-time kine- point clouds acquired by a UAV equipped with lidar.
matic (RTK)-GNSS; it uses TW2712 of Tallysman Inc.
for its six GNSS antennas; all the installed equipment,

Int. J. of Automation Technology Vol.15 No.3, 2021 315


Suzuki, T. et al.

3D mapping by
UAV Lidar
3D pointcloud of
3D pointcloud trees
Extraction terrain pointcloud Principal component
by Progressive analysis and b-spline
Morphological Filter fitting
Terrain Tree canopy Tree trunk
pointcloud pointcloud pointcloud

Segmentation
Local maximum
B-spline fitting using vertical
computation
compression
Terrain model Tree top
Tree trunk model
Tree top

Fig. 4. Example of 3D point cloud in forest acquired by Estimation of Integrtation of tree canopy and
proposed UAV 3D mapping system. vertical height trunk information

Vertical height 3D location Number of trees

Fig. 6. Proposed measurement flow for number, position,


and vertical height of single tree from forest 3D point cloud.

We first segmented the point clouds in the terrain region


and then defined the curvilinear surface passing through
the bottom face of the tree crown via B-spline fitting us-
ing principal component analysis. Hence, we segmented
the tree 3D point clouds, excluding the point clouds in
the terrain region, into point clouds in the tree crown and
tree trunk regions. We extracted each tree top informa-
tion from the point clouds in the tree crown region. The
tree locations were extracted from the point clouds in the
trunk region via the segmentation method using their ver-
tical compressions; subsequently, we integrated such in-
formation to calculate the numbers, locations, and vertical
heights of trees.
Fig. 5. Cross-sectional view of acquired forest point cloud.

4.2. Tree Top Detection


First, we extracted the point clouds in the terrain re-
4. Proposed Method
gion from the acquired 3D point clouds. To detect the
point clouds that pass through the tree crowns and trunks
4.1. Overview
to arrive at the terrain, we removed the point clouds of
The algorithm to be used for extracting single-tree in- ground objects other than the terrain (non-terrain point
formation from the point clouds acquired by ground laser clouds), such as trees, using the progressive morphologi-
identifies a single tree by fitting its tree crown modeled cal filter proposed by Zhang et al. [21]. The progressive
into a cone and its trunk modeled into a column [3]. morphological filter removes non-terrain point clouds by
Meanwhile, the point clouds to be acquired by a small open processing, where they are first expansion processed
UAV that irradiates laser from the sky are characteristi- and then contraction processed. The filter removes ground
cally dense for the tree crown and sparse for the trunk objects whose width is smaller than the window region w.
because the laser is obstructed by the tree crown. In many Ground objects of different widths can be removed more
cases, some of the point cloud data of the trunk are un- precisely by gradually changing the window width from
available. It is difficult to extract a single tree by model a small value to a large one to sequentially process them
fitting such point clouds. Therefore, in this study, we seg- and distinguish ground objects from the terrain based on
mented the forest’s 3D point clouds into regions of tree height changes between adjacent points. Subsequently, to
crown, tree trunk, and terrain, the point cloud densities of calculate the intersection between the vertical line from
which differ from each other; subsequently, we processed the tree top and the terrain to determine the vertical height,
these three regions separately. Fig. 6 shows the flow of we created a curvilinear surface representing the terrain
extracting single-tree information from 3D point clouds. by B-spline fitting [22] the terrain point clouds.

316 Int. J. of Automation Technology Vol.15 No.3, 2021


Forest Data Collection by UAV Lidar-Based 3D Mapping:
Segmentation of Individual Tree Information from 3D Point Clouds

Fig. 7. Estimation of boundary plane of crown and trunk 3D


points from 3D tree point cloud.

Fig. 8. Example of initial plane estimated using PCA.


In the B-spline fitting, we determined a B-spline curvi-
linear surface that fits the input point clouds through
repetitive calculations using point distance minimization,
which minimizes the residual distance between the cre-
ated curvilinear surface and the input point clouds [22–
24]. B-spline fitting was performed as follows [22, 23]:
1. An appropriate initial shape of B-spline curvilinear
surface was specified.
2. The distance to the curvilinear surface at every input
point cloud as well as the error function were calcu-
lated.
3. A control point to minimize the error function was
identified, and the B-spline curvilinear surface was Fig. 9. Example of boundary plane of crown and trunk 3D
updated. points via B-spline fitting.

4. Steps 2 and 3 were repeated until the prespecified


error threshold was obtained.
Next, we estimated by the abovementioned B-spline fit-
Next, the entire-tree point clouds were segmented, ex- ting a curvilinear surface separating the tree crown from
cluding the terrain point clouds, into point clouds in the the trunk (gray line in Fig. 7). Using the plane acquired
tree crown and trunk regions. In mountainous areas with via principal component analysis as an initial shape for
diverse terrains, the shape of the curvilinear surface that the B-spline fitting, we estimated a curvilinear surface that
separates the tree crown and trunk is complex. Therefore, minimizes the distance residuals between the curvilinear
in this study, we determined the curvilinear surface that surface and the entire-tree point clouds. Because the den-
separates the tree crown from the trunk using B-spline fit- sity of the point clouds near the bottom part of the tree
ting. crown is higher than that of the trunk and top parts of
First, we downsampled the entire-tree point clouds in the tree, as described above, the final curvilinear surface
advance using a voxel grid filter [25]. The voxel grid filter passed through the tree’s bottom face.
separates the point clouds by an arbitrary grid size and Figure 9 shows an example of a plane that separates the
replaces the point clouds in a grid with one point in the tree crown from the trunk as estimated from actual forest
center of gravity. point cloud data. As shown, the B-spline fitting method
Next, we defined the third principal component vector can estimate a curvilinear surface that separates the tree
in the principal component analysis of the entire tree point crown from the trunk. By setting the tree’s bottom face
clouds as its normal vector and calculated the plane that determined by B-spline fitting as a boundary, we defined
will pass through the center of gravity of the point cloud the point clouds above the boundary as point clouds in the
data as the initial plane in the B-spline fitting. As we can tree crown region and those below the boundary as point
assume that the tree crown of a needle-leaved tree is cone clouds in the trunk region.
shaped, the density of the bottom part of the point clouds Finally, we detected the tree top as the local maximum
in the tree crown region is higher in the entire forest, and point in the region diameter D in the height direction. To
the plane to be acquired by the principal component anal- avoid this, for cases involving two or more trees within the
ysis constitutes an approximate average plane of the tree region diameter D, the tree top of a tree inferior in height
bottom faces (light gray line in Fig. 7). Fig. 8 shows an may be detected as excessively small. Therefore, we set
example of the plane estimated via principal component the region diameter D as the minimum distance between
analysis in an actual forest, from which it is clear that the trees in the forest for detecting the tree top. We extracted
plane passed slightly above the bottom part of the tree the tree top from the acquired 3D point clouds using the
crown. abovementioned processes.

Int. J. of Automation Technology Vol.15 No.3, 2021 317


Suzuki, T. et al.

Input trunk Voxel grid filter Vertical compression and Vertical

ε
pointcloud clustering decompression

Fig. 11. Flow of segmentation of trunk 3D points.

ε
D
D

Input trunk Voxel grid filter Vertical compression and Vertical


Fig. 10. Segmentation of trunk points including crown points. pointcloud clustering decompression

Fig. 12. Examples of trunk segmentation failures.

4.3. Trunk Detection


In measuring a forest using ground-based laser, tree minimum distance D between trees in the object position
trunks were detected by columnar-fitting or circular arc- measuring range as the downsampling grid size such that
fitting entire-tree 3D point clouds [26]. In measuring a adjacent trunk point clouds exhibit the largest size within
forest by a UAV, however, laser is irradiated on tree trunks the limits that prevent them from appearing in the same
at angles that are extremely small for generating sufficient grid and become filtered.
reflection intensities to acquire trunk point clouds from Subsequently, the point clouds aligned in the vertical
the sky. Moreover, because the laser beam was obstructed direction were compressed to obtain their vertical coordi-
by tree crowns, the trunk point clouds became sparser. nate values 1/d times such that they can appear within the
Hence, the abovementioned forest measuring method is radius to be determined in the same segment. We set the
not suitable for the detection of tree trunks. compression coefficient d to satisfy Eq. (1) such that the
Therefore, in this study, we segmented the entire-tree compressed vertical distance between the points is smaller
point clouds into point clouds in the tree crown and trunk than the segmentation radius ε and the horizontal distance
regions. Additionally, we segmented the point clouds in between points is larger than ε .
the trunk region into one segment per trunk based on Eu-
D
clidean cluster expansion [27]. Euclidean cluster expan- <ε <D
sion is a representative clustering method for classifying d
adjacent point clouds within radius ε into the same seg- D
∴ d> . . . . . . . . . . . . . . (1)
ment. As described in the preceding section, the tree ε
crown point clouds were separated from the trunk point The Euclidean cluster extraction of vertically com-
clouds by setting the B-spline curvilinear surface as their pressed point clouds enables us to segment the trunk point
boundary. In cases where some trees were extremely low clouds sparsely in the height direction into the same seg-
compared with the neighbors, the tree’s bottom face might ment. After the trunk point clouds were segmented, they
not completely correspond to the boundary curvilinear were decompressed such that their vertical coordinate val-
surface because the tree crown point clouds remained in ues would be d-times and trunk detection can be com-
the trunk region. Fig. 10 shows the tree crown point pleted.
clouds remaining in the trunk region; point clouds in the In the proposed method, it is important to segment the
tree crown and trunk regions are indicated in gray and point clouds into point clouds in the tree crown and trunk
black, respectively. regions in advance. Fig. 12 shows an example of seg-
Therefore, in this study, exploiting the characteristics of menting the entire-tree point clouds using the proposed
needle-leaved trees wherein their trunks grow vertically, method. To detect the trunks by downsampling them with
we processed the point cloud density in the height direc- a voxel and by vertically compressing them without sep-
tion such that they are virtually dense; consequently, the arating the tree crown point clouds from the trunk point
adjacent trunks can be separated appropriately into differ- clouds in advance, some point clouds will remain be-
ent segments. Fig. 11 shows the flow of the proposed seg- tween adjacent trunks after vertical compression; there-
mentation method. First, to prevent the horizontally dense fore, the segmentation will not separate the trunks from
tree crown point clouds that remained in the trunk region each other. Trunks can be accurately detected by separat-
from being combined into the same segment with the ad- ing point clouds in the tree crown region from those in the
jacent trunk segment, we downsampled the point clouds trunk region, as well as by reducing the tree crown point
in the trunk region using a voxel grid filter. We used the clouds remaining in the trunk region.

318 Int. J. of Automation Technology Vol.15 No.3, 2021


Forest Data Collection by UAV Lidar-Based 3D Mapping:
Segmentation of Individual Tree Information from 3D Point Clouds

4.4. Estimate Numbers, Locations, and Vertical


Heights of Trees
We estimated the number, location, and vertical height
of trees by combining the detection results of the tree
tops and trunk segments. First, the tree top was de-
signed to correspond to the trunk segment by searching
the N-dimensional nearest neighboring point through the
Delauny division [28]. Subsequently, we calculated the
number of trees by assuming that the same tree is repre-
sented if the horizontal coordinate residuals of the tree
crown and trunk segment are within the minimum dis-
tance between trees D. Furthermore, if the tree top or
trunk segment is detected from one tree, then it is detected
as a single tree. Hence, we can detect trees of inferior
height whose tops are obstructed by the tree crowns of Fig. 13. UAV flight path and reference data acquired using
other trees or trees whose trunk point clouds cannot be ground-based laser scanner.
acquired in the dense tree area.
As the horizontal position of a tree can be defined by
its breast-height position at 1.2 m above the ground, if the
trunk segment has been successfully detected, the nearest
point at 1.2 m above the ground should be selected among
the point clouds of the detected trunk segment as the tree’s
estimated position. In this study, however, by assuming
that the trees are positioned nearly upright, we regarded
the center of gravity of the trunk segment point clouds as
the tree’s horizontal position.
As the vertical height is defined by the vertical distance
from the top of the tree to the ground surface, we calcu-
lated the vertical distance from the point determined as
the top of each tree to the terrain polygon to determine
the vertical height of the tree.
Using the abovementioned processes, we extracted the
numbers, locations, and vertical heights of trees from for-
est 3D point clouds acquired by a small UAV equipped
with lidar.

5. Experiments Fig. 14. Examples of 3D point cloud of single tree acquired


using proposed UAV 3D mapping system.
5.1. Overview
On September 5, 2017, we conducted forest measur-
ing experiments in the mountainous area of Mount Kan- gion in Fig. 13. Ground-based laser can be used to mea-
muri in Hatsukaichi City, Hiroshima Prefecture. Fig. 13 sure trees with the following accuracies: 0.01 and 1.1 m
shows an aerial photograph of the experimental environ- in position and vertical height accuracies, respectively.
ment. The measurement object was an artificial forest of
Japanese cedar trees that were measured approximately
5.2. Experimental Results
25 m in vertical height; they were planted around 1965
and qualitatively thinned by approximately 600 trees/ha. 5.2.1. 3D Point Clouds
In a measurement area of approximately 200 m × 200 m, Figure 13 shows the laser point clouds acquired in
the UAV automatically flew approximately 60 m above the experiments (indicated by white points). The num-
the ground at a speed of 2.0 m/s for 9 min and 51 s on ber of acquired point clouds was approximately 30 mil-
the path indicated in white in Fig. 13. For the refer- lion, and the point cloud density was approximately
ence data pertaining to the measured trees, we used the 1,300 points/m2 . Fig. 14 shows an example of a 3D point
locations, numbers, and vertical heights measured using cloud of a single tree extracted from the entire point cloud.
ground-based laser in 2014 as true values in the evaluation The tree crown point clouds were extremely dense com-
experiments. We evaluated the proposed method by com- pared with the trunk point clouds. In regions with dense
paring the UAV-airborne measurements with those for an trees, the point clouds of the trunk below the tree crown
area of 100 m × 100 m, indicated in the black square re- and those of ground were sparse. It was difficult to apply

Int. J. of Automation Technology Vol.15 No.3, 2021 319


Suzuki, T. et al.

and from both the tree top and the trunk, as in the pro-
posed method. Fig. 17 shows the tree locations measured
via ground lidar, the tree top positions, and the trunk po-
sitions detected using the proposed method.
Among 514 trees within an area measuring 100 m ×
100 m, approximately 60 trees were undetected by the es-
timations from the tree top and trunk information. Mean-
while, the proposed method undetected 28 trees, which
represents an improved detection rate. Among the total
number of trees within the reference area, 94.6% was cor-
rectly detected. The undetected trees were either trees of
Fig. 15. Detection of tree top from crown point cloud. inferior height whose crowns were positioned lower than
the surrounding trees, or trees that were located lower than
the boundary set for separating the tree top from the trunk
and whose top point clouds remained in the trunk region.
Most of the trees undetected from both the tree top and
trunk segments by the proposed method were short and
obstructed by other trees; therefore, their point clouds
were not sufficiently acquired. Approximately 2.7% of
the total trees were misdetected; this occurs when the tree
crown point clouds are not sufficiently acquired in the
sparse point cloud region and other point clouds within
the local searching distance D [m] are scarce.
Fig. 16. Extraction and segmentation results of tree trunk
point cloud.
5.3.2. Location
Among the three trees whose trunk segments were un-
any cylindrical model as used in the ground laser mea- detected in the evaluation area, the tree top of one tree
surements to trunk point clouds that were discontinuous was detected; hence, we were able to calculate its loca-
owing to the different tree densities. tion. Excluding one tree whose location has been calcu-
lated from the tree top, the proposed method utilized the
trunk segment’s position. We compared the 3D tree lo-
5.2.2. Detection Results of Tree Tops and Trunks cations with the true values acquired by the ground laser
Figure 15 shows the tree top point clouds detected us- to obtain a locational root mean square error (RMSE) of
ing the local maximization method after separating the 0.3 m.
point clouds by the B-spline curvilinear surface that con-
stitutes a boundary between the tree crown and trunk re- 5.3.3. Vertical Height
gions. The minimum distance between trees D in the veri- We only evaluated the vertical heights of trees whose
fication area was 1.0 m. Fig. 16 shows the point clouds in tops were detected. As the ground laser measured them
the trunk region created by removing the tree crown point as a height above sea level, including the geoid height,
clouds and terrain point clouds, followed by segmentation and the measurement values were calculated as an ellip-
performed using the proposed method. Because the max- soidal height on the GPS coordinate system, we added the
imum diameter of the trunks to be measured was 0.7 m, geoid height of the measurement area to the measurement
we set the radius ε of the distance between points to be values to obtain the height above sea level. Fig. 18 shows
placed in the same segment as 0.8 m and the depth d to the histogram of the vertical height residuals between the
10. The abovementioned detection results prove that the estimated height hest and the actual measured height href .
proposed method can detect tree tops and trunks from 3D The average height residual was 1.34 m. The standard
point clouds with uneven densities. deviation was 0.94 m, and the RMSE was 2.28 m.

5.3. Evaluations
6. Discussion
5.3.1. Number of Trees
We evaluated the detection of number of trees in terms The 3D point clouds acquired by the small UAV were
of the following three criteria: correct detection, in which dense for the tree crown point clouds and sparse for the
the number of trees is correctly detected; misdetection, in trunk point clouds because the laser beam was obstructed
which trees that are not present are detected as trees; un- by the tree crowns. The proposed method segmented the
detected, in which trees that are present are not detected. acquired 3D point clouds into tree crown, trunk, and ter-
Table 1 shows the evaluation results for the detection of rain data, extracted tree information from each dataset,
trees from the tree top only, from the trunk segment only, and combined such tree information to obtain improved

320 Int. J. of Automation Technology Vol.15 No.3, 2021


Forest Data Collection by UAV Lidar-Based 3D Mapping:
Segmentation of Individual Tree Information from 3D Point Clouds

Table 1. Number of detected trees and detection rates.

Proposal method Detection


Number of trees Reference Top Trunk (Top and Trunk) rate [%]
Total 514 464 482 510 99.2
Correct detection – 458 474 486 94.6
Undetection – 56 60 28 5.4
Misdetection – 6 8 14 2.7

䕿 ZĞĨĞƌĞŶĐĞ 㽢 dŽƉ 䠇 dƌƵŶŬ from many different angles.


䢷䢲 The proposed method can be used to estimate tree loca-
tions with an accuracy sufficient for forest management.
䢶䢲
As the trunk point clouds are low in density, we calculated
䢵䢲
the tree locations from the center of gravity of the point
clouds segmented as trunk point clouds. If we can im-
䢴䢲 prove the density of 3D point clouds as described above,
then the location estimation accuracy can be improved by
䢳䢲
applying a trunk model, which can also be used for bent
trees. Furthermore, improvements in the point cloud den-
䣛䣯


sity will enable us to estimate the tree trunk shapes, i.e.,
䢯䢳䢲 a task that used to be difficult to perform without using
ground-based lidar.
䢯䢴䢲
The low accuracy in measuring the vertical heights of
䢯䢵䢲 trees may be attributed to the reference data acquired by
ground laser, which often measures the vertical heights
䢯䢶䢲 at positions lower than the tree tops when it cannot ob-
tain their unobstructed views. Moreover, as the reference
䢯䢷䢲
䢯䢷䢲 䢯䢶䢲 䢯䢵䢲 䢯䢴䢲 䢯䢳䢲 䢲 䢳䢲 䢴䢲 䢵䢲 䢶䢲 䢷䢲 data represent the actual values measured more than three
䣚䢢䣯
years ago, the Japanese cedar trees subject to be measured
in this study appeared to have grown taller. Japanese cedar
Fig. 17. Comparison of estimated locations of trees using
proposed method. tree, which varies significantly with soil conditions, forest
age, tree density, and tree height, is estimated to grow by
0.4–1.3 m annually. Hence, the vertical heights measured
using the proposed method in 2017 were 1.34 m higher
(on average) than those measured using ground laser in
2014.
In 2014, an entire woodland of approximately 6 ha in
area, which included the area investigated in this study,
was measured; this activity was performed by two per-
sons and required two days to complete. Generally, the
work efficiency in measuring forests by laser is approx-
imately 2.5 ha per day. Measuring efficiency on ground
depends on the environment; in particular, on a sloped
ground, the work efficiency declines significantly. The
proposed method for measuring forests by a UAV suc-
cessfully measured the measurement area (1 ha) by fly-
m
ing for approximately 15 min, which would otherwise re-
Fig. 18. Histogram of vertical height residuals. quire half a day. Compared with the ground-based laser
measurement method, using a UAV for the measurements
reduced the required operation hours by approximately
24 times. Hence, it is demonstrated that the proposed
detection rates. Because most of the undetected tree point method is superior in terms of work efficiency and pro-
clouds were for short trees that were obstructed by other ductivity.
trees and remained undetected, the density of the 3D point
clouds can be increased and the detection accuracy fur-
ther increased by lowering the UAV’s flying altitude and
setting its flight path such that lidar data can be acquired

Int. J. of Automation Technology Vol.15 No.3, 2021 321


Suzuki, T. et al.

7. Conclusion [11] S. Kameyama and K. Sugiura, “Estimating Tree Height and Vol-
ume Using Unmanned Aerial Vehicle Photography and SfM Tech-
nology, with Verification of Result Accuracy,” Drones, Vol.4, No.2,
Herein, we proposed a system to efficiently measure 19, 2020.
detailed tree information in a forest environment using a [12] S. Krause, T. G. Sanders, J.-P. Mund, and K. Greve, “UAV-based
photogrammetric tree height measurement for intensive forest mon-
UAV equipped with lidar. Forest measurements are pri- itoring,” Remote Sensing, Vol.11, No.7, 758, 2019.
marily aimed at estimating tree volumes for thinning and [13] D. Panagiotidis, A. Abdollahnejad, P. Surovỳ, and V. Chiteculo,
harvesting, as well as to clarify the boundaries by tree “Determining tree height and crown diameter from high-resolution
UAV imagery,” Int. J. of Remote Sensing, Vol.38, No.8-10,
locations. We proposed a method for extracting the lo- pp. 2392-2410, 2017.
cations, numbers, and vertical heights of trees from 3D [14] K. Liu, X. Shen, L. Cao, G. Wang, and F. Cao, “Estimating for-
est structural attributes using UAV-LiDAR data in Ginkgo planta-
point clouds acquired using a UAV equipped with lidar. tions,” ISPRS J. of Photogrammetry and Remote Sensing, Vol.146,
The point clouds acquired by a small UAV were dense pp. 465-482, 2018.
for the tree crowns and sparse for the trunks. This oc- [15] O. Mian, J. Lutes, G. Lipa, J. Hutton, E. Gavelle, and S. Borgh-
ini, “Direct georeferencing on small unmanned aerial platforms
curred because the laser beam was obstructed by the tree for improved reliability and accuracy of mapping without the need
crowns, rendering it difficult to extract single-tree infor- for ground control points,” The Int. Archives of Photogrammetry,
Remote Sensing and Spatial Information Sciences, Vol.40, No.1,
mation from 3D point clouds acquired by the UAV, un- pp. 397-402, 2015.
like the 3D point clouds acquired by the conventional [16] D. Yin and L. Wang, “Individual mangrove tree measurement us-
ing UAV-based LiDAR data: Possibilities and challenges,” Remote
ground-based laser because of the large difference in char- Sensing of Environment, Vol.223, pp. 34-49, 2019.
acteristics between them. Therefore, in this study, we [17] A. P. Dalla Corte, F. E. Rex, D. R. A. de Almeida, C. R. Sanquetta,
segmented the forest 3D point clouds into three regions: C. A. Silva, M. M. Moura, B. Wilkinson, A. M. A. Zambrano,
E. M. da Cunha Neto, H. F. Veras et al., “Measuring individual tree
tree crown, trunk, and terrain, with different point cloud diameter and height using GatorEye High-Density UAV-Lidar in an
densities. Subsequently, we processed the three regions integrated crop-livestock-forest system,” Remote Sensing, Vol.12,
No.5, 863, 2020.
separately to extract the target information. We com- [18] A. Yamaba, T. Sano, Y. Watanabe, and S. Futatsuya, “Extraction of
pared the measurement results in the actual forest envi- forested terrain and tree height by using a 3D laser scanner mounted
on a drone in comparison to TLS,” J. of the Japan Forest Engineer-
ronment obtained using the ground laser and the proposed ing Society, Vol.33, No.3, pp. 169-174, 2018.
method. It was discovered that the proposed method cor- [19] T. Suzuki, T. Akehi, T. Massubuchi, and Y. Amano, “Attitude De-
rectly detected 94.6% of the 514 trees present in an area termination using Single Frequency GNSS Receivers for Small
UAVs,” Proc. of the Int. Conf. on Advanced Mechatronics (ICAM)
of 100 m × 100 m. The RMSEs of the tree locations and 2015, p. 112, 2015.
vertical heights were 0.3 and 2.3 m, respectively. This [20] T. Suzuki, D. Inoue, and Y. Amano, “Robust UAV Position and At-
titude Estimation using Multiple GNSS Receivers for Laser-based
proves that the proposed method measured single-tree in- 3D Mapping,” Proc. of the 2019 IEEE/RSJ Int. Conf. on Intelligent
formation with an accuracy sufficient for forest manage- Robots and Systems (IROS), pp. 4402-4408, 2019.
ment. [21] K. Zhang, S.-C. Chen, D. Whitman, M.-L. Shyu, J. Yan, and C.
Zhang, “A progressive morphological filter for removing nonground
measurements from airborne LIDAR data,” IEEE Trans. on Geo-
science and Remote Sensing, Vol.41, No.4, pp. 872-882, April
2003.
References:
[1] L. Tang and G. Shao, “Drone remote sensing for forestry research [22] L. Piegl and W. Tiller, “The NURBS book,” Springer Science &
Business Media, 1996.
and practices,” J. of Forestry Research, Vol.26, No.4, pp. 791-797,
2015. [23] W. Wang, H. Pottmann, and Y. Liu, “Fitting B-spline curves to point
[2] J. Kilian, N. Haala, M. Englich et al., “Capture and evaluation of clouds by curvature-based squared distance minimization,” ACM
airborne laser scanner data,” Int. Archives of Photogrammetry and Trans. on Graphics (ToG), Vol.25, No.2, pp. 214-238, 2006.
Remote Sensing, Vol.31, pp. 383-388, 1996. [24] A. Richtsfeld, T. Mörwald, J. Prankl, M. Zillich, and M. Vincze,
[3] S. Ganz, Y. Käber, and P. Adler, “Measuring tree height with remote “Learning of perceptual grouping for object segmentation on RGB-
sensing – A comparison of photogrammetric and LiDAR data with D data,” J. of Visual Communication and Image Representation,
different field measurements,” Forests, Vol.10, No.8, 694, 2019. Vol.25, No.1, pp. 64-73, 2014.
[4] W. Xu, Z. Su, Z. Feng, H. Xu, Y. Jiao, and F. Yan, “Compari- [25] R. B. Rusu and S. Cousins, “3d is here: Point cloud library (pcl),”
son of conventional measurement and LiDAR-based measurement Proc. of the 2011 IEEE Int. Conf. on Robotics and Automation,
for crown structures,” Computers and Electronics in Agriculture, pp. 1-4, 2011.
Vol.98, pp. 242-251, 2013. [26] M. Dassot, A. Colin, P. Santenoise, M. Fournier, and T. Constant,
[5] A. Persson, J. Holmgren, and U. Soderman, “Detecting and mea- “Terrestrial laser scanning for measuring the solid wood volume, in-
suring individual trees using an airborne laser scanner,” Photogram- cluding branches, of adult standing trees in the forest environment,”
metric Engineering and Remote Sensing, Vol.68, No.9, pp. 925- Computers and Electronics in Agriculture, Vol.89, pp. 86-93, 2012.
932, 2002. [27] R. B. Rusu, “Semantic 3d object maps for everyday manipulation
[6] S. C. Popescu, R. H. Wynne, and R. F. Nelson, “Measuring indi- in human living environments,” KI-Künstliche Intelligenz, Vol.24,
vidual tree crown diameter with lidar and assessing its influence No.4, pp. 345-348, 2010.
on estimating forest volume and biomass,” Canadian J. of Remote [28] M. Kolahdouzan and C. Shahabi, “Voronoi-based k nearest neigh-
Sensing, Vol.29, No.5, pp. 564-577, 2003. bor search for spatial network databases,” Proc. of the 30th Int.
[7] W. Yao, P. Krzystek, and M. Heurich, “Tree species classification Conf. on Very large data bases, Vol.30, pp. 840-851, 2004.
and estimation of stem volume and DBH based on single tree ex-
traction by exploiting airborne full-waveform LiDAR data,” Remote
Sensing of Environment, Vol.123, pp. 368-380, 2012.
[8] C. Zhang, Y. Zhou, and F. Qiu, “Individual tree segmentation from
LiDAR point clouds for urban forest inventory,” Remote Sensing,
Vol.7, No.6, pp. 7892-7913, 2015.
[9] T. Takahashi, K. Yamamoto, Y. Senda, and M. Tsuzuku, “Estimat-
ing individual tree heights of sugi (Cryptomeria japonica D. Don)
plantations in mountainous areas using small-footprint airborne Li-
DAR,” J. of Forest Research, Vol.10, No.2, pp. 135-142, 2005.
[10] T. Takahashi, K. Yamamoto, Y. Senda, and M. Tsuzuku, “Predict-
ing individual stem volumes of sugi (Cryptomeria japonica D. Don)
plantations in mountainous areas using small-footprint airborne Li-
DAR,” J. of Forest Research, Vol.10, No.4, pp. 305-312, 2005.

322 Int. J. of Automation Technology Vol.15 No.3, 2021


Forest Data Collection by UAV Lidar-Based 3D Mapping:
Segmentation of Individual Tree Information from 3D Point Clouds

Name: Name:
Taro Suzuki Atsushi Yamaba

Affiliation: Affiliation:
Chief Researcher, Future Robotics Technology Chief Researcher, Forestry Research Center, Hi-
Center, Chiba Institute of Technology roshima Prefectural Technology Research Insti-
tute

Address: Address:
2-17-1 Tsudanuma, Narashino, Chiba 275-0016, Japan 4-6-1 Toukaichi-higashi, Miyoshi, Hiroshima 728-0013, Japan
Brief Biographical History: Brief Biographical History:
2015- Assistant Professor, Waseda University 1996- Forestry Technical Staff, Hiroshima Prefectural Government
2019- Future Robotics Technology Center, Chiba Institute of Technology 2000 Received Ph.D. from Hiroshima University
Main Works: 2004- Researcher, Hiroshima Prefectural Forestry Research Center
• T. Suzuki, “Time-Relative RTK-GNSS: GNSS Loop Closure in Pose Main Works:
Graph Optimization,” IEEE Robotics and Automation Letters, Vol.5, No.3, • “Forest Ownership Patterns Impacting on Landscape Structure of
pp. 4735-4742, July 2020. Vegetation in a Mountainous Farm Village, Western Japan,” S. K. Hong
Membership in Academic Societies: and N. Nakagoshi (Eds.), “Landscape Ecology for Sustainable Society,”
• Japan Society of Mechanical Engineers (JSME) pp. 309-319, Springer, 2017.
• Robotics Society of Japan (RSJ) Membership in Academic Societies:
• Society of Instrument and Control Engineers (SICE) • Japanese Forest Society (JFS)
• Japan Forest Engineering Society (JFES)

Name:
Shunichi Shiozawa Name:
Yoshiharu Amano
Affiliation:
Division Manager, Terra Drone Corporation Affiliation:
Professor, Department of Applied Mechanics
and Aerospace Engineering, Waseda University

Address:
2-14-13 Shibuya, Shibuya-ku, Tokyo 150-0002, Japan Address:
Brief Biographical History: 17 Kikui-cho, Shinjuku-ku, Tokyo 162-0044, Japan
2016 Received Bachelor of Engineering degree from Waseda University Brief Biographical History:
2018 Received Master of Engineering degree from Waseda University 2000-2002 Assistant Professor, Research Institute for Science and
2018- Terra Drone Corporation Engineering, Waseda University
2002-2007 Associate Professor, Research Institute for Science and
Engineering, Waseda University
2008- Professor, Faculty of Science and Engineering, Waseda University
2008- Visiting Professor, École Polytechnique Fédérale de Lausanne
(EPFL)
Main Works:
• Development of automated energy management system for smart grid
through digital communication standards
• Development of autonomous sensing systems of 3D environment for
smart society
Membership in Academic Societies:
• Society of Instrument and Control Engineers (SICE)
• American Society of Mechanical Engineers (ASME)
• Japan Society of Mechanical Engineers (JSME)
• Japan Forest Engineering Society (JFES)

Int. J. of Automation Technology Vol.15 No.3, 2021 323

Powered by TCPDF (www.tcpdf.org)

You might also like