Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

This article has been accepted for inclusion in a future issue of this journal.

Content is final as presented, with the exception of pagination.

IEEE GEOSCIENCE AND REMOTE SENSING LETTERS 1

Progressive Band Selection Processing of


Hyperspectral Image Classification
Meiping Song, Chunyan Yu, Member, IEEE, Hongye Xie, and Chein-I Chang , Life Fellow, IEEE

Abstract— This letter introduces a new approach to hyperspec- all existing BS-based HSIC techniques. One is that all the
tral image classification (HSIC), called progressive band selection BS techniques developed for HSIC are carried out in one-shot
processing of hyperspectral image classification (PBSP-HSIC), operation in the sense that all multiple classes are classified by
which performs classification in multiple stages in the sense
that each stage performs HSIC progressively according to a a classifier simultaneously at the same time, not progressively.
specifically selected band subset. Interestingly, such PBSP-HSIC The other is that the progressive classification profile resulting
offers a rare view of how different classes are classified in from PBSP-HSIC allows users to see how a class is classified
progressive stages, which has never been explored in the past. incorrectly to begin with but is eventually corrected in sub-
The experimental results also show that PBSP-HSIC performs sequently in a progressive manner or the other way around.
better than HSIC using full bands.
Such a scenario cannot be observed by one-shot classification
Index Terms— Band selection (BS), class classification priority which only shows the final classification results with no regard
(CCP), hyperspectral image classification (HSIC), p-ary Huffman to what happens during classification. As a result, when a
coding tree (HCT), progressive band selection processing of
hyperspectral image classification (PBSP-HSIC). misclassification error occurs, there is no clue to see how it
happens and which class causes such misclassification error.
I. I NTRODUCTION This is particularly important for environmental monitoring
when contaminated areas may be slowly migrated. The HSIC
M ANY new approaches to hyperspectral image classifi-
cation (HSIC) have been developed in recent years such
as spectral–spatial HSIC [1], integration of locality sensitive
currently being considered in the literature did not take this
into account.
discriminant analysis with the group sparse representation for To realize PBSP-HSIC, the main idea is to design a criterion
HSIC [2], multiscale superpixel-level subspace-based support that can measure the class classification priority (CCP) for
vector machines for HSIC [3], noise-robust HSIC via multi- each class of interest so that all the classes can be prioritized
scale total variation [4], fusion of multiple edge-preserving according to their priority scores calculated by CCP. Then,
operations for HSIC [5], subpixel detection approach to CCP can be further used to calculate CCP probabilities that
HSIC [6], mixed pixel classification approach to HSIC [7], and can be used for two purposes. One is to find a desired band
statistical detection theory approach to HSIC [8]. Since various subset for each class for its own classification. The other is to
classes generally have their own unique spectral characteristics navigate classification to be performed progressively based on
responding to different wavelengths for their classification, CCP-ranked classes along with their particularly CCP-selected
it is important to take the advantage of the bands that are band subsets.
crucial to individual classes for their classification. To address First of all, we define a criterion derived from signal-to-
this issue, this letter develops a new approach, called pro- noise ratio (SNR) to calculate CCP scores for all classes in
gressive band selection processing of hyperspectral image terms of probabilities, referred to as CCP probabilities. The
classification (PBSP-HSIC). Despite that many BS methods higher is the CCP probability assigned to a class, the higher
have been developed for HSIC, the proposed PBPS-HSIC is the priority the class to be classified. To classify various
is quite different from existing BS methods. For example, classes progressively, a p-ary Huffman code is specifically
Liu et al. [9] used spectral partitioning to perform BS, which designed to encode all the classes as codewords according
is not used in PBSP-HSIC. Most importantly, there are two to their CCP-calculated probabilities, where p is the number
key ideas of PBSP-HSIC which distinguish PBSP-HSIC from of codeword alphabets used for encoding. Thus, the longer
is the codeword assigned to a class, the lower is the pri-
Manuscript received August 11, 2019; revised October 26, 2019; accepted ority the class is to be classified. These p-ary encoded
November 11, 2019. The work of M. Song was supported by the National
Nature Science Foundation of China under Grant 61601077. The work of Huffman codewords are then used to construct a CCP-based
C. Yu was supported by the National Nature Science Foundation of Liaoning p-ary Huffman coding tree (HCT) to be used as a guide to
Province under Grant 20170540095. (Corresponding author: Chunyan Yu.) navigate progressive classification, where each layer of the
M. Song, C. Yu, and H. Xie are with the Center for Hyperspectral
Imaging in Remote Sensing (CHIRS), Information and Technology College, p-ary HCT is considered as a stage that performs classification
Dalian Maritime University, Dalian 116026, China (e-mail: smping@163.com; of classes specified by the codewords at that particular layer.
yucy@dlmu.edu.cn; 109942371@qq.com). Accordingly, classification can be carried out in a top-to-down
C.-I Chang is with the Remote Sensing Signal and Image Processing
Laboratory, Department of Computer Science and Electrical Engineering, fashion stage-by-stage using a p-ary HCT from the highest
University of Maryland, Baltimore County, Baltimore, MD 21250 USA CCP level to the lowest CCP level. The depth of the tree is
(e-mail: cchang@umbc.edu). the number of stages required for progressive classification
Color versions of one or more of the figures in this letter are available
online at http://ieeexplore.ieee.org. to perform. Since different classes are classified in different
Digital Object Identifier 10.1109/LGRS.2019.2953525 stages, each stage requires a different set of selected bands
1545-598X © 2019 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

2 IEEE GEOSCIENCE AND REMOTE SENSING LETTERS

to respond to different classes to be classified at the same words, for a layer of a constructed p-ary coding tree except
stage. To address this issue, a recently developed BS for the last layer, there are p internal nodes in that particular layer,
HSIC in [10], called class signature-constrained background each of which corresponds to one subset of classes ranked
suppressed (CSCBS) approach, is used to further develop by CCP scores, for p-class classification. The depth of the
class signature constrained band prioritization-based band coding tree is given by d = log p M. Since PBSP-HSIC is
selection (CSCBP-BS) to select appropriate bands for each progressive, the best value of p to be selected should be the
stage. Finally, a new iterative classification technique, called one that is the largest value to make the depth d as small as
iterative linearly constrained minimum variance (ILCMV) possible but also be greater than or equal to 2.
also developed in [7], was used as a classifier since it was Constructing a p-ary HCT with p > 2 is not trivial because
shown to perform better than many other existing classifiers. the p-ary HCT may not be complete in the sense that there
Therefore, ILCMV is used to derive progressive band selection may not have sufficient codewords to fill in all nodes and
processing of ILCMV (PBSP-ILCMV), which implements a leaves of the tree. In this case, unencoded nodes must be leaves
CCP-constructed p-ary HCT using bands selected by CSCBP- and should not be internal nodes.
BS to perform PBSP-HSIC. The reason that CSCBP-BS is Let q be the number of the first set of source
chosen for BS is because both ILCMV and CSCBS are derived alphabets to be combined to reduce original probabili-
(1) (1) (1)
from the same concept of LCMV [11]. ties { p1, p2 , . . . , pn } to { p1 , p2 , . . . , pn(1) }. According to
Finally, it should be noted that PBSP is very different from [13, eqs. (11.2) and (11.3), p. 338], q must satisfy the
progressive BS [12] because the latter does not possess capa- following constraints:
bility of progressively selecting bands to perform classification q ∈ {2, 3, . . . , p} and n ≡ q mod ( p − 1) (3)
of different classes via an HCT.
which means that n = m( p − 1) + q for some nonneg-
II. CCP AND p-A RY HCT C ONSTRUCTION ative integer m. Then, the number of dummy variables is
Assume that there are M classes of interest, {Ci }i=1
M and {x 1 , x 2 , . . . , x p−q }. In this case, n (1)
1 = m( p − 1) + 1. As an
μi is the sample mean of the i th class, Ci . We can use the example, for n = 6 and p = 3, n can be expressed by (3) as
global sample correlation matrix R to define a CCP criterion n = 2 · 2 + 2 with m = 2 and q = 2. This implies that the
as follows: number of dummy variables is p − q = 3 − 2 = 1.

1  T
N
ρRCCP (Ci ) = µTi R−1 µi with R = rri (1) III. PBSP-HSIC V IA A p-A RY HCT
N
i=1 In Section II, a CCP-constructed p-ary HCT can navigate
which is indeed the spectral angle between μ j and R−1 μ j . HSIC to be performed at each internal node in a progressive
Now, if we further consider R−1 μ j as the i th class suppressed manner, where CSCBP-BS developed in [10] is used to find
mean and assume that R is zero-mean and independent with an appropriate band subset for each of layers in a p-ary HCT
R = σ 2 I where σ 2 is the variance and I is the identity matrix, for stage-by-stage classification. The resulting HSIC is called
then (1) becomes µTi R−1 µi = ||R−1/2 µi ||2 = |||µi ||2 /σ 2 PBSP-HSIC which implements four key steps: 1) calculation
which turns out to be SNR. For this reason, CCP can be of CCP; 2) construction of a p-ary HCT; 3) CSCBP-BS to
considered as an extended SNR. Thus, the higher is the SNR, select appropriate band subsets; and 4) ILCMV developed
the higher is the priority. in [7] to perform multiclass classification at each stage.
Using (1), we can calculate the CCP probability of the i th Detailed step-by-step implementations of each of four steps
class, Ci , as follows: are described in Algorithm 1.
ρ CCP (Ci ) It should be noted that the key idea is (4), which takes the
piCCP = pCCP (Ci ) =  M R CCP (2) advantage of ILCMV-classification maps produced by PBSP
i=1 ρR (Ci ) to correct misclassification errors (as will be illustrated by the
which can be used to prioritize Ci according to the magnitude following experiment in Fig. 3). Fig. 1 depicts a graphical
of the mean, μi matched with its suppressed class mean, diagram of PBSP-HSIC by implementing ILCMV using a
||R−1/2 µi ||. This implies that the higher is the value of constructed p-ary HCT in conjunction with bands selected
pCCP (Ci ), the less chance is the Ci to be misclassified. By tak- by CSCBP-BS. For example, at the 1st layer corresponding
ing advantage of (2), the well-known optimal variable length to the 1st stage, three leaves highlighted by red circles are
coding method in information theory, Huffman coding [13], classified, while a large circle highlighted by blue is a subtree
is then used to construct a coding tree for progressive HSIC. representing undesired classes which will not be classified
Suppose that a source S with the source alphabet space at this particular stage. At the last stage, all the leaves are
specified by the classes of interest, {Ci }i=1 M , i.e., each class
single classes and will be classified all together to complete
specified by a source alphabet, where { pi CCP = pCCP (Ci )}i=1
M
the classification.
given in (2), is the source alphabet probabilities. Now, we
assume that the codeword alphabet space used for encoding IV. E XPERIMENTS
S is given by  p = {0, 1, 2, . . . , p − 1}. Then, each of the The image used for experiments is the Purdue Indiana
codewords assigned to {Ci }i=1 M is actually a p-ary represen- Indian Pines shown in Fig. 2(a) with its ground truth
tation. It should be noted that the “ p” determines p-class in Fig. 2(b) available on the website [14]. It is a well-known
classification at each layer of a p-ary coding tree. In other Airborne Visible Infrared Imaging Spectrometer (AVIRIS)
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

SONG et al.: PBSP-HSIC 3

Fig. 1. Graphical diagram of implementing PBSP-HSIC.

TABLE I
P RIORITY P ROBABILITIES OF CCP OF 16 C LASSES IN P URDUE ’ S I NDIAN P INES C ALCULATED BY (1–2) W ITH R I NCLUDING BKG

Algorithm 1 PBSP-HSIC
1. Use virtual dimensionality (VD) [11] to determine n BS .
2. Using (2) to construction of a p-ary HCT
3. At each layer, there are terminal nodes which are leaves
and internal nodes which are roots of subtrees. So, in this
layer only leaves which correspond to single classes will
be classified while internal nodes will be passed on to next
layer.
4. Find the mean of each class at each leave. In the mean time,
also find means of all classes in subtrees at the intermediate Fig. 2. Purdue’s Indiana Indian Pines scene. (a) Band 186 (2162.56 nm).
(b) Ground truth map.
nodes which are obtained by averaging all class means in
the subtrees.
5. At the k layer use CSCBP-BS to find an appropriate band samples. Table I tabulates the 16-class CCP probabilities
subset. { piCCP = pCCP (Ci )}16
i=1 along with their assigned priorities.
6. Apply ILCMV to perform multi-class classification Now, these calculated CCP probabilities in Table I were then
at this layer as a stage process. Let the resulting used to produce a four-ary Huffman code found in open paren-
LCMV-classification map be denoted by LCMV(k) . theses in Table I and construct a four-ary HCT in Fig. 1 with
7. Find p = 4, m = q = 4, and p − q = 0, where p = 4 was
ILCMV-map(k) chosen to be the largest value to make the depth d as small as
possible but greater than or equal to 2. Interestingly, the CCP
= LCMV-map (k) ∪ ((LCMV-map(k) )c LCMV-map(k−1) ) tabulated in Table I is proportional inversely to class size. That
(4) is, the four classes with highest priorities are those classes
where (LCMV-map(k) )c is the complement set of with sample size less than 100 as opposed to the two classes
LCMV-map(k) and ILCMV-map(1) = LCMV-map(1). with least priorities, which correspond to two largest classes,
8. Then the same process is continued on until it reaches the class 11 and class 2 with more than 1000 data samples. Now,
last layer of the p-ary HCT. these calculated CCP probabilities in Table I were then used
to produce a four-ary Huffman code to construct a four-ary
HCT in Fig. 1.
image scene and has a size of 145 × 145 pixel vectors. It was At each layer of the four-ary HCT in Fig. 1, only
recorded in June 1992 with 220 bands with including water leaves marked by red are codewords assigned to classes
absorption bands (bands 104–108, 150–163, and 220). The which will be classified at that particular layer, while the
number of bands, n BS , required to be selected for this scene yellow internal nodes are codewords representing subtrees,
was estimated by VD [15] to be 18. which will be further processed at the next layer. In this
First of all, the sample means {µi }16
i=1 of all the 16 classes case, each of the leaves corresponds to one particular class.
were used to calculate CCP probabilities via (1). It should On the other hand, the number of yellow internal nodes
be noted that there is no need of training samples for actually determines the number of times that the classification
ILCMV since ILCMV uses the class means obtained by the must be performed by ILCMV for p classes. For example,
ground truth to constrain the classes of interest not training in Fig. 1, there are five yellow nodes which determine that
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

4 IEEE GEOSCIENCE AND REMOTE SENSING LETTERS

Fig. 3. Three layers/stages processed by PBSP-ILCMV using four-ary HCT and ILCMV without PBSP for Purdue Indian Pines.

TABLE II ILCMV-map(1) = LCMV-map(1), and classes 7,9,13,8,5,15 at


BANDS S ELECTED BY CSCBP-BS FOR E ACH S UBSET OF C LASS the 2nd stage shown by LCMV-map(2) along with
OF THE P URDUE I NDIANA I NDIAN P INES
ILCMV-map(2)
= ILCMV-map(1) ∪ ((LCMV-map(2) )c ∩ LCMV-map(1) )
which combined classification results of eight classes,
{1, 16, 7, 9, 13, 8, 5, 15} and classes 6, 4, 12, 14, 10, 3, 2, 11 at
the 3rd and last stage, LCMV-map(3) along with the final
classification ILCMV-map(3) given by
ILCMV-map(3)
= ILCMV-map(2) ∪ ((LCMV-map(3) )c ∩ LCMV-map(2) ).
ILCMV must be performed progressively five times, and
Fig. 3 offers at least two immediate benefits offered by
each time classes at leaves highlighted by red are classified.
PBSP-HSIC. One is to dictate changes in classified data
At the 1st layer, two leaves specified by class 1 and class
samples. For example, in the 2nd stage, there were 11 pixels
16 assigned by codewords 0 and 1 will be first classi-
in the 3rd class C3 misclassified in class 9 as shown in a
fied, and two yellow internal nodes assigned by codewords
zoom-in window of ILCMV(2), but these misclassified were
2 and 3 represent two subtrees to specify two joint sets
later corrected as shown in the 3rd stage classification by
of classes, {7, 9, 13, 8, 5, 15} and {6, 4, 12, 14, 10, 3, 2, 11},
ILCMV(3). This scenario cannot be observed during one-shot
respectively, both of which will be further processed at
classification. Another is to allow users to calculate classifica-
the 2nd layer. At the 2nd layer, six leaves specified
tion accuracy and precision rates progressively. In order to see
by six classes, 7, 9, 13, 8, 5, 15 assigned by codewords
how it works, Table III further calculates PA (accuracy) of each
20, 21, 22, 30, 31, 32 will be classified, whereas the two yel-
of 16 classes produced by ILCMV without PBSP and PBSP-
low internal nodes assigned by codewords 23 and 33 represent
ILCMV using four-ary HCT in Fig. 3, where PBSP-ILCMV
two subtrees that specify two joint sets of classes, {6, 4, 12, 14}
produced higher PA values for every class except class 13 than
and {10, 3, 2, 11}, respectively, needed to be processed at the
their corresponding PA values produced by ILCMV without
next layer. Finally, at the 3rd layer, a total of eight leaves
PBSP.
specified by eight classes, 12, 6, 4, 14, 10, 3, 2, 11, assigned by
Table IV also calculates PSR CCP SR CCP
OA , POA , PPR , PPR , and PAA ,
codewords, 230, 231, 232, 233, 330, 331, 332, 333, with four
where PBSP-ILCMV performed better than ILCMV without
leaves from each subtree yet to be classified to complete
PBSP by producing higher values of PSR SR SR
OA , PAA , and PPR ,
PBSP-HSIC. Then, CSCBP-BS was implemented to find
where average accuracy (AA), overall accuracy (OA), and
desired bands listed in Table II with n BS = 18. Now, using the
precision (PR) are defined in [8] with weights given by sample
four-ary HCT constructed in Fig. 1 as well as bands selected
ratio (SR) and CCP probabilities.
by CSCBP-BS in Table II, both ILCMV and PBSP-ILCMV
Also, shown in Table IV, PBSP-ILCMV actually produced
were implemented where the parameters used by ILCMV are
the same as in [7]. POA better than PSR
CCP CCP
OA by 2% with PPR nearly the same as
−4
PPR by a difference of 8 × 10 . The most important results
SR
Fig. 3 describes a three-stage PBSP-ILCMV using the
four-ary HCT stage-by-stage along with ILCMV with- in Table IV are the last two columns, which calculate PCCP OA
out PBSP in one-shot operation. As shown in Fig. 3, and PCCP
PR progressively according to CCP for each of stages.
PBSP-ILCMV performed classification in three stages, where For example, in the 1st stage, PCCP CCP
OA and PPR for classifying
classes 1 and 16 were classified in the 1st stage shown by classes 1 and 16 were 99.28% and 100%, respectively, and
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

SONG et al.: PBSP-HSIC 5

TABLE III
PA C ALCULATED F ROM THE C LASSIFICATION R ESULTS OF P URDUE ’ S D ATA

TABLE IV
PSR SR SR CCP CCP
OA , PAA , PPR , POA , AND PPR C ALCULATED BY ILCMV W ITHOUT PBSP AND P ROGRESSIVE
L AYER /S TAGE - BY-L AYER /S TAGE PBSP-ILCMV FOR P URDUE ’ S D ATA

95.25% and 100% for classes 7, 9, 13, 5, 8, 15, respectively, [2] H. Yu, L. Gao, W. Li, Q. Du, and B. Zhang, “Locality sensitive dis-
for the 2nd stage. At the final stage (i.e., 3rd stage), PCCP
OA and
criminant analysis for group sparse representation-based hyperspectral
imagery classification,” IEEE Geosci. Remote Sens. Lett., vol. 14, no. 8,
PCCP
PR for classifying last eight classes, 12, 6, 4, 14, 10, 3, 2, 11, pp. 1358–1362, Aug. 2017.
were 96.25% and 99.99%, respectively. For overall [3] H. Yu, L. Gao, W. Liao, B. Zhang, A. Pižurica, and W. Philips,
performance, PCCPOA = 98.19 and PCCP PR = 99.91, which “Multiscale superpixel-level subspace-based support vector machines for
hyperspectral image classification,” IEEE Geosci. Remote Sens. Lett.,
were higher than ILCMV by 3% in POA and nearly the vol. 14, no. 11, pp. 2142–2146, Nov. 2017.
same in PPR . These experiments demonstrated the benefits [4] P. Duan, X. Kang, S. Li, and P. Ghamisi, “Noise-robust hyperspectral
of using PBSP-ILCMV for HSIC. Same experiments were image classification via multi-scale total variation,” IEEE J. Sel. Topics
Appl. Earth Observ. Remote Sens., vol. 12, no. 6, pp. 1948–1962,
also performed for other image scenes [14] such as Salinas Jun. 2019.
and University of Pavia, and their conclusions are also [5] P. Duan, X. Kang, S. Li, P. Ghamisi, and J. A. Benediktsson, “Fusion
similar. Due to limited space, their results are not included of multiple edge-preserving operations for hyperspectral image classifi-
cation,” IEEE Trans. Geosci. Remote Sens., to be published.
here.
[6] B. Xue et al., “A subpixel target detection approach to hyperspectral
V. C ONCLUSION image classification,” IEEE Trans. Geosci. Remote Sens., vol. 55, no. 9,
pp. 5093–5114, Sep. 2017.
This letter develops PBSP-HSIC which allows users to [7] C. Yu, B. Xue, M. Song, Y. Wang, S. Li, and C.-I Chang, “Iterative
progressively process classes in multiple stages, where each target-constrained interference-minimized classifier for hyperspectral
classification,” IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens.,
stage performs classification of selective classes according to vol. 11, no. 4, pp. 1095–1117, Apr. 2018.
their priorities calculated by CCP. The calculated CCP proba- [8] C.-I Chang, “Statistical detection theory approach to hyperspectral
bilities are then used to construct a p-ary HCT which further image classification,” IEEE Trans. Geosci. Remote Sens., vol. 57, no. 4,
pp. 2057–2074, Apr. 2019.
navigates PBSP-HSIC to perform stage-by-stage classification [9] Y. Liu, J. Li, P. Du, A. Plaza, X. Jia, and X. Zhang, “Class-oriented
progressively where each stage performs the classification spectral partitioning for remotely sensed hyperspectral image classifica-
of p-class subsets located at its corresponding layer of the tion,” IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens., vol. 10,
no. 2, pp. 691–711, Feb. 2017.
p-ary HCT. With the CCP-calculated probabilities factored in [10] C. Yu, M. Song, Y. Wang, and C.-I Chang, “Class signature-constrained
POA and PPR , PBSP-HSIC provides progressive classification background-suppressed approach to band selection for classification of
profiles of class subsets in multiple stages, a task which cannot hyperspectral images,” IEEE Trans. Geosci. Remote Sens., vol. 57, no. 1,
pp. 14–31, Jan. 2019.
be offered by the traditional HSIC. Although PBSP-HSIC [11] C.-I Chang, Hyperspectral Imaging: Techniques for Spectral Detection
proposed in this letter only focuses on the specific classifier and Classification. Norwell, MA, USA: Kluwer, 2003.
and BS, its idea can also be applied to other classifiers and [12] K.-H. Liu, H.-C. Chien, M.-H. Lu, and S.-Y. Chen, “Progressive sample
BS techniques as well. processing of band selection for hyperspectral image transmission,”
Remote Sens., vol. 10, no. 3, p. 367, 2018.
R EFERENCES [13] R. J. McEliece, Theory of Information Coding. Cambridge, U.K.:
Cambridge Univ. Press, 2004.
[1] P. Ghamisi et al., “New frontiers in spectral-spatial hyperspectral image [14] [Online]. Available: http://www.ehu.eus/ccwintco/index.php?title=
classification: The latest advances based on mathematical morphology, Hyperspectral_Remote_Sensing_Scenes
Markov random fields, segmentation, sparse representation, and deep [15] C.-I Chang, “A review of virtual dimensionality for hyperspectral
learning,” IEEE Geosci. Remote Sens. Mag., vol. 6, no. 3, pp. 10–43, imagery,” IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens., vol. 11,
Sep. 2018. no. 4, pp. 1285–1305, Apr. 2018.

You might also like