Professional Documents
Culture Documents
Band Selection For Hyperspectral Images Based On Parallel Particle Swarm Optimization Schemes
Band Selection For Hyperspectral Images Based On Parallel Particle Swarm Optimization Schemes
Abstract— Greedy modular eigenspaces (GME) has been devel- (feature) set from hyperspectral images (HSI). Unfortunately,
oped for the band selection of hyperspectral images (HSI). GME it is hard to find the most optimal set by greedy reordering
attempts to greedily select uncorrelated feature sets from HSI. operations of GME except by exhaustive iterations. The long
Unfortunately, GME is hard to find the optimal set by greedy
operations except by exhaustive iterations. The long execution execution time of these exhaustive iterations has been the
time has been the major drawback in practice. Accordingly, major drawback in practice. Accordingly, finding a near-
finding an optimal (or near-optimal) solution is very expensive. optimal solution is very expensive.
In this study we present a novel parallel mechanism, referred Recently, a new simulated annealing band selection (SABS)
to as parallel particle swarm optimization (PPSO) band selection, [2] based on GME is reported to improve to the performances
to overcome this disadvantage. It makes use of a new particle
swarm optimization scheme, a well-known method to solve the of GME. It selects sets of non-correlated hyperspectral bands
optimization problems, to develop an effective parallel feature according to simulated annealing (SA) algorithm and utilizes
extraction for HSI. The proposed PPSO improves the computa- the inherent separability of different classes in HSI to reduce
tional speed by using parallel computing techniques which in- dimensionality. As a common technique in metallurgy, SA
clude the compute unified device architecture (CUDA) of graphics denotes the slow-cooling melt behavior in the formation of
processor unit (GPU), the message passing interface (MPI) and
the open multi-processing (OpenMP) applications. These parallel hardened metals. It is a general purpose optimization technique
implementations can fully utilize the significant parallelism of which can find optimal or near-optimal solutions. SABS is an
proposed PPSO to create a set of near-optimal GME modules optimization technique based on SA to select an optimal set
on each parallel node. The experimental results demonstrated of feature bands from HSI.
that PPSO can significantly improve the computational loads However, the long execution time of SABS has been the
and provide a more reliable quality of solution compared to
GME. The effectiveness of the proposed PPSO is evaluated by major drawback in practice. Numerous studies have been
MODIS/ASTER airborne simulator (MASTER) HSI for band devoted to parallel SA algorithms. Due to the constrain of a
selection during the Pacrim II campaign. single Markov chain (MC) needed to be adjusted in the parallel
mechanism, only a limited SA parallelism was exploited [3].
I. I NTRODUCTION Currently, a new parallel SA method, referred to as a parallel
The benefits of high dimensional spectral images makes simulated annealing (PSA) band selection approach [4], was
use of the advances of sensor technologies and a growing introduced to overcome this disadvantage. It can improve the
number of spectral bands. High-dimensional remote sensing computational speed by using parallel computing techniques.
obtained from multispectral, hyperspectral and even ultraspec- It allows multiple MC to be traced simultaneously and fully
tral sensors generally provide huge spectral information for utilizes the significant parallelism embedded in SABS to create
data analysis. It covers a tremendous of applications from a set of PSA modules on each parallel node.
satellite remote sensing imaging and surveillance monitoring In this paper we further present a novel parallel band
systems to industrial product inspections and medical imaging selection approach, referred to as parallel particle swarm
examinations. World-wide researchers all report the difficulties optimization (PPSO), for HSI. The approach is based on
regarding its intrinsic characteristics of the data complexities. the particle swarm optimization (PSO) [5] scheme. PSO is
Consequently, determining the most useful and valuable infor- a relatively new swarm intelligence algorithm originally de-
mation has become more essential. In response, a technique signed to solve the nonlinear function optimization and neural
known as greedy modular eigenspaces (GME) [1] has been network training problems [5]. It’s inspired by social behavior
developed for the band selection of hyperspectral datasets. of organisms such as bird flocking. PSO scheme is currently
GME attempts to greedily select a near-optimal unique band a most well-known method to solve the heuristic optimization
swarm [6]. These properties make PPSO a powerful method vid( t 1) u vid (t ) c1 u rand ( )u ( pid xid ) c2 u rand( )u ( p g d xid ) 0.8 1 ..... 0.4 0.9
for solving optimization problems. Compared to PSA, PPSO xid (t 1) xid (t ) vid (t 1)
..... ..... ..... ..... .....
V - 85
35
START
30 PPSO
Set c1 = c2 = 2
PSA
, u = 0.4 ,1.0002 25
cost = cost 0
DRR (%)
20
(t+1)= (t) u u t
>0,1@ 15
for t=1 to #_of_generations T
u [1.0001,1.0005] 10
VCA
No
E > 0? Pid ( t +1) =Pid ( t ) 6
Yes
4
Update Pgd
2
No i=0? 0
0.75 0.76 0.77 0.78 0.79 0.8 0.81 0.82 0.83 0.84 0.85 0.86 0.87 0.88 0.89 0.9 0.91 0.92
Yes Thresholds of CC
Pgd ( t +1) = Best of Pid ( t ) (b.) VCA comparison between PPSO and PSA
where i = {1 ... #_of_particles I}
No Fig. 3. The comparisons between PPSO and PSA for (a.) DRR and (b.)
t =0? VCA with different thresholds of CC.
Yes
END 1.3
1.25
Fig. 2. The flowchart of the proposed PPSO scheme.
1.2
CE
1.15
particle and dimension loops. The initial statuses are also 1.1
nk
Fig. 4. The efficiency comparisons between PPSO and PSA with different
cost = ml 2 × c2 , (3) thresholds of CC.
l=1
V - 86
230 230
220 220
210 210
200 200
Cost
Cost
190 190
180 180
PPSO
170 APSO 170
PEPSO
160 160 PSA
NIPSO
150 150
0 5 10 15 20 25 30 35 40 45 50 0 5 10 15 20 25 30 35 40 45 50
Number of times Number of times
Fig. 5. The cost comparisons among NIPSO, PEPSO and APSO with 50 Fig. 6. The cost comparisons between PPSO and PSA with 50 operations.
operations.
V - 87