Professional Documents
Culture Documents
Fusion 15
Fusion 15
Fusion 15
fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/JSEN.2015.2478655, IEEE Sensors Journal
AbstractImage fusion is a process of generating a more informative image from a set of source images. Major applications
of image fusion are in navigation and military. Here infrared and
visible sensors are used to capture complementary images of the
targeted scene. The complementary information of these source
images has to be integrated into a single image using some fusion
algorithms.The aim of any fusion method is to transfer maximum
information from the source images to the fused image with
minimum information loss. It has to minimize the artifacts in the
fused image. In this context, we propose a new edge preserving
image fusion method for infrared and visible sensor images.
Anisotropic diffusion is used to decompose the source images into
approximation and detail layers. Final detail and approximation
layers are calculated with the help of Karhunen-Loeve (KL)
transform and weighted linear superposition respectively. Fused
image is generated from the linear combination of final detail and
approximation layers. Performance of the proposed algorithm
is assessed with the help of petrovic metrics. Results of the
proposed algorithm are compared with the traditional and recent
image fusion algorithms. Results reveal that proposed method
outperforms the existing methods.
Index TermsAnisotropic diffusion, KL-transform, Edge preserving, Imaging sensors.
I. I NTRODUCTION
INGLE sensor image capture is insufficient to provide
complete information about a targeted scene because of
the sensor system limitations. Indeed multiple image captures
using a single sensor or multiple sensors are essential to
know in-detail knowledge about the scene. However the entire
information from several captures should be integrated into a
single image. Image fusion is a phenomenon of merging useful
information from several captures of a particular scene into a
single image such that the fused image provides more knowledge about the scene than individual source captures. Digital
photography [24], remote sensing [2], concealed weapon detection [5], helicopter navigation aid [22], medical imaging,
navigation and military [11], [14], [9], [15], [8] are the various
areas in need of image fusion.
In all these applications, image fusion is used to extract
more information that is not available in individual source
images. In particular, IR-visible image fusion plays a vital
role in military and navigation applications (for accurate target
identification). Due to bad weather circumstances after rain,
during winter etc., the images captured using visible sensors
alone are not sufficient to provide the information about a situation. Visible sensors can provide background like vegetation
and soil in great details. In contrast infrared sensors provide
information about the foreground like weapons, enemy, and
vehicle movements. For detection and localization of a target,
to improve the situational awareness information from both
infrared and visible images are needed to be fused into a single
image.
Several fusion methods are implemented so far in literature.
Among them multi-scale decomposition methods (pyramid,
wavelet) [16], [22] and data driven methods [13], [12] are the
most successful methods. But these methods may introduce
artifacts into the fused image. To overcome these problems
optimization based fusion schemes [24], [25] are proposed.
These methods take multiple iterations to find the optimal
solution (fused image). These optimization methods may over
smooth the fused image because of multiple iterations.
In addition, edge preserving image fusion schemes are
becoming popular these days. These methods use edge preserving smoothing filtering/process for the purpose of fusion.
Popular methods in this class are guided image filter [10],
weighted least square filter [3], [6], bilateral filter [20], cross
bilateral filter [8], 3-D anisotropic diffusion [7] based methods.
Most of these methods decompose each source image into base
and detail layers. Either manipulated base layer or manipulated
detail layer or both manipulated layers together are combined
to get the fused image. Bilateral filter and cross bilateral filter
fusion methods produces gradient reversal artifacts in the fused
image whereas guided image fusion method produces halo
effects in the fused image.
An efficient image fusion scheme should possess three
properties.
1) It has to transfer most of the useful information from
source images into the fused image.
2) It should not loose useful information of source imagery
in the fusion process.
3) It should not introduce any artifacts or extra information
into the fused image.
A new anisotropic diffusion based image fusion (ADF)
using KL-transform is proposed to address the problems of the
existing methods and also by keeping the above properties in
mind. Each source image is filtered using anisotropic diffusion
process to extract base and detail layers. Useful information
from base and detail layers are integrated into the fused image.
1530-437X (c) 2015 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/JSEN.2015.2478655, IEEE Sensors Journal
Base layers
Source images
Anisotropic
Diffusion
(1)
where, c (x, y, t) = Flux function or rate of diffusion, = Laplacian operator, = Gradient operator, t
= Time or scale or iteration.
We can also term (1) as heat equation. Forward-time-centralspace (FTCS) scheme is used to solve this equation. The
solution for this PDE is
=
t
t
N Ii,j
S Ii,j
+ cN
+ cS
+
t
t
.
cE E I + cW W I
i,j
(2)
i,j
t+1
is the coarser resolution image at t + 1 scale
In (2), Ii,j
t
which depends on the previous coarser scale image Ii,j
. is
N ,
S,
E
a stability constant satisfying 0 1/4 .
W are the nearest- neighbor differences in north, south,
and
east and west directions respectively. They are defined as
Fused
Image
_
+
Detail
layers
Detail layers
fusion
(3)
,
cEi,j = g
(I)i,j+1/2
= g E Ii,j
t
t
W Ii,j
.
ctWi,j = g
(I)i,j1/2
= g
kIk
k
1+
) .
g (I) =
kIk
k
2 .
(5)
(6)
t
Ii,j
Linear
Combination
t+1
Ii,j
Base layers
fusion
METHODOLOGY
Let the source images {In (x, y)}n=1 are of size p q and
all are co-registered. These images are passed through the
edge preserving smoothing anisotropic diffusion process for
obtaining base layers.
Bn (x, y) = aniso (In (x, y)) ,
(7)
where Bn (x, y) is the n-th base layer and aniso (In (x, y))
represents the anisotropic diffusion process on the n-th source
image. Refer section II for more details of the anisotropic
diffusion. Detail layers are obtained by subtracting base layers
from the source images.
1530-437X (c) 2015 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/JSEN.2015.2478655, IEEE Sensors Journal
(max =
max (1 , 2 )). If max is the eigen
vector corresponding to the max then KL1 and KL2
are given by
max (1)
max (2)
KL1 = X
, KL2 = X
.
max (i)
max (i)
i
(a)
(b)
(c)
(9)
(10)
(d)
(e)
D (x, y) =
(f)
Fig. 2. (a) IR image (b) Visible image (c) Base layer of IR image (d) Base
layer of visible image (e) Detail layer of IR image (f) Detail layer of visible
image.
(8)
N
X
KLn Dn (x, y) .
(11)
n=1
B (x, y) =
N
X
wn Bn (x, y) ,
(12)
n=1
where,
wn = 1 and 0 wn 1.
(13)
(a)
(b)
Fig. 3. (a) Final detail layer (11), (b) Final base layer (12) of IR-visible
image data set.
1530-437X (c) 2015 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/JSEN.2015.2478655, IEEE Sensors Journal
0.5
0.9
0.4
0.8
0.3
LXY/F
QXY/F
0.7
0.2
0.6
0.5
0.1
10
15
20
25
30
35
40
10
15
20
25
30
35
40
25
30
35
40
IRvisible
MMWvisible
0.01
0.008
0.006
x 10
NXY/F
NXY/F
0.004
0.002
0
10
15
20
25
30
35
40
10
15
Fig. 4. Analysis of Q
XY
F
,L
XY
F
,N
XY
F
XY
and Nk F
for k change.
0.7
0.9
0.6
0.5
0.4
LXY/F
QXY/F
0.8
0.7
0.3
0.6
0.2
0.5
0.4
0.1
0
10
0.008
0.006
10
15
x 10
0.004
2
1
10
15
,L
15
0.01
0.002
XY
F
10
IRvisible
MMWvisible
NXY/F
NXY/F
15
Fig. 5. Analysis of Q
20
XY
F
,N
XY
F
XY
and Nk F
for t change.
0.7
0.9
0.6
0.5
0.4
LXY/F
QXY/F
0.8
0.7
0.3
0.6
0.2
0.5
0.4
0.1
0
0.05
0.1
0.15
0.2
0.25
0.05
0.1
0.05
0.1
IRvisible
MMWvisible
0.15
0.2
0.25
0.15
0.2
0.25
0.008
0.006
0.004
0.002
0
Fig. 6. Analysis of Q
XY
F
,L
x 10
NXY/F
NXY/F
0.01
XY
F
0.05
,N
XY
F
0.1
0.15
0.2
0.25
XY
and Nk F
for change.
1530-437X (c) 2015 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/JSEN.2015.2478655, IEEE Sensors Journal
(a)
(b)
(c)
(d)
(e)
(f)
(g)
(h)
Fig. 7. (a) IR image. (b) Visible image. (c) [9]. (d) [21]. (e) [15]. (f) [8]. (g) [10]. (h) ADF method.
(a)
Fig. 8. Fusion score
data set.
XY
Q F
(b)
, Fusion loss
XY
L F
, Fusion artifacts N
(c)
XY
F
(d)
XY
F
(a)
(b)
(c)
(d)
(e)
(f)
(g)
(h)
Fig. 9. (a) Visible image. (b) MMW image. (c) [9]. (d) [21]. (e) [15]. (f) [8]. (g) [10]. (h) ADF method.
C. Fusion metrics
Gradient based objective fusion metrics termed as petrovic
metrics are considered for in depth evaluation of the proposed
1530-437X (c) 2015 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/JSEN.2015.2478655, IEEE Sensors Journal
(a)
(b)
XY
F
, Fusion loss L
XY
F
(c)
, Fusion artifacts N
XY
F
(d)
XY
TABLE I
S UMMATION OF PETROVIC METRICS (14), (15) FOR IR AND VISIBLE IMAGES .
(14)
(15)
[9]
1.0323
1.0000
[21]
1.0093
1.0000
[15]
1.0088
1.0000
[8]
1.2293
1.0000
[10]
1.0077
1.0000
ADF
1.0042
1.0000
TABLE II
S UMMATION OF PETROVIC METRICS (14), (15) FOR MMW AND VISIBLE IMAGES .
(14)
(15)
[9]
1.0226
1.0000
[21]
1.0017
1.0000
[15]
1.0010
1.0000
XY
F
+L
XY
F
+N
XY
F
= 1.
XY
F
+L
XY
F
XY
+ Nk F = 1.
[10]
1.0020
1.0000
ADF
1.0003
1.0000
(14)
[8]
1.0340
1.0000
(15)
For more details about these fusion metrics one may refer
XY
[23], [19], [18]. Note that for better performance Q F value
XY
XY
XY
should be high and L F , N F , Nk F values should be low.
D. Effect of free parameters on the ADF method
Here, with the help of petrovic fusion metrics the effect of
free parameters on the performance of the ADF method is
evaluated . In this method, each source image is filtered using
the anisotropic diffusion process. The degree of smoothing
depends on the parameters k, t and . While inspecting the
effect of k we set t = 10, = 0.15. When examining the
influence of t on ADF rest of the parameters are taken as
= 0.15 and k = 30. Similarly, for , free parameters
considered as t = 10, k = 30. Effect of k on ADF algorithm
for both data sets (IR-visible, MMW-visible) is illustrated in
XY
XY
Fig. 4. This figure demonstrates the behavior of Q F , L F ,
XY
XY
N F and Nk F with respective to the change of k. We
XY
observe that Q F looks almost constant after k = 20 for
XY
both image data sets. Same observation is made for L F . The
XY
XY
metrics N F and Nk F are almost constant for any value of
1530-437X (c) 2015 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/JSEN.2015.2478655, IEEE Sensors Journal
TABLE III
C OMPUTATIONAL TIME IN SECONDS OF DIFFERENT IMAGE FUSION ALGORITHMS ON BOTH IR- VISIBLE AND MMW- VISIBLE DATA SETS .
Data set
IR-visible
MMW-visible
[9]
1.3868
2.4166
[21]
0.29026
0.3572
[15]
0.2504
0.4149
[10]
0.2840
0.5643
[8]
15.3090
31.4187
ADF
0.6629
1.2446
R EFERENCES
[1] . www.imagefusion.org.
[2] Shaohui Chen, Renhua Zhang, Hongbo Su, Jing Tian, and Jun Xia. Sar
and multispectral image fusion using generalized ihs transform based
on a trous wavelet and emd decompositions. IEEE Sensors Journal,
10(3):737745, 2010.
[3] Zeev Farbman, Raanan Fattal, Dani Lischinski, and Richard Szeliski.
Edge-preserving decompositions for multi-scale tone and detail manipulation. In ACM Transactions on Graphics (TOG), volume 27, page 67.
ACM, 2008.
[4] Rafael C Gonzalez. Digital image processing. Pearson Education India,
2009.
[5] A. Jameel, A. Ghafoor, and M.M. Riaz. Adaptive compressive fusion for
visible/ir sensors. IEEE Sensors Journal, 14(7):22302231, July 2014.
[6] Yong Jiang and Minghui Wang. Image fusion using multiscale edgepreserving decomposition based on weighted least squares filter. IET
Image Processing, 8(3):183190, 2014.
[7] Fatih Kahraman, C Deniz Mendi, and M Gokmen. Image frame fusion
using 3d anisotropic diffusion. In 23rd International Symposium on
Computer and Information Sciences, ISCIS08., pages 16. IEEE, 2008.
[8] BK Shreyamsha Kumar. Image fusion based on pixel significance using
cross bilateral filter. Signal, Image and Video Processing, pages 112,
2013.
[9] BK Shreyamsha Kumar. Multifocus and multispectral image fusion
based on pixel significance using discrete cosine harmonic wavelet
transform. Signal, Image and Video Processing, 7(6):11251143, 2013.
[10] Shutao Li, Xudong Kang, and Jianwen Hu. Image fusion with guided
filtering. IEEE transactions on image processing: a publication of the
IEEE Signal Processing Society, 22(7):28642875, 2013.
[11] Shutao Li and Bin Yang. Hybrid multiresolution method for multisensor
multimodal image fusion. IEEE Sensors Journal, 10(9):15191526,
2010.
[12] Junli Liang, Yang He, Ding Liu, and Xianju Zeng. Image fusion using
higher order singular value decomposition. IEEE Transactions on Image
Processing, 21(5):28982909, 2012.
[13] David Looney and Danilo P Mandic. Multiscale image fusion using
complex extensions of emd. IEEE Transactions on Signal Processing,
57(4):16261630, 2009.
[14] Nikolaos Mitianoudis and Tania Stathaki. Optimal contrast correction
for ica-based fusion of multimodal images. IEEE Sensors Journal,
8(12):20162026, 2008.
[15] VPS Naidu. Image fusion technique using multi-resolution singular
value decomposition. Defence Science Journal, 61(5):479484, 2011.
[16] Gonzalo Pajares and Jesus Manuel De La Cruz. A wavelet-based image
fusion tutorial. Pattern recognition, 37(9):18551872, 2004.
[17] Pietro Perona and Jitendra Malik. Scale-space and edge detection
using anisotropic diffusion. IEEE Transactions on Pattern Analysis and
Machine Intelligence, 12(7):629639, 1990.
[18] Vladimir Petrovic. Multisensor pixel-level image fusion. University of
Manchester, 2001.
[19] Vladimir Petrovic and Costas Xydeas. Objective image fusion performance characterisation. In Tenth IEEE International Conference on
Computer Vision, 2005. ICCV 2005, volume 2, pages 18661871. IEEE,
2005.
[20] Georg Petschnigg, Richard Szeliski, Maneesh Agrawala, Michael Cohen,
Hugues Hoppe, and Kentaro Toyama. Digital photography with flash and
no-flash image pairs. ACM transactions on graphics (TOG), 23(3):664
672, 2004.
[21] Oliver Rockinger. Image sequence fusion using a shift-invariant wavelet
transform. In International Conference Proceedings on Image Processing., volume 3, pages 288291. IEEE, 1997.
[22] Oliver Rockinger. Multiresolution-Verfahren zur fusion dynamischer
bildfolgen. dissertation. de, 1999.
[23] Parul Shah, Shabbir N Merchant, and Uday B Desai. Multifocus and
multispectral image fusion based on pixel significance using multiresolution decomposition. Signal, Image and Video Processing, 7(1):95109,
2013.
1530-437X (c) 2015 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/JSEN.2015.2478655, IEEE Sensors Journal
[24] Rui Shen, Irene Cheng, Jianbo Shi, and Anup Basu. Generalized random
walks for fusion of multi-exposure images. IEEE Transactions on Image
Processing, 20(12):36343646, 2011.
[25] Min Xu, Hao Chen, and Pramod K Varshney. An image fusion approach
based on markov random fields. IEEE Transactions on Geoscience and
Remote Sensing, 49(12):51165127, 2011.
1530-437X (c) 2015 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.