Steerable Wavelet Based Fusion Model For High-Resolution Remotely Sensed Image Enhancement
Steerable Wavelet Based Fusion Model For High-Resolution Remotely Sensed Image Enhancement
Steerable Wavelet Based Fusion Model For High-Resolution Remotely Sensed Image Enhancement
Steerable Wavelet Based Fusion Model for High-resolution Remotely Sensed Image
Enhancement
Abstract—High resolution image shows more detail Traditional fusion schemes include IHS[3],PCA[4][5], and
information in Panchromatic(PAN) band and more spectral so on. The multi-resolution fusion models have been widely
information in multi-spectral(MS) band. This paper aims at used recently for image fusion as the pyramid structure
developing a new Steerable Wavelet Transform-Intensity Hue responds well to human visual system. However, the
Saturation(STWT-IHS) fusion model considering the texture orthogonal wavelets is the lack of translation invariance
detail information of different direction of PAN band. The IHS especially in two-dimensional signals. The “steerable”
transform of the multi-spectral image firstly gives the pyramid wavelet overcomes the drawbacks by transforming
Intensity, Hue and Saturation bands. Thereafter, the PAN with a tight frame, which decompose signal with a class of
band and Intensity band were decomposed into different scales
arbitrary orientation filters generated by linear combination
including direct-based detail scales.Then,the fused Intensity
band is obtained by reconstruction with high frequency
of a set of basis filters. [6]. A new image fusion model is
components of PAN band and low frequency components of proposed by integrating steerable wavelet transform theory
Intensity image and IHS inverse transform generates the final and IHS theory and the experimental verification of the
fused MS image. model to high-resolution image is given in the paper.
After presenting STWT principle, the fusion technique
scheme is advanced based STWT and IHS theory. PAN band
II. THEORETICAL PRINCIPLE
and MS band1,2,3 of QUICKBIRD are used to access the
quality of the fusion model. After selecting appropriate A. Wavelet Transform(WT)
decomposition level for the model, the fusion scheme gives A decomposition framework can be given with wavelet
fused image, which is compared with generally used fusion transform to decompose images into a number of new
algorithms (IHS, Mallat and SWT). The objectives of image images, each one of them with a different degree of
fusion include enhancing the visibility of the image and resolution[7]. The continuous wavelet transform of a one-
improving the spatial resolution and the spectral information
dimensional function f ( x ) ∈ L ( R ) can be expressed as[8] .
2
of the original image. For assessing quality of an image after
fusion, information entropy and standard deviation are +∞
applied to assess spatial details of the fused images and W f ( a , b ) =< f ,ψ a ,b >= ∫
−∞
f ( x )ψ a ,b dx (1)
correlation coefficient, bias index and warping degree for
measuring distortion between the original image and fused
image in terms of spectral information. For all the tested fusion
The wavelet base functions ψ a ,b ( x) are dilations and
algorithms, better result is obtained when STWT-IHS is used. translations of the Mother Wavelet ψ ( x ) :
Keywords- Steerable Wavelet; Image Enhancement 1 x−b
ψ a ,b ( x ) = ψ( ) (2)
a a
I. INTRODUCTION
Where a, b ∈ R . a is the dilation or scaling factor, and
With the development of remote sensing, spatial and
b is called the translation factor. For every scale a and
spectral resolution and temporal resolution of images are
greatly improved,[1]. The remote sensing image is available location b, the wavelet coefficient W f (a, b) represents the
from different remote sensors and how to make the best of information contained in f ( x ) at that scale and position.
the information from these multi-source and multi-resolution
images is one of the significant issues in application of The original signal can be exactly reconstructed from the
remote sensing[2]. Image fusion effectively enhance the wavelet coefficients by:
available information with these data, and supply a basic for ∞ +∞
1 da
target recognition and information extraction. f ( x) =
Cψ ∫ ∫W
0 −∞
f ( a , b )ψ a ,b ( x )db
a2
(3)
So far, a lot of image fusion schemes between high-
resolution panchromatic image and low-resolution multi- Where Cψ is the normalizing factor of the Mother
spectral image have been proposed by academician.
Wavelet.
c
978-1-4244-5824-0/$26.00 2010 IEEE V3-512
May 12, 2010 17:39 RPS : Trim Size: 8.50in x 11.00in (IEEE) icfcc2010-lineup˙vol-3: F1563
The discrete approach of the wavelet transform can be defined in the Fourier domain, where the subbands are polar-
carried out with two different algorithms, Mallat algorithm separable. The frequency tiling of the single-stage subband
and á trous algorithm.The Multiresolution analysis based on decomposition is shown in figure 1. For the case of
G G
Mallat’s algorithm uses an orthonormal basis, but the
transform is not shift-invariant, which can cause problem in n=1.
{ Bk (ω ) k = 0,1}
are band pass oriented filters,
H 0 (ω )
G
applications[7]. More details on Mallat algorithm realization L1 (ω )
is a non-oriented high-pass filter and is a
and its application to image enhancement are widely narrowband low-pass filter[10].
discussed[7,8,9].
B. ‘à trous’ algorithm
In contrast to Mallat’s Algorithm, the á trous algorithm
FA PAN
allows a shift-invariant discrete wavelet decomposition and it IA IH
H
[Volume 3] 2010 2nd International Conference on Future Computer and Communication V3-513
May 12, 2010 17:39 RPS : Trim Size: 8.50in x 11.00in (IEEE) icfcc2010-lineup˙vol-3: F1563
H 0 ( −ω ) H 0 (ω )
L0 ( −ω ) B0 ( −ω ) B0 (ω ) L0 (ω )
B1 ( −ω ) B1 (ω )
Figure 4. Image fusion scheme based on steerable wavelet transformation
B K ( −ω ) B K (ω )
IV. EXPERIMENTAL RESULTS
L1 ( −ω ) 2↓ 2↑ L1 (ω ) The QUIKBIRD image of Xuzhou taken on 20
Dec.2005 is selected for testifying the proposed fusion
Figure 3. The steerable wavelet decomposition and synthesis scheme algorithm. A 1024*1024 pixels are subseted for testing.
Before the fusion process, the MS bands was interpolated
into 0.6m pixel same as the PAN band. The decomposition
D. Image fusion scheme based on STWT scale is set to 2 as the resolution of PAN band is 2.4m.
This paper proposed a IHS-STWT based high resolution Firstly PAN band and intensity band obtained with IHS
image fusion model(Figure 2). The MS band images are transform from MS band 3(R),2(G),1(B) are decomposed
firstly co-registered and resized to the panchromatic image into two scales with four orientations by STWT given by
with an interpolation method. Thereafter the Figure 1. Then one high frequency component, one low
Intensity(I),Hue(H), and Saturation(S) image can be obtained frequency and 8 orientation band of the two scales are
by IHS transform to the multi-spectral RGB image. The I given. We use substitute the low band of PAN band with
band and PAN image are fused based on Steerable wavelet low band of the Intensity band to get fused Intensity band.
transform theory and a new I band will be generated to Finally fused MS image can be gained by IHS inverse
perform IHS inverse transform. The steerable pyramid transform.
performs a polar-separable decomposition in the frequency It is obvious shown in Figure 3. that different spatial
domain, thus allowing independent representation of scale texture is different in the selected four decomposition
and orientation. The proposed algorithm show advantage orientations(from B10 to B13) with respect to same tested
compared to Mallat based algorithm and ‘ trous’ based region. And different detail textures are demonstrated on
algorithm. two levels(e.g. comparison between B10 band B20). The
high frequency band(H1) of level 1 contains more detail
III. QUANTITY EVALUATION OF THE FUSED IMAGE texture information than all the images and the L2 image
The evaluation methods include visual evaluation and obvious includes all more low frequency informations.
quantity evaluation.The standard of qualitative evaluation is Figure 4. shows the fused I band with different images.
used for measurement of maintaining the multi-spectral All the spatial resolutions are enhanced but it is hard to tell
characteristic information and high resolution spatial the difference among the fused I bands with different fusion
information. For assessing quality of an image after fusion, schemes by visual evaluation. However the quantitative
information entropy and standard deviation are applied to evaluation given by Table1. shows the STWT-HIS fusion
assess spatial details of the fused images and correlation algorithm has the best result and Mallat HIS is the worst
coefficient, bias index and warping degree for measuring among the three multi-resolution fusion algorithms. That is
distortion between the original image and fused image in the spatial information is enhanced to the fused I band
terms of spectral information. For all the proposed fusion image for IHS inversion to RGB.
V3-514 2010 2nd International Conference on Future Computer and Communication [Volume 3]
May 12, 2010 17:39 RPS : Trim Size: 8.50in x 11.00in (IEEE) icfcc2010-lineup˙vol-3: F1563
[Volume 3] 2010 2nd International Conference on Future Computer and Communication V3-515
May 12, 2010 17:39 RPS : Trim Size: 8.50in x 11.00in (IEEE) icfcc2010-lineup˙vol-3: F1563
PAN band and MS band can give more spatial and spectral
information and the visual and quantitative evaluation
shows the proposed fusion algorithm is the best of the four
tested fusion models. The presented method can also be
applied to high resolution image from other sensors and to
other different application objectives.
ACKNOWLEDGMENT
This work is supported in part by the Specialized
(a)IHS fusion (b)Mallat-IHS fusion Research Fund for the Doctoral Program of Higher
Education of China (No.200802901516, 200802900501) and
in part by National Natural Science Foundation of China (No.
40774010 and 40904004) and the Project Sponsored by the
Scientific Research Foundation for the Returned Overseas
Chinese Scholars, State Education Ministry.
REFERENCES
[1] Zhong Z Y and Chen Y, “On application of wavelet transformation to
multisource information fusion,” Acta geodaetica et cartographica
sinica,vol.31,2002,pp.56 -60 .
(c) SWT-IHS fusion (d) STWT-IHS fusion
[2] Wang H H,Peng J X and Wu W, “Remote Sensing image fusion using
Figure 8. Local region of the fused RGB composite image wavelet packet transform,” Journal of Image and Graphics,
vol.7,no.9,2002,pp.932-936.
[3] Pohl C. and Van Genaderen J.L., “Multisensor image fusion in
TABLE II. QUANTITATIVE EVALUATION OF FUSION SCHEMES remote sensing:concepts,methods and applications,” Int.J.Remote
Sensing,vol.19,no.5,1998,pp.823-854.
RGB- IHS Mallat- STWT-
SWT-IHS [4] Maria G.A.,José L.S. and Raquel G.C., “Fusion of Multispectral and
image IHS IHS
SHE panchromatic images using improved IHS and PCA mergers based on
Band1(B) 3.6244 5.2160 5.3088 5.3182 5.3238 Wavelet decomposition,” IEEE Transactionson Geoscience and
remote sensing, vol.42,no.6,pp.1291-1299.
Band2(G) 4.0252 5.3144 5.3923 5.4010 5.4099
Band3(R) 4.0757 5.2955 5.3753 5.3830 5.3891 [5] Pohl C.,Van Genaderen J.L. “Multisensor image fusion in remote
sensing:concepts,methods and applications,” Int.J.Remote Sensing,
STD vol.19, no.5,1998, pp.823-854.
Band1(B) 64.8315 70.1017 70.5725 69.9478 68.8558
[6] W. T. Freeman and E. H. Adelson, “The design and use of steerable
Band2(G) 65.9355 69.3329 70.5725 69.9986 69.0055
filters,” IEEE Trans. on PAMI, vol. 13, Sept. 1991, pp. 891-906.
Band3(R) 62.5792 67.1118 68.2436 67.6258 66.4250
[7] Jorge Núňez, Xavier Otazu, Octavi Fors and Albert Prades, “Multi-
COC
resolution-based image fusion with additive wavelet decomposition,”
Band1(B) 0.7814 0.8925 0.9022 0.9222 IEEE Transactions on geoscience and remote sensing,
Band2(G) 0.7576 0.8939 0.9035 0.9239 Vol.37,no.3,1999,pp.1204-1211.
Band3(R) 0.7237 0.8707 0.8810 0.9045 [8] M.gonzález-audicana, X.otazu, O. fors and A.Seco, “Comparison
DEI between Mallat’s and the ‘á trous’ discrete wavelet transform based
Band1(B) 0.5214 0.3043 0.2892 0.2589 algorithms for the fusion of multi-spectral and panchromatic
Band2(G) 0.9168 0.4504 0.4343 0.3931 images.International journal of remote sensing,”
vol.26,no.3,2005,pp.595-614.
Band3(R) 1.0441 0.5016 0.4835 0.4407
[9] Yocky,D.A., “Image merging and data fusion by means of the
DII discrete two-dimensional wavelet transform,” Journal of the optical
Band1(B) 31.5076 21.9252 20.7449 18.4342 society of America,vol.12, 1995,pp.1834-1841.
Band2(G) 33.7126 22.3920 21.2247 18.8410 [10] Unser,M., “Texture Classification and Segmentation using Wavelet
Band3(R) 35.1400 23.9138 22.7004 20.2596 Frames,” IEEE Trans. Image Processing,vol.4, no.9,1995 ,pp.1549-
1560.
V. CONCLUSION [11] Anestis Karasaridis and Eero Simoncelli, “A filter design technique
for steerable pyramid image transforms,” Pro.ICASSP-96,May 7-
It is necessary to develop appropriate fusion models for 10,Atlanta,GA.
high-resolution image information enhancement. A image [12] Wang J, Zhang J X, Liu Z J and Gao J X, “High Resolution Image
fusion model based on STWT and IHS theory is firstly Merging Based on EMD,” Journal of Remote Sensing, vol. 11,
Jan.2007,pp. 55-61.
presented in the paper. The fusion between QUICKBIRD
V3-516 2010 2nd International Conference on Future Computer and Communication [Volume 3]