29947.pdf

Fusion of Optical and Microwave Remote Sensing
data for snow cover mapping
G.VENKATARAMAN1, BIKASH C. MAHATO1,
S. RAVI1, AND Y. S. RAO1
COL. P.MATHUR2, SNEHMANI2
2
Research and Development Centre,
Snow and Avalanche Study Establishment,
Chandigarh, India
1
Centre of Studies in Resources Engineering
Indian Institute of Technology, Bombay, India
[email protected]
Key words: Fusion, LISS-III, Radarset, Snow Cover.
Abstract - Optical remote sensing data and microwave remote
sensing data are complementary to each other and hence the
fusion of these data would help in improving the classification
accuracy. In this paper IRS LISS-III data and Radarsat-1 SAR
data are fused using Bayesian formulation of data fusion. For
this purpose SAR image is modeled using multiplicative
autoregressive random field model. The synthesized SAR
image is fused with IRS LISS-III image using a model, which
incorporates transition probability to give allowance for
temporal ambiguity. Fusion technique is used to improve the
classification accuracy of snow related features in Himalayan
region, India.
1. INTRODUCTION
Data fusion is the seamless integration of data from
disparate sources. Optical and microwave Remote Sensing
are complementary to each other as their characteristics are
different. Microwaves are capable of penetrating the
atmosphere under virtually all the conditions. Microwave
reflections or emissions from earth materials bear no direct
relationship to their counterparts in the visible or thermal
portions of the spectrum (Lillesand and Kiefer, 2000).
Rough surface appearance in visible portion of spectrum
may be seen smooth by microwaves. A minute variation in
surface roughness does not affect the imaging mechanism in
microwave region but highly affects the reflection in optical
wavelength region. This characteristic of microwave has
best suited the imaging of areas in Himalayan region.
Moreover, microwave is very much sensitive to dielectric
properties of the imaged object. Hence the presence of a
small amount of water content in snow greatly affects the
radar return. The viewing geometry also plays a major role
in acquiring information about the ground objects. High
relief terrain especially in Himalayan region causes a major
area of radar image under shadow and layover due to the
side looking properties of microwave antenna. IRS LISS-III
sensor can partly cover the information under radar shadow
and layover. The multi-spectral LISS-III images provide
spatial correlation that fits Radarsat SAR imagery has to be
applied on SAR imagery. If the data can be transformed to
Gaussian statistics with an invertible point non-linearity,
0-7803-8742-2/04/$20.00 (c) 2004 IEEE
good discrimination between various landuse/landcover
classes. But Radarsat-1 SAR operates in microwave spectral
region on single frequency hence discrimination ability is
poor for various landuse/landcover classification. In this
work IRS LISS-III and a Radarsat-1 SAR image have been
fused following the concept of Bayesian formulation of data
fusion. This fusion model incorporates the temporal nature
of different sensors.
2. STUDY AREA
The study area covering Beaskund glacier and the
neighborhood falls between latitude 32o15’ and 32025’N and
longitude between 77o0’ and 77o15E. During winter season
the whole area remains under snow cover. In Himalayan
region the snow condition is totally different from the polar
snow covered region. Even after fresh snowfall, wet snow
class can be found in the terrain. Hence the changes in snow
classes are quite possible in this terrain.
3. METHODOLOGY
In fusion technique, the classification results of optical and
SAR data are to be fused using linear combination.
Therefore a classification system has to be developed for
fusion of IRS LISS-III and Radarsat-1 SAR images. The
LISS-III image has been modeled using multivariate normal
distribution. Gaussian maximum likelihood classification
algorithm has been used to classify the LISS-III image data.
Radar images contain some degree of speckle noise. Several
researchers have found that radar returns corrupted by
speckle and also with speckle averaged out by noncoherent
integration fit lognormal statistics (Einstein, 1982). An
image can be thought of as an estimate of the power
spectrum of aperture voltage and positivity of lognormal
data is consistent with image intensity being a power signal
(Frankot and Chellappa, 1987). Hence a model for
then it can be modeled by extending the well-understood
results from Gaussian random fields with linear spatial
interaction (Frankot and Chellappa, 1987). Lognormal
2554
Authorized licensed use limited to: INDIAN INSTITUTE OF TECHNOLOGY BOMBAY. Downloaded on July 2, 2009 at 03:17 from IEEE Xplore. Restrictions apply.
random fields with multiplicative spatial interaction are a
special case of the above transformed Gaussian random
fields that are of interest in radar image processing.
In this work we have modeled the SAR image using a
multiplicative autoregressive random field (MAR) model
(Solberg, 1994). The estimated parameters viz. θ , µ y and
σ2
have been combined in a single image to form a multilayer SAR image (Fig. 1) and maximum likelihood
classification scheme has been used to classify the SAR
image. This fusion technique follows the concept of
Bayesian formulation for data fusion. The fusion technique
adopted in this work incorporates the temporal aspect of the
two images. The ambiguity between the SAR and LISS-III
image data acquired at different times has been removed by
introducing a penalty term that includes the transition
probability. In this work optical IRS (LISS-III) image data
of 24th January 2002 has been fused with the microwave
(Radarsat-1) image data of 6th January 2002. Prior to fusion,
the two images have been orthorectified to a common
projection system of Polyconic Everest using the
Orthorectification and OrthoBase module of ERDAS
Imagine software, which incorporates the DEM of the study
area and co-registration. Our main aim in fusing Optical
data with SAR data is to improve the information content in
the final classified output. Because of the multi-spectral
characteristics, original LISS-III image is considered to be
modeled by multivariate normal distribution and classified
using maximum likelihood classifier. SAR image has been
modeled using a multiplicative autoregressive random field
(MAR) model where model parameters are estimated
through least square estimate. The five textural features
corresponding to the five parameters
θ,µy
and
σ 2 were
combined to generate a multi-layer image (Fig. 1), which
follows multivariate normal distribution and classified using
maximum likelihood classifier. The accuracy assessment
for both the classifications was carried out individually and
these accuracies were used later in the fusion model. Finally
a comparison was made between the fused image and the
two individually classified images.
Fig. 1. Synthesized SAR image is generated by combining five Layers and displayed with layer 3, 4 and 5 in B, G and R res.
0-7803-8742-2/04/$20.00 (c) 2004 IEEE
2555
Authorized licensed use limited to: INDIAN INSTITUTE OF TECHNOLOGY BOMBAY. Downloaded on July 2, 2009 at 03:17 from IEEE Xplore. Restrictions apply.
Fig. 2. IRS LISS-III image classified with
all the four bands using ML classification
scheme
Class Name
Histogram
Dry Snow
Moist Snow
Wet Snow
Vegetation
Shadow
Ridges
1647540
3807992
5413332
263548
2670948
0
Table 1. Class statistics of classified IRS LISS-III image
Class Name
Histogram
Dry Snow
2721936
Moist Snow
3871073
Wet Snow
1871127
Vegetation
1674074
Shadow
2540569
Ridges
1124581
Table 2. Class statistics of classified Radarsat-1 SAR image
Fig. 3. Radarsat-1 SAR image classified
with bands theta (3), mean and variance
using ML classification scheme
Class Name
Histogram
Dry Snow
Moist Snow
Wet Snow
Vegetation
Shadow
Ridges
2175772
4185220
5786913
639582
1015873
0
Table 3. Class statistics of fused image
Fig. 4. Output image generated by fusing IRS
LISS-III image with Radarsat-1 SAR image
0-7803-8742-2/04/$20.00 (c) 2004 IEEE
2556
Authorized licensed use limited to: INDIAN INSTITUTE OF TECHNOLOGY BOMBAY. Downloaded on July 2, 2009 at 03:17 from IEEE Xplore. Restrictions apply.
4. RESULTS
The classified IRS LISS-III image of 24th January 2002 and
Radarsat-1 SAR image 6th January 2002 are presented in
figure 2 and 3 respectively. The final fused image is shown
in figure 4. The features classified include dry snow, moist
snow, wet snow, vegetation, shadow and ridges. Due to
corner reflection, higher radar return has been observed in
SAR image which has been classified as ridges. This feature
is present only in the classified SAR image. Snow is a
rapidly changing ground feature and hence a change in snow
condition is common to observe in snow cover classes even
in this very short time gap of data acquisition (i.e., 6th
January 2002 and 24th January 2002). Therefore, the
transition probability matrix of class changes between
different pattern classes has been designed accordingly to
improve the classification accuracy.
The class statistics of classified LISS-III, Radarsat
and fused image are presented in table 1, 2, and 3
respectively. Comparison of these data reveal that in the
fused image the shadow area has been reduced to a great
extent from 19.35% in LISS-III image to 7.36% of total
area. The dry snow cover area has increased to 15.8% in the
fused image from 11.9% in the LISS-III image, which
indicates that the dry snow pixels under the optical shadow
class has been recovered in the fused image. However, as
compared to the LISS-III image there is an increment of
2.7% for both moist snow and wet snow cover area in the
fused image, which is due to the mutual transition between
these two classes and also because of the recovery of the
shadow pixels in the LISS-III image to the above mentioned
classes. As compared to the SAR classified image, the dry
snow covered area has decreased by 3.9% but the moist
snow and wet snow cover area has increased by 2.3% and
28.3% respectively in the fused image, which indicates that
there was no snowfall between 6th January 2002 and 24th
January 2002. This has been corroborated by the field data.
It has been observed that there is misclassification in the
case of vegetation and wet snow classes in SAR image and
dry snow and moist snow in the LISS-III image. These
misclassifications have been rectified to a greater extent in
the fused image. The accuracy estimates for classification
has been 63% for SAR image, 80% for IRS LISS-III image
and 93% for fused image.
ACKNOWLEDGEMENTS
This work forms part of a collaborative research project
sponsored by Department of Science and Technology, Govt.
of India. The authors are thankful to the Director, IITBombay and Director, SASE, Chandigarh for their
continuous support and encouragement.
REFERENCES:
Einstein T.H. (1982), “Effect of frequency averaging on
estimation of clutter statistics used in setting CFAR
detection thresholds”, MIT Lincoln Lab., TT-60, AD
A131947.
Frankot, R.T. and Chellappa, R. (1987), Lognormal
Random-Field Models and Their Applications to Radar
Image Synthesis, IEEE Trans. Geosci. & Rem. Sens., Vol.
GE 25, No. 2, pp. 195-207.
Lillesand, T.M. and Kiefer, R.W. (2000), Remote Sensing
and Image Interpretation, John Wiley & Sons, Inc.,
Singapore, 4th Edition, 724p.
Solberg, A.H.S., Jain, A.K. and Taxt, T. (1994),
Multisource Classification of Remotely Sensed Data: Fusion
of Landsat TM and SAR Images, IEEE Trans. Geosci. &
Rem. Sens., Vol. 32, No. 4, pp. 768-778.
5. CONCLUSIONS
•
•
•
Fusion
has
considerably
improved
the
classification accuracy compared to the
classification of SAR and IRS LISS-III image
individually.
Shadow effects have been considerably reduced in
fused image.
Misclassification of some classes in SAR and IRS
LISS-III image has been greatly rectified in the
fused image.
0-7803-8742-2/04/$20.00 (c) 2004 IEEE
2557
Authorized licensed use limited to: INDIAN INSTITUTE OF TECHNOLOGY BOMBAY. Downloaded on July 2, 2009 at 03:17 from IEEE Xplore. Restrictions apply.