STUDY OF REMOTE SENSING IMAGE FUSION AND ITS APPLICATION IN IMAGE CLASSIFICATION

Similar documents
INTERNATIONAL JOURNAL OF GEOMATICS AND GEOSCIENCES Volume 2, No 2, 2011

Data Fusion. Merging data from multiple sources to optimize data or create value added data

AN INTEGRATED APPROACH TO AGRICULTURAL CROP CLASSIFICATION USING SPOT5 HRV IMAGES

Region Based Image Fusion Using SVM

GEOBIA for ArcGIS (presentation) Jacek Urbanski

Cluster Validity Classification Approaches Based on Geometric Probability and Application in the Classification of Remotely Sensed Images

Introduction to digital image classification

BUILDINGS CHANGE DETECTION BASED ON SHAPE MATCHING FOR MULTI-RESOLUTION REMOTE SENSING IMAGERY

Hyperspectral Image Enhancement Based on Sensor Simulation and Vector Decomposition

Implementation & comparative study of different fusion techniques (WAVELET, IHS, PCA)

A NEW CLASSIFICATION METHOD FOR HIGH SPATIAL RESOLUTION REMOTE SENSING IMAGE BASED ON MAPPING MECHANISM

MULTI-SPECTRAL IMAGE FUSION METHOD BASED ON WAVELET TRANSFORMATION

Remote Sensing Introduction to the course

THE REMOTE-SENSING IMAGE FUSION BASED ON GPU

Copyright 2005 Center for Imaging Science Rochester Institute of Technology Rochester, NY

Nearest Clustering Algorithm for Satellite Image Classification in Remote Sensing Applications

Performance Optimization of Image Fusion using Meta Heuristic Genetic Algorithm

Extraction of cross-sea bridges from GF-2 PMS satellite images using mathematical morphology

DATA FUSION FOR MULTI-SCALE COLOUR 3D SATELLITE IMAGE GENERATION AND GLOBAL 3D VISUALIZATION

ORGANIZATION AND REPRESENTATION OF OBJECTS IN MULTI-SOURCE REMOTE SENSING IMAGE CLASSIFICATION

Analysis of spatial distribution pattern of changedetection error caused by misregistration

A Toolbox for Teaching Image Fusion in Matlab

BUILDING DETECTION AND STRUCTURE LINE EXTRACTION FROM AIRBORNE LIDAR DATA

High Resolution Remote Sensing Image Classification based on SVM and FCM Qin LI a, Wenxing BAO b, Xing LI c, Bin LI d

CHAPTER 5 OBJECT ORIENTED IMAGE ANALYSIS

Change Detection in Remotely Sensed Images Based on Image Fusion and Fuzzy Clustering

Multi Focus Image Fusion Using Joint Sparse Representation

DIGITAL IMAGE ANALYSIS. Image Classification: Object-based Classification

Unsupervised Change Detection in Optical Satellite Images using Binary Descriptor

MULTIPLE CLASSIFIER COMBINATION FOR TARGET IDENTIFICATION FROM HIGH RESOLUTION REMOTE SENSING IMAGE

COMBINING HIGH SPATIAL RESOLUTION OPTICAL AND LIDAR DATA FOR OBJECT-BASED IMAGE CLASSIFICATION

Research on a Remote Sensing Image Classification Algorithm Based on Decision Table Dehui Zhang1, a, Yong Yang1, b, Kai Song2, c, Deyu Zhang2, d

PUTTING KNOWLEDGE ON THE MAP PROBA-V CLOUD ROUND ROBIN. Béatrice Berthelot 01/03/ /03/2017 HYPOS-PPT-37-MAGv1.0

Adaptive Zoom Distance Measuring System of Camera Based on the Ranging of Binocular Vision

Research on Design and Application of Computer Database Quality Evaluation Model

Data: a collection of numbers or facts that require further processing before they are meaningful

, SPOT, IRS, Landsat7, IKONOS, QuickB ird O rbv iew. 4 : W avelet, , ( IFOV ) IFOV, , IKONOS QuickB ird Pan. Landsat TM SPOT HRV MS

DATA FUSION AND INTEGRATION FOR MULTI-RESOLUTION ONLINE 3D ENVIRONMENTAL MONITORING

Lab 9. Julia Janicki. Introduction

COMPARATIVE STUDY OF IMAGE FUSION TECHNIQUES IN SPATIAL AND TRANSFORM DOMAIN

Remote Sensed Image Classification based on Spatial and Spectral Features using SVM

GEOMETRY AND RADIATION QUALITY EVALUATION OF GF-1 AND GF-2 SATELLITE IMAGERY. Yong Xie

FIELD-BASED CLASSIFICATION OF AGRICULTURAL CROPS USING MULTI-SCALE IMAGES

with Color Distortion Reduction for IKONOS Imagery

Image Denoising Based on Hybrid Fourier and Neighborhood Wavelet Coefficients Jun Cheng, Songli Lei

IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 11, NO. 4, APRIL

A Novel Pansharpening Algorithm for WorldView-2 Satellite Images

CHAPTER 2 TEXTURE CLASSIFICATION METHODS GRAY LEVEL CO-OCCURRENCE MATRIX AND TEXTURE UNIT

Object-based classification of IKONOS data for endemic Torreya mapping

Geometric Rectification of Remote Sensing Images

Keywords: Image Fusion, Background Extraction, Foreground Extraction, Binarization, Filtration.

URBAN IMPERVIOUS SURFACE EXTRACTION FROM VERY HIGH RESOLUTION IMAGERY BY ONE-CLASS SUPPORT VECTOR MACHINE

Fusion of pixel based and object based features for classification of urban hyperspectral remote sensing data

Face Hallucination Based on Eigentransformation Learning

Multi-Sensor Image Registration, Fusion and Dimension Reduction. Jacqueline Le Moigne

EVALUATION OF CONVENTIONAL DIGITAL CAMERA SCENES FOR THEMATIC INFORMATION EXTRACTION ABSTRACT

Spectral Classification

RESEARCH ON THE VISUALIZATION SYSTEM AND APPLICATIONS OF UNCERTAINTY IN REMOTE SENSING DATA CLASSIFICATION

The Improved Embedded Zerotree Wavelet Coding (EZW) Algorithm

A Spectrum Matching Algorithm Based on Dynamic Spectral Distance. Qi Jia, ShouHong Cao

SYNERGY BETWEEN AERIAL IMAGERY AND LOW DENSITY POINT CLOUD FOR AUTOMATED IMAGE CLASSIFICATION AND POINT CLOUD DENSIFICATION

MultiSpec Tutorial. January 11, Program Concept and Introduction Notes by David Landgrebe and Larry Biehl. MultiSpec Programming by Larry Biehl

Forest Fire Smoke Recognition Based on Gray Bit Plane Technology

Remote Sensing Image Classification Based on a Modified Self-organizing Neural Network with a Priori Knowledge

Figure 1: Workflow of object-based classification

Spatial Enhancement Definition

Automated Building Change Detection using multi-level filter in High Resolution Images ZHEN LIU

INTEGRATION OF TREE DATABASE DERIVED FROM SATELLITE IMAGERY AND LIDAR POINT CLOUD DATA

INCREASING CLASSIFICATION QUALITY BY USING FUZZY LOGIC

Parameter Optimization in Multi-scale Segmentation of High Resolution Remotely Sensed Image and Its Application in Object-oriented Classification

Research on Quality Inspection method of Digital Aerial Photography Results

A Comparative Study of Conventional and Neural Network Classification of Multispectral Data

Multi-temporal Hyperspectral Images Unmixing and Classification Based on 3D Signature Model and Matching

Classification (or thematic) accuracy assessment. Lecture 8 March 11, 2005

Multi-Spectral Image Analysis and NURBS-based 3D Visualization of Land Maps

LAB EXERCISE NO. 02 DUE DATE: 9/22/2015 Total Points: 4 TOPIC: TOA REFLECTANCE COMPUTATION FROM LANDSAT IMAGES

Land Cover Classification Techniques

COSC160: Detection and Classification. Jeremy Bolton, PhD Assistant Teaching Professor

Image Classification. RS Image Classification. Present by: Dr.Weerakaset Suanpaga

Compression of RADARSAT Data with Block Adaptive Wavelets Abstract: 1. Introduction

Geometric Rectification Using Feature Points Supplied by Straight-lines

International Journal of Engineering Research-Online A Peer Reviewed International Journal Articles available online

Comparison of DCT, DWT Haar, DWT Daub and Blocking Algorithm for Image Fusion

Hyperspectral Image Segmentation using Homogeneous Area Limiting and Shortest Path Algorithm

Unsupervised and Self-taught Learning for Remote Sensing Image Analysis

Graph-theoretic Issues in Remote Sensing and Landscape Ecology

SLIDING WINDOW FOR RELATIONS MAPPING

CHANGE DETECTION APPROACH TO SAR AND OPTICAL IMAGE INTEGRATION

Research Institute of Uranium Geology,Beijing , China a

Bridge Surface Crack Detection Method

DENSE IMAGE MATCHING FOR MARS EXPRESS HRSC IMAGERY BASED ON PRECISE POINT PREDICTION METHOD

Support Vector Selection and Adaptation and Its Application in Remote Sensing

This is the general guide for landuse mapping using mid-resolution remote sensing data

Keywords: impervious surface mapping; multi-temporal data; change detection; high-resolution imagery; LiDAR; object-based post-classification fusion

Medical Image Fusion Using Discrete Wavelet Transform

Glacier Mapping and Monitoring

DIGITAL HEIGHT MODELS BY CARTOSAT-1

Schedule for Rest of Semester

ENHANCEMENT OF THE DOUBLE FLEXIBLE PACE SEARCH THRESHOLD DETERMINATION FOR CHANGE VECTOR ANALYSIS

Principal Component Image Interpretation A Logical and Statistical Approach

Transcription:

STUDY OF REMOTE SENSING IMAGE FUSION AND ITS APPLICATION IN IMAGE CLASSIFICATION Wu Wenbo,Yao Jing,Kang Tingjun School Of Geomatics,Liaoning Technical University, 123000, Zhonghua street,fuxin,china - yaojing1124@163.com Commission VII, WG VII /6 KEY WORDS: Landsat, data fusion, spectral Distortion, Classification accuracy, Algorithms evaluation ASTRACT: Data fusion is a formal framewor in which is expressed means and tools for the alliance of data originating from different sources. It aims at obtaining information of greater quality; the exact definition of greater quality will depend upon the application. Satellites remote sensing fusion has been a hot research topic of remote sensing processing. Mae Multispectral s matching with TM panchromatic, and the error control in 0.3 pixels within. Use Smoothing Filter-based Intensity Modulation (SFIM), High Pass Filter (HPF), Modified rovery, Multiplication, IHS, Principle component analysis Transform (PCA) methods for the fusion experiment. Use some parameters to evaluate the quality of fused s. Select representative features from the fused and original s and analysis the impact of fusion method. The result reveals that all the six methods have spectral distortion, HPF and SFIM are the best two in retaining spectral information of original s, but the PCA is the worst. In the process of remote sensing data fusion, different method has different impact on the fused s. Use supervised classification and unsupervised classification method to mae classification experiments, the study reveals that all the fused s have higher spatial frequency information than the original s, and SFIM transform is the best method in retaining spectral information of original. 1. INTRODUCTION The specific objectives of fusion are to improve the spatial resolution, improve the geometric precision, enhanced the capabilities of features display, improve classification accuracy, enhance the capability of the change detection and replace or repair the defect of data [1]. ut for a long time, remote sensing fusion is mainly used to enhance the visual interpretation, and it not usually used in the research of improving the classification, the main reasons are shown as follows [2] : 1 Image fusion is mostly based on the fusion of different satellite. ecause of the difference of the various parameters and phase between different sensors, as well as the inevitably registration error, led to the fusion classification results unsatisfactory;2 Although the same sensor system provided different spatial resolution s, because of its low spatial resolution, resulting in poor classification effect; 3ecause of the unreasonable fusion algorithm or classification method mae the failure of classification. In this paper, using Landsat ETM + s panchromatic bands and multi-spectral bands to fuse, to research the fusion technology of different spatial resolution based on the same sensor system and the classification technology, evaluate the infection of each fusion method with the land use classification. 2. THE CHOICE OF DATA SOURCES AND SUMMARIZE OF THE PRETREATMENT In this paper, using the data of Landsat 7 ETM + panchromatic and multispectral s of August 2001,the study area is Shenyang.There are many types of feature in this area,the main features include rice, dry land, forest, water bodies, residents of villages and towns and so on. 2.1 ands Selection ands combination is a ey step of fusion technique, bands combination optimization must be followed by two principles: firstly, the physical significance of the selected bands are good and they are in different light area, that is to say the relevance of each bands are small; secondly, we should choose the bands with the largest information [3].In this paper calculate the correlation coefficient matrix and OIF index, select the bands combinations in turn (table1 and table 2). In the table of OIF index we can see that, the OIF index of the combination of bands ETM+3,4,5 is the biggest, and the correlation coefficient of bands3,4,5 is the smallest, so choose bands 3,4,5 as a fusion experiment. 2.2 Image Registration The essence of registration is according to the geometric correction of the remote sensing s; adopt a geometric transform to mae the unified to a same coordinate *Corresponding author: Yao Jing, School Of Geomatics,Liaoning Technical University, 123000, 359 mailbox,zhonghua street,fuxin,china. yaojing1124@163.com, telephone number:13941830365 1141

The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part 7. eijing 2008 system [4]. Image registration include relative registration and is call for the error control within a single pixel in the highresolution s, or within the pixel size of 10% ~ 20% in the low-resolution multispectral [5]. In this study, use the relative registration method, tae the panchromatic band of ETM + s for the reference, select forty-six control points to mae geometric correction absolute registration [8],the general requirements of registration with the multispectral s, in order to reduce loss of spectrum by resample, use the nearest neighbour method to resample, the accuracy of registration is controlled within 0.3 pixel, which can meet the need of registrations accuracy. ETM+1 ETM+2 ETM+3 ETM+4 ETM+5 ETM+7 ETM+1 1 ETM+2 0.952 1 ETM+3 0.846 0.927 1 ETM+4 0.626 0.611 0.436 1 ETM+5 0.645 0.719 0.712 0.690 1 ETM+7 0.678 0.780 0.864 0.465 0.891 1 Table1 Correlation coefficient matrix table ands combination OIF index Range order ands combination OIF index 123 12.832 1 134 22.605 11 127 16.739 2 157 22.724 12 124 17.229 3 357 22.840 13 237 18.043 4 245 23.918 14 125 18.359 5 145 24.316 15 137 19.160 6 247 24.858 16 135 19.693 7 147 25.724 17 235 20.596 8 457 27.442 18 257 21.314 9 347 29.209 19 234 22.169 10 345 29.230 20 Range order Table 2. The OIF index of different bands combination(according to the order from small to big) 3. COMMON FUSION METHODS In this study, the fusion methods are all based on pixel-level fusion method, pixel-level fusion method is a process to analysis the original information, this fusion method loss the least information, so the accuracy of the pixel-level fusion is the highest, but the data transmission and processing is the largest. [6]. 3.1 Fusion Methods pixel ( i, j ) HPH, ( i, j) show the high frequency information of the high-resolution panchromatic. 3.1.2 SFIM Fusion Method: The full name of SFIM transform is called Smoothing Filter-based Intensity Modulation [8],which is a brightness transformation based on the smoothing filter. The formula of this arithmetic is as follows: 3.1.1 High-pass Filter Fusion Method: High-pass filter fusion method a method that mae the high frequency components of high-resolution panchromatic superimposed on lowresolution multispectral, to obtain the enhanced spatial resolution multispectral. The formula is as follows: SFIM i = low ij j mean j highj i = 1, 2, 3 (2) (, ) = (, ) + ( ) F i j M i j HPH i j, In the formula, F ( i, j ) is the fusion value of the band pixel ( i, j ), M ( i, j ) the value of multi-special of band (1) i SFIM In the formula is the fusion generated by this arithmetic, is the value of the band, j and is the value of row and line; is the Low-resolution s, here low denote the multi-spectral band of ETM+; is the highresolution s, which is the panchromatic bands of ETM+; mean is simulate low-resolution s, which can be low 1142

The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part 7. eijing 2008 obtained by low-pass filter with the pan-band, this study use the low-pass filter operator of 5*5 of ERDAS8.7. 3.1.3 rovery Change Fusion Method rovery fusion formula as follows: = M i j n lowij i= 1 j low high ij j In the formula, M denominator denote the summation of the three ETM+ multispectral bands. (3) is the fusion, n is bands numbers, 3.1.4 The Product Transform Fusion Method The product fusion method is a simple algebra operation, multiply the pixel of the two s to obtain the fusion. In this paper, tae extraction with the product of the pixel to obtain the fusion pixel value, the fusion algorithm formula as follows: 3.1.5 IHS Transform Fusion Method At present, IHS transform model mainly contains spherical transform, cylinder transform, triangular transform and six pyramid transform. On the fusion, the differences between the four models are not obvious, but sphere transform is the best in compare [9]. In this study use IHS transform method for the fusion experiment, and programming in Matlab to achieve the fusion algorithm. 3.1.6 Principal Component Fusion Method The after fusion contains the character of high spatial resolution and high spectral resolution of the original, reserve the high frequency of the original. The individual part character of the aim of the fusion is more clear, and the spectral information is more richer, and PCA transform conquer the localization of the HIS transform which can only at the same time do fusion with three bands, it can do fusion with more than two multi-spectral. 3.2 Fusion Image Evaluation Parameters ( ) 1 2 ML = XS PN ij ij ij (4) The common used fusion quality evaluation parameters are: the mean, standard deviation, average gradient, information entropy, and the correlation coefficient. In the formula, ML is the fusion pixel value, pixel value of multi-spectral, panchromatic. ij PN ij XS is the ij is the pixel value of 4. EXPERIMENTS AND ANALYSIS Programming the various fusion algorithm with Erdas Model module and matlab, fused s are displayed: 5, 4 and 3 bands in accordance with the R, G,, fusion results are shown as follows(figure1-6): Figure1.HPF fused 543 bands Figure2.SFIM fused 543bands Figure3. rovery fused 543bands Figure4.ML fused 543bands Figure5.PCA fused 543bands Figure6. HIS fused 543bands 1143

The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part 7. eijing 2008 band mean standard deviation XS HPF fused SFIM fused ML fused rovery fused IHS fused PCA fused entropy Correlation coefficient with XS average gradient Correlation coefficient with Pan 5 75.183 21.675 6.0952 1 4.0176 0.6541 4 71.941 15.599 6.0627 1 2.8171 0.7284 3 66.227 18.682 6.0598 1 3.2773 0.1661 5 75.113 22.141 6.3509 0.9455 6.6193 0.7214 4 71.871 15.884 6.2881 0.9636 6.5239 0.7956 3 66.157 21.041 6.2971 0.9843 5.6477 0.2985 5 75.161 22.966 6.291 0.9548 8.0911 0.7365 4 71.810 17.516 6.2199 0.9603 7.3351 0.8139 3 65.971 19.383 6.2274 0.9537 7.2359 0.3127 5 100.228 23.161 6.2391 0.9277 8.1029 0.8883 4 98.153 19.267 6.2595 0.9433 7.6337 0.9140 3 93.150 19.167 6.0762 0.8425 7.952 0.6674 5 47.925 15.982 5.6371 0.7847 9.9214 0.9543 4 47.965 21.041 6.2393 0.9732 9.1894 0.6699 3 40.908 9.767 5.1429 0.7844 9.266 0.7208 5 73.574 14.842 5.9055 0.8231 6.5412 0.8970 4 71.761 22.728 6.4812 0.8786 6.8252 0.7000 3 64.028 21.48 5.8278 0.8157 6.5109 0.4894 5 136.353 28.614 6.9469 0.6441 10.3735 0.9613 4 134.02 13.613 6.9428 0.9546 9.396 0.7738 3 137.837 17.966 5.9238 0.2372 9.6198 0.9734 Table 3. Table of evaluate parameters 4.1 Parameters Statistics of Fused Image The original multi-spectral s using XS to replace,and panchromatic s with PAN replaced, evaluate parameters are shown in the table3: From the parameters of table 3, we can see that: (1) All fusion method in accordance with the definition in descending order, the order is: PCA>rovery>ML> SFIM>MIHS>IHS>HPF; (2) All fusion method in accordance with the Spectra maintains degrees in descending order, the order is: HPF>SFIM> rovery>mihs>ml>ihs>pca; (3) All fusion method in accordance with the entropy in descending order, the order is: PCA>MIHS>HPF>IHS>SFIM> ML>rovery. 4.2 Feature Identification Accuracy of Fused Image Different fusion methods have different impacts on. Image Recognition is the application of spectral characteristics and structural characteristics of different features to identify information; therefore spectra and texture information on the objectives of the interpretation are important significance [10]. In order to verify the influence of various fusion methods on the classification accuracy, in this paper, the data of different experiments using the same processes to deal with unsupervised classification; and mae classification accuracy test, select high precision fused to mae supervised classification. 4.2.1 Research Methods: Mae classification with maximum lielihood classification; using random method to select 256 ground inspection points, mae accuracy test for thematic maps of XS and fused, obtain total accuracy and Kappa index. 4.2.2 Accuracy Test of Unsupervised Classification From the comparative data table 4, we find that: PCA fusion, in addition to other fusion classification accuracy are significantly higher than those without fused classification accuracy, the reason maybe that: PCA fused has the worst spectrum distortion, and it leads to the lower classification accuracy. Descending order of the classification accuracy is: SFIM> HPF >ML>rovery > XS>IHS>PCA. type XS HPF fused ML fused rovery fused PCA fused SFIM fused IHS fused Total 77.34% 81.25% 80.47% 78.52% 67.97% 84.38% 76.95% accuracy Kappa index 0.6799 0.7468 0.7298 0.6809 0.5271 0.7810 0.6454 Table 4. Comparative data of unsupervised classification accuracy 1144

The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part 7. eijing 2008 4.2.3 Supervised Classification Accuracy Evaluation: Do supervised classification with the original and the SFIM fusion choosing 5,4,3bands after the bands selection, and evaluate the accuracy of the classification, the classification accuracy of the results are showed in table 5 type XS SFIM fusion Total 82.81% 90.23% accuracy Kappa index 0.7668 0.8673 Table 5.Comparative data of supervised classification accuracy y comparing the total accuracy and Kappa index of the two, we can see that: the accuracy and Kappa index of the supervised classification of SFIM fusion are much higher than the XS s. Through the accuracy analysis of the unsupervised classification and supervised classification, can generally now the character of the above five algorithm, on the basis of ETM+ classification, select SFIM fusion as the basic ; rovery fusion as the small information leading to the lower classification accuracy; HPF fusion owing to the better spectral fidelity and more high-frequency information can be used as the auxiliary of the visual interpretation; PCA, ML fusion has high integration of high frequency information, which has a certain value in the extraction of the city internal structure. 5. CONCLUSIONS AND OUTLOOK This paper introduces the basic concepts and theory of fusion, and discussed a variety of fusion method. Summarize the quantitative evaluation criteria of the mean, standard deviation, correlation coefficient, entropy, the average grads and so on, measure and compare each fusion algorithm and obtained many useful conclusion. [4]Sollerg S.Jain A.K.Taxt.T.A Maov random field model for classi fication of multisource satellite ry.ieee Trans.on Geosciences and Remote Sensing,1996,34(1),pp.100-113. [5]Wang Wenjie,2006. Image fusion technology based on wavelet transform.chinese Academy of Sciences, a master's degree thesis. [6]Zheng Yongsheng, Wang Renli,1999. Dynamic monitoring of remote sensing,eijing:pla publishing. [7]Jia Yonghong,2005. Multi-source remote sensing data fusion technology,eijing: Mapping Press,. [8]Smara Y,elhandj-Aissa A, and Sansal.Multisources ERS- 1 and Optical Data for Vegetal Cover Assessment and Monitoring in a semi-arid Region of Algeria [J].International Journal of Remote Sensing,1998,19(18),pp.3551-3568. [9]Musa M K A,Hussin Y A.Multi-data Fusion for sustainable Forest Management : A case study from Northern Part of selangor,malaysia[j].international archives of photogrammetry and Remote Sensing,2000,33(7),pp.71-80. [10]Pan Xueqin,2005. The comparison and analysis of data fusion of remote sensing.kaifeng:henan university. Appendix Supported by the Ministry of Education of Doctor Spot Foundation (20050147002); College of Liaoning Province Emphasis Laboratory Item (20060370) ut this study mae fusion analysis with the panchromatic and multispectral s in the same satellite system of Landsat-7 ETM+, as different types of sensors have different data types, it should be more complex in the process of fusion and the evaluation, yet has to do further studies with specific issues. REFERENCES [1]Zhao Yingshi,2003. Remote sensing application principles and methods.eijing:science press,pp..253-254. [2]Xu Hanqiu. Study on Data Fusion and Classification of Landsat 7 ETM + Imagery.Journal of Remote Sensing,2005,9(2),pp.186-194. [3]Shi Yongjun,2003. Forest remote sensing classification technology research tae the northwest mountain area of Zhejiang as an example.zhengjiang:zhejiang university, 36-50 1145

The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part 7. eijing 2008 1146