The Gain setting for Landsat 7 (High or Low Gain) depends on: Sensor Calibration - Application. the surface cover types of the earth and the sun angle

Similar documents
Aardobservatie en Data-analyse Image processing

Calculation steps 1) Locate the exercise data in your PC C:\...\Data

Data: a collection of numbers or facts that require further processing before they are meaningful

Classify Multi-Spectral Data Classify Geologic Terrains on Venus Apply Multi-Variate Statistics

SEA BOTTOM MAPPING FROM ALOS AVNIR-2 AND QUICKBIRD SATELLITE DATA

Remote Sensing Introduction to the course

GEOBIA for ArcGIS (presentation) Jacek Urbanski

APPLICATION OF SOFTMAX REGRESSION AND ITS VALIDATION FOR SPECTRAL-BASED LAND COVER MAPPING

GEOG 4110/5100 Advanced Remote Sensing Lecture 2

(Refer Slide Time: 0:51)

Hyperspectral CHRIS Proba imagery over the area of Frascati and Tor Vergata: recent advances on radiometric correction and atmospheric calibration

CORRELATION BETWEEN NDVI AND SURFACE TEMPERATURES USING LANDSAT ETM IMAGERY FOR SAN ANTONIO AREA. Remote Sensing Project By Newfel Mazari Fall 2005

LAB EXERCISE NO. 02 DUE DATE: 9/22/2015 Total Points: 4 TOPIC: TOA REFLECTANCE COMPUTATION FROM LANDSAT IMAGES

Class 11 Introduction to Surface BRDF and Atmospheric Scattering. Class 12/13 - Measurements of Surface BRDF and Atmospheric Scattering

DEVELOPMENT OF CLOUD AND SHADOW FREE COMPOSITING TECHNIQUE WITH MODIS QKM

CHRIS Proba Workshop 2005 II

Optical Theory Basics - 2 Atmospheric corrections and parameter retrieval

Determining satellite rotation rates for unresolved targets using temporal variations in spectral signatures

Introduction to digital image classification

Classification (or thematic) accuracy assessment. Lecture 8 March 11, 2005

Prof. Vidya Manian Dept. of Electrical l and Comptuer Engineering. INEL6007(Spring 2010) ECE, UPRM

GEOG 4110/5100 Advanced Remote Sensing Lecture 4

A Survey of Modelling and Rendering of the Earth s Atmosphere

Aim of Lesson. Objectives. Background Information

TOPOGRAPHIC NORMALIZATION INTRODUCTION

A Comparative Study of Conventional and Neural Network Classification of Multispectral Data

MODULE 3 LECTURE NOTES 3 ATMOSPHERIC CORRECTIONS

Lab 9. Julia Janicki. Introduction

Hyperspectral Remote Sensing

Airborne Hyperspectral Imaging Using the CASI1500

Vicarious Radiometric Calibration of MOMS at La Crau Test Site and Intercalibration with SPOT

ICA-based multi-temporal multi-spectral remote sensing images change detection

HYPERSPECTRAL REMOTE SENSING

Defining Remote Sensing

Land surface VIS/NIR BRDF module for RTTOV-11: Model and Validation against SEVIRI Land SAF Albedo product

Lab on MODIS Cloud spectral properties, Cloud Mask, NDVI and Fire Detection

Image Classification. RS Image Classification. Present by: Dr.Weerakaset Suanpaga

IMPROVING 2D CHANGE DETECTION BY USING AVAILABLE 3D DATA

Revision History. Applicable Documents

(Refer Slide Time 00:17) Welcome to the course on Digital Image Processing. (Refer Slide Time 00:22)

VALERI 2003 : Concepcion site (Mixed Forest) GROUND DATA PROCESSING & PRODUCTION OF THE LEVEL 1 HIGH RESOLUTION MAPS

Spectral Classification

Monte Carlo Ray Tracing Based Non-Linear Mixture Model of Mixed Pixels in Earth Observation Satellite Imagery Data

Radiometric Correction. Lecture 4 February 5, 2008

Artificial Neural Networks Lab 2 Classical Pattern Recognition

Raster Classification with ArcGIS Desktop. Rebecca Richman Andy Shoemaker

Solar Panel Irradiation Exposure efficiency of solar panels with shadow

THE EFFECT OF TOPOGRAPHIC FACTOR IN ATMOSPHERIC CORRECTION FOR HYPERSPECTRAL DATA

RASTER ANALYSIS GIS Analysis Winter 2016

DIGITAL IMAGE ANALYSIS. Image Classification: Object-based Classification

Laser Beacon Tracking for High-Accuracy Attitude Determination

Using ArcGIS for Landcover Classification. Presented by CORE GIS May 8, 2012

Evaluation of Satellite Ocean Color Data Using SIMBADA Radiometers

RASTER ANALYSIS GIS Analysis Fall 2013

ENHANCEMENT OF THE DOUBLE FLEXIBLE PACE SEARCH THRESHOLD DETERMINATION FOR CHANGE VECTOR ANALYSIS

Outlier and Target Detection in Aerial Hyperspectral Imagery: A Comparison of Traditional and Percentage Occupancy Hit or Miss Transform Techniques

Prototyping GOES-R Albedo Algorithm Based on MODIS Data Tao He a, Shunlin Liang a, Dongdong Wang a

Polarimetric Effects in Non-polarimetric Imaging Russel P. Kauffman* 1a and Michael Gartley b

SENTINEL-2 PROCESSING IN SNAP

MultiSpec Tutorial: Unsupervised Classification (Cluster Analysis)

Remote Sensing & Photogrammetry W4. Beata Hejmanowska Building C4, room 212, phone:

Lab 9: Atmospheric Corrections

Multi-sensors vicarious calibration activities at CNES

Introduction to Remote Sensing

Light Detection and Ranging (LiDAR)

CORRECTING RS SYSTEM DETECTOR ERROR GEOMETRIC CORRECTION

Ice Cover and Sea and Lake Ice Concentration with GOES-R ABI

The first Conference on Advances in Geomatics Research 110. Modeling Topographic Effects in Satellite Imagery

BATHYMETRIC EXTRACTION USING WORLDVIEW-2 HIGH RESOLUTION IMAGES

Quality assessment of RS data. Remote Sensing (GRS-20306)

The Spherical Harmonics Discrete Ordinate Method for Atmospheric Radiative Transfer

Machine learning approach to retrieving physical variables from remotely sensed data

INTEGRATION OF TREE DATABASE DERIVED FROM SATELLITE IMAGERY AND LIDAR POINT CLOUD DATA

UAV-based Remote Sensing Payload Comprehensive Validation System

Validation of MODTRAN 5.3 sea surface radiance computations

Application of Hyperspectral Remote Sensing for LAI Estimation in Precision Farming

Motivation. Aerosol Retrieval Over Urban Areas with High Resolution Hyperspectral Sensors

Change Detection in Remotely Sensed Images Based on Image Fusion and Fuzzy Clustering

Global and Regional Retrieval of Aerosol from MODIS

Modeling of the ageing effects on Meteosat First Generation Visible Band

ENMAP RADIOMETRIC INFLIGHT CALIBRATION

NAME :... Signature :... Desk no. :... Question Answer

Crop Types Classification By Hyperion Data And Unmixing Algorithm

Terrestrial GPS setup Fundamentals of Airborne LiDAR Systems, Collection and Calibration. JAMIE YOUNG Senior Manager LiDAR Solutions

NATIONAL TECHNICAL UNIVERSITY OF ATHENS PERMANENT COMMITTEE FOR BASIC RESEARCH NTUA BASIC RESEARCH SUPPORT PROGRAMME THALES 2001

This is the general guide for landuse mapping using mid-resolution remote sensing data

Infrared Scene Simulation for Chemical Standoff Detection System Evaluation

MODIS Atmosphere: MOD35_L2: Format & Content

MTG-FCI: ATBD for Clear Sky Reflectance Map Product

L7 Raster Algorithms

False Color to NDVI Conversion Precision NDVI Single Sensor

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication

This paper describes an analytical approach to the parametric analysis of target/decoy

LIDAR and Terrain Models: In 3D!

Analysis Ready Data For Land (CARD4L-ST)

Introduction to Image Processing

Preprocessing of Satellite Images. Radiometric and geometric corrections

Suitability of the parametric model RPV to assess canopy structure and heterogeneity from multi-angular CHRIS-PROBA data

Digital Image Classification Geography 4354 Remote Sensing

Bird Solar Model Source Creator

Transcription:

Sensor Calibration - Application Station Identifier ASN Scene Center atitude 34.840 (34 3'0.64"N) Day Night DAY Scene Center ongitude 33.03270 (33 0'7.72"E) WRS Path WRS Row 76 036 Corner Upper eft atitude 3.460 (3 32'47.40"N) Date Acquired 2/04/2004 Corner Upper eft ongitude 32.22620 (32 3'34.32"E) Start Time 2004:03:08::38.047370 Corner Upper Right atitude 3.2470 (3 4'44.2"N) Stop Time 2004:03:08:2:0.0 Corner Upper Right ongitude 34.2770 (34 7'.72"E) Image Quality VCID Corner ower eft atitude 33.70 (33 4'6.2"N) Image Quality VCID 2 Cloud Cover 0.00 Corner ower eft ongitude 3.7870 (3 47'.00"E) Sun Elevation 6.03442 Corner ower Right atitude 33.62070 (33 37'4.2"N) Sun Azimuth 33.32486 Corner ower Right ongitude 33.840 (33 4'0.84"E) Raw data are converted from digital numbers (DN) to units of radiance using the following equation: Where: Radiance = min max QCA max = spectral radiance at the minimum quantized and calibrated data digital number = spectral radiance at the maximum quantized and calibrated data digital number = 2 ( max min) QCA + min QCA max The Gain setting for andsat 7 (High or ow Gain) depends on: the surface cover types of the earth and the sun angle QCA = DN

Gain Band Table.2 ETM+ Spectral Radiance Range watts/(meter squared * ster * µm) Before July, 2000 After July, 2000 Gain Band 2 ow Gain High Gain ow Gain High Gain Gain Band 3 Band Number MIN MAX MIN MAX MIN MAX MIN MAX Gain Band 4 2-6.0 27. 303.4-6.0 4.3 202.4-6.4 23.7 300. -6.4.6 6. Gain Band 3-4. 23. -4. 8.6 -.0 234.4 -.0 2. Gain Band 6 VCID 4-4. 23.0 47.70-4. 7. 3.76 -. 24. 47.7 -. 7.4 3.06 Gain Band 6 VCID 2 H 6 7 0.0-0.3 7.04 6.60 3.2-0.3 2.6 0.32 0.0-0.3 7.04 6.4 3.2-0.3 2.6 0.80 Gain Band 7 8 -.0 244.00 -.0 8.40-4.7 243. -4.7 8.3 Gain Band 8 Consider in our example that DN = 00 Radiance ts =.60784 (W/m -2 ster - µm - ) A reduction in between-scene variability is achieved through normalization for solar irradiance by converting spectral radiance to reflectance using the following equation: ( π ts ) d ρts = 2 E *cos( θ )* ESUNλ d Where: ρ ts ts d ESUN λ θ 0 0 0 = reflectance at the top of the atmosphere = spectral radiance at the sensor s aperture = Earth-Sun distance in Astronomical Units = solar irradiance at the top of the atmosphere = solar zenith angle Station Identifier Day Night WRS Path WRS Row Date Acquired Start Time Stop Time Image Quality VCID Image Quality VCID 2 Cloud Cover Sun Elevation Sun Azimuth ASN DAY 76 036 2/04/2004 2004:03:08::38.047370 2004:03:08:2:0.0 0.00 6.03442 33.32486 2

Table d.4 Earth-Sun Distance in Astronomical Units Julian day = The number of days elapsed since the start of the year Julian Day Dist. Julian Day Dist. Julian Day Dist. Julian Day Dist. Julian Day Dist..832 74.4 2.040 227.028 30.2 Jan. 3 Febr. 2 March 3 April 2 =03 32 46 60.836.83.878.0 06 2 3.3.0033.0076.00 66 82 6 23.08.067.06.04 242 28 274 288.002.007.00.72 3 33 34 36.82.860.843.833 d =,002 A reduction in between-scene variability is achieved through normalization for solar irradiance by converting spectral radiance to reflectance using the following equation: ( π ts ) d ρts = 2 E *cos( θ )* ESUNλ d Where: 0 0 ρ ts ts d ESUN λ = reflectance at the top of the atmosphere = spectral radiance at the sensor s aperture = Earth-Sun distance interpolated from Table (based on Julian day) in Astronomical Units = solar irradiance at the top of the atmosphere θ 0 = solar zenith angle The elevation angle is the angular height of the sun in the sky measured from the horizontal. The elevation is 0 at sunrise and 0 when the sun is directly overhead (which occurs for example at the equator on the spring and fall equinoxes). The zenith angle is similar to the elevation angle but it is measured from the vertical rather than from the horizontal, thus making the zenith angle = 0 - elevation. 3

Station Identifier Day Night WRS Path WRS Row Date Acquired Start Time Stop Time Image Quality VCID Image Quality VCID 2 Cloud Cover Sun Elevation ASN DAY 76 036 2/04/2004 2004:03:08::38.047370 2004:03:08:2:0.0 0.00 3 o Solar zenith angle = 0 o solar elevation angle = 0 o 3 o = o A reduction in between-scene variability is achieved through normalization for solar irradiance by converting Table.3 ETM+ Solar Spectral Irradiances spectral radiance to reflectance using the following equation: Band ESUN λ (Watts x m -2 x µm -) (Watts x cm -2 x µm - ) ( π ) d ts ρts = 6.000 0.6 E0*cos( θ0)* ESUNλ d 2 840.000 0.84 Where: 3.000 0. ρ ts = reflectance at the top of the atmosphere 4 044.000 0.044 = spectral radiance at the sensor s aperture ts 22.700 0.0227 d = Earth-Sun distance interpolated from Table (based on Julian day) in Astronomical Units 7 82.07 0.008207 ESUN λ = solar irradiance at the top of the atmosphere 8 368.000 0.368 θ 0 = solar zenith angle The thermal band of andsat 7 ETM+ (band 6) is converted from spectral radiance to temperature. The conversion equation is: Reflectance r = 0.402782 K 2 T = K ln( + ) λ Where: T = effective at-satellite temperature in Kelvin K = calibration constant = 666.0 ( 2 W* m * ster µ m ) K 2 = calibration constant = 282.7 ( o Kelvin) 2 λ = spectral radiance ( W* m * ster µ m ) 4

K = 2 666.0 ( W* m * ster µ m ) In our example consider DN=00 K2 = 282.7 (Kelvin) Table.2 ETM+ Spectral Radiance Range watts/(meter squared * ster * µm) The thermal band of andsat 7 satellite has both ow Gain and High Gain Band Number Before July, 2000 ow Gain High Gain MIN MAX MIN MAX After July, 2000 ow Gain High Gain MIN MAX MIN MAX 27. 4.3 23.7.6 2-6.0 303.4-6.0 202.4-6.4 300. -6.4 6. 3-4. 23. -4. 8.6 -.0 234.4 -.0 2. 4-4. 23.0-4. 7. -. 24. -. 7.4 47.70 3.76 47.7 3.06 6 0.0 7.04 3.2 2.6 0.0 7.04 3.2 2.6 7-0.3 6.60-0.3 0.32-0.3 6.4-0.3 0.80 8 -.0 244.00 -.0 8.40-4.7 243. -4.7 8.3 Atmospheric Effects Temperature K = 278.3 =.3 o C

7 th May 200: Prior to atmospheric correction 7 th May 200: After atmospheric correction 4 th July: Prior to atmospheric correction 4 th July: After atmospheric correction 2 st August: Prior to atmospheric correction 2 st August: After atmospheric correction 6

Atmospheric effects Image acquisition Areas of water (hectares) 7 th May 200 Prior to atmospheric correction 8. After atmospheric correction 43.6 Image Processing 4 th July 200 84.6 684.6 2 August 200 82. 3.7 22 September 200 243. 22.8 Difference between areas classified as water before and after atmospheric correction Spectral characteristics of different materials Image Classification Reflectance (%) 60 40 Vegetation Dry bare soil Visual Interpretation Unsupervised Classification Supervised Classification 20 Water (Clear) 0 0.4 0.6 0.8.0.2.4.6.8 2.0 2.2 2.4 2.6 Wavelength (µm) Visual interpretation Class identification Identification of the classes of ground cover to be detected is essential prior to image classification (from desk study or other sources) Visual interpretation consists of: the visual identification of image features and their assignment into classes. This is enabled by the use of time series images where crops can be discriminated based on changes during their growth stage. 7

Visual interpretation of the images is initially carried out to identify broad classes such as water, soil and vegetation 24 th February 2000 False colour presentation. Band 6 (thermal) projected on red, Band 4 (near-infrared) projected on green and band 2 (green) projected on blue The images are presented in false colour (Bands 2,4,6): the red colour indicates high temperature areas, the green vegetation and the black water. Field pattern In the following images, vegetation, soil and water are easily discriminated and marked. Also Field Patterns are clearly identified. Water Soil Field pattern Water Field pattern 7 th July 2000 8

4 3 4 6 3 0 7 2 2 4 0 6 7 2 2 4 8 3 8 3 Water 7 th July 2000 Soil Water Spilled water Soil False colour presentation of 7th July 2000 image. Band 6 (thermal) projected on red, Band 4 (near-infrared) projected on green and band 2 (green) projected on blue 4 3 6 0 7 2 2 4 8 3 th September 2000 Water False colour presentation of th September 2000 image. Band 6 (thermal) projected on red, Band 4 (near-infrared) projected on green and band 2 (green) projected on blue Soil th September 2000 Classification is the process of sorting pixels into a finite number of individual classes, or categories of data, based on their data file values. Salty soil Winter wheat followed by alfalfa Summer wheat followed by alfalfa Alfalfa Early-cultivated rice Rice Water irrigation system Open water Unclassified There are two classification methods, unsupervised classification and supervised classification.

Unsupervised Classification Unsupervised classification is a computer-automated process where pixels are assigned to classes based on their spectral characteristics. It is used when little is known about the data and giving meaning to the classified results depends on the analyst. The way the pixels are assigned into classes depends on the clustering algorithm used. The most popular unsupervised classification algorithm is ISODATA (Iterative Self-Organising Data Analysis Technique). ISODATA assigns pixels into clusters based on the minimum Euclidean spectral distance. In the first iteration, the means of the clusters are arbitrarily determined. New cluster means are determined after each iteration. The pixels are assigned to the nearest cluster centre every iteration. This is repeated until the number of iterations or the threshold specified by the user is reached. Euclidean Spectral Distance Euclidean Spectral Distance Y p ( x,y ) Using Pythagora s Theorem, in 2-dimensional space: Z p ( x,y,z ) Similarly, in 3-dimensional space, the equation becomes: d qp= (p x q x ) 2 + (p y q y ) 2 q ( x,y,z) d qp = (p x q x ) 2 + (p y q y ) 2 + (p z q z ) 2 q ( x,y) X X Y Deep clear water body: W Very low reflectance in the near infrared waveband Vigorous vegetation: V Very high reflectance in the near infrared waveband The red and near infrared wavebands might therefore be selected as the features on which the classification is to be based. ow reflectance in the visible red ow reflectance in the visible red 0

Visible red (band in andsat 7 ETM+) W a c Points a, b, c, d represent pixel values (reflectance values after the sensor calibration). V d We can choose a decision rule such as points will be labelled as members of the group whose centre is closest in feature space to the point concerned To do this, we should consider each point separately (e.g. point a) and determine the Euclidean distance from that point (point a) to each of the centres V and W. b Near infrared (band 4 in andsat 7ETM+ Points that are closer to V will be labelled as vegetation V while the points closer to W will be labelled as water W. The results of the classification described, is a matrix. Its elements are numerical pixel labels. For example, the pixels that are classified as water are given the number 2. The pixels that are classified as vegetation are given the number. Then, the analyst can choose colours to associate each class, i.e., he can choose green colour to associate all pixels that are labelled, and the colour blue for all pixels labelled as 2. In this way, we produce a colour-coded thematic map of the image. In the first iteration, the means of the clusters are arbitrarily determined. New cluster means are determined after each iteration. Apart from the decision rule shortest distance to centre we used for this example, there are other ones. The pixels are assigned to the nearest cluster centre every iteration. This is repeated until the number of iterations or the threshold specified by the user is reached.

Unsupervised Classification. Number of Classes used Four class unsupervised classification on 7 th May 200 Class : water Class 2: vegetation Class 3: sparse vegetation and soil Class 4: soil and salty soil Six class unsupervised classification on 7 th May 200 Class : water Class 2: vegetation stage Class 3: vegetation stage 2 Class 4: vegetation stage 3 Class : sparse vegetation and soil Class 6: soil and salty soil Four class unsupervised classification on 4 th July 200 2

Water is classified as the first class independent of the number of classes specified Six class unsupervised classification on 4 th July 200 Unsupervised Classification. Number of Classes used Unsupervised Classification 2. Single or Time-series image? The number of classes specified significantly affects the classification. 70 3 % of pixels changing class 60 0 40 30 20 0 % of pixels changing class 30 2 20 0 0 0 2 3 4 Change of class 0 0 2 3 4 Change of class Percentage of pixels changing class (jumping or dropping) as a result of six class unsupervised classification on 4 th July 200 and 2 st August 200 Percentage of pixels changing class (jumping or dropping) as a result of six class unsupervised classification on 7 th May 200 and 4 th July 200 3

Unsupervised Classification 2. Single or Time-series image? The use of a single image for land use classification can result in a non-representative land use map, especially when crop types are being identified. Crops growing at different times will not show in a single image and different crops at similar growth stages and under similar conditions (e.g. both in soil or both in water) will have similar reflectance and may not be distinguishable. Unsupervised Classification Unsupervised classification is not a precise method for image classification. The number of classes specified significantly affects the classification. The labelling of the classes depends on the analyst and is therefore subjective. Unsupervised Classification Unsupervised classification cannot effectively be used for discriminating crops since it assigns pixels into classes only based on their radiometric characteristics and crops have similar radiometric characteristics when are at similar growing stages. On the other hand, unsupervised classification showed to be effective in identifying water. 4