RAPID RECONNAISSANCE AND RESPONSE (R 3 ) THERMAL ALGORITHMS. Nathalie Vedie. Technical documentation to support the detailed design of:

Similar documents
An Algorithm for Blurred Thermal image edge enhancement for security by image processing technique

Lecture: Edge Detection

Sobel Edge Detection Algorithm

Ulrik Söderström 16 Feb Image Processing. Segmentation

Image Processing

Filtering Images. Contents

Comparison of Some Motion Detection Methods in cases of Single and Multiple Moving Objects

TELEDYNE GEOSPATIAL SOLUTIONS

CS334: Digital Imaging and Multimedia Edges and Contours. Ahmed Elgammal Dept. of Computer Science Rutgers University

Interfaces Module Exploration Systems Engineering, version 1.0

Image Compression With Haar Discrete Wavelet Transform

Lecture 6: Edge Detection

Calibration of SMOS geolocation biases

Analysis of Image and Video Using Color, Texture and Shape Features for Object Identification

Gianluca Palermo Sapienza - Università di Roma. Paolo Gaudenzi Sapienza - Università di Roma

VEGETATION Geometrical Image Quality

Pictures at an Exhibition

Edge Detection Lecture 03 Computer Vision

Lecture 7: Most Common Edge Detectors

Discovering Visual Hierarchy through Unsupervised Learning Haider Razvi

Image Processing. BITS Pilani. Dr Jagadish Nayak. Dubai Campus

Mohammed Bin Rashid Space Centre.

CS534: Introduction to Computer Vision Edges and Contours. Ahmed Elgammal Dept. of Computer Science Rutgers University

A Vision System for Automatic State Determination of Grid Based Board Games

Spatial Data Models. Raster uses individual cells in a matrix, or grid, format to represent real world entities

DIGITAL HEIGHT MODELS BY CARTOSAT-1

Panoramic Image Stitching

Motion illusion, rotating snakes

Understanding Tracking and StroMotion of Soccer Ball

Rule-based inspection of Wafer surface

Face Detection. Gary Chern, Paul Gurney, and Jared Starman

SUMMARY: DISTINCTIVE IMAGE FEATURES FROM SCALE- INVARIANT KEYPOINTS

C E N T E R A T H O U S T O N S C H O O L of H E A L T H I N F O R M A T I O N S C I E N C E S. Image Operations II

Biometrics Technology: Image Processing & Pattern Recognition (by Dr. Dickson Tong)

CMPUT 206. Introduction to Digital Image Processing

Classification of image operations. Image enhancement (GW-Ch. 3) Point operations. Neighbourhood operation

SIMULATIVE ANALYSIS OF EDGE DETECTION OPERATORS AS APPLIED FOR ROAD IMAGES

Generic approach for structuring of workflow processes in satellite operations

Keywords: Thresholding, Morphological operations, Image filtering, Adaptive histogram equalization, Ceramic tile.

SuRVoS Workbench. Super-Region Volume Segmentation. Imanol Luengo

Anno accademico 2006/2007. Davide Migliore

The Gain setting for Landsat 7 (High or Low Gain) depends on: Sensor Calibration - Application. the surface cover types of the earth and the sun angle

CRITICAL DESIGN REVIEW

Line, edge, blob and corner detection

Spatial Enhancement Definition

A Novel Algorithm for Color Image matching using Wavelet-SIFT

The Detection of Faces in Color Images: EE368 Project Report

[ ] Review. Edges and Binary Images. Edge detection. Derivative of Gaussian filter. Image gradient. Tuesday, Sept 16

09/11/2017. Morphological image processing. Morphological image processing. Morphological image processing. Morphological image processing (binary)

Algorithms for Edge Detection and Enhancement for Real Time Images: A Comparative Study

Appendix E. Development of methodologies for Brightness Temperature evaluation for the MetOp-SG MWI radiometer. Alberto Franzoso (CGS, Italy)

Overview of Post-BCD Processing

Sentinel-1 Toolbox. TOPS Interferometry Tutorial Issued May 2014

Comparison between Various Edge Detection Methods on Satellite Image

Prewitt, Sobel and Scharr gradient 5x5 convolution matrices

S2 MPC Data Quality Report Ref. S2-PDGS-MPC-DQR

Temporal Modeling and Missing Data Estimation for MODIS Vegetation data

Lecture 4: Spatial Domain Transformations

Real-time Data Process Software for POAC Space Mission Management System

JNTUWORLD. 4. Prove that the average value of laplacian of the equation 2 h = ((r2 σ 2 )/σ 4 ))exp( r 2 /2σ 2 ) is zero. [16]

ELEN E4830 Digital Image Processing. Homework 6 Solution

Assignment 3: Edge Detection

Digital Image Classification Geography 4354 Remote Sensing

Heritage Group Inc. Magnetic susceptibility meter SM-100 USER S MANUAL. Publication Date: April 2005

Tracking Trajectories of Migrating Birds Around a Skyscraper

Geometric Accuracy Evaluation, DEM Generation and Validation for SPOT-5 Level 1B Stereo Scene

Development of Formation Flight and Docking Algorithms Using the SPHERES Testbed

Image Processing Method for Seed Selection

HOMEWORK 1. Theo Lorrain-Hale UNIVERSITY OF MARYLAND

Hardware Description of Multi-Directional Fast Sobel Edge Detection Processor by VHDL for Implementing on FPGA

Lecture: Segmentation I FMAN30: Medical Image Analysis. Anders Heyden

EECS490: Digital Image Processing. Lecture #19

Using Edge Detection in Machine Vision Gauging Applications

Detecting and Identifying Moving Objects in Real-Time

A New Technique of Extraction of Edge Detection Using Digital Image Processing

Fuzzy Inference System based Edge Detection in Images

Comparative Analysis of Edge Detection Algorithms Based on Content Based Image Retrieval With Heterogeneous Images

Local Image preprocessing (cont d)

Image Segmentation Image Thresholds Edge-detection Edge-detection, the 1 st derivative Edge-detection, the 2 nd derivative Horizontal Edges Vertical

SCALE INVARIANT FEATURE TRANSFORM (SIFT)

11 th Annual CubeSat Developers Workshop

Small-scale objects extraction in digital images

The Geostationary Operational Satellite R Series (GOES-R) SpaceWire Implementation

A Comparative Performance Study of Diffuse Fraction Models Based on Data from Vienna, Austria

Performance Evaluation of Different Techniques of Differential Time Lapse Video Generation

Lab 9. Julia Janicki. Introduction

Multimedia Computing: Algorithms, Systems, and Applications: Edge Detection

Application in a Systems Design Environment. EE 498/499 Capstone Design Classes Klipsch School of Electrical & Computer Engineering

Micro Sun Sensor with CMOS Imager for Small Satellite Attitude Control

EE795: Computer Vision and Intelligent Systems

CS4442/9542b Artificial Intelligence II prof. Olga Veksler

REGION BASED SEGEMENTATION

DEVELOPMENT OF ORIENTATION AND DEM/ORTHOIMAGE GENERATION PROGRAM FOR ALOS PRISM

Topic 4 Image Segmentation

Multi-Scale Kernel Operators for Reflection and Rotation Symmetry: Further Achievements

ROSESAT -- A GRAPHICAL SPACECRAFT SIMULATOR FOR RAPID PROTOTYPING

Image Enhancement: To improve the quality of images

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

CS4442/9542b Artificial Intelligence II prof. Olga Veksler

Analysis Ready Data For Land (CARD4L-ST)

Transcription:

RAPID RECONNAISSANCE AND RESPONSE (R 3 ) THERMAL ALGORITHMS by Nathalie Vedie Technical documentation to support the detailed design of: R 3 Mission Center for Space Systems Georgia Institute of Technology Spring 2010 Lead Designer Date Project Systems Engineer Date Project Manager Date

REVISIONS Revision Description Date Approval Initial Release 4/29/08 PDR Update 7/12/09 Detailed Design Update 12/17/09 CDR Update 2/12/10 End of Spring semester 4/28/10

GEORGIA TECH CENTER FOR SPACE SYSTEMS ABSTRACT Thermal Algorithms Subsystem Technical Design Document by Nathalie Vedie Principle Investigator: Professor David Spencer Georgia Tech An overview of the thermal algorithms utilized in the R 3 mission is presented. The R 3 satellite will have thermal algorithms to process images taken by the thermal imager. The algorithms will calculate the area of features based on whether the feature s size matches the intended features area and on whether the temperature of the feature falls within a certain range. The center of each matching feature can be calculated and validated onboard the satellite. The algorithms will also be able to detect edges to various ends.

TABLE OF CONTENTS TABLE OF CONTENTS...B LIST OF FIGURES...III LIST OF TABLES... IV ACKNOWLEDGMENTS... V NOMENCLATURE... VI INTRODUCTION...I CHAPTER 1: REQUIREMENTS FLOW- DOWN...1 CHAPTER 2: SUBSYSTEM ARCHITECTURE...2 Trade Studies and Design Drivers... 2 Trade Study 1... 2 Trade Study 2... 2 Architecture History... Error! Bookmark not defined. Architecture Revision 1...Error! Bookmark not defined. Architecture Revision 2...Error! Bookmark not defined. Subsystem Description and Components... 3 Subsystem Block Diagrams... Error! Bookmark not defined. Schematics... Error! Bookmark not defined. Interfaces (Input/Ouput)... Error! Bookmark not defined. CHAPTER 3: SUBSYSTEM ENGINEERING...4 Key analyses... Error! Bookmark not defined. Algorithms... 4 Resource Budget... 4 CHAPTER 4: OPERABILITY...16 Commands...16 Telemetry...16 CHAPTER 5: INTEGRATION AND TESTING...19 Requirement Verification...19 Requirement Number 1...Error! Bookmark not defined. Requirement Number 2...Error! Bookmark not defined. Functional Tests... Error! Bookmark not defined. Component Name...Error! Bookmark not defined. Component Name...Error! Bookmark not defined. Intragrated Tests...21 End- To- End Test One...Error! Bookmark not defined. Another Integrated Test Name Two...Error! Bookmark not defined. CHAPTER 6: DEVELOPMENT SCHEDULE...22 Detailed Design...22 Integration & Test...22 Verification & Validation... Error! Bookmark not defined. CHAPTER 7: FLIGHT RULES...23 i

Flight Rule Title... Error! Bookmark not defined. Reaction Wheel Desaturation... Error! Bookmark not defined. Another Flight Rule Title... Error! Bookmark not defined. CHAPTER 8: RISK AREAS AND OPEN ITEMS...24 Risk Areas...24 Open Items...24 BIBLIOGRAPHY...25 ii

LIST OF FIGURES Figure 1: Blobber Algorithm Flowchart... 5 Figure 2: Sobel Mask for vertical direction... 6 Figure 3: Sobel Mask for horizontal direction... 6 Figure 4: Prewitt Mask for vertical direction... 7 Figure 5: Prewitt Mask for horizontal direction... 7 Figure 6: First derivative approach flowchart... 8 Figure 7: Laplace Operator's Mask... 9 Figure 8: Second derivative approach flowchart... 9 Figure 9: Charleston, SC original image... 10 Figure 10: Sobel operator... 11 Figure 11: Prewitt operator... 11 Figure 12: Laplace operator... 12 Figure 13: Ocracoke, NC original image... 13 Figure 14: Blobber algorithm results... 13 Figure 15: Cape Hatteras, NC original image... 14 Figure 16: Blobber algorithm results... 14 Figure 17: Blobber algorithm + Sobel edge detection results... 15 Figure 18: Blobber algorithm + Sobel edge detection results... 15 Figure 19: Stone Mountain, GA and its centroid (ASTER image, 90m/pixel resolution)... 20 Figure 20: Charleston, SC airport and its centroid (ASTER image, 90m/pixel resolution) 21 iii

LIST OF TABLES Table I Requirements Flow-Down... 1 iv

ACKNOWLEDGMENTS The authors wish to express sincere appreciation to Professor David Spencer (Principle Investigator) and Annalisa Bracco (Associate Professor in Oceanography at the EAS department of the Georgia Institute of Technology) for their assistance in the preparation of this design and documentation. Special thanks also to the entire R 3 design team, whose tireless diligence and cooperative spirit was vital during the early programming phase of this undertaking. v

NOMENCLATURE Blobs contiguous features found on the output of the blobber algorithm; used to identify features that might pass identification criteria (intensity and area) Blobber Algorithm algorithm used to find features of interest on the R3 mission Edge Detection Algorithm algorithm used to detect edges via gradients, supplement the blobber algorithm Kernel a matrix that is used to operate on a pixel and its surrounding pixels to determine various characteristics of the pixel in question Laplace Operator the basis for the edge detection algorithm that uses a second derivative approach to detect edges Mask see Kernel Pass a flight over a feature of interest by the R3 satellite Pass-failure when a pass fails to either image the feature due to some obstruction or fails to downlink to the ground station Prewitt Operator the basis for one of the two edge detection algorithms that utilize the first derivative approach to detect edges; see Sobel Operator Script a tool used to run algorithms multiple automatically in order to analyze the algorithm itself instead of worrying about code and to generate results automatically Sobel Operator - the basis for one of the two edge detection algorithms that utilize the first derivative approach to detect edges; see Prewitt Operator Thermal Imaging the science of analyzing thermal images that are taken by a thermal imager Thermal Imager the tool used to generate, or take, thermal images Thermal Variance the difference in temperature that manifests as difference in pixel intensity in thermal image. vi

INTRODUCTION The R 3 satellite project began with eight students attending Georgia Institute of Technology in January 2009. The goal of the R 3 project is to win a secondary payload launch opportunity through the Air Force Research Laboratory s sixth University Nanosat Program competition (UNP-6). The mission objectives are to demonstrate same-pass tasking and innovative on-board processing algorithms by detecting a thermal feature of known signature, calculating its coordinates, and downlinking them to the Georgia Tech Ground Station in the same pass; and to correlate the radiation environment in space with its effect on a non-radiation-hardened uncooled microbolometer. As a secondary payload, the satellite must be robust to successfully complete its mission in any orbit from 500-1000 km and at any inclination. It must also meet all UNP-6 program constraints, including staying within a size and mass envelope of 50cm x 50cm x 60cm and 50kg. The R 3 satellite shall have a thermal algorithms subsystem in order to process captured images of interest and downlink information about these images to the ground station. The detail designed document will understand the logic behind the algorithms and determine what algorithm would yield the desired results in the most efficient manner. i

CHAPTER 1: REQUIREMENTS FLOW-DOWN The requirements for the Algorithms flow down directly from the mission statements of R 3. The requirements flow down presented in Table I detail the mission statement, the Mission Objectives, the Mission Success Criteria and the Subsystem Requirements. Table I Requirements Flow-Down Index Requirement Source Mission Objectives M0-3 MSC- 4 MSC- 5 ALG- 1 ALG- 1.1 ALG- 1.1.1 ALG- 1.1.2 ALG- 1.2 ALG- 2 R3 shall acquire thermal images from low earth orbit, and utilize onboard image processing algorithms to detect and geolocate thermal features having specified signatures. Mission Success Criteria The R3 mission shall utilize onboard image processing algorithms to detect features having specified signatures. The R3 mission shall geolocate the thermal features using onboard processes. Thermal algorithms Requirements On- board image processing algorithms shall be capable of identifying thermal features within images The image processing algorithm to be applied to a particular thermal image shall be identified via ground command. The "blobber" algorithm shall be capable of identifying thermal features having a specified range of intensity, covering a specified range of contiguous pixel area. The "edge detection" algorithm shall be capable of detecting areas within thermal images exhibiting thermal gradients within specified ranges. The image processing algorithm shall downlink an error message if they could not be process the image correctly The image processing algorithms shall be capable of calculating the centroid pixel of identified thermal features. MO- 3 MO- 3 MSC- 4 ALG- 1 ALG- 1.1 ALG- 1.1 MSC- 5 1

CHAPTER 2: SUBSYSTEM ARCHITECTURE TRADE STUDIES AND DESIGN DRIVERS The thermal algorithms used on the R3 satellite can detect thermal variations in order to identify shapes and edges of objects of interest. The blobber algorithm can be used to identify the center of a feature. The edge detection algorithms can be used to various ends, such as detecting eddies to detecting changes in shape of a particular feature over multiple passes. We list below two possible design drivers: the blobber algorithm and the edge detection algorithm. Blobber Algorithm The blobber algorithm was created for the specific use of screening a grayscale image and finding certain features (blobs) based on intensity thresholds and area limitations. After all blobs that fit the intensity and area criteria are found, their centers can be calculated and downlinked to the ground station, or we can obtain the coordinates of each pixel belonging to a blob. Edge Detection Algorithm The edge detection algorithms are multifarious in their use. The algorithms can be used to detect eddies and currents in the water, such as the Gulf Stream, in addition to the blobber algorithm; or to detect river outflows. The data received from these algorithms can be used to either confirm that the blob we found is indeed the feature we were looking for or even track ocean currents and detect any shifts. We can also track changes along the years by comparing the results given by the edge detection algorithm applied to two different images taken at two different dates. Trade Study: Edge Detection Algorithm There are two approaches to edge detection: the first derivative approach and the second derivative approach. The first derivative method finds the gradient and looks for maxima in the gradient to detect the edges. Two examples employ the first derivative approach: the Sobel Operator and the Prewitt Operator. The second derivative method, however, finds the rate of 2

change of the gradient and locates where the rate crosses zero, thereby detecting an edge. The algorithm utilizing the second derivative approach is the Laplace Operator. Sobel Operator and Prewitt Operator The Sobel operator and the Prewitt operator use the first derivative method to detect edges in the grayscale images. They both utilize two kernels (masks/matrices) to find the changes in the horizontal direction and the vertical direction. It then uses those values to calculate the direction of the change intensity in the original grayscale image and creates a new image representing the edges. The difference between the two operators is slight; the Sobel Operator is a bit brighter than the Prewitt operator, due to the different sets of kernels employed by each operator. Laplace Operator The Laplace operator (or Laplacian operator) calculates the second derivative of the given image in order to find the rate of change of the gradient. This allows the operator to determine whether the pixel is likely to be part of an edge or not. This method is often much faster than the Sobel and Prewitt operators mainly because the Laplace operator only uses one kernel to determine the information it needs to create the final image, rather than calculating the information after applying the two kernels to the image. SUBSYSTEM DESCRIPTION AND COMPONENTS Since this subsystem is dealing with algorithms, the results expected and their explanation is given in the Algorithms section. 3

CHAPTER 3: SUBSYSTEM ENGINEERING ALGORITHMS The blobber algorithm locates the blobs desired in a series of eight steps. A graphical representation is presented after the discussion. 1. Initialize all matrices to zero. All of these matrices are the same size as the input image. 2. Screen the original image to create an intensity screen. In this step, all pixels that meet the intensity threshold are given a value of 255 whilst all others are assigned 0. The result is a splattering of white on black. 3. Now it is time to build blobs. Looking at each individual pixel on the intensity screen, if the pixel has at least one neighbor (another pixel with the value of 255 is in one of the 8 squares surrounding the pixel) the pixel is kept as belonging to a blob. Otherwise the pixel is discarded and assigned the value of 0. 4. Now we must number the blobs in order to operate on them. Simply start labeling a pixel and if a pixel in the surroundings is already labeled with a number less than the current label, but different from zero, apply the lower number label. 5. We repeat step 4 in order to modify the pixels that do not have the correct label corresponding to their actual blob number yet. 6. Here, the area for each indidual blob is calculated. 7. Now it is time to eliminate the blobs that don t meet the area criteria specified. 8. Finally, out of the blobs that have successfully passed area screening, we can calculate their centers as possible matches for the features in question. 4

Figure 1: Blobber Algorithm Flowchart 5

The two approaches in edge detection (first and second derivative) don t differ too much. The main difference is the size and amount of masks utilized. FIRST DERIVATIVE APPROACH: SOBEL AND PREWITT OPERATORS The steps for the first derivative are quite simple: 1. Initialize all matrices to zero. The matrices are again the same size as the original image. 2. Apply the horizontal and vertical masks to each pixel, which are then used to create horizontal and vertical images of the gradient. The mask sets for both the Prewitt and the Sobel operators are shown at the end of this listing. 3. Use the horizontal and vertical images to calculate the final edge detection image. Keep only the output pixels whose values are between thresholds. With this option, we can get rid of strong intensity jumps between neighboring pixels, and keep only smaller differences for example. Figure 2: Sobel Mask for vertical direction Figure 3: Sobel Mask for horizontal direction 6

Figure 4: Prewitt Mask for vertical direction Figure 5: Prewitt Mask for horizontal direction 7

Figure 6: First derivative approach flowchart SECOND DERIVATIVE APPROACH: LAPLACE OPERATOR The methodology for the second derivative are simple and efficient, mainly because the Laplace operator only has one mask to apply and finds the value for the pixel in the edge detection image directly, thereby reducing calculations. 1. Initialize the edge detection matrix. There is only one matrix necessary for this algorithm, as the values are computed and stored directly, rather than through any intermediary matrices. 8

2. Apply Laplacian mask to all pixels and store results to the final image. One again, the output value is kept only if it belongs to a certain range of intensity jumps. Below is a flowchart for the second derivative approach and the Laplace Operator. Figure 7: Laplace Operator's Mask Figure 8: Second derivative approach flowchart 9

RESULTS A lot of examples have been realized to check the results of the blobber and edge detection algorithm. The edge detection algorithms are meant to detect sharp changes in intensity indicative of an edge. The main difference in the algorithms is the intensity of edges. The second derivative method yields larger thermal jumps, as shown below when all three algorithms are applied to an image of the Charleston coast in South Carolina. Figure 9: Charleston, SC original image 10

Figure 10: Sobel operator Figure 11: Prewitt operator 11

Figure 12: Laplace operator The blobber algorithm gives us the possibility to detect contiguous features whose area and pixels intensity fall within some ranges. This capability has been applied to detecting the Gulf Stream and its limits close to the coast. Below are some examples of tests run on images representing the Gulf Stream. This algorithm can also be applied to other gyres all around the world to know their boundary in real time. 12

Figure 13: Ocracoke, NC original image Figure 14: Blobber algorithm results 13

Figure 15: Cape Hatteras, NC original image Figure 16: Blobber algorithm results 14

After running the blobber algorithm, we can also apply the edge detection algorithm and obtain the exact limit: Figure 17: Blobber algorithm + Sobel edge detection results Figure 18: Blobber algorithm + Sobel edge detection results 15

CHAPTER 4: OPERABILITY COMMANDS The blobber algorithm needs a range of intensities and a range of areas in order to function correctly. The intensity range is related to the temperature range of the feature we want to detect. Once we have the average and the standard deviation of the picture (in Kelvin, and ignoring the pixels where the temperature is 0 K), we can convert this minimum and maximum temperature to the corresponding intensities (with values between 0 and 255 (8 bits)), using the function: +127 We run tests to determine the minimum and maximum temperature. The algorithm will select feature whose minimum temperature is equal to the average temperature T av in the image, and the maximum is T av + 3σ. Since the images we select will mainly have ocean, we want to select the warmer part of the range of temperature displayed in the image to select the Gulf Stream. That is how our selection of the range of temperatures was chosen. (See results in the algorithm section) The algorithm also needs a set of area ranges to screen the blobs and find a feature that meets the desired criteria. The calculation of the possible range of areas for the blob is based on the percentage of land and clouds present in the picture, the number of pixels that don t represent ground parcels (i.e. the two black bands that appear on both sides of the image), and a given precision. In the examples displayed in the algorithms section, the parameters that were chosen were Number of pixels: 90,230 Precision: 5% 16

The calculation of the area range is then simple. The minimum size of the feature to be called a blob must be the maximum number of pixels that the land and the clouds can occupy (to make sure we select features in the water). This is calculated by multiplying the estimation of percentage of the image occupied by land and clouds, adding the precision percentage and then multiplying it by the total number of pixels. This gives us the minimum number of pixels that the blob must contain. The maximum number is calculated by subtracting to the total number of pixels in the image the number of pixels that don t represent any parcel on the ground (the side bands) and the minimum number of pixels occupied by the land and the clouds (calculated by multiplying the estimation of percentage of the image occupied by land and clouds, substracting the precision percentage and then multiplying it by the total number of pixels). This explains why we need to have images that contain more than half the image covered in water. The results are showed in the algorithms section. Note that we have a minimum range in area that is bigger than the land coverage to make sure that the blobs selected represent data from the ocean. The temperature on the ground and in the Gulf Stream are pretty similar, and we need to make sure that we don t select the land by requiring a blob bigger than the land area on the image. The edge detection algorithms don t really need any parameters, except for the range of intensity jumps wanted. They simply need to read in an image and will return the edges detected. When used in conjunction with the blobber algorithm, shapes and features can be distinguished. TELEMETRY The algorithms need the land percentage estimated from the ground thanks to a tool that will be developed if the Georgia Tech wins the competition (cf. section 8). This will be an input from the ground, whereas the cloud cover percentage, the standard deviation and average temperature of the image have to be calculated onboard. These last three are also inputs for the algorithms. The cloud cover percentage is not a required parameter though. This is an estimation of how much 17

cloud will obstruct real data from the ground, and help us calculate the area range for the blobs. However, this time, pixels showing clouds cannot be mistaken for water, since the clouds are much colder. It is important though to be able to discard an image if the cloud cover is too important, and to not run the algorithms, since the results could be completely off. I recommend discarding the image if its cloud cover is greater than 40%. The outputs that need to be send to the ground consist of the coordinates of the blob or its center of brightness, depending on the type of feature we want to detect. If the algorithms could not run properly an error message must be sent to the ground in order to know that some data might not be accurate. The algorithms can still be processed on the ground with the downlinked image. 18

CHAPTER 5: INTEGRATION AND TESTING REQUIREMENT VERIFICATION On-board image processing algorithms shall be capable of identifying thermal features within images We run the algorithms several times on various images to check that the Gulf Stream was the feature detected, as expected. To have the description of this test, please refer to the Algorithms section explaining how it works. The image processing algorithm to be applied to a particular thermal image shall be identified via ground command. The mission planning process selects the time when images will be taken. This helps up figure out parameters such as the land cover. The operator will select which features will be detected and chooses which algorithm he will apply. It can be the blobber, the edge detection, or a combination of both. The "blobber" algorithm shall be capable of identifying thermal features having a specified range of intensity, covering a specified range of contiguous pixel area. We have run the algorithms on several images representing the Gulf Stream, in order to check that the algorithm was working properly. The "edge detection" algorithm shall be capable of detecting areas within thermal images exhibiting thermal gradients within specified ranges. We have run the algorithms on several images representing different thermal gradients, in order to check that the algorithm was working properly. 19

The image processing algorithm shall downlink an error message if they could not be process the image correctly We have run the algorithms on several images in order to check that the algorithm was diplaying an error message if it was not working properly (like not finding a blob for example). The image processing algorithms shall be capable of calculating the centroid pixel of identified thermal features. Calculating the centroid of a blob is an option that can be selected once the blobs have been found. Some images have been tested to check if that this option was giving us the results expected. Below are examples of blobs and their centroid Figure 19: Stone Mountain, GA and its centroid (ASTER image, 90m/pixel resolution) 20

Figure 20: Charleston, SC airport and its centroid (ASTER image, 90m/pixel resolution) INTRAGRATED TESTS Blobber algorithm test The blobber algorithm needs a lot of inputs from another subsystem, such as Instruments, to find the correct blobs. This means that an end-to-end test needs to be realized when all the code has been put together. 21

CHAPTER 6: DEVELOPMENT SCHEDULE DETAILED DESIGN By may 7 th, 2010, the algorithms should be finished and documented properly. INTEGRATION & TEST Every test has been realized so far, it is now the Software team that must check the results obtained when coding from Matlab to C. 22

CHAPTER 7: FLIGHT RULES CLOUD COVER The algorithms should not been run if the cloud cover in the image exceeds 40%. ERROR MESSAGE The algorithms should downlink an error message to the ground if the image was not processed correctly. 23

CHAPTER 8: RISK AREAS AND OPEN ITEMS RISK AREAS The image could not be acquired properly, and the thermal gradients could be too small to detect the Gulf Stream. OPEN ITEMS The second mission of the algorithms is to detect river outflows, and this is one of the open items left so far. The results so far have not been conclusive, but some work is still possible. 24

BIBLIOGRAPHY "Edge detection." Wikipedia, the free encyclopedia. 07 Dec. 2009 <http://en.wikipedia.org/wiki/edge_detection>. "Edge Detection Tutorial." Drexel University - Unix Web Service. 07 Dec. 2009 <http://www.pages.drexel.edu/~weg22/edge.html>. "Laplace operator." Wikipedia, the free encyclopedia. 07 Dec. 2009 <http://en.wikipedia.org/wiki/laplacian>. "Prewitt." Wikipedia, the free encyclopedia. 07 Dec. 2009 <http://en.wikipedia.org/wiki/prewitt>. "Sobel operator." Wikipedia, the free encyclopedia. 07 Dec. 2009 <http://en.wikipedia.org/wiki/sobel_operator>. 25