Comparison of Stereo Vision Techniques for cloud-top height retrieval Anna Anzalone *,, Francesco Isgrò^, Domenico Tegolo *INAF-Istituto Istituto di Astrofisica e Fisica cosmica di Palermo, Italy ^Dipartimento di Scienze Fisiche, Università degli Studi Federico II di Napoli,, Italy Dipartimento di Matematica ed Applicazioni, Università degli Studi di Palermo, Italy 6th International Workshop Data Analysis in Astronomy Livio Scarsi Erice 2007
Goal To retrieve cloud-top height from Infra Red image pairs by means of stereo vision techniques Preliminary results of a comparison of a selected set of classic algorithm devoted to extract dense disparity maps and motion fields on IR stereo image pairs of clouds
Field of Interest Climate and weather forecasting studies Cloud effects: Earth s s warming and cooling Cloudmap2, EUSO, AUGER Earth atmosphere monitored by means of large field-of-view telescopes to detect the UV fluorescence track and Čerenkov Radiation produced by the EECR interacting with the atmosphere. Attenuation, occlusion Knowledge of the cloud scenario improves the measurement of: primary energy EECRs composition Astrophysics: Cosmic ray radiation at ultra high energy Cloud effects: Cloud presence can affect: Maximum of the shower signal Diffusion/transmission of the Čerenkov photons Depending on: Cloud altitude Cloud Optical Depth Shower inclination
Current Approaches for indirect measurements from space Radiative Transfer Methods Brightness temperature (11μm) CO 2 slicing method (15μm) Oxigen A-band method (76μm) Need extra information (Aqua/CERES, Terra/MODIS data) Ambient temperature/pressure profiles Computer Vision Methods Feature and area based strereo matching (ERS2/ATSR2, Terra/MISR data) Only geometric relations Emitted IR-radiation converted into a temperature. Comparison with an atmosphere temperature profile: the lowest altitude with the same temperature is assigned as cloud height. Mass above the cloud and pressure profile
Operative Multiview Instruments Height information can be retrieved from Stereo images obtained from pairs of consecutive frames of the same cloudy scene ATSR-2 Along-Track Scanning Radiometer on board the ESA-ERS ERS-2 2 satellite altitude: ~780 km satellite velocity: 6.7 km/sec orbital period: about 100 minutes ATSR2 views: forward view: along the direction of the orbit track at an incidence angle of 55 as it flies toward the scene ATSR nadir view: 120s later, a second observation of the scene at an angle close to 0 Inter-camera delay: 120 sec Multi angle Imaging Spectro Radiometer on board NASA-Terra/Aqua satellites altitude: 705 km 4 spectral bands Visible ible and NearIR 9 different views of the same scene total temporal range of 7 min Operative algorithm for Inter-camera delay: 45-60 sec height maps pixel size: 275m pixel depth: 14bit Cloud motion is significant MISR MISR Jpl image P-49081 Height accuracy 270 m ± 270 m
ATSR2 Stereo Data ATSR2 cloudy image pairs forward view: along the direction of the orbit track at an incidence angle of 55 as it flies toward the scene nadir view: ).120s later, a second observation of the scene at an angle close to 0 ATSR-2 2 DATA Nadir- Forward views (11μm) IR channels: 11μm Gridded Brightness Temperat. data image size: 512x512 pixel size: 1km Swath:: Pixel depth: 16 bit ~ 500 km Ancillary data Cloudiness masks to discard images not including a meaningful number of mostly cloudy pixels d d
Parallax Effect Parallax is the apparent displacement in the sky of an object seen from two different points of view r=d Terra-Sole d = r/tg(p) The bigger the distance is, the smaller the parallax angle gets
Depth recovery distance of scene points to the camera The problem of depth recovery can be split into two steps: Correspondence find matching between consecutive frames For each point of the left image detect the corresponding one in the right image Reconstruction given the correspondences retrieve the 3-D structure according to the reference data system by triangulation Depth estimation depends strongly on the disparity computation Satellite height depth= bf /d Et 1 Et b 2 Focal Surface x 1 x 2 EARTH Area-based: for each point a small The template textured patch of the reference displacement image is searched into the corresponding to matched one according to a the best match measure of similarity. Dense gives the maps, uniform regions. disparity value to Feature-based: matching between the reference features independently selected in point both images ( edges, corners,..) Sparse maps, interpolation, cloud edges feature extraction dependency often not well defined Disparity is the apparent displacement of the scene point on the two image planes x f depth Cloud height h
Matching Algorithms Following the taxonomy of the most common stereo matching algorithms given by the Scharstein and Szeliski (2002) review: Stereo algorithms generally perform the following 4 steps: For this work the (1) matching cost computation first three steps are considered (2) cost aggregation (3) disparity computation/optimisation optimisation problem (4) disparity refinement. the actual step sequence can be different, depending on the particular algorithm considered We obtained a set of 80 combinations of different steps and parameters to be compared 80 disparity maps.. + 1
.. + 1 Optical Flow Algorithm Differential approach finding correspondences by estimating the apparent motion (temporal derivatives) of the scene points on the images. Optical flow: image grey level temporal changes are intended to be evidence of motion. Multi-resolution approach to cope with large vector motion (large time interval) Motion of coarsest images to predict motion for the finest level Multiresolution Optical Flow Estimation Algorithm scheme Motion field Iforward->Inadir Motion field Inadir -> Iforward Consistency check + Interpolation Motion Field
The Algorithm Scheme Implementation of the basic version of known area-based stereo matching algorithms Matching Module PSNR Monitor Forward ATSR2 Image Nadir ATSR2 Image Dense Disparity Maps PSNR module Optical Flow Module Best PSNR Assessment and final selection of the best performers
PSNR Disparity Map : d(i 1,I 2 ) Warping back: wb(i) wb: I 2 d I I I 1? PSNR (I,I 1 ) Two Measures of Goodness PSNR between an original image I and the reconstructed one I': PSNR I I 255 ' (, ) = 20log10 ' 2 ( I( p) I ( p)) p Weighted PSNR PGaps : pixel percentage where the consistency check step did not validate the computed disparity value d(pixel) by interpolation WPSNR(I, I') = (1 PGaps)PSNR(I, I')
PSNR WPSNR Results The fluctuation in the WPSNR graphs mean that some combination of parameters and matching steps provide more consistent disparity maps than others. On the contrary this result it is not highlighted by the nearly regular behaviour of the PSNR values. Similar graphs have been obtained for the other input image pairs. PSNR and WPSNR values for 80+1 disparity maps from different methods obtained as combinations of different steps of the matching algorithms and parameters.
Retrieved disparity values Pixel by pixel comparison of disparities retrieved by the 80 different matcher method combination. The different sequences of algorithms were found to yield very similar results in terms both of PSNR and of disparity values as shown in the figure, where relatively to an input image pair, separate difference distributions between pairs of disparity maps from different matcher combinations,are displayed. The plot shows a blow up of the central peak for all distributions and small difference values. (70% image pixel)
Conclusions The experiments show a general agreement between the disparity maps retrieved by the selected methods and highlight a slightly better performance of the optical flow based method In the future we plan: To test these sets of algorithms on data coming from other satellite data sources To validate the estimated height maps through maps provided by other type of sensors (e.g. lidar,, radar, etc)