ABSTRACT 1. INTRODUCTION
|
|
- Barnaby Grant
- 6 years ago
- Views:
Transcription
1 Correlation between lidar-derived intensity and passive optical imagery Jeremy P. Metcalf, Angela M. Kim, Fred A. Kruse, and Richard C. Olsen Physics Department and Remote Sensing Center, Naval Postgraduate School, 833 Dyer Rd, Monterey, CA, USA ABSTRACT When LiDAR data are collected, the intensity information is recorded for each return, and can be used to produce an image resembling those acquired by passive imaging sensors. This research evaluated LiDAR intensity data to determine its potential for use as baseline imagery where optical imagery are unavailable. Two airborne LiDAR datasets collected at different point densities and laser wavelengths were gridded and compared with optical imagery. Optech Orion C0 laser data were compared with a corresponding 1541 nm spectral band from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS). Optech ALTM Gemini LiDAR data collected at 64 nm were compared to the WorldView-2 (WV-2) nm NIR2 band. Intensity images were georegistered and spatially resampled to match the optical data. The Pearson Product Moment correlation coefficient was calculated between datasets to determine similarity. Comparison for the full LiDAR datasets yielded correlation coefficients of approximately 0.5. Because LiDAR returns from vegetation are known to be highly variable, a Normalized Difference Vegetation Index (NDVI) was calculated utilizing the optical imagery, and intensity and optical imagery were separated into vegetation and nonvegetation categories. Comparison of the LiDAR intensity for non-vegetated areas to the optical imagery yielded coefficients greater than 0.9. These results demonstrate that LiDAR intensity data may be useful in substituting for optical imagery where only LiDAR is available. Keywords: LiDAR, Intensity, Correlation, Orthophotography, Classification 1. INTRODUCTION When Light Detection and Ranging (LiDAR) data are collected, in addition to the typical time-of-flight information, intensity information is recorded for each return. This intensity information gives a measure of relative reflectivity of the objects in the scene, and when viewed as a gridded raster image, produces an image that resembles that of a passive sensor. This information can be especially useful in cases where optical imagery is not available, such as at night; being an active sensing system, LiDAR data can be collected at night or in low-light conditions. LiDAR intensity information is typically not calibrated, and is therefore not useful for true radiometric measurements. It does, however, provide a visually useful product. In this study, we investigated the correlation of LiDAR intensity with passive optical imagery, with the goal of understanding and quantifying the similarities and differences between the data products. We investigated the correlation of the scenes in an overall sense, and then particular classes of materials (such as vegetation, buildings, roads, etc.), and particular classes of LiDAR points (such as nadir scan angles, first returns, etc.). The goal of this work was to determine if there are cases where LiDAR intensity information, although uncalibrated, may be reliably used to provide pseudo-reflectivity information, or if there are cases when the LiDAR intensity information is especially unreliable. Previous studies have demonstrated the utility of LiDAR intensity information (along with spectral information from orthophotos or spectral imagery) for tasks such as image classification or determination of the Normalized Difference Vegetation Index (NDVI) [1, 2]. Laser Radar Technology and Applications XIX; and Atmospheric Propagation XI, edited by Monte D. Turner, Gary W. Kamerman, Linda M. Wasiczko Thomas, Earl J. Spillar, Proc. of SPIE Vol. 9080, 90800U 14 SPIE CCC code: X/14/$18 doi:.1117/ Proc. of SPIE Vol U-1
2 2. DATA 2.1. LiDAR Data LiDAR data were collected over California s Monterey Bay area by Digital Mapping Inc. in (the AMBAG dataset) and Watershed Sciences Inc. (WSI) in 12. General information about each LiDAR dataset is given in Table 1. The two LiDAR datasets used in this study contrast highly in their collection parameters. Having a very high point density, the WSI data allow investigation of laser intensity correlation at a much finer scale than that of the AMBAG data. At 64 nm, the wavelength of the laser used in the AMBAG dataset is closer to typical wavelength ranges of visible to near infrared imaging systems. Table 1. LiDAR dataset descriptions. Dataset Name WSI AMBAG Origin Watershed Sciences Inc. Digital Mapping Inc. Client Naval Postgraduate School Associated Monterey Bay Area Governments (AMBAG) Collection Date Oct 12-Nov 12 Before Aug LiDAR System Optech Orion C-0 Optech ALTM Gemini Wavelength 1541 nm 64 nm Parameters 4 m AGL, 60% sidelap 10 m AGL, % sidelap Scanning 66 khz PRF, FOV 0 khz PRF, 25 FOV Point Density -80 pts/m 2 average 2-4 pts/m 2 average Posted Accuracy 7 cm vertical, cm horiz. 23 cm vertical, 35 cm horiz Image Data Spectral imagery datasets were chosen for comparison with the LiDAR data. A summary of the spectral imagery used is given in Table 2. Table 2. Spectral Imagery dataset descriptions. Sensor Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) WorldView-2 (WV-2) Ultracam Eagle Collection Sep 11 Apr 11 Nov 12 Date Collection Platform Airborne Satellite Airborne Selected 0-600nm (blue), nm(green), 860- nm Spectral nm (Band 126) 580-7nm (red), nm (NIR) (NIR2) Channels Pixel Size 2.4 m 2.3 m 15 cm Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) Hyperspectral Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data [3] were collected over Monterey, CA, in September 11 with a 2.4 m spatial resolution. Atmospheric correction was performed using ACORN [4]. An 10m by 800m image chip was created covering the Naval Postgraduate School (NPS), Monterey, CA. These data were paired with the WSI LiDAR data. Only the band closest to 1541 nm was used in this study as it overlapped the laser wavelength. Proc. of SPIE Vol U-2
3 Figure 1. True color AVIRIS image of the Naval Postgraduate School campus, Monterey, CA WorldView-2 Multispectral WorldView-2 (WV-2) satellite imagery collected in April 11 was paired with the AMBAG intensity data. The WV-2 image has a spatial resolution of 2.3 m and covers a 3.5 km by 2.3 km area of Monterey, CA. Only the NIR2 band was used in this study. The NIR2 band has a minimum and maximum band edge at 860 and nm respectively [6]...;,. Figure 2. True color WorldView-2 image of a portion of Monterey, CA. e Proc. of SPIE Vol U-3
4 2.2.3 UltraCam Eagle Following LiDAR acquisition, WSI collected high spatial resolution aerial imagery using the UltraCam Eagle camera. The UltraCam Eagle is a 260 megapixel, large format digital aerial camera with four spectral bands: (red, green, blue, and near infrared). Spectral responses for this sensor are 0-600nm (blue), nm (green), 580-7nm (red) and nm (NIR). Images were radiometrically calibrated, pan-sharpened, and then orthorectified using a ground model. The corrected images were output with a 16 bit dynamic range with a ground sample distance (GSD) of 15 cm. Figure 3. True color Ultracam Eagle aerial orthophoto of Naval Postgraduate School, Monterey, CA 3. METHODS Two separate methods were employed to evaluate LiDAR intensity and passive optical imagery correlation. In the first method, raster images were created from the LiDAR intensity information and the correlation to the spectral imagery was calculated on an image-to-image basis. This method was used to compare the AVIRIS to the WSI LiDAR and WV- 2 data to the AMBAG LiDAR. In the second method, an attempt was made to avoid the uncertainties introduced by gridding the LiDAR data. The spectral imagery was overlaid on the LiDAR data, enabling the spectral values to be associated with each individual LiDAR point. Correlation was calculated on an image-to-point basis. This method was used to compare the WSI LiDAR to the UltraCam Eagle orthophotos. For both approaches, several assumptions were made: 1. Only the intensity information contained in the first echo was considered as it is assumed to resemble the surface energy received by passive optical sensors. 2. For hard surfaces such as buildings or roads, the first echo is typically the only echo. In objects that are frequently represented by multiple echoes, vegetation for example, laser energy is often divided among each following echo. 3. Because the behavior of laser returns over water tends to be highly variable, they are ignored in this study. Proc. of SPIE Vol U-4
5 The Pearson Product Moment Correlation coefficient is used to evaluate similarity between datasets. Pearson s r is defined as: r = n i= 1 n i= 1 ( X i ( X i X ) X )( Y 2 i n Y) i= 1 ( Y Y) where is image 1 at the th pixel, is the mean of image 1, is image 2 at the th pixel, is the mean of image 2, and is the number of pixels [7]. With this equation, correlation coefficients range from -1 to 1, where -1 is perfect negative linear correlation, 0 is uncorrelated, and 1 is perfect positive linear correlation. Additionally, the correlation coefficient is invariant to changes in scale Image-to-image approach Create LiDAR intensity raster images Intensity images were created from each LiDAR dataset by gridding the average first return intensity values. The number of points that are averaged for each cell depends on the resulting first return point density. The GSD of AVIRIS and WV-2 images were matched at 2.3 and 2.4 m respectively Classify vegetation and non-vegetation The Normalized Difference Vegetation Index (NDVI) was used to segment image data into vegetation and nonvegetation. NDVI is a common measure to identify healthy vegetation in imagery containing red and near infrared channels [5]. NDVI = (NIR RED) (NIR + RED) Where NIR and RED represent the spectral reflectance measured at the near infrared and red visible spectral regions respectively. NDVI values were calculated using the NIR and RED bands of the spectral imagery The values of an NDVI image range from -1 to 1, where pixels with values above 0.3 represent a variety of vegetation cover. Pixels having an NDVI value above 0.3 were classified as vegetation, and pixels having an NDVI value below 0.3 were classified as non-vegetation Calculate correlation Correlation was calculated for the intensity raster images, and for the vegetation and non-vegetation classes for both AVIRIS/WSI and WV-2/AMBAG. Results are given in Section Image-to-point approach Overlay orthophoto on LiDAR point cloud Spectral information was added to the WSI LiDAR dataset by spatially sampling the Ultracam Eagle aerial orthophoto at each LiDAR point location. Both datasets were collected in the same coordinate system making the overlay process fairly simple. LiDAR points in optical shadow present in the orthophoto were identified and removed using a simple threshold of the NIR spectral channel Classify LiDAR data The WSI LiDAR data were delivered with ground points already classified. An automatic classification process was applied to the LiDAR data to produce building, and vegetation classes. Points forming flat surfaces greater than square meters and more than 3 vertical meters from the ground were classified as buildings. Dispersed points with a height minimum of 1.3m and a radius minimum of 2m were classified as trees. Visual inspection of the automatic classification result revealed slight class confusion between building and vegetation classes. A large majority of points i 2 (1) (2) Proc. of SPIE Vol U-5
6 remaining unclassified were found to be clearly associated with building and vegetation classes. Class confusion was manually corrected. During the correction process, an additional road class was introduced using ground points Calculate correlation Correlation was calculated for all LiDAR points, and each of the individual classes (ground, roads, building, and vegetation) Image-to-image correlation 4. RESULTS AND DISCUSSION The results of computing the correlation coefficients for the AVIRIS/WSI and WV-2/AMBAG datasets are presented in Table 3. Table 3. Computed correlation coefficients for AVIRIS/WSI and WV-2/AMBAG image datasets Overall Vegetation Non-vegetation AVIRIS/WSI WV-2/AMBAG Comparing the overall correlation result of both datasets, the WV-2/AMBAG correlation was relatively higher than that of the AVIRIS/WSI test. We initially expected the AVIRIS/WSI dataset to have a much higher correlation since the spectral response of AVIRIS band selected directly overlapped the wavelength of the laser used in the WSI LiDAR collection. At this point, it is unclear what are the major contributing factors to the correlation values we see between datasets. When both image datasets were separated into vegetation and non-vegetation categories, the resulting correlation coefficients improved dramatically. Despite the differences in point density, coverage area, and wavelength overlap of each test case, the correlation within each category was well above 0.9. This increase in correlation is likely explained by variations in how each surface is generally represented in LiDAR data. Impervious surfaces are usually represented by a single return having an intensity value more closely related to the reflectance characteristics of the surface. In vegetation, specifically vegetation having more than one return, the intensity received by the first return can be diminished depending on number of returns, beam width, and object size. To visualize how correlation varies between datasets, composite images were created for AVIRIS/WSI (Figure 4) and WV-2/AMBAG (Figure 5) datasets. AVIRIS and WSI intensity images appear to be aligned rather well as seen in Figure 4. Buildings with split rooftops can readily be identified by the bright red pixels owing to illumination differences inherent in LiDAR and optical data collection methods. The image has an overall blue color because the LiDAR intensity and AVIRIS images were scaled to different ranges. Because AVIRIS was collected in September 11 and WSI LiDAR in November 12, variations found in AVIRIS brightness and LiDAR intensity can be partially explained by seasonal changes between years. Proc. of SPIE Vol U-6
7 , t, ','HIS: :coed CI: LiDAR: scaled 0: ,. It...x.,..2. '''..,,,. Figure 4. Composite image of AVIRIS/WSI image data with AVIRIS image in red, WSI intensity image in blue and green. Visual inspection of the WV-2/AMBAG image composite (Figure 5) reveals discrepancies likely affecting image correlation. At first glance, there appears to be a brightness gradient where LiDAR intensity is higher in the left of the image. After reevaluating the original AMBAG LiDAR dataset, the intensity gradient was found to correspond with elevation for this area. It is possible that the AMBAG LiDAR intensity data were not properly calibrated prior to our analysis. Image alignment errors are apparent along the gradient as well. Where the overall LiDAR intensity decreases, image misregistration increases. Additionally, the WV-2 and AMBAG data were collected at different dates; objects not present in both dates can be seen as appearing very bright red or blue. WV 2 scaled 0:8000 LiDAR Intensity: scaled 0: ' 1- Figure 5. Composite image of WV-2/AMBAG data with WV-2 NIR2 image (red), AMBAG intensity image (blue and green). Proc. of SPIE Vol U-7
8 4.2. Image-to-point correlation The correlation results of the WSI LiDAR intensity and Ultracam Eagle spectral bands are presented here. The WSI LiDAR data were classified manually and then fused with the high-resolution orthorectified aerial imagery. Although additional classes were created, we are mainly concerned with assessing the correlation of vegetation and impervious surfaces (ground, road, and building). Figure 6 shows the LiDAR classification and image fusion results. Table 4 displays the computed correlation coefficients for WSI intensity and orthophoto spectral bands. Figure 7 shows scatterplots for these data. o o Unclassified Ground Vegetation Building Road Power Line Figure 6. WSI LiDAR intensity scaled 0 to (top), classification result (bottom left), orthophoto fused with LiDAR points (bottom right) Table 4. Correlation coefficients for WSI LiDAR intensity and optical imagery All Classes Ground Road Buildings Vegetation Blue Green Red NIR Proc. of SPIE Vol U-8
9 All Classes Ground Road Buildings Vegetation io Blue A 32 m X m 6 8 y v x ' x ' , x " n 93 Green 33 m " " 00 Red in Figure 7. Scatterplots for five classes/four spectral bands: x-axis represents the LiDAR intensity values from 0 to and y-axis represents the Ultracam Eagle spectral band values from 0 to 625. Note: actual intensity values range from 0 to 96, but almost all of the information St is 9 represented R 19 _ in the range from 0 to. o From Table 4, we see the highest correlation coefficients for nearly all classes of the WSI LiDAR data occur with the UltraCam Eagle NIR band. Despite having positive correlation with blue, green, and red bands, the ground class was found to be uncorrelated. Unlike other classes, the ground class contains both impervious surfaces and low vegetation. Ìi Compared to other classes, the highest intensity correlation coefficients are found within the road class. The scatterplots for this class show a much clearer linear relationship than any other class. For all bands, the vegetation class also S Fì FS _ appears to be uncorrelated. In Figure 7, scatterplots for vegetation show almost no linearity between intensity and pixel values. From these results, it appears that the intensity associated with vegetation is quite different from that of other surfaces. Overall, the reported correlation coefficients for the WSI LiDAR and UltraCam spectral data are rather low. Several factors that likely had an adverse effect on correlation include: 1) Although imagery and LiDAR data were collected by WSI within - the same 51 9 week, Fì F3 9 the spectral response of the Ultracam Eagle does not overlap with the LiDAR laser wavelength; 2) Concerning healthy vegetation, it is known that the reflectance of this material can be high within the range of the UltraCam NIR band, but low at laser operating wavelength of 1541 nm; and 3) Additionally, the image e.;.:.l overlay process operates in the x and y dimensions only. This translates to ground, road, and building classes receiving vegetation pixels values when directly underneath trees. The effect of solar illumination and LiDAR scan angle were not addressed in this study. o s RR _ (Yi z Proc. of SPIE Vol U-9
10 5. CONCLUSIONS We presented the results of computing the Pearson s Product Moment correlation coefficient between corresponding LiDAR intensity data and passive optical imagery. Two parameters mainly appear to control the correlation between the LiDAR intensity images and optical data: 1) Properly matching the LiDAR and Optical wavelengths, and 2) Separating the datasets into vegetation and non-vegetation categories. When LiDAR data acquired with a laser wavelength of 1541 nm were compared with mismatched optical data with a nm wavelength range (which does not extend to the laser wavelength), correlations were minimal (between 0 and ~0.5). When the LiDAR and optical data wavelengths were well matched, the correlations averaged between Separation of the vegetation from non-vegetated areas in the AVIRIS/WSI LiDAR test resulted in increasing correlation from approximately 0.5 for the full AVIRIS band to well above 0.9 for segmented vegetation and nonvegetation. There was a similar but less pronounced increase in correlation with the WV-2/AMBAG LiDAR test, where an overall correlation of approximately 0.7 rises above 0.9. It is interesting to note that although vegetation and nonvegetation pixels are highly correlated in their own category, when considered together, the correlation drops significantly. Future work will include the summation of intensity by each pulse having multiple returns, separation of LiDAR points by scan angle to investigate BRDF effects, and the use of full waveform LiDAR collected with a green laser. 6. ACKNOWLEDGEMENTS This paper describes selected research results from a project "Remote Sensing for Improved Earthquake Response" supported by the Science and Technology (S&T) Directorate, Department of Homeland Security (DHS). REFERENCES [1] Causey, R., Kehoe, J. and Slatton, K. C. Airborne laser intensity measurements for vegetation studies: A comparison to passive imagery techniques, Adaptive Signal Processing Laboratory (ASPL) at University of Florida, ASPL Report No. Rep_ , (05). [2] Bandyopadhyay, M., van Aardt, J. A. N. and Cawse-Nicholson, K., "Classification and extraction of trees and buildings from urban scenes using discrete return LiDAR and aerial color imagery", Proceedings of SPIE Vol. 8731, (13). [3] Green, R., "AVIRIS and Related 21st Century Imaging Spectrometers for Earth and Space Science," in High Performance Computing in Remote Sensing, Chapman and Hall/CRC Press, , (07). [4] IMSPEC LLC, "ACORN 6 User's Manual", IMSPEC LLC, 121 p., (). [5] Tucker, C.J Red and Photographic linear combinations for monitoring vegetation, Remote Sensing of the Environment, 8, 127 1, (1979). [6] Satellite Imaging Corporation, Spectral Response for DigitalGlobe WorldView 1 and WorldView 2 Earth Imaging Instruments, (Accessed 28 April, 14). [7] Pearson, K., "Notes on regression and inheritance in the case of two parents," Proceedings of the Royal Society of London, 58 : 2 242, (1895). Proc. of SPIE Vol U-
Terrain categorization using LIDAR and multi-spectral data
Terrain categorization using LIDAR and multi-spectral data Angela M. Puetz, R. C. Olsen, Michael A. Helt U.S. Naval Postgraduate School, 833 Dyer Road, Monterey, CA 93943 ampuetz@nps.edu, olsen@nps.edu
More informationMonterey, CA, USA ABSTRACT 1. INTRODUCTION. phone ; fax ; nps.edu/rsc
Application of image classification techniques to multispectral lidar point cloud data Chad I. Miller* a,b, Judson J. Thomas b, Angela M. Kim b, Jeremy P. Metcalf b, Richard C. Olsen b b SAIC, 1710 SAIC
More informationSIMULATED LIDAR WAVEFORMS FOR THE ANALYSIS OF LIGHT PROPAGATION THROUGH A TREE CANOPY
SIMULATED LIDAR WAVEFORMS FOR THE ANALYSIS OF LIGHT PROPAGATION THROUGH A TREE CANOPY Angela M. Kim and Richard C. Olsen Remote Sensing Center Naval Postgraduate School 1 University Circle Monterey, CA
More informationMethods for LiDAR point cloud classification using local neighborhood statistics
Methods for LiDAR point cloud classification using local neighborhood statistics Angela M. Kim, Richard C. Olsen, Fred A. Kruse Naval Postgraduate School, Remote Sensing Center and Physics Department,
More informationAn Introduction to Lidar & Forestry May 2013
An Introduction to Lidar & Forestry May 2013 Introduction to Lidar & Forestry Lidar technology Derivatives from point clouds Applied to forestry Publish & Share Futures Lidar Light Detection And Ranging
More informationGenerating passive NIR images from active LIDAR
Generating passive NIR images from active LIDAR Shea Hagstrom and Joshua Broadwater Johns Hopkins University Applied Physics Lab, Laurel, MD ABSTRACT Many modern LIDAR platforms contain an integrated RGB
More informationAPPLICATION OF SOFTMAX REGRESSION AND ITS VALIDATION FOR SPECTRAL-BASED LAND COVER MAPPING
APPLICATION OF SOFTMAX REGRESSION AND ITS VALIDATION FOR SPECTRAL-BASED LAND COVER MAPPING J. Wolfe a, X. Jin a, T. Bahr b, N. Holzer b, * a Harris Corporation, Broomfield, Colorado, U.S.A. (jwolfe05,
More informationCLASSIFICATION OF NONPHOTOGRAPHIC REMOTE SENSORS
CLASSIFICATION OF NONPHOTOGRAPHIC REMOTE SENSORS PASSIVE ACTIVE DIGITAL CAMERA THERMAL (e.g. TIMS) VIDEO CAMERA MULTI- SPECTRAL SCANNERS VISIBLE & NIR MICROWAVE HYPERSPECTRAL (e.g. AVIRIS) SLAR Real Aperture
More informationComparison of LiDAR and Stereo Photogrammetric Point Clouds for Change Detection Paul L Basgall a, Fred A Kruse b, Richard C Olsen b
Comparison of LiDAR and Stereo Photogrammetric Point Clouds for Change Detection Paul L Basgall a, Fred A Kruse b, Richard C Olsen b a National Geospatial Intelligence Agency, 3838 Vogel Road, Arnold,
More informationHyperspectral Remote Sensing
Hyperspectral Remote Sensing Multi-spectral: Several comparatively wide spectral bands Hyperspectral: Many (could be hundreds) very narrow spectral bands GEOG 4110/5100 30 AVIRIS: Airborne Visible/Infrared
More informationIMPROVED TARGET DETECTION IN URBAN AREA USING COMBINED LIDAR AND APEX DATA
IMPROVED TARGET DETECTION IN URBAN AREA USING COMBINED LIDAR AND APEX DATA Michal Shimoni 1 and Koen Meuleman 2 1 Signal and Image Centre, Dept. of Electrical Engineering (SIC-RMA), Belgium; 2 Flemish
More informationAirborne LiDAR Data Acquisition for Forestry Applications. Mischa Hey WSI (Corvallis, OR)
Airborne LiDAR Data Acquisition for Forestry Applications Mischa Hey WSI (Corvallis, OR) WSI Services Corvallis, OR Airborne Mapping: Light Detection and Ranging (LiDAR) Thermal Infrared Imagery 4-Band
More informationTHE EFFECT OF TOPOGRAPHIC FACTOR IN ATMOSPHERIC CORRECTION FOR HYPERSPECTRAL DATA
THE EFFECT OF TOPOGRAPHIC FACTOR IN ATMOSPHERIC CORRECTION FOR HYPERSPECTRAL DATA Tzu-Min Hong 1, Kun-Jen Wu 2, Chi-Kuei Wang 3* 1 Graduate student, Department of Geomatics, National Cheng-Kung University
More informationDefining Remote Sensing
Defining Remote Sensing Remote Sensing is a technology for sampling electromagnetic radiation to acquire and interpret non-immediate geospatial data from which to extract information about features, objects,
More informationEXTRACTING SURFACE FEATURES OF THE NUECES RIVER DELTA USING LIDAR POINTS INTRODUCTION
EXTRACTING SURFACE FEATURES OF THE NUECES RIVER DELTA USING LIDAR POINTS Lihong Su, Post-Doctoral Research Associate James Gibeaut, Associate Research Professor Harte Research Institute for Gulf of Mexico
More informationINTEGRATION OF TREE DATABASE DERIVED FROM SATELLITE IMAGERY AND LIDAR POINT CLOUD DATA
INTEGRATION OF TREE DATABASE DERIVED FROM SATELLITE IMAGERY AND LIDAR POINT CLOUD DATA S. C. Liew 1, X. Huang 1, E. S. Lin 2, C. Shi 1, A. T. K. Yee 2, A. Tandon 2 1 Centre for Remote Imaging, Sensing
More informationDetecting trails in LiDAR point cloud data
Detecting trails in LiDAR point cloud data Angela M. Kim and Richard C. Olsen Naval Postgraduate School, Remote Sensing Center, 1 University Circle, Monterey, CA, USA ABSTRACT The goal of this work is
More informationRemote Sensing Sensor Integration
Remote Sensing Sensor Integration Erica Tharp LiDAR Supervisor Table of Contents About 3001 International Inc Remote Sensing Platforms Why Sensor Integration? Technical Aspects of Sensor Integration Limitations
More informationLiDAR & Orthophoto Data Report
LiDAR & Orthophoto Data Report Tofino Flood Plain Mapping Data collected and prepared for: District of Tofino, BC 121 3 rd Street Tofino, BC V0R 2Z0 Eagle Mapping Ltd. #201 2071 Kingsway Ave Port Coquitlam,
More informationLiDAR Remote Sensing Data Collection: Yaquina and Elk Creek Watershed, Leaf-On Acquisition
LiDAR Remote Sensing Data Collection: Yaquina and Elk Creek Watershed, Leaf-On Acquisition Submitted by: 4605 NE Fremont, Suite 211 Portland, Oregon 97213 April, 2006 Table of Contents LIGHT DETECTION
More informationIMPROVING 2D CHANGE DETECTION BY USING AVAILABLE 3D DATA
IMPROVING 2D CHANGE DETECTION BY USING AVAILABLE 3D DATA C.J. van der Sande a, *, M. Zanoni b, B.G.H. Gorte a a Optical and Laser Remote Sensing, Department of Earth Observation and Space systems, Delft
More informationLIDAR MAPPING FACT SHEET
1. LIDAR THEORY What is lidar? Lidar is an acronym for light detection and ranging. In the mapping industry, this term is used to describe an airborne laser profiling system that produces location and
More informationCOMBINING HIGH SPATIAL RESOLUTION OPTICAL AND LIDAR DATA FOR OBJECT-BASED IMAGE CLASSIFICATION
COMBINING HIGH SPATIAL RESOLUTION OPTICAL AND LIDAR DATA FOR OBJECT-BASED IMAGE CLASSIFICATION Ruonan Li 1, Tianyi Zhang 1, Ruozheng Geng 1, Leiguang Wang 2, * 1 School of Forestry, Southwest Forestry
More informationTerrestrial GPS setup Fundamentals of Airborne LiDAR Systems, Collection and Calibration. JAMIE YOUNG Senior Manager LiDAR Solutions
Terrestrial GPS setup Fundamentals of Airborne LiDAR Systems, Collection and Calibration JAMIE YOUNG Senior Manager LiDAR Solutions Topics Terrestrial GPS reference Planning and Collection Considerations
More informationIntroduction to Remote Sensing
Introduction to Remote Sensing Spatial, spectral, temporal resolutions Image display alternatives Vegetation Indices Image classifications Image change detections Accuracy assessment Satellites & Air-Photos
More informationPresented at the FIG Congress 2018, May 6-11, 2018 in Istanbul, Turkey
Presented at the FIG Congress 2018, May 6-11, 2018 in Istanbul, Turkey Evangelos MALTEZOS, Charalabos IOANNIDIS, Anastasios DOULAMIS and Nikolaos DOULAMIS Laboratory of Photogrammetry, School of Rural
More information2010 LiDAR Project. GIS User Group Meeting June 30, 2010
2010 LiDAR Project GIS User Group Meeting June 30, 2010 LiDAR = Light Detection and Ranging Technology that utilizes lasers to determine the distance to an object or surface Measures the time delay between
More informationN.J.P.L.S. An Introduction to LiDAR Concepts and Applications
N.J.P.L.S. An Introduction to LiDAR Concepts and Applications Presentation Outline LIDAR Data Capture Advantages of Lidar Technology Basics Intensity and Multiple Returns Lidar Accuracy Airborne Laser
More informationLidar Technical Report
Lidar Technical Report Oregon Department of Forestry Sites Presented to: Oregon Department of Forestry 2600 State Street, Building E Salem, OR 97310 Submitted by: 3410 West 11st Ave. Eugene, OR 97402 April
More informationVALIDATION OF A NEW 30 METER GROUND SAMPLED GLOBAL DEM USING ICESAT LIDARA ELEVATION REFERENCE DATA
VALIDATION OF A NEW 30 METER GROUND SAMPLED GLOBAL DEM USING ICESAT LIDARA ELEVATION REFERENCE DATA M. Lorraine Tighe Director, Geospatial Solutions Intermap Session: Photogrammetry & Image Processing
More informationAutomated Extraction of Buildings from Aerial LiDAR Point Cloud and Digital Imaging Datasets for 3D Cadastre - Preliminary Results
Automated Extraction of Buildings from Aerial LiDAR Point Cloud and Digital Imaging Datasets for 3D Pankaj Kumar 1*, Alias Abdul Rahman 1 and Gurcan Buyuksalih 2 ¹Department of Geoinformation Universiti
More informationSpatial Density Distribution
GeoCue Group Support Team 5/28/2015 Quality control and quality assurance checks for LIDAR data continue to evolve as the industry identifies new ways to help ensure that data collections meet desired
More informationGEOG 4110/5100 Advanced Remote Sensing Lecture 4
GEOG 4110/5100 Advanced Remote Sensing Lecture 4 Geometric Distortion Relevant Reading: Richards, Sections 2.11-2.17 Review What factors influence radiometric distortion? What is striping in an image?
More informationMULTISPECTRAL MAPPING
VOLUME 5 ISSUE 1 JAN/FEB 2015 MULTISPECTRAL MAPPING 8 DRONE TECH REVOLUTION Forthcoming order of magnitude reduction in the price of close-range aerial scanning 16 HANDHELD SCANNING TECH 32 MAX MATERIAL,
More informationGEOBIA for ArcGIS (presentation) Jacek Urbanski
GEOBIA for ArcGIS (presentation) Jacek Urbanski INTEGRATION OF GEOBIA WITH GIS FOR SEMI-AUTOMATIC LAND COVER MAPPING FROM LANDSAT 8 IMAGERY Presented at 5th GEOBIA conference 21 24 May in Thessaloniki.
More informationAerial and Mobile LiDAR Data Fusion
Creating Value Delivering Solutions Aerial and Mobile LiDAR Data Fusion Dr. Srini Dharmapuri, CP, PMP What You Will Learn About LiDAR Fusion Mobile and Aerial LiDAR Technology Components & Parameters Project
More informationMayden VP of Business Development Surdex Corporation
Making Sense of Sensors Randy Mayden, Mayden VP of Business Development Surdex Corporation randym@surdex.com EARLYAERIAL PHOTOGRAPHY 2 FIRSTAERIAL CAMERA 3 AERIAL CAMERA SYSTEM DEVELOPMENT Aerial Camera
More informationUAS based laser scanning for forest inventory and precision farming
UAS based laser scanning for forest inventory and precision farming M. Pfennigbauer, U. Riegl, P. Rieger, P. Amon RIEGL Laser Measurement Systems GmbH, 3580 Horn, Austria Email: mpfennigbauer@riegl.com,
More informationTerrain Modeling and Mapping for Telecom Network Installation Using Scanning Technology. Maziana Muhamad
Terrain Modeling and Mapping for Telecom Network Installation Using Scanning Technology Maziana Muhamad Summarising LiDAR (Airborne Laser Scanning) LiDAR is a reliable survey technique, capable of: acquiring
More informationAPPENDIX E2. Vernal Pool Watershed Mapping
APPENDIX E2 Vernal Pool Watershed Mapping MEMORANDUM To: U.S. Fish and Wildlife Service From: Tyler Friesen, Dudek Subject: SSHCP Vernal Pool Watershed Analysis Using LIDAR Data Date: February 6, 2014
More informationNAVAL POSTGRADUATE SCHOOL THESIS
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS LIDAR POINT CLOUD AND STEREO IMAGE POINT CLOUD FUSION by Paul L. Basgall September 2013 Thesis Advisor: Second Reader: Fred A. Kruse R.C. Olsen Approved
More informationUniversity of Technology Building & Construction Department / Remote Sensing & GIS lecture
5. Corrections 5.1 Introduction 5.2 Radiometric Correction 5.3 Geometric corrections 5.3.1 Systematic distortions 5.3.2 Nonsystematic distortions 5.4 Image Rectification 5.5 Ground Control Points (GCPs)
More informationENY-C2005 Geoinformation in Environmental Modeling Lecture 4b: Laser scanning
1 ENY-C2005 Geoinformation in Environmental Modeling Lecture 4b: Laser scanning Petri Rönnholm Aalto University 2 Learning objectives To recognize applications of laser scanning To understand principles
More informationDIGITAL SURFACE MODELS OF CITY AREAS BY VERY HIGH RESOLUTION SPACE IMAGERY
DIGITAL SURFACE MODELS OF CITY AREAS BY VERY HIGH RESOLUTION SPACE IMAGERY Jacobsen, K. University of Hannover, Institute of Photogrammetry and Geoinformation, Nienburger Str.1, D30167 Hannover phone +49
More informationTesting Hyperspectral Remote Sensing Monitoring Techniques for Geological CO 2 Storage at Natural Seeps
Testing Hyperspectral Remote Sensing Monitoring Techniques for Geological CO 2 Storage at Natural Seeps Luke Bateson Clare Fleming Jonathan Pearce British Geological Survey In what ways can EO help with
More informationLocating the Shadow Regions in LiDAR Data: Results on the SHARE 2012 Dataset
Locating the Shadow Regions in LiDAR Data: Results on the SHARE 22 Dataset Mustafa BOYACI, Seniha Esen YUKSEL* Hacettepe University, Department of Electrical and Electronics Engineering Beytepe, Ankara,
More informationEVOLUTION OF POINT CLOUD
Figure 1: Left and right images of a stereo pair and the disparity map (right) showing the differences of each pixel in the right and left image. (source: https://stackoverflow.com/questions/17607312/difference-between-disparity-map-and-disparity-image-in-stereo-matching)
More informationThe Gain setting for Landsat 7 (High or Low Gain) depends on: Sensor Calibration - Application. the surface cover types of the earth and the sun angle
Sensor Calibration - Application Station Identifier ASN Scene Center atitude 34.840 (34 3'0.64"N) Day Night DAY Scene Center ongitude 33.03270 (33 0'7.72"E) WRS Path WRS Row 76 036 Corner Upper eft atitude
More informationIntegrated analysis of Light Detection and Ranging (LiDAR) and Hyperspectral Imagery (HSI) data
Integrated analysis of Light Detection and Ranging (LiDAR) and Hyperspectral Imagery (HSI) data Angela M. Kim, Fred A. Kruse, and Richard C. Olsen Naval Postgraduate School, Remote Sensing Center and Physics
More informationLidar Sensors, Today & Tomorrow. Christian Sevcik RIEGL Laser Measurement Systems
Lidar Sensors, Today & Tomorrow Christian Sevcik RIEGL Laser Measurement Systems o o o o Online Waveform technology Stand alone operation no field computer required Remote control through wireless network
More informationLeica - Airborne Digital Sensors (ADS80, ALS60) Update / News in the context of Remote Sensing applications
Luzern, Switzerland, acquired with GSD=5 cm, 2008. Leica - Airborne Digital Sensors (ADS80, ALS60) Update / News in the context of Remote Sensing applications Arthur Rohrbach, Sensor Sales Dir Europe,
More informationBy Colin Childs, ESRI Education Services. Catalog
s resolve many traditional raster management issues By Colin Childs, ESRI Education Services Source images ArcGIS 10 introduces Catalog Mosaicked images Sources, mosaic methods, and functions are used
More informationGeomatic & Information Technologies for Ports and Navigable Waterways. Expanding Our Global Opportunities
Geomatic & Information Technologies for Ports and Navigable Waterways Airborne Remote Sensing Susan Jackson Tetra Tech Geomatics BD Director Hydrographic Surveying Robert Feldpausch Tetra Tech Principal
More informationFiles Used in this Tutorial
Generate Point Clouds and DSM Tutorial This tutorial shows how to generate point clouds and a digital surface model (DSM) from IKONOS satellite stereo imagery. You will view the resulting point clouds
More informationABSTRACT. Keywords: Thermal imaging, MWIR, photogrammetry, computer vision, point cloud
Evaluation of terrestrial photogrammetric point clouds derived from thermal imagery Jeremy P. Metcalf, Richard C. Olsen, Naval Postgraduate School, 833 Dyer Rd, Monterey, CA, USA 93943 ABSTRACT Computer
More informationCopyright 2005 Society of Photo-Optical Instrumentation Engineers.
Copyright 2005 Society of Photo-Optical Instrumentation Engineers. This paper was published in the Proceedings, SPIE Symposium on Defense & Security, 28 March 1 April, 2005, Orlando, FL, Conference 5806
More informationFiles Used in this Tutorial
RPC Orthorectification Tutorial In this tutorial, you will use ground control points (GCPs), an orthorectified reference image, and a digital elevation model (DEM) to orthorectify an OrbView-3 scene that
More informationFiles Used in this Tutorial
RPC Orthorectification Tutorial In this tutorial, you will use ground control points (GCPs), an orthorectified reference image, and a digital elevation model (DEM) to orthorectify an OrbView-3 scene that
More informationA Method to Create a Single Photon LiDAR based Hydro-flattened DEM
A Method to Create a Single Photon LiDAR based Hydro-flattened DEM Sagar Deshpande 1 and Alper Yilmaz 2 1 Surveying Engineering, Ferris State University 2 Department of Civil, Environmental, and Geodetic
More informationENVI Automated Image Registration Solutions
ENVI Automated Image Registration Solutions Xiaoying Jin Harris Corporation Table of Contents Introduction... 3 Overview... 4 Image Registration Engine... 6 Image Registration Workflow... 8 Technical Guide...
More informationAnalysis Ready Data For Land
Analysis Ready Data For Land Product Family Specification Optical Surface Reflectance (CARD4L-OSR) Document status For Adoption as: Product Family Specification, Surface Reflectance, Working Draft (2017)
More information[Youn *, 5(11): November 2018] ISSN DOI /zenodo Impact Factor
GLOBAL JOURNAL OF ENGINEERING SCIENCE AND RESEARCHES AUTOMATIC EXTRACTING DEM FROM DSM WITH CONSECUTIVE MORPHOLOGICAL FILTERING Junhee Youn *1 & Tae-Hoon Kim 2 *1,2 Korea Institute of Civil Engineering
More informationCourse Outline (1) #6 Data Acquisition for Built Environment. Fumio YAMAZAKI
AT09.98 Applied GIS and Remote Sensing for Disaster Mitigation #6 Data Acquisition for Built Environment 9 October, 2002 Fumio YAMAZAKI yamazaki@ait.ac.th http://www.star.ait.ac.th/~yamazaki/ Course Outline
More informationAirborne Laser Scanning: Remote Sensing with LiDAR
Airborne Laser Scanning: Remote Sensing with LiDAR ALS / LIDAR OUTLINE Laser remote sensing background Basic components of an ALS/LIDAR system Two distinct families of ALS systems Waveform Discrete Return
More informationA SENSOR FUSION APPROACH TO COASTAL MAPPING
A SENSOR FUSION APPROACH TO COASTAL MAPPING Maryellen Sault, NOAA, National Ocean Service, National Geodetic Survey Christopher Parrish, NOAA, National Ocean Service, National Geodetic Survey Stephen White,
More informationLecture 11. LiDAR, RADAR
NRMT 2270, Photogrammetry/Remote Sensing Lecture 11 Calculating the Number of Photos and Flight Lines in a Photo Project LiDAR, RADAR Tomislav Sapic GIS Technologist Faculty of Natural Resources Management
More informationHigh resolution survey and orthophoto project of the Dosso-Gaya region in the Republic of Niger. by Tim Leary, Woolpert Inc.
High resolution survey and orthophoto project of the Dosso-Gaya region in the Republic of Niger by Tim Leary, Woolpert Inc. Geospatial Solutions Photogrammetry & Remote Sensing LiDAR Professional Surveying
More information2. POINT CLOUD DATA PROCESSING
Point Cloud Generation from suas-mounted iphone Imagery: Performance Analysis A. D. Ladai, J. Miller Towill, Inc., 2300 Clayton Road, Suite 1200, Concord, CA 94520-2176, USA - (andras.ladai, jeffrey.miller)@towill.com
More informationAIRBORNE GEIGER MODE LIDAR - LATEST ADVANCEMENTS IN REMOTE SENSING APPLICATIONS RANDY RHOADS
Place image here (10 x 3.5 ) AIRBORNE GEIGER MODE LIDAR - LATEST ADVANCEMENTS IN REMOTE SENSING APPLICATIONS RANDY RHOADS Geospatial Industry Manager HARRIS.COM #HARRISCORP Harris Company Information SECURITY
More informationLight Detection and Ranging (LiDAR)
Light Detection and Ranging (LiDAR) http://code.google.com/creative/radiohead/ Types of aerial sensors passive active 1 Active sensors for mapping terrain Radar transmits microwaves in pulses determines
More informationAn Approach for Combining Airborne LiDAR and High-Resolution Aerial Color Imagery using Gaussian Processes
Rochester Institute of Technology RIT Scholar Works Presentations and other scholarship 8-31-2015 An Approach for Combining Airborne LiDAR and High-Resolution Aerial Color Imagery using Gaussian Processes
More informationEVALUATION OF CONVENTIONAL DIGITAL CAMERA SCENES FOR THEMATIC INFORMATION EXTRACTION ABSTRACT
EVALUATION OF CONVENTIONAL DIGITAL CAMERA SCENES FOR THEMATIC INFORMATION EXTRACTION H. S. Lim, M. Z. MatJafri and K. Abdullah School of Physics Universiti Sains Malaysia, 11800 Penang ABSTRACT A study
More informationABSTRACT 1. INTRODUCTION 2. DATA
Spectral LiDAR Analysis for Terrain Classification Charles A. McIver, Jeremy P. Metcalf, Richard C. Olsen* Naval Postgraduate School, 833 Dyer Road, Monterey, CA, USA 93943 ABSTRACT Data from the Optech
More informationAdvanced Processing Techniques and Classification of Full-waveform Airborne Laser...
f j y = f( x) = f ( x) n j= 1 j Advanced Processing Techniques and Classification of Full-waveform Airborne Laser... 89 A summary of the proposed methods is presented below: Stilla et al. propose a method
More informationIntegration of airborne LiDAR and hyperspectral remote sensing data to support the Vegetation Resources Inventory and sustainable forest management
Integration of airborne LiDAR and hyperspectral remote sensing data to support the Vegetation Resources Inventory and sustainable forest management Executive Summary This project has addressed a number
More informationLiDAR Derived Contours
LiDAR Derived Contours Final Delivery June 10, 2009 Prepared for: Prepared by: Metro 600 NE Grand Avenue Portland, OR 97232 Watershed Sciences, Inc. 529 SW Third Avenue, Suite 300 Portland, OR 97204 Metro
More informationA DATA DRIVEN METHOD FOR FLAT ROOF BUILDING RECONSTRUCTION FROM LiDAR POINT CLOUDS
A DATA DRIVEN METHOD FOR FLAT ROOF BUILDING RECONSTRUCTION FROM LiDAR POINT CLOUDS A. Mahphood, H. Arefi *, School of Surveying and Geospatial Engineering, College of Engineering, University of Tehran,
More informationInSAR Operational and Processing Steps for DEM Generation
InSAR Operational and Processing Steps for DEM Generation By F. I. Okeke Department of Geoinformatics and Surveying, University of Nigeria, Enugu Campus Tel: 2-80-5627286 Email:francisokeke@yahoo.com Promoting
More informationThe 2014 IEEE GRSS Data Fusion Contest
The 2014 IEEE GRSS Data Fusion Contest Description of the datasets Presented to Image Analysis and Data Fusion Technical Committee IEEE Geoscience and Remote Sensing Society (GRSS) February 17 th, 2014
More informationQuinnipiac Post Flight Aerial Acquisition Report
Quinnipiac Post Flight Aerial Acquisition Report August 2011 Post-Flight Aerial Acquisition and Calibration Report FEMA REGION 1 Quinnipiac Watershed, Connecticut, Massachusesetts FEDERAL EMERGENCY MANAGEMENT
More informationHydrocarbon Index an algorithm for hyperspectral detection of hydrocarbons
INT. J. REMOTE SENSING, 20 JUNE, 2004, VOL. 25, NO. 12, 2467 2473 Hydrocarbon Index an algorithm for hyperspectral detection of hydrocarbons F. KÜHN*, K. OPPERMANN and B. HÖRIG Federal Institute for Geosciences
More informationMAPPS 2013 Winter Conference 2013 Cornerstone Mapping, Inc. 1
MAPPS 2013 Winter Conference 2013 Cornerstone Mapping, Inc. 1 What is Thermal Imaging? Infrared radiation is perceived as heat Heat is a qualitative measure of temperature Heat is the transfer of energy
More informationAirborne Laser Survey Systems: Technology and Applications
Abstract Airborne Laser Survey Systems: Technology and Applications Guangping HE Lambda Tech International, Inc. 2323B Blue Mound RD., Waukesha, WI-53186, USA Email: he@lambdatech.com As mapping products
More informationImagery and Raster Data in ArcGIS. Abhilash and Abhijit
Imagery and Raster Data in ArcGIS Abhilash and Abhijit Agenda Imagery in ArcGIS Mosaic datasets Raster processing ArcGIS is a Comprehensive Imagery System Integrating All Types, Sources, and Sensor Models
More informationSimulation of small-footprint full-waveform LiDAR propagation through a tree canopy in 3D
Simulation of small-footprint full-waveform LiDAR propagation through a tree canopy in 3D Angela M. Kim a, Richard C. Olsen a, Martin Béland b a Naval Postgraduate School, Remote Sensing Center and Physics
More informationAnalysis Ready Data For Land (CARD4L-ST)
Analysis Ready Data For Land Product Family Specification Surface Temperature (CARD4L-ST) Document status For Adoption as: Product Family Specification, Surface Temperature This Specification should next
More informationComparison GRASS-LiDAR modules TerraScan with respect to vegetation filtering
Comparison GRASS-LiDAR modules TerraScan with respect to vegetation filtering Sara Lucca sara.lucca@mail.polimi.it Maria Antonia Brovelli - maria.brovelli@polimi.it LiDAR system Detection system by a laser
More informationLiDAR Engineering and Design Applications. Sample Data
LiDAR Engineering and Design Applications Sample Data High density LiDAR will return points on any visible part of a structure. Modeling of Existing Structures 2 The distance between any two positions
More informationREGISTRATION OF AIRBORNE LASER DATA TO SURFACES GENERATED BY PHOTOGRAMMETRIC MEANS. Y. Postolov, A. Krupnik, K. McIntosh
REGISTRATION OF AIRBORNE LASER DATA TO SURFACES GENERATED BY PHOTOGRAMMETRIC MEANS Y. Postolov, A. Krupnik, K. McIntosh Department of Civil Engineering, Technion Israel Institute of Technology, Haifa,
More informationMapping Project Report Table of Contents
LiDAR Estimation of Forest Leaf Structure, Terrain, and Hydrophysiology Airborne Mapping Project Report Principal Investigator: Katherine Windfeldt University of Minnesota-Twin cities 115 Green Hall 1530
More informationDEEP LEARNING TO DIVERSIFY BELIEF NETWORKS FOR REMOTE SENSING IMAGE CLASSIFICATION
DEEP LEARNING TO DIVERSIFY BELIEF NETWORKS FOR REMOTE SENSING IMAGE CLASSIFICATION S.Dhanalakshmi #1 #PG Scholar, Department of Computer Science, Dr.Sivanthi Aditanar college of Engineering, Tiruchendur
More informationPOSITIONING A PIXEL IN A COORDINATE SYSTEM
GEOREFERENCING AND GEOCODING EARTH OBSERVATION IMAGES GABRIEL PARODI STUDY MATERIAL: PRINCIPLES OF REMOTE SENSING AN INTRODUCTORY TEXTBOOK CHAPTER 6 POSITIONING A PIXEL IN A COORDINATE SYSTEM The essential
More informationMunicipal Projects in Cambridge Using a LiDAR Dataset. NEURISA Day 2012 Sturbridge, MA
Municipal Projects in Cambridge Using a LiDAR Dataset NEURISA Day 2012 Sturbridge, MA October 15, 2012 Jeff Amero, GIS Manager, City of Cambridge Presentation Overview Background on the LiDAR dataset Solar
More informationSpectral Classification
Spectral Classification Spectral Classification Supervised versus Unsupervised Classification n Unsupervised Classes are determined by the computer. Also referred to as clustering n Supervised Classes
More informationEVALUATION OF WORLDVIEW-1 STEREO SCENES AND RELATED 3D PRODUCTS
EVALUATION OF WORLDVIEW-1 STEREO SCENES AND RELATED 3D PRODUCTS Daniela POLI, Kirsten WOLFF, Armin GRUEN Swiss Federal Institute of Technology Institute of Geodesy and Photogrammetry Wolfgang-Pauli-Strasse
More informationLearning Objectives LIGHT DETECTION AND RANGING. Sensing. Blacksburg, VA July 24 th 30 th, 2010 LiDAR: Mapping the world in 3-D Page 1
LiDAR: Mapping the world in 3-D Val Thomas Department of Forest Resources & Environmental Conservation July 29, 2010 Learning Objectives Part 1: Lidar theory What is lidar? How does lidar work? What are
More informationLiDAR data pre-processing for Ghanaian forests biomass estimation. Arbonaut, REDD+ Unit, Joensuu, Finland
LiDAR data pre-processing for Ghanaian forests biomass estimation Arbonaut, REDD+ Unit, Joensuu, Finland Airborne Laser Scanning principle Objectives of the research Prepare the laser scanning data for
More informationRange Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation
Obviously, this is a very slow process and not suitable for dynamic scenes. To speed things up, we can use a laser that projects a vertical line of light onto the scene. This laser rotates around its vertical
More informationSTEREO EVALUATION OF ALOS/PRISM DATA ON ESA-AO TEST SITES FIRST DLR RESULTS
STEREO EVALUATION OF ALOS/PRISM DATA ON ESA-AO TEST SITES FIRST DLR RESULTS Authors: Mathias Schneider, Manfred Lehner, Rupert Müller, Peter Reinartz Remote Sensing Technology Institute German Aerospace
More informationGeospatial Computer Vision Based on Multi-Modal Data How Valuable Is Shape Information for the Extraction of Semantic Information?
remote sensing Article Geospatial Computer Vision Based on Multi-Modal Data How Valuable Is Shape Information for the Extraction of Semantic Information? Martin Weinmann 1, * and Michael Weinmann 2 1 Institute
More information