A Data Fusion Platform for Supporting Bridge Deck Condition Monitoring by Merging Aerial and Ground Inspection Imagery

Similar documents
Multiperspective mosaics for under vehicle inspection

USING UNMANNED AERIAL VEHICLE (DRONE/FLYCAM) TECHNOLOGY IN SURVEY WORK OF PORTCOAST

Behavior Learning for a Mobile Robot with Omnidirectional Vision Enhanced by an Active Zoom Mechanism

Aerial and Mobile LiDAR Data Fusion

Reality Modeling Drone Capture Guide

Phased Array Assisted Manual Nozzle Inspection Solution with Data Archiving Capability

FAST REGISTRATION OF TERRESTRIAL LIDAR POINT CLOUD AND SEQUENCE IMAGES

ENY-C2005 Geoinformation in Environmental Modeling Lecture 4b: Laser scanning

Jeffrey A. Schepers P.S. EIT Geospatial Services Holland Engineering Inc. 220 Hoover Blvd, Suite 2, Holland, MI Desk

CIS 580, Machine Perception, Spring 2015 Homework 1 Due: :59AM

Camera Calibration for a Robust Omni-directional Photogrammetry System

Dept. of Adaptive Machine Systems, Graduate School of Engineering Osaka University, Suita, Osaka , Japan

Integrated Multi-Source LiDAR and Imagery

Virtual and remote inspection methods

UtilityScan TM. Locate and Map Underground Utilities with GPR. Typical Uses. Designate Targets.

LEAK DETECTION UTILIZING SMALL UNMANNED AERIAL SYSTEMS (SUAS) PRESENTED BY MATT HALKER

LIDAR BASED BRIDGE EVALUATION PH.D DEFENSE - WANQIU LIU. Advisor: Dr. Shen-en Chen

ROAD SURFACE STRUCTURE MONITORING AND ANALYSIS USING HIGH PRECISION GPS MOBILE MEASUREMENT SYSTEMS (MMS)

CSE 527: Introduction to Computer Vision

Dominant plane detection using optical flow and Independent Component Analysis

DEVELOPMENT OF A ROBUST IMAGE MOSAICKING METHOD FOR SMALL UNMANNED AERIAL VEHICLE

CT Reconstruction with Good-Orientation and Layer Separation for Multilayer Objects

Autonomous Navigation for Flying Robots

Help Documentation. Copyright 2007 WebAssist.com Corporation All rights reserved.

Using Web Camera Technology to Monitor Steel Construction

Geo-location and recognition of electricity distribution assets by analysis of ground-based imagery

Rectification Algorithm for Linear Pushbroom Image of UAV

Low-level Image Processing for Lane Detection and Tracking

A New Protocol of CSI For The Royal Canadian Mounted Police

THE RANGER-UAV FEATURES

Measurements using three-dimensional product imaging

ULTRAVISION 3.7R21. Product Bulletin. UltraVision, a complete UT and Phased Array inspection package!

An Introduction to Lidar & Forestry May 2013

Remote Sensing Sensor Integration

Precision Roadway Feature Mapping Jay A. Farrell, University of California-Riverside James A. Arnold, Department of Transportation

Trimble VISION Positions from Pictures

ASNT Conference

TRAINING MATERIAL HOW TO OPTIMIZE ACCURACY WITH CORRELATOR3D

MAPPS 2013 Winter Conference 2013 Cornerstone Mapping, Inc. 1

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016

07ATC 22 Image Exploitation System for Airborne Surveillance

Low-level Image Processing for Lane Detection and Tracking

The Use of UAV s for Gathering Spatial Information. James Van Rens CEO MAPPS Winter Conference January, 2015

A Reactive Bearing Angle Only Obstacle Avoidance Technique for Unmanned Ground Vehicles

Machine learning based automatic extrinsic calibration of an onboard monocular camera for driving assistance applications on smart mobile devices

Leica Geosystems UAS Airborne Sensors. MAPPS Summer Conference July 2014 Alistair Stuart

HIGH-PERFORMANCE TOMOGRAPHIC IMAGING AND APPLICATIONS

Light Detection and Ranging (LiDAR)

Chapters 1-4: Summary

Implementation of BIM for Bridge Design A Case Study

A Multiple-Hypothesis Tracking of Multiple Ground Targets from Aerial Video with Dynamic Sensor Control *

Overview of the Trimble TX5 Laser Scanner

Automated In-Process Inspection System for AFP Machines

AUTOFOCUS SENSORS & MICROSCOPY AUTOMATION IR LASER SCANNING CONFOCAL MICROSCOPE IRLC DEEP SEE. Now See Deeper than ever before

Graph-based SLAM (Simultaneous Localization And Mapping) for Bridge Inspection Using UAV (Unmanned Aerial Vehicle)

Three-dimensional nondestructive evaluation of cylindrical objects (pipe) using an infrared camera coupled to a 3D scanner

Tutorial (Beginner level): Orthomosaic and DEM Generation with Agisoft PhotoScan Pro 1.3 (with Ground Control Points)

Adam Hammet & Anthony McClaren. Getting to know the Trimble SX10 Scanning Total Station - Inside and Out ISV Regional Conference 2017: Moama, NSW

Performance Improvement of a 3D Stereo Measurement Video Endoscope by Means of a Tunable Monochromator In the Illumination System

Deliverable D8.5 Video Showcase

High Altitude Balloon Localization from Photographs

Development of crack diagnosis and quantification algorithm based on the 2D images acquired by Unmanned Aerial Vehicle (UAV)

Terrain Modeling and Mapping for Telecom Network Installation Using Scanning Technology. Maziana Muhamad

Equipment. Remote Systems & Sensors. Altitude Imaging uses professional drones and sensors operated by highly skilled remote pilots

3DPIXA: options and challenges with wirebond inspection. Whitepaper

FIFI-LS: Basic Cube Analysis using SOSPEX

GPR DATA ANALYSIS OF AIRFIELD RUNWAY USING MIGRATION VELOCITY SIMULATION ALGORITHM

UNIVERSAL CONTROL METHODOLOGY DESIGN AND IMPLEMENTATION FOR UNMANNED VEHICLES. 8 th April 2010 Phang Swee King

Course Outline (1) #6 Data Acquisition for Built Environment. Fumio YAMAZAKI

Computer and Machine Vision

Open Pit Mines. Terrestrial LiDAR and UAV Aerial Triangulation for. Figure 1: ILRIS at work

Calibration of a rotating multi-beam Lidar

Aerial Robotic Autonomous Exploration & Mapping in Degraded Visual Environments. Kostas Alexis Autonomous Robots Lab, University of Nevada, Reno

City of San Antonio Utilizing Advanced Technologies for Stormwater System Mapping and Condition Assessments

LICENSE PLATE RECOGNITION SETUP AND FIELD INSTALLATION GUIDE

UAV s in Surveying: Integration/processes/deliverables A-Z. 3Dsurvey.si

Trends in Digital Aerial Acquisition Systems

Inspire 2 Release Notes

FIELD OPERATION CENTER Coastal Hydraulics Lab. Field Data Collection and Analysis Branch Our Team Members

Image-based Retrieval of Concrete Crack Properties

Probability of Detection Simulations for Ultrasonic Pulse-echo Testing

Photogrammetry: A Modern Tool for Crash Scene Mapping

Chapter 3 Image Registration. Chapter 3 Image Registration

Image Fusion of Video Images and Geo-localization for UAV Applications K.Senthil Kumar 1, Kavitha.G 2, Subramanian.

Visualization and interaction with a panosphere as a panorama overview

Using Mobile LiDAR To Efficiently Collect Roadway Asset and Condition Data. Pierre-Paul Grondin, B.Sc. Surveying

UAV Position and Attitude Sensoring in Indoor Environment Using Cameras

Case Study for Long- Range Beyond Visual Line of Sight Project. March 15, 2018 RMEL Transmission and Planning Conference

(Subsea) Keith Vickery Zupt LLC

SCAN & PAINT 3D. Product Leaflet NEW PRODUCT. icroflown Technologies Charting sound fields MICROFLOWN // CHARTING SOUND FIELDS

Compositing a bird's eye view mosaic

Recent developments in laser scanning

CS6670: Computer Vision

Vision-based Automated Vehicle Guidance: the experience of the ARGO vehicle

Slide 1. Technical Aspects of Quality Control in Magnetic Resonance Imaging. Slide 2. Annual Compliance Testing. of MRI Systems.

CS595:Introduction to Computer Vision

ALL-IN-ONE DRONE SOLUTION FOR 3D MODELING

An Angle Estimation to Landmarks for Autonomous Satellite Navigation

COMPLEX CONTOUR ULTRASONIC SCANNING SYSTEM APPLICATION AND TRAINING

1216 P a g e 2.1 TRANSLATION PARAMETERS ESTIMATION. If f (x, y) F(ξ,η) then. f(x,y)exp[j2π(ξx 0 +ηy 0 )/ N] ) F(ξ- ξ 0,η- η 0 ) and

Transcription:

A Data Fusion Platform for Supporting Bridge Deck Condition Monitoring by Merging Aerial and Ground Inspection Imagery Zhexiong Shang, 1 Chongsheng Cheng, 2 and Zhigang Shen, Ph.D. 3 1 NH 113, Durham School of Architectural Engineering & Construction, University of Nebraska- Lincoln, Lincoln, NE, 68588; e-mail: szx0112@huskers.unl.edu 2 NH 113, Durham School of Architectural Engineering & Construction, University of Nebraska- Lincoln, Lincoln, NE, 68588; e-mail: cheng.chongsheng@huskers.unl.edu 3 NH 113, Durham School of Architectural Engineering & Construction, University of Nebraska- Lincoln, Lincoln, NE, 68588; e-mail: shen@unl.edu ABSTRACT UAVs showed great efficiency on scanning bridge decks surface by taking a single shot or through stitching a couple of overlaid still images. If potential surface deficits are identified through aerial images, subsequent ground inspections can be scheduled. This two-phase inspection procedure showed great potentials on increasing field inspection productivity. Since aerial and ground inspection images are taken at different scales, a tool to properly fuse these multi-scale images is needed for improving the current bridge deck condition monitoring practice. In response to this need a data fusion platform is introduced in this study. Using this proposed platform multi-scale images taken by different inspection devices can be fused through geo-referencing. As part of the platform, a web-based user interface is developed to organize and visualize those images with inspection notes under users queries. For illustration purpose, a case study involving multi-scale optical and infrared images from UAV and ground inspector, and its implementation using the proposed platform is presented. INTRODUCTION Bridge deck serves as the key load-carrying structure throughout the service life of bridges. It constantly exposed to traffic, scour and chemicals that cause surface cracks and invisible corrosions of reinforcing steels in bridge decks. Traditional non-destructive evaluation (NDE) methods such as visual surveying, chain-dragging and hammer sounding, penetrating radar (GPR), impact echo (IE), ultrasound are labor intensive and time consuming. Following the significant advancement in computer vision and image processing, direct hyper-spectrum scanning (e.g. optical and IR camera) has demonstrated great potentials for detecting both surface and subsurface defects (Oh et al. 2012). Compared to other NDE methods, the major advantages of hyper-spectrum scanning are its high-speed data collection and its ease of locating and mapping the detected defects (Graybeal et al. 2002). Following the boom of UAVs, there has been significant growths of aerial inspection on bridge decks (Ellenberg et al. 2016; Metni and Hamel 2007). UAV has multiple advantages over the traditional ground-level inspection units. The low-cost UAVs can rapidly scan bridge decks with no need of traffic closure, and access regions that are dangerous and hard to be reached by 1

human. Equipped with onboard camera, UAVs can provide orthogonal views of bridge deck surface that would be otherwise inaccessible. In practice, UAVs need to fly at certain safe height above the deck surface to avoid the traffic and site obstacles during routine inspections (Zink and Lovelace 2015). Lately, engineers in NDE community start to deploy a UAV-based two-phase inspection approach (Khan et al. 2015), with Phase 1 including the driving speed of UAV to cover the entire deck surface, followed by higher resolution inspection methods carried out on a subset of the decks based on the Phase 1 results and agency needs. This two-phase bridge inspection strategy showed great potentials on increasing field inspection productivity and, at the same time, reduce unnecessary cost. Due to the fact that aerial and ground images are taken at different scales, fusing these images through pure image-based technique is challenging (Fruh and Zakhor 2003). An alternative method is to use the data management tool to support this multi-scale image data fusion. In the study, a multi-scale image data fusion platform is developed to support this newly emerged two-phase bridge deck inspection practice. The platform integrated the image acquisition process with the post-processing step for data preparation. A web-based user interface is then applied to query and visualize the captured images based on the user interaction. A real project is utilized to demonstrate the practicability of this platform. METHODOLOGY Fig. 1 shows the overview of the proposed platform, it includes the in field data collection with both aerial and ground inspection units, image processing for bridge deck surface mapping and the web-based user interface (UI). The web-based UI is served as a data management tool that properly organize the aerial and ground inspection to support both on-site bridge deck inspection and off-site condition assessment practice. Details of each section of the platform are discussed in the followings. Figure 1 Overview of the data collection, image processing and the user interface of the platform 2

Data Collection The data collection process is determined based on the inspection phases. For phase 1 inspection, images are captured by UAV scans along the bridge deck surface with fixed velocity and height. It is noted that the height is determined by the required spatial resolution and field of view (FOV) of onboard camera. Compared to the optical camera, IR camera has narrow FOV where larger distance is required to cover the entire deck width with single shot. During aerial inspection, the onboard camera is oriented towards the ground and the images are taken at short time internal so that sufficient overlay for mosaic map generation is guaranteed. The images captured in the air are geo-tagged and stored in a MicroSD card for post processing. Figure 3 shows the example of using UAVs (i.e. DJI Inspire 1) to scan bridge deck surface. For onboard camera has small FOV or UAV cannot reach the level of height because of field obstacle (e.g. truss, beams), assigning a gimbal pitch angle to the camera can still reach the full deck coverage. However, additional post-processing is required to eliminate the camera s perspective effect. Figure 2 (a) Field view of aerial inspection process; (b) Camera view of image acquisition progress If potential defects are identified in the aerial-image generated map, the subsequent indepth inspection is then carried out by ground inspection units. In general, the subsequent inspection is only applied at the potential defects that are detected in aerial inspection. However, for bridge deck that is heavily deteriorated, full deck inspection with inspectors or ground vehicles is high demanded. Image Processing Images collected during inspection are either single image which indicate specific defects verified in the subsequent inspection, or a set of overlaid images for bridge deck surface mapping. In this study, two image processing techniques are utilized to support the construction of orthogonal surface map of bridge deck, they are: perspective projection and image stitching. Perspective projection is the technique to remove the perspective effects in inspection images. For bridge deck surface mapping, images not taken with orthogonal view can result in misinterpretation of the real dimensions and shapes of the defects in the constructed map. Such issue is often existed for aerial and ground inspection process where the onboard camera s shooting angle is not perpendicular to the deck surface. 3

In this study, inverse perspective mapping (IPM) is used to remove such effect by reprojecting the distorted images into an orthogonal view. IPM is a geometrical transformation algorithm that converts an incoming image at a 2D space into the 3D world coordinate, and remaps each pixel toward a different position in 3D and construct a new 2D image plane with corrected view (Mallot et al. 1991). The transformation result is a new image with bird eye view of the original image at significant height (as shown in Fig. 3). The pixel-wise formula of IPM is presented in Equation (1). hsinγ x, y atan θ α y d u = 2α m 1 y d atan (γ α) x l v = 2α n 1 (1) Where (l, d, h) is camera position in the world coordinates, (θ, γ) is the pitch and yaw angles of camera. α is camera aperture, and (m, n) is the image resolution. (x, y) is the pixel coordinates in the original image, and (u, v) is the pixel coordinates in the corrected image. For bridge deck inspection, the camera position can be set above the origin of world coordinate (as l, d = 0) with camera yaw aligns with the front axis (γ = 0), which simplifies its calculation process. Fig. 4 (a) and (b) show the example of a thermal image with tilted pitch angle before and after re-projection using IPM. Figure 3 Conceptual diagram of IPM The next step is to stitch these corrected images in order to construct bridge deck surface map. There is a wide variety of algorithmic solutions available for stitching the overlaid images. SIFT/SURF feature detectors are widely used to identify salient features within each images, and matching those features in adjacent images (Szeliski 2007). Image transformation is then applied to align the matched images into a uniform coordinate system. This strategy works well for most image sets where salient features can be detected and matched. However, especially for thermal 4

images where salient features may not always exist, directly applying the feature-based matching may arise significant errors. In this platform, image stitching is a semi-autonomous process where the feature based automatic image matching is applied as the initial step (Brown and Lowe 2007). For images without enough salient features, the manual process is taken over where the GPS locations attached on each image are used as the reference for image registration. Fig. 4 (c) shows the surface map constructed with the set of corrected IR images using the semi-autonomous image stitching strategy. The real dimension and shapes of defects can then be revealed in the constructed map. Figure 4 (a) IR image with perspective effect; (b) Image corrected with IPM; (c) Stitching the corrected images to identify real dimension of defects. User Interface In this section, a web-based user interface is introduced to properly merge and manage the images taken with different inspection phases. The user interface is implemented with JavaScript and Google Map API which synchronize images at different scales into the uniform GPS coordinate system. To support the user interaction and data management, two data layers are developed in this UI, they are: map layer and defects layer. The detailed discussion of each data layer is presented below. Map layer is served as the base holder of this UI. It provides the outline map view which allows user to access bridge decks by mouse selection in Google Map. The aerial inspection constructed surface maps are linked to this data layer through the GPS coordinates. Due to the geometrical transformation of the mosaic image set, GPS of the initial image of the map is used to localize such surface maps. Callback functions are also developed in this layer to visualize the condition of individual bridge deck as well as the stored surface maps through a pop-up window. Dataset stored in the map layer are mostly applied by bridge owners who are more interested in 5

the the overall conditions of bridge decks within the region of interest (ROI) and the general inspection results of each bridge deck. Defects layer is the place holder to store the inspection notes and images that indicates specific bridge defects detected in the ground-level inspection. These defects are normally captured in the surface map and further validated in the subsequent inspections. Images in this layer can be either image data that directly capture the surface cracks (e.g. by optical camera) and subsurface delamination (e.g. by IR camera), or the images of markers that represents the defects detected by other sensors (e.g. GPR, ultrasound, hammer sounding). For individual images, the geotag attached on each image is used to locate the defects position in the map. Figure 5 Web-based User Interface: (a) Map view to indicate general conditions of bridges within the ROI; (b) Click each flag shows the pop-up window of the conditions as well as the surface map of the selected bridge deck; (c) Zoom in the map to show the locations of bridge deck defects (i.e. delamination) with captured image and description in subsequent inspection; (d) Street view of bridge deck defects (i.e. surface crack) for assisting in identifying defects locations on-site. Five bridges located near Lincoln, Nebraska are surveyed by Nebraska Department of Transportation (NDOT), inspection images, including both aerial images captured by UAV and ground hammer sounding with handhold camera, as wells as the on-site notes were uploaded into the platform after the field inspection period. In this section, the bridge deck with significant structural deficient (shown as red flag in Fig. 5) is chosen to demonstrate the functionality and the practical usage of the UI. In Fig. 5 (a), the general conditions of bridge decks within the region of interest (ROI) are presented in a 2D map view where the condition of each bridge is illustrated with a colored flag. Clicking each flag in the map shows the pop-up window where the general condition and constructed surface map of the bridge deck are presented. Fig. 5 (b) 6

shows an example of the use case where the displayed surface maps captured by UAV are presented. In this example, three surface maps are presented, they are: surface map constructed by phase 1 aerial inspection using digital camera (top), surface map constructed by phase 1 aerial inspection using IR camera (middle), surface map constructed by phase 2 ground inspection using hammer sounding. By zooming in the map view to meter level, individual defects identified through handhold camera and hammer sounding through phase 2 inspection are automatically displayed. Clicking each marker in the zoomed view displays the image and description of identified defects. Fig 5 (c) shows the example where the locations, comments as well as the defects focused inspection image are presented. In order to support the on-site defects identification, the platform also provides a function which allows users to localize the defects in the street view (as shown in Fig 5 (d)). This street view can support engineers staying on-site to identify the locations of the subsurface defects (e.g. delamination) without the need of manually drawing the reference coordinates on the deck surface. Noted that the accuracies of street view images are highly determined by the precision of GPS signal and the updates of the google map. Advanced image processing and sensor fusion algorithms can increase such accuracy, however, discussing such technologies is out of the scope of this study. SUMMARY In this study, a web-based data fusion platform is introduced to support merging multi-scale bridge deck inspection images through geo-referencing. The platform is compatible with the newly emerged UAV-enabled, two-phase bridge deck inspection practice. The data collection, image processing and visualization process of using the proposed platform on such practice are discussed. With the increased flight capability and the resolution of onboard camera system, the role of UAV on structural inspection practice is emerging. In the future studies, the authors aim to expand the proposed platform to support data management of versatile UAV-enabled inspection practice, such pavements or tunnels. KNOWLEDGEMENT The image acquisition process received support from the Nebraska Department of Transportation through facilitating data collection and sharing the ground evaluation results. REFERENCE Brown, M., and Lowe, D. G. (2007). "Automatic panoramic image stitching using invariant features." International journal of computer vision, 74(1), 59-73. Ellenberg, A., Kontsos, A., Moon, F., and Bartoli, I. (2016). "Bridge deck delamination identification from unmanned aerial vehicle infrared imagery." Automation in Construction. Fruh, C., and Zakhor, A. (2003). "Constructing 3D city models by merging aerial and ground views." IEEE computer graphics and applications, 23(6), 52-61. Graybeal, B. A., Phares, B. M., Rolander, D. D., Moore, M., and Washer, G. (2002). "Visual inspection of highway bridges." Journal of nondestructive evaluation, 21(3), 67-83. 7

Khan, F., Ellenberg, A., Mazzotti, M., Kontsos, A., Moon, F., Pradhan, A., and Bartoli, I. "Investigation on Bridge Assessment Using Unmanned Aerial Systems." Proc., Structures Congress 2015, ASCE, 404-413. Mallot, H. A., Bülthoff, H. H., Little, J., and Bohrer, S. (1991). "Inverse perspective mapping simplifies optical flow computation and obstacle detection." Biological cybernetics, 64(3), 177-185. Metni, N., and Hamel, T. (2007). "A UAV for bridge inspection: Visual servoing control law with orientation limits." Automation in construction, 17(1), 3-10. Oh, T., Kee, S.-H., Arndt, R. W., Popovics, J. S., and Zhu, J. (2012). "Comparison of NDT methods for assessment of a concrete bridge deck." Journal of Engineering Mechanics, 139(3), 305-314. Szeliski, R. (2007). "Image alignment and stitching: A tutorial." Foundations and Trends in Computer Graphics and Vision, 2(1), 1-104. Zink, J., and Lovelace, B. (2015). "Unmanned aerial vehicle bridge inspection demonstration project." 8