Camera Calibration for a Robust Omni-directional Photogrammetry System

Similar documents
Omni Stereo Vision of Cooperative Mobile Robots

FAST REGISTRATION OF TERRESTRIAL LIDAR POINT CLOUD AND SEQUENCE IMAGES

Chapters 1 9: Overview

DESIGN AND TESTING OF MATHEMATICAL MODELS FOR A FULL-SPHERICAL CAMERA ON THE BASIS OF A ROTATING LINEAR ARRAY SENSOR AND A FISHEYE LENS

Compositing a bird's eye view mosaic

Using RANSAC for Omnidirectional Camera Model Fitting

SELF-CALIBRATION OF CENTRAL CAMERAS BY MINIMIZING ANGULAR ERROR

Exterior Orientation Parameters

Calibration of a Different Field-of-view Stereo Camera System using an Embedded Checkerboard Pattern

Creating a distortion characterisation dataset for visual band cameras using fiducial markers.

Generic and Real-Time Structure from Motion

Chapters 1 7: Overview

Camera Calibration for Video See-Through Head-Mounted Display. Abstract. 1.0 Introduction. Mike Bajura July 7, 1993

(Subsea) Keith Vickery Zupt LLC

The Applanix Approach to GPS/INS Integration

TRAINING MATERIAL HOW TO OPTIMIZE ACCURACY WITH CORRELATOR3D

Computer Vision Lecture 17

Computer Vision Lecture 17

Jump Stitch Metadata & Depth Maps Version 1.1

Training i Course Remote Sensing Basic Theory & Image Processing Methods September 2011

DERIVING PEDESTRIAN POSITIONS FROM UNCALIBRATED VIDEOS

Visual Odometry for Non-Overlapping Views Using Second-Order Cone Programming

AUTOMATED UAV-BASED VIDEO EXPLOITATION FOR MAPPING AND SURVEILLANCE

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 263

Fisheye Camera s Intrinsic Parameter Estimation Using Trajectories of Feature Points Obtained from Camera Rotation

Reality Modeling Drone Capture Guide

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

IEEE Consumer Electronics Society Calibrating a VR Camera. Adam Rowell CTO, Lucid VR

Capture and Dewarping of Page Spreads with a Handheld Compact 3D Camera

Image Formation. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

calibrated coordinates Linear transformation pixel coordinates

TERRESTRIAL AND NUMERICAL PHOTOGRAMMETRY 1. MID -TERM EXAM Question 4

Trimble Engineering & Construction Group, 5475 Kellenburger Road, Dayton, OH , USA

Computed Photography - Final Project Endoscope Exploration on Knee Surface

PREPARATIONS FOR THE ON-ORBIT GEOMETRIC CALIBRATION OF THE ORBVIEW 3 AND 4 SATELLITES

Model-based segmentation and recognition from range data

Calibration of a fish eye lens with field of view larger than 180

Outdoor Scene Reconstruction from Multiple Image Sequences Captured by a Hand-held Video Camera

Stereo Image Rectification for Simple Panoramic Image Generation

Catadioptric camera model with conic mirror

Neue Verfahren der Bildverarbeitung auch zur Erfassung von Schäden in Abwasserkanälen?

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016

A Fast Linear Registration Framework for Multi-Camera GIS Coordination

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 253

Real-time Generation and Presentation of View-dependent Binocular Stereo Images Using a Sequence of Omnidirectional Images

Monitoring surrounding areas of truck-trailer combinations

POME A mobile camera system for accurate indoor pose

Real-time target tracking using a Pan and Tilt platform

1-2 Feature-Based Image Mosaicing

Chapter 1: Overview. Photogrammetry: Introduction & Applications Photogrammetric tools:

A Review of Image- based Rendering Techniques Nisha 1, Vijaya Goel 2 1 Department of computer science, University of Delhi, Delhi, India

DRC A Multi-Camera System on PC-Cluster for Real-time 3-D Tracking. Viboon Sangveraphunsiri*, Kritsana Uttamang, and Pongsakon Pedpunsri

3D Fusion of Infrared Images with Dense RGB Reconstruction from Multiple Views - with Application to Fire-fighting Robots

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

V-Sentinel: A Novel Framework for Situational Awareness and Surveillance

ADS40 Calibration & Verification Process. Udo Tempelmann*, Ludger Hinsken**, Utz Recke*

INTEGRATED PROCESSING OF TERRESTRIAL LASER SCANNER DATA AND FISHEYE-CAMERA IMAGE DATA

Vision Review: Image Formation. Course web page:

Capture and Displays CS 211A

A Survey of Light Source Detection Methods

Estimation of Camera Motion with Feature Flow Model for 3D Environment Modeling by Using Omni-Directional Camera

Tutorial (Beginner level): Orthomosaic and DEM Generation with Agisoft PhotoScan Pro 1.3 (with Ground Control Points)

Aided-inertial for GPS-denied Navigation and Mapping

A NEW PROBE CONCEPT FOR INTERNAL PIPEWORK INSPECTION

BUILDING MODEL RECONSTRUCTION FROM DATA INTEGRATION INTRODUCTION

Accurate and Dense Wide-Baseline Stereo Matching Using SW-POC

LUMS Mine Detector Project

Camera Registration in a 3D City Model. Min Ding CS294-6 Final Presentation Dec 13, 2006

Structure from Motion. Introduction to Computer Vision CSE 152 Lecture 10

Video Georegistration: Key Challenges. Steve Blask Harris Corporation GCSD Melbourne, FL 32934

Motion and Target Tracking (Overview) Suya You. Integrated Media Systems Center Computer Science Department University of Southern California

Camera Geometry II. COS 429 Princeton University

Published Technical Disclosure. Camera-Mirror System on a Remotely Operated Vehicle or Machine Authors: Harald Staab, Carlos Martinez and Biao Zhang

A COMPREHENSIVE TOOL FOR RECOVERING 3D MODELS FROM 2D PHOTOS WITH WIDE BASELINES

Outline. ETN-FPI Training School on Plenoptic Sensing

Omnivergent Stereo-panoramas with a Fish-eye Lens

Graph based INS-camera calibration

ABSTRACT 1. INTRODUCTION

HySCaS: Hybrid Stereoscopic Calibration Software

GI-Eye II GPS/Inertial System For Target Geo-Location and Image Geo-Referencing

INTEGRATING TERRESTRIAL LIDAR WITH POINT CLOUDS CREATED FROM UNMANNED AERIAL VEHICLE IMAGERY

Estimating Camera Position And Posture by Using Feature Landmark Database

Multiple View Geometry

Real-time Image-based Reconstruction of Pipes Using Omnidirectional Cameras

Measurement of Pedestrian Groups Using Subtraction Stereo

POSITIONING A PIXEL IN A COORDINATE SYSTEM

Alignment of Continuous Video onto 3D Point Clouds

Characterizing Strategies of Fixing Full Scale Models in Construction Photogrammetric Surveying. Ryan Hough and Fei Dai

arxiv: v1 [cs.cv] 28 Sep 2018

Video Alignment. Final Report. Spring 2005 Prof. Brian Evans Multidimensional Digital Signal Processing Project The University of Texas at Austin

Homogeneous Coordinates. Lecture18: Camera Models. Representation of Line and Point in 2D. Cross Product. Overall scaling is NOT important.

Incremental Real-time Bundle Adjustment for Multi-camera Systems with Points at Infinity

Employing a Fish-Eye for Scene Tunnel Scanning

Live Metric 3D Reconstruction on Mobile Phones ICCV 2013

Today s lecture. Image Alignment and Stitching. Readings. Motion models

Photogrammetry: DTM Extraction & Editing

EVALUATION OF WORLDVIEW-1 STEREO SCENES AND RELATED 3D PRODUCTS

Estimation of Camera Pose with Respect to Terrestrial LiDAR Data

DETERMINATION OF CORRESPONDING TRUNKS IN A PAIR OF TERRESTRIAL IMAGES AND AIRBORNE LASER SCANNER DATA

Vehicle Dimensions Estimation Scheme Using AAM on Stereoscopic Video

Transcription:

Camera Calibration for a Robust Omni-directional Photogrammetry System Fuad Khan 1, Michael Chapman 2, Jonathan Li 3 1 Immersive Media Corporation Calgary, Alberta, Canada 2 Ryerson University Toronto, Ontario, Canada 3 University of Waterloo, Waterloo, Ontario, Canada. Abstract Immersive Media s Telemmersion system is a multi-view imaging system that captures high-resolution full-motion 360o omni-directional images from synchronized cameras arranged in a dodecahedron. This paper describes transformation of the resultant imagery from aesthetically pleasing blended mosaics to mathematically rigorous projections through camera calibration. Specifically, we describe our approach to camera calibration as it relates to panoramic image construction, stereo-photogrammetry, and augmentation of the immersive environment with 3D content. Background Immersive Media has been producing advanced omni-directional imaging systems based on a patented dodecahedral design since 1994. With twelve symmetrical pentagonal facets, the dodecahedron is the most natural geometric division of a sphere for immersive image capture. It offers symmetrical, standardized divisions of the sphere that utilize the most of the image produced by each lens, produce even resolution in every direction, and allow for better blending of the images. The camera system comprises a Dodeca camera and a dedicated base unit for camera control, image storage, and integration with GPS and GPS-aided inertial navigation systems. We have extended the immersive experience to include modeling and interaction with 3D environments and data by combining a rigorous approach to camera calibration with high accuracy direct camera pose determination.

Interior Orientation Each of the 11 component cameras comprising the Dodeca camera (Figure 1) is calibrated to determine orientation parameters and relative pose. A calibration target array comprising 50 coded surveyed targets is used to generate calibration images. The calibration targets are arranged such that targets are simultaneously visible to adjacent sensors. Images are simultaneously collected on all sensors as the camera is rotated and translated through a calibration rig. Intrinsic and extrinsic parameters are determined in two-stage least-squares selfcalibrating bundle adjustments using a common set of images. Camera characteristics (principal point offsets and distortion parameters) are determined for each sensor independently in the first stage. In the second stage, sets of adjacent images (captured simultaneously) are processed in a bundle adjustment to determine relative position and orientation. Image sets in the second stage are constrained by weight-constrained relative pose using the knowledge that they were acquired simultaneously in a geometrically rigid body.

Camera Model The multi-view Dodeca camera produces sets of synchronized images from component cameras which are combined into an omni-directional panoramic image. Camera calibration is conceptually divided into interior orientation and exterior orientation of the component sensors and a virtual camera abstracted from the view blending process that creates the panoramic image. The field of view for individual Dodeca lenses is significantly larger than that for cameras used in conventionally photogrammetry and machine vision applications. The wide constant angle curvature of these lenses presented a calibration challenge that required a non-standard camera model. Both generalized harmonic models and finite element models were found to be appropriate for camera characterization. A harmonic model was ultimately chosen based on re-projection accuracy and stability criteria. A high precision navigation system is employed to determine exterior orientation. However, an exterior calibration is required to align the camera reference frame to the vehicle reference frame and the individual camera elements to each other. Panoramic Image Construction: Stitching Panoramic images are constructed from individual sensor images though stitching. In essence, pixels from individual cameras are projected onto a 3D surface to construct a single panoramic image (see Figure 2). The final image is then re-projected into a flattened equi-rectangular image so that it can be stored in conventional media formats (e.g. JPEG, AVI, and Windows Media). A panoramic image viewer projects the flattened image onto a 3D mesh from which the viewer can interactively chose a particular field of view (see Figure 3). While a perfect panoramic imaging system would approximate a point source, in practice the inter-sensor spacing is non-zero. As a result, a parallax correction must be made in stitching to properly blend the images from adjacent sensors.

For aerial recording applications, a spherical convergence surface with a radius of approximately 6m is used. For terrestrial recording applications an ovoid surface is used to optimize parallax at relatively close range for objects near the ground and further at higher elevations (see Figure 4).

Exterior Orientation Data collection vehicles are equipped with high-performance GPS-aided inertial navigation systems (see Figure 5) which enable direct determination of camera pose. The navigation system generates a tightly-coupled Kalman-filtered position and orientation solution by combining a GPS sensor, fiber-optic gyroscope and wheel rotation encoder. Exterior orientation parameters, consisting of a set of 3 lever arms and angular offsets, are determined through a bundle adjustment where vehicle position and orientation are fixed. An outdoor calibration field of 30 targets is used for exterior calibration. Photogrammetry: Feature extraction and augmentation Single-frame and stereo measurements from manual registration of features have been implemented. Images from individual sensors or stitched panoramic images can be used to construct stereo pairs. The high-performance navigation system produces sub-meter absolute accuracy and sub-centimeter relative accuracy. Highly accurate baselines allow a single camera to be used for stereo photogrammetry by utilizing temporal base pairs, rather than multiple cameras. Measurements across multiple sensors are possible because the sensors are synchronized and calibrated.

The calibrated panoramic environment can be readily augmented with georeferenced content since the problem of rendering 3D objects into a panoramic scene requires the same data as photogrammetry but is much less complicated. Figure 6 illustrates insertion of traditional GIS data into calibrated GeoImmersive imagery captured at street level. Large-scale calibrated, georeferenced Immersive environments provide unique opportunities in photogrammetry, interactivity, and fusion of data from 2D and 3D sources. Immersive Media and its partners are actively leveraging and extending conventional photogrammetry tools to develop novel applications in a number of areas, including asset management, situational awareness, and mobile mapping.

Conclusions The development of multi-use sensors will increase the versatility of the mobile mapping platforms. An omni-directional camera that has undergone a rigorous camera calibration offers the possibility of a compact imaging system that has the capabilities of other more space-consuming systems. Through the integration of a high-precision geo-referencing system, the system is capable of producing high accuracy 3D coordinates of features along with their attributes. The use of realtime digital signal processing hardware facilitates the display of rectified and georeferenced imagery to the user during data capture. The immersive characteristics of this system make it particularly attractive for asset management and surveillance applications as well as those associated with traditional mobile mapping. The fusion of this data source with other sensor data, such as airborne data, suggests many other possibilities.

Bibliography Bakstein, H. and Pajdla, T. (2002).Panoramic Mosaicing with a 180 degree Field of View Lens., Proc. IEEE Workshop on Omni-directional Vision, pp. 60-67. Claus, D. and Fitzgibbon, A. W. (2005) A rational function lens distortion model for general cameras., Proc. CVPR, pp. 213-219. Kannala, J. and Brandt, S. (2004) A generic camera calibration method for fisheye lenses. In: proceedings 17th International Conference on Pattern Recognition (ICPR 2004), pp. 10-13. Kumler, J.J., Bauer, M. (2000) Fisheye lens designs and their relative performance. In: Proceedings of the Lens and Optical System Design and Engineering Conference of the SPIE Annual Meeting. Micusik, B. and Pajdla, T. (2003) Estimation of omnidirectional camera model from epipolar geometry. Proc. CVPR, pp. 485-490. Schneider, D., Schwalbe, E. (2005) Design and testing of mathematical models for a full-spherical camera on the basis of a rotating linear array sensor and a fisheye lens. In: proceedings of Optical 3D Measurement Techniques VII, Eds.: Grün, A.; Kahmen, H., Vol.1, pp. 245-254. Schwalbe, E. (2005) Geometric modelling and calibration of fisheye lens camera systems. Proceedings 2nd Panoramic Photogrammetry Workshop, Int. Archives of Photogrammetry and Remote Sensing, Vol. 36, Part 5/W8. Shah, S. and Aggarwal, J. (1996) Intrinsic parameter calibration procedure for a (high-distortion) fish-eye lens camera with distortion model and accuracy estimation., Pattern Recognition, 29(11), pp.1775-1788. Thirthala, S. and Pollefeys, M. (2005) Multi-view geometry of 1D radial cameras and its application to omni-directional camera calibration., Proc. ICCV, pp. 1539-1546. Zhang, Z. (2000) A Flexible new technique for camera calibration. TPAMI, 22(11), pp. 1330-1334.