Camera model and multiple view geometry

Similar documents
calibrated coordinates Linear transformation pixel coordinates

Pin Hole Cameras & Warp Functions

1 Projective Geometry

Rigid Body Motion and Image Formation. Jana Kosecka, CS 482

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Homogeneous Coordinates. Lecture18: Camera Models. Representation of Line and Point in 2D. Cross Product. Overall scaling is NOT important.

Pin Hole Cameras & Warp Functions

Multiple View Geometry in Computer Vision

CS6670: Computer Vision

Robot Vision: Camera calibration

CHAPTER 3. Single-view Geometry. 1. Consequences of Projection

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.

COSC579: Scene Geometry. Jeremy Bolton, PhD Assistant Teaching Professor

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah

Rectification and Distortion Correction

CS201 Computer Vision Camera Geometry

CIS 580, Machine Perception, Spring 2016 Homework 2 Due: :59AM

Projective geometry for Computer Vision

CSE 252B: Computer Vision II

Augmented Reality II - Camera Calibration - Gudrun Klinker May 11, 2004

Geometric camera models and calibration

3D Geometry and Camera Calibration

Camera models and calibration

Image Formation. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

3D Modeling using multiple images Exam January 2008

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah

Cameras and Radiometry. Last lecture in a nutshell. Conversion Euclidean -> Homogenous -> Euclidean. Affine Camera Model. Simplified Camera Models

CS4670: Computer Vision

Visual Recognition: Image Formation

Unit 3 Multiple View Geometry

Introduction to Homogeneous coordinates

Outline. ETN-FPI Training School on Plenoptic Sensing

Image Transformations & Camera Calibration. Mašinska vizija, 2018.

Structure from Motion. Prof. Marco Marcon

3D Vision Real Objects, Real Cameras. Chapter 11 (parts of), 12 (parts of) Computerized Image Analysis MN2 Anders Brun,

Representing the World

3-D D Euclidean Space - Vectors

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 253

Assignment 2 : Projection and Homography

Vision Review: Image Formation. Course web page:

Image formation. Thanks to Peter Corke and Chuck Dyer for the use of some slides

Computer Vision Projective Geometry and Calibration. Pinhole cameras

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 263

Agenda. Rotations. Camera models. Camera calibration. Homographies

CS-9645 Introduction to Computer Vision Techniques Winter 2019

Humanoid Robotics. Projective Geometry, Homogeneous Coordinates. (brief introduction) Maren Bennewitz

Exterior Orientation Parameters

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: ,

Chapters 1-4: Summary

Camera Geometry II. COS 429 Princeton University

Epipolar Geometry Prof. D. Stricker. With slides from A. Zisserman, S. Lazebnik, Seitz

Auto-calibration. Computer Vision II CSE 252B

Single View Geometry. Camera model & Orientation + Position estimation. What am I?

Camera Model and Calibration

Flexible Calibration of a Portable Structured Light System through Surface Plane

Instance-level recognition I. - Camera geometry and image alignment

CIS 580, Machine Perception, Spring 2015 Homework 1 Due: :59AM

Final Exam. Today s Review of Optics Polarization Reflection and transmission Linear and circular polarization Stokes parameters/jones calculus

Invariance of l and the Conic Dual to Circular Points C

2D Transforms. Lecture 4 CISC440/640 Spring Department of Computer and Information Science

AP Physics: Curved Mirrors and Lenses

N-Views (1) Homographies and Projection

Optics II. Reflection and Mirrors

Waves & Oscillations

Contents. 1 Introduction Background Organization Features... 7

Multiple View Geometry in Computer Vision

Stereo Image Rectification for Simple Panoramic Image Generation

Robotics - Projective Geometry and Camera model. Marcello Restelli

Projective geometry, camera models and calibration

Projective Geometry and Camera Models

5LSH0 Advanced Topics Video & Analysis

Stereo CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Camera Calibration and 3D Reconstruction from Single Images Using Parallelepipeds

Structure from motion

Waves & Oscillations

Midterm Exam Solutions

Camera model and calibration

Graphics Pipeline 2D Geometric Transformations

PROJECTIVE SPACE AND THE PINHOLE CAMERA

Multiple View Geometry in Computer Vision Second Edition

Perspective Projection in Homogeneous Coordinates

CV: 3D to 2D mathematics. Perspective transformation; camera calibration; stereo computation; and more

Cameras and Stereo CSE 455. Linda Shapiro

Rectification and Disparity

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

3D Sensing. 3D Shape from X. Perspective Geometry. Camera Model. Camera Calibration. General Stereo Triangulation.

Lecture 3: Camera Calibration, DLT, SVD

Geometric transformations in 3D and coordinate frames. Computer Graphics CSE 167 Lecture 3

Light: Geometric Optics

ECE-161C Cameras. Nuno Vasconcelos ECE Department, UCSD

Agenda. Rotations. Camera calibration. Homography. Ransac

ECE Digital Image Processing and Introduction to Computer Vision. Outline

More on single-view geometry class 10

Compositing a bird's eye view mosaic

Short on camera geometry and camera calibration

TD2 : Stereoscopy and Tracking: solutions

Computer Vision I - Appearance-based Matching and Projective Geometry

Chapters 1 7: Overview

METRIC PLANE RECTIFICATION USING SYMMETRIC VANISHING POINTS

Announcements. The equation of projection. Image Formation and Cameras

Transcription:

Chapter Camera model and multiple view geometry Before discussing how D information can be obtained from images it is important to know how images are formed First the camera model is introduced and then some important relationships between multiple views of a scene are presented 1 The camera model In this work the perspective camera model is used This corresponds to an ideal pinhole camera The geometric process for image formation in a pinhole camera has been nicely illustrated by Dürer (see Figure 1) The process is completely determined by choosing a perspective projection center and a retinal plane The projection of a scene point is then obtained as the intersection of a line passing through this point and the center of projection with the retinal plane Most cameras are described relatively well by this model In some cases additional effects (eg radial distortion) have to be taken into account (see ection 15) 11 A simple model In the simplest case where the projection center is placed at the origin of the world frame and the image plane is the plane D the projection process can be modeled as follows: For a world point + 0 T and the corresponding image point Using the homogeneous representation of the points a linear projection equation is obtained: ( ( ( ( ( ( ( ( ( + 0 (1) This projection is illustrated in Figure 2 The optical axis passes through the center of projection and is orthogonal to the retinal plane It s intersection with the retinal plane is defined as the principal point 12 Intrinsic calibration With an actual camera the focal length (ie the distance between the center of projection and the retinal plane) will be different from 1 the coordinates of equation (2) should therefore be scaled with to take this into account 21 (2)

22 CHATER CAMERA MODEL AD MULTILE VIEW GEOMETRY Figure 1: Man Drawing a Lute (The Draughtsman of the Lute) woodcut 1525 Albrecht Dürer M R optical axis m c f C Figure 2: erspective projection

1 THE CAMERA MODEL 2 p y p x K c e x R p y α pixel e y p x Figure : From retinal coordinates to image coordinates In addition the coordinates in the image do not correspond to the physical coordinates in the retinal plane With a CCD camera the relation between both depends on the size and shape of the pixels and of the position of the CCD chip in the camera With a standard photo camera it depends on the scanning process through which the images are digitized The transformation is illustrated in Figure The image coordinates are obtained through the following equations: 1 5 where and are the width and the height of the pixels is the principal point and the skew angle as indicated in Figure ince only the ratios and are of importance the simplified notations of the following equation will be used in the remainder of this text: with and being the focal length measured in width and height of the pixels and a factor accounting for the skew due to non-rectangular pixels! The above upper triangular matrix is called the calibration matrix of the camera and the notation will be used for it o the following equation describes the transformation from retinal coordinates to image coordinates! (4) For most cameras the pixels are almost perfectly rectangular and thus is very close to zero Furthermore the principal point is often close to the center of the image These assumptions can often be used certainly to get a suitable initialization for more complex iterative estimation procedures For a camera with fixed optics these parameters are identical for all the images taken with the camera For cameras which have zooming and focusing capabilities the focal length can obviously change but also the principal point can vary An extensive discussion of this subject can for example be found in the work of Willson [17 171 172 174] 1 Camera motion () Motion of scene points can be modeled as follows " with a rotation matrix and 1 5 E " ( a translation vector (5)

E CA ( ( 24 CHATER CAMERA MODEL AD MULTILE VIEW GEOMETRY as The motion of the camera is equivalent to an inverse motion of the scene and can therefore be modeled with " and indicating the motion of the camera 14 The projection matrix E " " (6) Combining equations (2) () and (6) the following expression is obtained for a camera with some specific intrinsic calibration and with a specific position and orientation: ( ( ( ( ( ( ( ( ( ( ( ( " - " + 0 which can be simplified to or even! " 1 - " 5 (7) () The T matrix is called the camera projection matrix Using () the plane corresponding to a back-projected image line can also be obtained: ince The transformation equation for projection matrices can be obtained as described in paragraph 21 If the points of a calibration grid are transformed by the same transformation as the camera their image points should stay the same: (10) and thus E UZW E UZW (9) (11) The projection of the outline of a quadric can also be obtained For a line in an image to be tangent to the projection of the outline of a quadric the corresponding plane should be on the dual quadric ubstituting equation (9) in (217) the following constraint 7 ( is obtained for to be tangent to the outline Comparing this result with the definition of a conic (210) the following projection equation is obtained for quadrics (this results can also be found in [65]) : 7 7 (12) Relation between projection matrices and image homographies The homographies that will be discussed here are collineations from - C -2 A homography describes the transformation from one plane to another A number of special cases are of interest since the image is also a plane The projection of points of a plane into an image can be described through a homography The matrix representation of this homography is dependent on the choice of the projective basis in the plane As an image is obtained by perspective projection the relation between points belonging to a plane in D space and their projections in the image is mathematically expressed by a homography The matrix of this homography is found as follows If the plane is given by 1 5 and the point of is represented as 1 5 then belongs to if and only if (2 Hence \ (1)

1 5 1 THE CAMERA MODEL 25 ow if the camera projection matrix is 21 5 then the projection of onto the image is 1 5 \ 1 5 (14) Consequently T ote that for the specific plane 1 ( (/( 5 the homographies are simply given by T It is also possible to define homographies which describe the transfer from one image to the other for points and other geometric entities located on a specific plane The notation will be used to describe such a homography from view to for a plane These homographies can be obtained through the following relation UXW and are independent to reparameterizations of the plane (and thus also to a change of basis in - )! " In the metric and Euclidean case and the plane at infinity is G# 21 (F( ( 5 In this case the homographies for the plane at infinity can thus be written as: #! "! UXW (15) where " L respect top the " " is the rotation matrix that describes the relative orientation from the one camera with In the projective and affine case one can assume that W 1 \ 5! (since in this case is unknown) In that case the homographies W \ for all planes and thus W Therefore can be factorized as 21 W W 5 (16) where W is the projection of the center of projection of the first camera (in this case 1 (/( ( 5 ) in image This point W is called the epipole for reasons which will become clear in ection 21 ote that this equation can be used to obtain W and W from but that due to the unknown relative scale factors can in general not be obtained from W and W Observe also that in the affine case (where # 21 (F( ( 5 # ) this yields 21 W W 5 Combining equations (14) and (16) one obtains W W W (17) This equation gives an important relationship between the homographies for all possible planes Homographies can only differ by a term W E 5 This means that in the projective case the homographies for the plane at infinity are known up to common parameters (ie the coefficients of # in the projective space) Equation (16) also leads to an interesting interpretation of the camera projection matrix: W 1 \ 1 W W 5 A W W W (1) W W A W ( (19) (20) In other words a point can thus be parameterized as being on the line through the optical center of the first camera (ie 1 ( (F( 5 ) and a point in the reference plane This interpretation is illustrated in Figure 4 15 Deviations from the camera model The perspective camera model describes relatively well the image formation process for most cameras However when high accuracy is required or when low-end cameras are used additional effects have to be taken into account

26 CHATER CAMERA MODEL AD MULTILE VIEW GEOMETRY M REF L REF M m 1 C 1 e 1i mi m REFi C i Figure 4: A point can be parameterized as W A obtained by transferring W according to (ie with W combination with the projection W of W (ie W A W W ) Its projection in another image can then be ) to image and applying the same linear The failures of the optical system to bring all light rays received from a point object to a single image point or to a prescribed geometric position should then be taken into account These deviations are called aberrations Many types of aberrations exist (eg astigmatism chromatic aberrations spherical aberrations coma aberrations curvature of field aberration and distortion aberration) It is outside the scope of this work to discuss them all The interested reader is referred to the work of Willson [17] and to the photogrammetry literature [17] Many of these effects are negligible under normal acquisition circumstances Radial distortion however can have a noticeable effect for shorter focal lengths Radial distortion is a linear displacement of image points radially to or from the center of the image caused by the fact that objects at different angular distance from the lens axis undergo different magnifications It is possible to cancel most of this effect by warping the image The coordinates in undistorted image plane coordinates can be obtained from the observed image coordinates by the following equation: W (21) W where W and are the first and second parameters of the radial distortion and ote that it can sometimes be necessary to allow the center of radial distortion to be different from the principal point [174] When the focal length of the camera changes (through zoom or focus) the parameters W and will also vary In a first approximation this can be modeled as follows: W W (22) Due to the changes in the lens system this is only an approximation except for digital zooms where (22) should be exactly satisfied