CS201 Computer Vision Camera Geometry

Similar documents
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

Two-view geometry Computer Vision Spring 2018, Lecture 10

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

calibrated coordinates Linear transformation pixel coordinates

Camera Model and Calibration

3D Geometry and Camera Calibration

Stereo CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Machine vision. Summary # 11: Stereo vision and epipolar geometry. u l = λx. v l = λy

Reminder: Lecture 20: The Eight-Point Algorithm. Essential/Fundamental Matrix. E/F Matrix Summary. Computing F. Computing F from Point Matches

Rectification and Disparity

Geometric camera models and calibration

Rigid Body Motion and Image Formation. Jana Kosecka, CS 482

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah

Lecture 9: Epipolar Geometry

Epipolar Geometry and the Essential Matrix

Camera Model and Calibration. Lecture-12

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: ,

Recovering structure from a single view Pinhole perspective projection

Cameras and Stereo CSE 455. Linda Shapiro

Epipolar Geometry Prof. D. Stricker. With slides from A. Zisserman, S. Lazebnik, Seitz

Vision Review: Image Formation. Course web page:

Rectification and Distortion Correction

Epipolar Geometry and Stereo Vision

Image Rectification (Stereo) (New book: 7.2.1, old book: 11.1)

Computer Vision: Lecture 3

Computer Vision I. Announcement. Stereo Vision Outline. Stereo II. CSE252A Lecture 15

Lecture 5 Epipolar Geometry

Two-View Geometry (Course 23, Lecture D)

Announcements. Stereo

Multiple View Geometry in Computer Vision

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 253

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman

CS 231A Computer Vision (Winter 2015) Problem Set 2

Lecture 6 Stereo Systems Multi-view geometry

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 263

Camera model and multiple view geometry

Structure from motion

Camera Geometry II. COS 429 Princeton University

Lecture 3: Camera Calibration, DLT, SVD

Stereo II CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Announcements. Stereo

Computer Vision Lecture 17

LUMS Mine Detector Project

Computer Vision Lecture 17

Epipolar Geometry and Stereo Vision

MAPI Computer Vision. Multiple View Geometry

Stereo and Epipolar geometry

Midterm Exam Solutions

There are many cues in monocular vision which suggests that vision in stereo starts very early from two similar 2D images. Lets see a few...

BIL Computer Vision Apr 16, 2014

Multi-View Geometry Part II (Ch7 New book. Ch 10/11 old book)

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Lecture 14: Basic Multi-View Geometry

Image formation. Thanks to Peter Corke and Chuck Dyer for the use of some slides

Unit 3 Multiple View Geometry

Structure from motion

Epipolar Geometry and Stereo Vision

CS201 Computer Vision Lect 4 - Image Formation

CS6670: Computer Vision

Computer Vision Projective Geometry and Calibration. Pinhole cameras

Homogeneous Coordinates. Lecture18: Camera Models. Representation of Line and Point in 2D. Cross Product. Overall scaling is NOT important.

Stereo Vision. MAN-522 Computer Vision

Structure from Motion

Multi-view geometry problems

CS201: Computer Vision Introduction to Tracking

CS4670: Computer Vision

C / 35. C18 Computer Vision. David Murray. dwm/courses/4cv.

3-D D Euclidean Space - Vectors

MERGING POINT CLOUDS FROM MULTIPLE KINECTS. Nishant Rai 13th July, 2016 CARIS Lab University of British Columbia

Robot Vision: Camera calibration

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.

Multiple Views Geometry

CS 664 Slides #9 Multi-Camera Geometry. Prof. Dan Huttenlocher Fall 2003

CS 231A: Computer Vision (Winter 2018) Problem Set 2

Epipolar Geometry in Stereo, Motion and Object Recognition

3D Vision Real Objects, Real Cameras. Chapter 11 (parts of), 12 (parts of) Computerized Image Analysis MN2 Anders Brun,

COMP 558 lecture 19 Nov. 17, 2010

1 (5 max) 2 (10 max) 3 (20 max) 4 (30 max) 5 (10 max) 6 (15 extra max) total (75 max + 15 extra)

COSC579: Scene Geometry. Jeremy Bolton, PhD Assistant Teaching Professor

Introduction to Homogeneous coordinates

Dense 3D Reconstruction. Christiano Gava

Project 4 Results. Representation. Data. Learning. Zachary, Hung-I, Paul, Emanuel. SIFT and HoG are popular and successful.

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography

1 Projective Geometry

CIS 580, Machine Perception, Spring 2015 Homework 1 Due: :59AM

CSCI 5980/8980: Assignment #4. Fundamental Matrix

Epipolar geometry. x x

Multiple View Geometry

Augmented Reality II - Camera Calibration - Gudrun Klinker May 11, 2004

COMPARATIVE STUDY OF DIFFERENT APPROACHES FOR EFFICIENT RECTIFICATION UNDER GENERAL MOTION

Structure from Motion

Agenda. Rotations. Camera models. Camera calibration. Homographies

Computer Vision cmput 428/615

Perception II: Pinhole camera and Stereo Vision

Homographies and RANSAC

Camera model and calibration

Computer Vision I - Algorithms and Applications: Multi-View 3D reconstruction

Stereo Observation Models

Image Formation. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

Transcription:

CS201 Computer Vision Camera Geometry John Magee 25 November, 2014 Slides Courtesy of: Diane H. Theriault (deht@bu.edu) Question of the Day: How can we represent the relationships between cameras and the world? 1

The Book Why Camera Geometry To relate what we see in images to what is happening in the world 2

Vocabulary Center of projection Projection plane Optical axis Canonical Position : center at origin, Z axis is optical axis The Camera Matrix Pinhole model represented as matrix 3

Internal Camera Parameters Focal Length Principal point Pixel size Skew Map from world to image coordinates, assuming a camera in canonical position Camera Matrix in Canonical Position Map from world to image coordinates, assuming a camera in canonical position 4

Camera Pose Translation Position of camera center Binocular Stereo Use image disparity to calculate depth u1 = xf/z u2 = (x-t)f/z t Z = tf / (u1-u2) Find matching u1,u2 5

Camera Pose Rotation (Inverse of) Camera Orientation Position of camera center Epipolar Geometry Relationship between cameras 6

Epipolar Lines Relationship between points in one image and lines in another epipoles Epipolar Lines A point in one image defines a line in the other How to find (a matrix to give us) this line? 7

Epipolar Lines A point in one image defines a line in the other How to find (a matrix to give us) this line? Back-projection of Image Points Point in an image defines a ray emanating out into the scene How do you define a 3D line? 8

Back-projection of Image Points Point in an image defines a ray emanating out into the scene How do you define a 3D line? Vector from projecting image point back into scene Pseudo-inverse for matrix that does not have full rank Back-projection of Image Points Point in an image defines a ray emanating out into the scene Points on this line are imaged in the second view as P * X(lambda) 9

Fundamental Matrix Matrix that expresses the relationship between camera images and maps from image points to epipolar lines All epipolar lines intersect at the epipole The epipolar line is a cross product between two points: The epipole Any point you can come up with that will be on the line based on image point L = Fx Epipolar Line Constraint Ax + By + C = 0 Given A, B, C (Fx), if you plug in a point on the line for x,y, the answer will be zero Epipolar line constraint follows since image of X must be on epipolar line 10

So what? How can I use this camera geometry to find out things about the world using images? What can you do with camera geometry? Given 3D objects, find their image coordinates (booooorring) Given image points and 3D points, infer the pose of the camera Given image coordinates and camera matrices for multiple cameras, reconstruct the 3D point Perform multiple view data association for tracking Infer the movement of the camera (from image correspondences only) 11

Estimating a Linear Transform Rewrite the point transformation equation: How to estimate [a b c d e f]? Estimating a Linear Transformation DLT (Direct Linear Transformation) We will use it several times to infer things about camera geometry: Reconstruct 3D point Recover pose of camera Estimate fundamental matrix 12

DLT Basic Idea Set up a linear system that should be equal to zero (i.e. by taking a cross product of two vectors) : Ah = 0 Minimize the norm: Ah By taking the eigenvector of A A corresponding to the smallest eigenvalue Using DLT to compute a homography Given, image correspondences, x and x Want: homography / transformation H Set up cross product of x x H x Factor out elements of H into a vector with a system of equations like Ah = 0 h is the eigenvector of A T A with smallest eigenvalue 13

Using DLT to reconstruct a 3D point Given: image points x and x, camera matrices P and P Want 3D point such that x = PX and x = P X Set up cross product of x x PX and x x P X Factor out X Get system of equations of the form AX = 0 X is eigenvector of A T A with least eigenvalue Estimate 3D point Given image points and camera matrices, estimate the 3D world point 14

Estimating the Camera Matrix Need a calibration object Given: 3D points X and corresponding image points [x y w] Obtain: Camera matrix, P Note: Not factored in a nice way! Factoring the Camera Matrix Matrices can be factored in many ways From P, we can read out matrix M We know that we want M factored into a rotation (orthonormal) matrix and an upper triangular matrix (K) The way to do this is called RQ decomposition http://en.wikipedia.org/wiki/qr_deco mposition http://en.wikipedia.org/wiki/gram%e2 %80%93Schmidt_process 15

Fundamental Matrix Relationship between cameras Maps from points in one image to lines in the other image Epipolar line constraint: Estimating the Fundamental Matrix Use epipolar line constraint Factor out elements of F F is eigenvector of A T A with least eigenvalue 16

Estimating the Fundamental Matrix Use epipolar line constraint Need to scale image coordinates to get good results! Center of image points at origin, magnitude = sqrt(2) Essential Matrix (One thing) we really would like: Recover camera pose from image correspondences only Need to factor out camera calibration matrices: K, K 17

Recover Camera Pose from Essential Matrix Most important to know: It can be done Recover Camera Pose from Essential Matrix Most important to know: Four possible options (Need to test) 18

Radial Distortion Before you can estimate anything, you need your images to be square http://toothwalker.org/optics.html Radial Distortion Cause: Restricting light to the image sensor at location other than the lens http://toothwalker.org/optics.html 19

Radial Distortion Complex effect, need to pick a simple model Most common/popular is polynomial Given coefficients, you can correct the distortion Radial Distortion Estimating Coefficients is a bit involved http://www.vision.caltech.edu/bouguetj/calib _doc/ 20

Here it is in song form: The Fundamental Matrix Song: http://danielwedge.com/fmatrix/ 21