Epipolar geometry. x x

Similar documents
Multi-view geometry problems

Structure from motion

Recovering structure from a single view Pinhole perspective projection

Stereo CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Undergrad HTAs / TAs. Help me make the course better! HTA deadline today (! sorry) TA deadline March 21 st, opens March 15th

Two-view geometry Computer Vision Spring 2018, Lecture 10

Epipolar Geometry and Stereo Vision

Stereo II CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Epipolar Geometry and Stereo Vision

Lecture 9: Epipolar Geometry

Announcements. Stereo

Announcements. Stereo

Week 2: Two-View Geometry. Padua Summer 08 Frank Dellaert

Cameras and Stereo CSE 455. Linda Shapiro

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

Multiple View Geometry. Frank Dellaert

Reminder: Lecture 20: The Eight-Point Algorithm. Essential/Fundamental Matrix. E/F Matrix Summary. Computing F. Computing F from Point Matches

Lecture 5 Epipolar Geometry

Lecture 14: Basic Multi-View Geometry

Structure from motion

Computer Vision I. Announcement. Stereo Vision Outline. Stereo II. CSE252A Lecture 15

Machine vision. Summary # 11: Stereo vision and epipolar geometry. u l = λx. v l = λy

Structure from Motion and Multi- view Geometry. Last lecture

3D Geometry and Camera Calibration

Robust Geometry Estimation from two Images

CS231M Mobile Computer Vision Structure from motion

Unit 3 Multiple View Geometry

1 Projective Geometry

calibrated coordinates Linear transformation pixel coordinates

Last lecture. Passive Stereo Spacetime Stereo

CS231A Course Notes 4: Stereo Systems and Structure from Motion

Structure from motion

Multi-View Geometry Part II (Ch7 New book. Ch 10/11 old book)

Lecture 6 Stereo Systems Multi- view geometry Professor Silvio Savarese Computational Vision and Geometry Lab Silvio Savarese Lecture 6-24-Jan-15

Structure from Motion. Introduction to Computer Vision CSE 152 Lecture 10

Multiple Views Geometry

BIL Computer Vision Apr 16, 2014

Two-View Geometry (Course 23, Lecture D)

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

Epipolar Geometry class 11

Epipolar Geometry and Stereo Vision

The end of affine cameras

Multiple View Geometry

Introduction à la vision artificielle X

Camera Geometry II. COS 429 Princeton University

Epipolar Geometry Prof. D. Stricker. With slides from A. Zisserman, S. Lazebnik, Seitz

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography

Geometry of Multiple views

CS 664 Slides #9 Multi-Camera Geometry. Prof. Dan Huttenlocher Fall 2003

Structure from Motion CSC 767

3D Computer Vision. Structure from Motion. Prof. Didier Stricker

Structure from Motion

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Camera Model and Calibration

Structure from Motion

Epipolar Geometry and Stereo Vision

Stereo Vision. MAN-522 Computer Vision

Lecture'9'&'10:'' Stereo'Vision'

Computer Vision Lecture 20

Fundamental Matrix & Structure from Motion

Lecture 6 Stereo Systems Multi-view geometry

3D FACE RECONSTRUCTION BASED ON EPIPOLAR GEOMETRY

Epipolar Geometry in Stereo, Motion and Object Recognition

A Real-Time Catadioptric Stereo System Using Planar Mirrors

Camera Calibration Using Line Correspondences

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman

Depth from two cameras: stereopsis

Computer Vision Lecture 17

MAPI Computer Vision. Multiple View Geometry

Computer Vision Lecture 17

Computer Vision Lecture 20

Lecture 10: Multi-view geometry

Stereo and Epipolar geometry

Depth from two cameras: stereopsis

Epipolar Geometry CSE P576. Dr. Matthew Brown

Computer Vision I. Announcements. Random Dot Stereograms. Stereo III. CSE252A Lecture 16

Structure from Motion

Z (cm) Y (cm) X (cm)

3D Photography: Epipolar geometry

Recap: Features and filters. Recap: Grouping & fitting. Now: Multiple views 10/29/2008. Epipolar geometry & stereo vision. Why multiple views?

Structure from Motion

A General Expression of the Fundamental Matrix for Both Perspective and Affine Cameras

Structure from motion

CS201 Computer Vision Camera Geometry

CS 231A Computer Vision (Winter 2015) Problem Set 2

Camera Model and Calibration. Lecture-12

An Improved Evolutionary Algorithm for Fundamental Matrix Estimation

Multi-stable Perception. Necker Cube

The Geometry Behind the Numerical Reconstruction of Two Photos

Augmented Reality II - Camera Calibration - Gudrun Klinker May 11, 2004

A Factorization Method for Structure from Planar Motion

Final project bits and pieces

CS223b Midterm Exam, Computer Vision. Monday February 25th, Winter 2008, Prof. Jana Kosecka

A linear algorithm for Camera Self-Calibration, Motion and Structure Recovery for Multi-Planar Scenes from Two Perspective Images

Part I: Single and Two View Geometry Internal camera parameters

Step-by-Step Model Buidling

Computer Vision I - Robust Geometry Estimation from two Cameras

Pin Hole Cameras & Warp Functions

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: ,

Rectification and Distortion Correction

Transcription:

Two-view geometry

Epipolar geometry X x x Baseline line connecting the two camera centers Epipolar Plane plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the other camera center

The Epipole Photo by Frank Dellaert

Epipolar geometry X x x Baseline line connecting the two camera centers Epipolar Plane plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the other camera center Epipolar Lines - intersections of epipolar plane with image planes (always come in corresponding pairs)

Example: Converging cameras

Example: Motion parallel to image plane

Example: Motion perpendicular to image plane

Example: Motion perpendicular to image plane

Example: Motion perpendicular to image plane e e Epipole has same coordinates in both images. Points move along lines radiating from e: Focus of expansion

Epipolar constraint X x x If we observe a point x in one image, where can the corresponding point x be in the other image?

Epipolar constraint X X X x x x x Potential matches for x have to lie on the corresponding epipolar line l. Potential matches for x have to lie on the corresponding Potential matches for x have to lie on the corresponding epipolar line l.

Epipolar constraint example

Epipolar constraint: Calibrated case X x x Assume that the intrinsic and extrinsic parameters of the cameras are known We can multiply l the projection matrix of each camera (and the image points) by the inverse of the calibration matrix to get normalized image coordinates We can also set the global coordinate system to the coordinate system of the first camera. Then the projection matrix of the first camera is [I 0].

Epipolar constraint: Calibrated case X = RX + t x x t R The vectors x, t, and Rx are coplanar

Epipolar constraint: Calibrated case X x x x [ t ( Rx )] = 0 x T E x = with E = [ t ] R 0 Essential Matrix (Longuet-Higgins Higgins, 1981) The vectors x, t, and Rx are coplanar

Epipolar constraint: Calibrated case X x x x [ t ( Rx )] = 0 x T E x = with E = [ t ] R 0 E x is the epipolar line associated with x (l = E x ) E T x is the epipolar line associated with x (l = E T x) E e = 0 and E T e = 0 E is singular (rank two) E has five degrees of freedom

Epipolar constraint: Uncalibrated case X x x The calibration matrices K and K of the two cameras are unknown We can write the epipolar constraint in terms of unknown normalized coordinates: ˆ E x ˆ = x T 0 x = K xˆ, x = K xˆ

Epipolar constraint: Uncalibrated case X x x xˆ T T T E x ˆ = 0 x F x = 0 with F = K EK 1 xˆ = K x ˆ = K 1 1 x x Fundamental Matrix (Faugeras and Luong, 1992)

Epipolar constraint: Uncalibrated case X x x xˆ T T T E x ˆ = 0 x F x = 0 with F = K EK 1 F x is the epipolar line associated with x (l = F x ) F T x is the epipolar line associated with x (l = F T x) Fe = 0 and F T e = 0 F is singular (rank two) F has seven degrees of freedom

The eight-point algorithm x = (u, v, 1) T, x = (u, v, 1) T Minimize: N ( x T i F xi ) 2 i= 1 under the constraint F 33 = 1

The eight-point algorithm Meaning of error T 2 ( x i F x i ) : N i= 1 sum of Euclidean distances between points x i and epipolar lines Fx i (or points x i and epipolar lines F T x i ) multiplied by a scale factor Nonlinear approach: minimize N [ + ] 2 2 T d ( xi, F xi ) d ( xi, F xi ) i= 1

Problem with eight-point algorithm

Problem with eight-point algorithm Poor numerical conditioning Can be fixed by rescaling the data

The normalized eight-point algorithm (Hartley, 1995) Center the image data at the origin, and scale it so the mean squared distance between the origin and the data points is 2 pixels Use the eight-point algorithm to compute F from the normalized points Enforce the rank-2 constraint (for example, take SVD of F and throw out the smallest singular value) Transform fundamental matrix back to original units: if T and T are the normalizing transformations in the two images, than the fundamental matrix in original coordinates is T T F T

Comparison of estimation algorithms 8-point Normalized 8-point Nonlinear least squares Av. Dist. 1 2.33 pixels 0.92 pixel 0.86 pixel Av. Dist. 2 2.18 pixels 0.85 pixel 0.80 pixel

From epipolar geometry to camera calibration Estimating the fundamental matrix is known as weak calibration If we know the calibration matrices of the two cameras, we can estimate the essential matrix: E=K T FK The essential matrix gives us the relative rotation and translation between the cameras, or their extrinsic parameters