Reminder: Lecture 20: The Eight-Point Algorithm. Essential/Fundamental Matrix. E/F Matrix Summary. Computing F. Computing F from Point Matches

Similar documents
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

Two-view geometry Computer Vision Spring 2018, Lecture 10

Stereo Vision. MAN-522 Computer Vision

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

calibrated coordinates Linear transformation pixel coordinates

Lecture 14: Basic Multi-View Geometry

Epipolar geometry. x x

Rectification. Dr. Gerhard Roth

Epipolar Geometry and Stereo Vision

Stereo II CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Recovering structure from a single view Pinhole perspective projection

Epipolar Geometry and Stereo Vision

3D Geometry and Camera Calibration

Lecture 9: Epipolar Geometry

Stereo Vision Computer Vision (Kris Kitani) Carnegie Mellon University

Two-View Geometry (Course 23, Lecture D)

CS201 Computer Vision Camera Geometry

Lecture'9'&'10:'' Stereo'Vision'

Computer Vision Lecture 17

Computer Vision I. Announcement. Stereo Vision Outline. Stereo II. CSE252A Lecture 15

Computer Vision Lecture 17

Week 2: Two-View Geometry. Padua Summer 08 Frank Dellaert

Stereo CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

1 Projective Geometry

Structure from Motion and Multi- view Geometry. Last lecture

Rectification and Distortion Correction

BIL Computer Vision Apr 16, 2014

Unit 3 Multiple View Geometry

Multi-view geometry problems

Final Exam Study Guide CSE/EE 486 Fall 2007

Lecture 5 Epipolar Geometry

Cameras and Stereo CSE 455. Linda Shapiro

Stereo and Epipolar geometry

Announcements. Stereo

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography

Lecture 6 Stereo Systems Multi-view geometry

Lecture 3: Camera Calibration, DLT, SVD

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

CS 231A: Computer Vision (Winter 2018) Problem Set 2

Undergrad HTAs / TAs. Help me make the course better! HTA deadline today (! sorry) TA deadline March 21 st, opens March 15th

Announcements. Stereo

Multiple Views Geometry

Fundamental Matrix & Structure from Motion

Lecture 9 & 10: Stereo Vision

Final project bits and pieces

Multiple View Geometry. Frank Dellaert

Structure from motion

Camera Model and Calibration

Image Rectification (Stereo) (New book: 7.2.1, old book: 11.1)

Step-by-Step Model Buidling

Image Transfer Methods. Satya Prakash Mallick Jan 28 th, 2003

Machine vision. Summary # 11: Stereo vision and epipolar geometry. u l = λx. v l = λy

Camera Model and Calibration. Lecture-12

Computer Vision Projective Geometry and Calibration. Pinhole cameras

Multiple View Geometry

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman

Epipolar Geometry CSE P576. Dr. Matthew Brown

But First: Multi-View Projective Geometry

There are many cues in monocular vision which suggests that vision in stereo starts very early from two similar 2D images. Lets see a few...

Visual Recognition: Image Formation

Lecture 6 Stereo Systems Multi- view geometry Professor Silvio Savarese Computational Vision and Geometry Lab Silvio Savarese Lecture 6-24-Jan-15

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 253

Robust Geometry Estimation from two Images

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 263

Structure from motion

CS231M Mobile Computer Vision Structure from motion

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: ,

Structure from Motion. Introduction to Computer Vision CSE 152 Lecture 10

Vision par ordinateur

Geometric camera models and calibration

CS 231A Computer Vision (Winter 2015) Problem Set 2

Correspondence and Stereopsis. Original notes by W. Correa. Figures from [Forsyth & Ponce] and [Trucco & Verri]

CS 664 Slides #9 Multi-Camera Geometry. Prof. Dan Huttenlocher Fall 2003

CS223b Midterm Exam, Computer Vision. Monday February 25th, Winter 2008, Prof. Jana Kosecka

CS231A Course Notes 4: Stereo Systems and Structure from Motion

Dense 3D Reconstruction. Christiano Gava

Camera Geometry II. COS 429 Princeton University

COMPARATIVE STUDY OF DIFFERENT APPROACHES FOR EFFICIENT RECTIFICATION UNDER GENERAL MOTION

Computer Vision Projective Geometry and Calibration. Pinhole cameras

N-Views (1) Homographies and Projection

Rectification and Disparity

Last lecture. Passive Stereo Spacetime Stereo

Recap: Features and filters. Recap: Grouping & fitting. Now: Multiple views 10/29/2008. Epipolar geometry & stereo vision. Why multiple views?

Epipolar Geometry and the Essential Matrix

Structure from motion

CS 4495 Computer Vision A. Bobick. Motion and Optic Flow. Stereo Matching

3D Reconstruction from Two Views

CS6670: Computer Vision

The end of affine cameras

Structure from Motion

Computer Vision I. Announcements. Random Dot Stereograms. Stereo III. CSE252A Lecture 16

3D Reconstruction with two Calibrated Cameras

Computer Vision I - Algorithms and Applications: Multi-View 3D reconstruction

Computer Vision: Lecture 3

C / 35. C18 Computer Vision. David Murray. dwm/courses/4cv.

Stereo matching. Francesco Isgrò. 3D Reconstruction and Stereo p.1/21

1 (5 max) 2 (10 max) 3 (20 max) 4 (30 max) 5 (10 max) 6 (15 extra max) total (75 max + 15 extra)

Structure from Motion

ECE 470: Homework 5. Due Tuesday, October 27 in Seth Hutchinson. Luke A. Wendt

CS 4495 Computer Vision A. Bobick. Motion and Optic Flow. Stereo Matching

Transcription:

Reminder: Lecture 20: The Eight-Point Algorithm F = -0.00310695-0.0025646 2.96584-0.028094-0.00771621 56.3813 13.1905-29.2007-9999.79 Readings T&V 7.3 and 7.4 Essential/Fundamental Matrix E/F Matrix Summary The essential and fundamental matrices are 3x3 matrices that encode the epipolar geometry of two views. Motivation: Given a point in one image, multiplying by the essential/fundamental matrix will tell us which epipolar line to search along in the second view. Longuet-Higgins equation Epipolar lines: Epipoles: E vs F: E works in film coords (calibrated cameras) F works in pixel coords (uncalibrated cameras) Computing F from Point Matches Assume that you have m correspondences Each correspondence satisfies: Computing F F is a 3x3 matrix (9 entries) Set up a HOMOGENEOUS linear system with 9 unknowns 1

Computing F Computing F Given m point correspondences Think: how many points do we need? How Many Points? Unlike a homography, where each point correspondence contributes two constraints (rows in the linear system of equations), for estimating the essential/fundamental matrix, each point only contributes one constraint (row). [because the Longuet-Higgins / Epipolar constraint is a scalar eqn.] Thus need at least 8 points. Solving Homogeneous Systems Assume that we need the non trivial solution of: with m equations and n unknowns, m >= n 1 and rank(a) = n-1 Hence: The Eight Point algorithm! Since the norm of x is arbitrary, we will look for a solution with norm x = 1 Least Square solution Optimization with constraints We want Ax as close to 0 as possible and x =1: Define the following cost: This cost is called the LAGRANGIAN cost and λ is called the LAGRANGIAN multiplier The Lagrangian incorporates the constraints into the cost function by introducing extra variables. 2

Optimization with constraints Optimization with constraints Taking derivatives wrt to x and λ: x is an eigenvector of A T A with eigenvalue λ: e λ The first equation is an eigenvector problem The second equation is the original constraint We want the eigenvector with smallest eigenvalue Singular Value Decomposition (SVD) We can find the eigenvectors and eigenvalues of A T A by finding the Singular Value Decomposition of A Any m x n matrix A can be written as the product of 3 matrices: Where: U is m x m and its columns are orthonormal vectors V is n x n and its columns are orthonormal vectors D is m x n diagonal and its diagonal elements are called the singular values of A, and are such that: σ 1 σ 2 σ n 0 SVD Properties Computing F: The 8 pt Algorithm The columns of U are the eigenvectors of AA T The columns of V are the eigenvectors of A T A The squares of the diagonal elements of D are the eigenvalues of AA T and A T A 8 8 8 8 8 8 8 8 8 8 8 8 A has rank 8 Find eigenvector of A T A with smallest eigenvalue! 3

Algorithm EIGHT_POINT Algorithm EIGHT_POINT The input is formed by m point correspondences, m >= 8 Construct the m x 9 matrix A Find the SVD of A: A = UDV T The entries of F are the components of the column of V corresponding to the least s.v. F must be singular (remember, it is rank 2, since it is important for it to have a left and right nullspace, i.e. the epipoles). To enforce rank 2 constraint: Find the SVD of F: F = U f D f V f T Set smallest s.v. of F to 0 to create D f Recompute F: F = U f D f V f T Numerical Details The coordinates of corresponding points can have a wide range leading to numerical instabilities. It is better to first normalize them so they have average 0 and stddev 1 and denormalize F at the end: A Practical Issue How to rectify the images so that any scan-line stereo alorithm that works for simple stereo can be used to find dense matches (i.e. compute a disparity image for every pixel). F = (T ) -1 F n T rectify Hartley preconditioning algorithm: this was an optional topic in Lecture 16. Go back and look if you want to know more. general epipolar lines parallel epipolar lines Stereo Rectification General Idea Image Reprojection reproject image planes onto common plane parallel to line between optical centers Notice, only focal point of camera really matters Seitz, UW Apply homography representing a virtual rotation to bring the image planes parallel with the baseline (epipoles go to infinity). 4

General Idea, continued Method from T&V 7.3.7 Assuming extrinsic parameters R & T are known, compute a 3D rotation that makes conjugate epipolar lines collinear and parallel to the horizontal image axis Remember: a rotation around focal point of camera is just a 2D homography in the image! Note: this method from the book assumes calibrated cameras (we can recover R,T from the E matrix). In a moment, we will see a more general approch that uses F matrix. Rectification involves two rotations: First rotation sends epipoles to infinity Second rotation makes epipolar lines parallel Rotate the left and right cameras with first R 1 (constructed from translation T) Rotate the right camera with the R matrix Adjust scales in both camera references Build the rotation: with: where T is just a unit vector representing the epipole in the left image. We know how to compute this from E, from last class. Algorithm Rectification Build the rotation: with: Build the matrix R rect Set R l = R rect and R r = R.R rect For each left point p l = (x,y,f) T compute R l p l = (x, y, z ) T Compute p l = f/z (x, y z ) T Repeat above for the right camera with R r Verify that this homography maps e1 to [1 0 0] 5

Example Example Rectified Pair A Better Approach The traditional method does not work when epipoles are in the image (for example, camera translating forward) Good paper: Simple and efficient rectification methods for general motion, Marc Pollefeys, R.Koch, L.VanGool, ICCV99. Polar re-parameterization around epipoles Requires only (oriented) epipolar geometry Preserve length of epipolar lines Choose Δθ so that no pixels are compressed original image Polar rectification (Pollefeys et al. ICCV 99) rectified image General idea: polar rectification around the epipoles. Works for all relative motions Guarantees minimal image size polar rectification: example polar rectification: example 6

Example Example (cont) epipoles Rectified images Input images Example (cont) Example (cont) Depth Maps sparse interpolated Depth map in pixel coords Views of texture mapped depth surface 7