Lecture 14: Basic Multi-View Geometry

Similar documents
Multiple View Geometry

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

Two-view geometry Computer Vision Spring 2018, Lecture 10

Reminder: Lecture 20: The Eight-Point Algorithm. Essential/Fundamental Matrix. E/F Matrix Summary. Computing F. Computing F from Point Matches

Stereo Vision. MAN-522 Computer Vision

Epipolar geometry. x x

Lecture 9: Epipolar Geometry

Depth from two cameras: stereopsis

Epipolar Geometry Prof. D. Stricker. With slides from A. Zisserman, S. Lazebnik, Seitz

Epipolar Geometry and Stereo Vision

Computer Vision Lecture 17

Computer Vision Lecture 17

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman

Epipolar Geometry and Stereo Vision

Depth from two cameras: stereopsis

Lecture'9'&'10:'' Stereo'Vision'

Stereo. Outline. Multiple views 3/29/2017. Thurs Mar 30 Kristen Grauman UT Austin. Multi-view geometry, matching, invariant features, stereo vision

Rectification. Dr. Gerhard Roth

There are many cues in monocular vision which suggests that vision in stereo starts very early from two similar 2D images. Lets see a few...

Dense 3D Reconstruction. Christiano Gava

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

1 Projective Geometry

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography

Recap: Features and filters. Recap: Grouping & fitting. Now: Multiple views 10/29/2008. Epipolar geometry & stereo vision. Why multiple views?

BIL Computer Vision Apr 16, 2014

Machine vision. Summary # 11: Stereo vision and epipolar geometry. u l = λx. v l = λy

Correspondence and Stereopsis. Original notes by W. Correa. Figures from [Forsyth & Ponce] and [Trucco & Verri]

Lecture 6 Stereo Systems Multi-view geometry

Rectification and Disparity

Fundamentals of Stereo Vision Michael Bleyer LVA Stereo Vision

Announcements. Stereo

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Stereo II CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

CS 664 Slides #9 Multi-Camera Geometry. Prof. Dan Huttenlocher Fall 2003

Dense 3D Reconstruction. Christiano Gava

Structure from Motion. Introduction to Computer Vision CSE 152 Lecture 10

Lecture 9 & 10: Stereo Vision

Image Rectification (Stereo) (New book: 7.2.1, old book: 11.1)

Week 2: Two-View Geometry. Padua Summer 08 Frank Dellaert

Unit 3 Multiple View Geometry

Lecture 6 Stereo Systems Multi- view geometry Professor Silvio Savarese Computational Vision and Geometry Lab Silvio Savarese Lecture 6-24-Jan-15

Recovering structure from a single view Pinhole perspective projection

Final Exam Study Guide

Announcements. Stereo

Multi-view geometry problems

Lecture 5 Epipolar Geometry

calibrated coordinates Linear transformation pixel coordinates

Camera Model and Calibration

Computer Vision I. Announcement. Stereo Vision Outline. Stereo II. CSE252A Lecture 15

Structure from motion

Rectification and Distortion Correction

3D Geometry and Camera Calibration

Geometry of Multiple views

Epipolar Geometry and Stereo Vision

Assignment 2: Stereo and 3D Reconstruction from Disparity

Stereo CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Final Review CMSC 733 Fall 2014

Epipolar Geometry and Stereo Vision

Robert Collins CSE486, Penn State. Lecture 09: Stereo Algorithms

Structure from motion

CS 231A: Computer Vision (Winter 2018) Problem Set 2

MAPI Computer Vision. Multiple View Geometry

Step-by-Step Model Buidling

Three-Dimensional Sensors Lecture 2: Projected-Light Depth Cameras

CS5670: Computer Vision

Fundamental matrix. Let p be a point in left image, p in right image. Epipolar relation. Epipolar mapping described by a 3x3 matrix F

Epipolar Geometry and the Essential Matrix

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: ,

Final project bits and pieces

CS4495/6495 Introduction to Computer Vision. 3B-L3 Stereo correspondence

CS201 Computer Vision Camera Geometry

Chaplin, Modern Times, 1936

Cameras and Stereo CSE 455. Linda Shapiro

Image Transfer Methods. Satya Prakash Mallick Jan 28 th, 2003

LUMS Mine Detector Project

Project 4 Results. Representation. Data. Learning. Zachary, Hung-I, Paul, Emanuel. SIFT and HoG are popular and successful.

Fundamental Matrix & Structure from Motion

Stereo imaging ideal geometry

Camera Geometry II. COS 429 Princeton University

Multiple View Geometry. Frank Dellaert

Lecture 10: Multi-view geometry

Camera Model and Calibration. Lecture-12

Final Exam Study Guide CSE/EE 486 Fall 2007

EECS 442 Computer vision. Stereo systems. Stereo vision Rectification Correspondence problem Active stereo vision systems

7. The Geometry of Multi Views. Computer Engineering, i Sejong University. Dongil Han

CS 4495 Computer Vision A. Bobick. Motion and Optic Flow. Stereo Matching

1 (5 max) 2 (10 max) 3 (20 max) 4 (30 max) 5 (10 max) 6 (15 extra max) total (75 max + 15 extra)

Computer Vision Projective Geometry and Calibration. Pinhole cameras

CS231A Course Notes 4: Stereo Systems and Structure from Motion

3D Reconstruction from Two Views

1 CSE 252A Computer Vision I Fall 2017

Computer Vision. Exercise Session 6 Stereo matching. Institute of Visual Computing

C / 35. C18 Computer Vision. David Murray. dwm/courses/4cv.

Stereo Image Rectification for Simple Panoramic Image Generation

Structure from Motion and Multi- view Geometry. Last lecture

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 263

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Stereo matching. Francesco Isgrò. 3D Reconstruction and Stereo p.1/21

Introduction à la vision artificielle X

Lecture 10: Multi view geometry

Transcription:

Lecture 14: Basic Multi-View Geometry

Stereo If I needed to find out how far point is away from me, I could use triangulation and two views scene point image plane optical center (Graphic from Khurram Shaffique)

To see this I have three unknowns (x,y,z) I have 4 equations P projects to p (u and v) P projects to p' (u and v)

Today For the rest of the lecture we will talk about the geometry of multiple views To begin we will talk about epipolar geometry (Image from Forsyth and Ponce)

Epipoles The projection of the optical centers of each camera epipole epipole (Image from Forsyth and Ponce)

Epipoles and epipolar lines For a point P, we may not know where it projects to (p') We do know that it lies on an epipolar line (Image from Forsyth and Ponce)

When the cameras are calibrated The vectors,, and are coplanar Can be expressed as (Image from Forsyth and Ponce)

How you can see this The vector returned by the cross product is perpendicular to the two vectors Can be thought of as a normal to a plane If the lines in the plane, it should also be perpendicular to that normal (Image from Forsyth and Ponce)

(Image from Forsyth and Ponce) Relating the two views First, remember that we have have points in two coordinate systems Can express a rigid transformation (translation and rotation) between the two system R is a Rotation Matrix T is a Translation vector

Now, rewrite constraint in terms of the coordinates of the left camera For simplicity:

The essential matrix Starting with: A cross product can be rewritten as a matrix multiplication, leading to the constraint is called the essential matrix

Properties of the essential matrix 3x3 matrix Rank 2 Only relates the extrinsic parameters of the two views

What if the cameras aren't calibrated The relationship still holds, but we have to calibrate the cameras first. Those calibration matrices, combined with the essential matrix are known as the fundamental matrix Encodes information from the intrinsic and extrinsic parameters Also Rank 2

Finding the fundamental matrix Basic algorithm: 8-point algorithm Find 8 corresponding points in the images Once you have the corresponding p and p' points, c Is linear in F

Finding the fundamental matrix Minimize over the coefficients of F Improvements proposed by Hartley (1995) Normalize corresponding points After you compute the first estimate of F, enforce Rank-2 constraint using SVD (see text) Significantly improves results

Adding more views Text also describes the geometry of 3 or more views (From Forsyth and Ponce)

How do we use the fundamental matrix? It can tell us where to look for points in the other image The quantity is a vector So, c defines a line This tells us where to look for the point that corresponds to p (Demo)

Triangulating points If we know the projection matrices M and the locations of corresponding points, We can generate a set of linear constraints And solve those using a linear-least squares technique

But it can be easier Image planes are parallel Also parallel to line connecting the two optical centers O P Baseline O'

Overhead view Now triangulation is much easier P O O'

First, Some Geometry Similar triangles c c' a' b' a b

P Z O f O'

P Z f O d B O'

P Z O f O'

P Z O f O' Disparity

Image Rectification All we need to calculate the depth is the distance between corresponding points Would be easier to implement if the epipolar lines corresponded with scanlines Image rectification transforms the images so they have this property

Image Rectification For each image, compute mapping between image from the camera and the rectified image O P Baseline O'

Intuitions on Rectification Let's look in 2D If I change the position of my camera, then the location of the point depends on depth P

Intuitions on Rectification Let's look in 2D If I change the position of my camera, then the location of the point depends on depth P' P

Intuitions on Rectification But if I just rotate the camera, then I don't need to know the depth Notice that P' and P still project to the same point in the rotated camera P' P

Image Rectification I can rotate views to be parallel to each other and baseline O P Baseline O'

Lines up Epipolar lines with scanlines (Image from Loop and Zhang)

Rectifying images Can be computed using the fundamental matrix

Interesting Connections to Fundamental Matrix After rectification, it should have this form

Implication

Where are the epipoles? In this case, the epipole is at infinity In homogeneous coordinates, this is [1 0 0] O P Baseline O'

Now to the hard problem Once we have rectified images, the hard problem is find the corresponding points Then we can triangulate Known as the correspondence problem When people say stereo algorithm, they usually mean correspondence algorithm

A very simple algorithm To find the point that matches this point

Could measure sum of squared differences (SSD) or correlation A very simple algorithm Look at a square window around that point and compare it to windows in the other image

What's wrong with this algorithm? This set of correspondences don't bother it Areas of the image without texture will be a problem for this algorithm.

To do better we need a better model of images We can make reasonable assumptions about the surfaces in the world Usually assume that the surfaces are smooth Can pose the problem of finding the corresponding points as an energy (or cost) minimization The data term measures how well the local windows match up for different disparities

The smoothness cost The smoothness is usually implemented by penalizing differences in the disparity Different penalty functions lead to different assumptions about the surfaces log(1+x^2) x^2 First deriv, Second derivative

Pitfalls in stereo Foreshortening Projected size of a patch of the surface can change significantly (Image from Forsyth and Ponce)

Pitfalls in Stereo Occlusion Don't always see points in both images

So, which algorithm works best? www.middlebury.edu/stereo Results are a combination of the model and the optimization algorithm Best results are obtained from algorithms using a model known as a Markov Random Field This will be our next topic