Multiple View Geometry

Similar documents
Epipolar Geometry Prof. D. Stricker. With slides from A. Zisserman, S. Lazebnik, Seitz

Computer Vision Lecture 17

Computer Vision Lecture 17

Dense 3D Reconstruction. Christiano Gava

Stereo Vision. MAN-522 Computer Vision

Lecture 14: Basic Multi-View Geometry

Dense 3D Reconstruction. Christiano Gava

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman

Recap: Features and filters. Recap: Grouping & fitting. Now: Multiple views 10/29/2008. Epipolar geometry & stereo vision. Why multiple views?

Depth from two cameras: stereopsis

Two-view geometry Computer Vision Spring 2018, Lecture 10

Computer Vision I. Announcement. Stereo Vision Outline. Stereo II. CSE252A Lecture 15

Image Rectification (Stereo) (New book: 7.2.1, old book: 11.1)

Depth from two cameras: stereopsis

Epipolar Geometry and Stereo Vision

Epipolar Geometry and Stereo Vision

Machine vision. Summary # 11: Stereo vision and epipolar geometry. u l = λx. v l = λy

There are many cues in monocular vision which suggests that vision in stereo starts very early from two similar 2D images. Lets see a few...

Binocular stereo. Given a calibrated binocular stereo pair, fuse it to produce a depth image. Where does the depth information come from?

12/3/2009. What is Computer Vision? Applications. Application: Assisted driving Pedestrian and car detection. Application: Improving online search

Stereo vision. Many slides adapted from Steve Seitz

Stereo and Epipolar geometry

Epipolar Geometry and Stereo Vision

Stereo. Many slides adapted from Steve Seitz

Announcements. Stereo

Stereo: Disparity and Matching

Project 4 Results. Representation. Data. Learning. Zachary, Hung-I, Paul, Emanuel. SIFT and HoG are popular and successful.

Multi-View Geometry Part II (Ch7 New book. Ch 10/11 old book)

CS 4495 Computer Vision A. Bobick. Motion and Optic Flow. Stereo Matching

Announcements. Stereo

Lecture 9: Epipolar Geometry

Unit 3 Multiple View Geometry

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

CS4495/6495 Introduction to Computer Vision. 3B-L3 Stereo correspondence

Stereo Vision A simple system. Dr. Gerhard Roth Winter 2012

Lecture 6 Stereo Systems Multi-view geometry

Stereo II CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Multiple View Geometry in Computer Vision

BIL Computer Vision Apr 16, 2014

C / 35. C18 Computer Vision. David Murray. dwm/courses/4cv.

MAPI Computer Vision. Multiple View Geometry

Stereo. Outline. Multiple views 3/29/2017. Thurs Mar 30 Kristen Grauman UT Austin. Multi-view geometry, matching, invariant features, stereo vision

CS 664 Slides #9 Multi-Camera Geometry. Prof. Dan Huttenlocher Fall 2003

Structure from Motion. Introduction to Computer Vision CSE 152 Lecture 10

1 Projective Geometry

CS 4495 Computer Vision A. Bobick. Motion and Optic Flow. Stereo Matching

Fundamentals of Stereo Vision Michael Bleyer LVA Stereo Vision

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Chaplin, Modern Times, 1936

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: ,

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Epipolar geometry. x x

Stereo CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Lecture 6 Stereo Systems Multi- view geometry Professor Silvio Savarese Computational Vision and Geometry Lab Silvio Savarese Lecture 6-24-Jan-15

Epipolar Geometry CSE P576. Dr. Matthew Brown

Miniature faking. In close-up photo, the depth of field is limited.

CS 2770: Intro to Computer Vision. Multiple Views. Prof. Adriana Kovashka University of Pittsburgh March 14, 2017

Geometry of Multiple views

Three-Dimensional Sensors Lecture 2: Projected-Light Depth Cameras

Final project bits and pieces

Comments on Consistent Depth Maps Recovery from a Video Sequence

Epipolar Constraint. Epipolar Lines. Epipolar Geometry. Another look (with math).

Correspondence and Stereopsis. Original notes by W. Correa. Figures from [Forsyth & Ponce] and [Trucco & Verri]

Cameras and Stereo CSE 455. Linda Shapiro

Recap from Previous Lecture

Rectification and Disparity

What have we leaned so far?

Computer Vision I. Announcement

Camera Model and Calibration

Epipolar Geometry and Stereo Vision

Reminder: Lecture 20: The Eight-Point Algorithm. Essential/Fundamental Matrix. E/F Matrix Summary. Computing F. Computing F from Point Matches

Lecture 10 Multi-view Stereo (3D Dense Reconstruction) Davide Scaramuzza

Computer Vision I - Algorithms and Applications: Multi-View 3D reconstruction

3D Computer Vision. Structure from Motion. Prof. Didier Stricker

Assignment 2: Stereo and 3D Reconstruction from Disparity

Public Library, Stereoscopic Looking Room, Chicago, by Phillips, 1923

calibrated coordinates Linear transformation pixel coordinates

Recovering structure from a single view Pinhole perspective projection

Lecture'9'&'10:'' Stereo'Vision'

Model-Based Stereo. Chapter Motivation. The modeling system described in Chapter 5 allows the user to create a basic model of a

Lecture 9 & 10: Stereo Vision

EECS 442 Computer vision. Stereo systems. Stereo vision Rectification Correspondence problem Active stereo vision systems

Camera Geometry II. COS 429 Princeton University

Multiple View Geometry in Computer Vision Second Edition

Multiple View Geometry

Epipolar Geometry and the Essential Matrix

Announcements. Stereo Vision II. Midterm. Example: Helmholtz Stereo Depth + Normals + BRDF. Stereo

Stereo Matching.

Camera Model and Calibration. Lecture-12

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

Project 3 code & artifact due Tuesday Final project proposals due noon Wed (by ) Readings Szeliski, Chapter 10 (through 10.5)

But First: Multi-View Projective Geometry

Lecture 10: Multi-view geometry

An investigation into stereo algorithms: An emphasis on local-matching. Thulani Ndhlovu

Last time: Disparity. Lecture 11: Stereo II. Last time: Triangulation. Last time: Multi-view geometry. Last time: Epipolar geometry

3D Modeling using multiple images Exam January 2008

Multi-view geometry problems

Project 2 due today Project 3 out today. Readings Szeliski, Chapter 10 (through 10.5)

1 (5 max) 2 (10 max) 3 (20 max) 4 (30 max) 5 (10 max) 6 (15 extra max) total (75 max + 15 extra)

Transcription:

Multiple View Geometry CS 6320, Spring 2013 Guest Lecture Marcel Prastawa adapted from Pollefeys, Shah, and Zisserman

Single view computer vision Projective actions of cameras Camera callibration Photometric stereo (geometrically single view, with multiple lightings)

Multi view computer vision Two (or more) images, from A stereo rig consisting of two cameras or the two images are acquired simultaneously A single moving camera (static scene) the two images are acquired sequentially The two scenarios are geometrically equivalent

Stereo head Camera on a mobile vehicle

Imaging geometry central projection camera centre, image point and scene point are collinear an image point back projects to a ray in 3-space? image point scene point depth of the scene point is unknown camera centre image plane

The objective Given two images of a scene acquired by known cameras compute the 3D position of the scene (structure recovery) Basic principle: triangulate from corresponding image points Determine 3D point at intersection of two back-projected rays

Corresponding points are images of the same scene point Triangulation C C / The back-projected points generate rays which intersect at the 3D scene point

An algorithm for stereo reconstruction 1. For each point in the first image determine the corresponding point in the second image (this is a search problem) 2. For each pair of matched points determine the 3D point by triangulation (this is an estimation problem)

The correspondence problem Given a point x in one image find the corresponding point in the other image Example with translation: This appears to be a 2D search problem, but it is reduced to a 1D search by the epipolar constraint

Notation The two cameras are P and P /, and a 3D point X is imaged as X P P / x x / C C / Warning for equations involving homogeneous quantities = means equal up to scale

Epipolar geometry Given an image point in one view, where is the corresponding point in the other view?? epipolar line C epipole C / baseline A point in one view generates an epipolar line in the other view The corresponding point lies on this line

Epipolar line Epipolar constraint Reduces correspondence problem to 1D search along an epipolar line

Epipolar geometry continued Epipolar geometry is a consequence of the coplanarity of the camera centres and scene point X x x / C C / The camera centres, corresponding points and scene point lie in a single plane, known as the epipolar plane

Nomenclature left epipolar line x e X C C / e / l / right epipolar line x / The epipolar line l / is the image of the ray through x The epipole e is the point of intersection of the line joining the camera centres with the image plane this line is the baseline for a stereo rig, and the translation vector for a moving camera The epipole is the image of the centre of the other camera: e = PC /, e / = P / C

The epipolar pencil X e e / baseline As the position of the 3D point X varies, the epipolar planes rotate about the baseline. This family of planes is known as an epipolar pencil. All epipolar lines intersect at the epipole. (a pencil is a one parameter family)

The epipolar pencil X e e / baseline As the position of the 3D point X varies, the epipolar planes rotate about the baseline. This family of planes is known as an epipolar pencil. All epipolar lines intersect at the epipole. (a pencil is a one parameter family)

Epipolar geometry example I: parallel cameras Epipolar geometry depends only on the relative pose (position and orientation) and internal parameters of the two cameras, i.e. the position of the camera centres and image planes. It does not depend on the scene structure (3D points external to the camera).

Epipolar geometry example II: converging cameras e e / Note, epipolar lines are in general not parallel

Homogeneous notation for lines

The line l through the two points p and q is l = p x q Proof The intersection of two lines l and m is the point x = l x m Example: compute the point of intersection of the two lines l and m in the figure below y 1 l m which is the point (2,1) 2 x

Matrix representation of the vector cross product

Example: compute the cross product of l and m

Algebraic representation of epipolar geometry We know that the epipolar geometry defines a mapping x l / point in first image epipolar line in second image

Derivation of the algebraic expression Outline P Step 1: for a point x in the first image back project a ray with camera P P / Step 2: choose two points on the ray and project into the second image with camera P / Step 3: compute the line through the two image points using the relation l / = p x q

choose camera matrices internal calibration rotation translation from world to camera coordinate frame first camera world coordinate frame aligned with first camera (i.e., first camera defines a reference space) second camera

Step 1: for a point x in the first image back project a ray with camera P A point x back projects to a ray where Z is the point s depth, since satisfies

P / Step 2: choose two points on the ray and project into the second image with camera P / Consider two points on the ray Z = 0 is the camera centre Z = is the point at infinity Project these two points into the second view

Step 3: compute the line through the two image points using the relation l / = p x q Compute the line through the points Using the identity F is the fundamental matrix F

Example I: compute the fundamental matrix for a parallel camera stereo rig X Y f Z f reduces to y = y /, i.e. raster correspondence (horizontal scan-lines)

X Y f Z f Geometric interpretation?

Example II: compute F for a forward translating camera X Y f Z f

X Y Z f f first image second image

e e

Summary: Properties of the Fundamental matrix

The Essential Matrix (F&P chapter 6) Algebraic setup:

The Essential Matrix: Equation

The Essential Matrix: Final Form Relates image of one point in one camera to the other, given rotation and translation

Essential (E) vs Fundamental (F) Matrix F has intrinsic and extrinsic parameters, E only has extrinsic Must know both camera properties for computing E Need calibrations No calibration for F E maps point in one image to the other F maps point to corresponding epipolar lines

An algorithm for stereo reconstruction 1. For each point in the first image determine the corresponding point in the second image (this is a search problem) 2. For each pair of matched points determine the 3D point by triangulation (this is an estimation problem)

Epipolar line Epipolar constraint Reduces correspondence problem to 1D search along an epipolar line

Algebraic representation of epipolar geometry We know that the epipolar geometry defines a mapping x l / point in first image epipolar line in second image the map only depends on the cameras P, P / (not on structure) it will be shown that the map is linear and can be written as

Stereo correspondence algorithms

Problem statement Given: two images and their associated cameras compute corresponding image points. Algorithms may be classified into two types: 1. Dense: compute a correspondence at every pixel 2. Sparse: compute correspondences only for features The methods may be top down or bottom up

Top down matching 1. Group model (house, windows, etc) independently in each image 2. Match points (vertices) between images

Bottom up matching epipolar geometry reduces the correspondence search from 2D to a 1D search on corresponding epipolar lines 1D correspondence problem A B C a b c a / b / c /

Stereograms Invented by Sir Charles Wheatstone, 1838

Red/green stereograms

Random dot stereograms

Autostereograms Autostereograms: www.magiceye.com

Correspondence algorithms Algorithms may be top down or bottom up random dot stereograms are an existence proof that bottom up algorithms are possible From here on only consider bottom up algorithms Algorithms may be classified into two types: 1. Dense: compute a correspondence at every pixel 2. Sparse: compute correspondences only for features

Example image pair parallel cameras

First image

Second image

Dense correspondence algorithm Parallel camera example epipolar lines are corresponding rasters epipolar line Search problem (geometric constraint): for each point in the left image, the corresponding point in the right image lies on the epipolar line (1D ambiguity) Disambiguating assumption (photometric constraint): the intensity neighbourhood of corresponding points are similar across images Measure similarity of neighbourhood intensity by cross-correlation

Intensity profiles Clear correspondence between intensities, but also noise and ambiguity

Normalized Cross Correlation region A region B write regions as vectors a b vector a vector b

Cross-correlation of neighbourhood regions epipolar line translate so that mean is zero

left image band 1 0.5 right image band cross correlation 0 x

target region left image band 1 0.5 right image band cross correlation 0 x

Why is cross-correlation such a poor measure in the second case? 1. The neighbourhood region does not have a distinctive spatial intensity distribution 2. Foreshortening effects fronto-parallel surface imaged length the same slanting surface imaged lengths differ

Limitations of similarity constraint Textureless surfaces Occlusions, repetition Non-Lambertian surfaces, specularities

Results with window search Data Window-based matching Ground truth

Sketch of a dense correspondence algorithm For each pixel in the left image compute the neighbourhood cross correlation along the corresponding epipolar line in the right image the corresponding pixel is the one with the highest cross correlation Parameters size (scale) of neighbourhood search disparity Other constraints uniqueness ordering smoothness of disparity field Applicability textured scene, largely fronto-parallel