Multi-view geometry
Structure rom motion Camera 1 Camera 2 R 1,t 1 R 2,t 2 Camera 3 R 3,t 3 Figure credit: Noah Snavely
Structure rom motion? Camera 1 Camera 2 R 1,t 1 R 2,t 2 Camera 3 R 3,t 3 Structure: Given known cameras and projections o the same 3D point in two or more images, compute the 3D coordinates o that point
Structure rom motion Camera 1? Camera 2 R 1,t 1 R 2,t 2?? Camera 3 R 3,t 3 Motion: Given a set o known 3D points seen by a camera, compute the camera parameters
Structure rom motion Camera 1? Camera 2 R 1,t 1 R 2,t 2? Bootstrapping the process: Given a set o 2D point correspondences in two images, compute the camera parameters
Two-view geometry
Epipolar geometry X x x Baseline line connecting the two camera centers Epipolar Plane plane containing baseline (1D amily) Epipoles = intersections o baseline with image planes = projections o the other camera center = vanishing points o the motion direction
The Epipole Photo by Frank Dellaert
Epipolar geometry X x x Baseline line connecting the two camera centers Epipolar Plane plane containing baseline (1D amily) Epipoles = intersections o baseline with image planes = projections o the other camera center = vanishing points o the motion direction Epipolar Lines - intersections o epipolar plane with image planes (always come in corresponding pairs)
Example 1 Converging cameras
Example 2 Motion parallel to the image plane
Example 3
Example 3 Motion is perpendicular to the image plane Epipole is the ocus o expansion and the principal point
Motion perpendicular to image plane http://vimeo.com/48425421
Epipolar constraint X x x I we observe a point x in one image, where can the corresponding point x be in the other image?
Epipolar constraint X X X x x x x Potential matches or x have to lie on the corresponding epipolar line l. Potential matches or x have to lie on the corresponding epipolar line l.
Epipolar constraint example
Epipolar constraint: Calibrated case X x x Intrinsic and extrinsic parameters o the cameras are known, world coordinate system is set to that o the irst camera Then the projection matrices are given by K[I 0] and K [R t] We can multiply the projection matrices (and the image points) by the inverse o the calibration matrices to get normalized image coordinates: 1 1 xnorm = K xpixel = [ I 0] X, xʹnorm = Kʹ xʹpixel = [ R t] X
Epipolar constraint: Calibrated case 0 x 1 X = (x,1) T x 1 [ I ] [ R t] x x = Rx+t t R The vectors Rx, t, and x are coplanar
Epipolar constraint: Calibrated case 0 )] ( [ = ʹ x R t x! x T [t ]Rx = 0 X x x = Rx+t b a b a ] [ 0 0 0 = = z y x x y x z y z b b b a a a a a a Recall: The vectors Rx, t, and x are coplanar
Epipolar constraint: Calibrated case X x x = Rx+t x ʹ [ t ( Rx)] = 0! x T [t ]Rx = 0! x T E x = 0 Essential Matrix (Longuet-Higgins, 1981) The vectors Rx, t, and x are coplanar
X x x Epipolar constraint: Calibrated case E x is the epipolar line associated with x (l' = E x) Recall: a line is given by ax + by + c = 0 or! x T E x = 0 = = = 1, where 0 y x c b a T x l x l
Epipolar constraint: Calibrated case X x x! x T E x = 0 E x is the epipolar line associated with x (l' = E x) E T x' is the epipolar line associated with x' (l = E T x') E e = 0 and E T e' = 0 E is singular (rank two) E has ive degrees o reedom
Epipolar constraint: Uncalibrated case X x x The calibration matrices K and K o the two cameras are unknown We can write the epipolar constraint in terms o unknown normalized coordinates: ˆ ʹ E xˆ = x T 0 xˆ = K 1 x, xˆ ʹ = Kʹ 1 xʹ
Epipolar constraint: Uncalibrated case X x x xˆ ʹ T E xˆ = 0 xʹ T F x = 0 with F = K T ʹ E K 1 xˆ xˆ ʹ = = K Kʹ 1 1 x xʹ Fundamental Matrix (Faugeras and Luong, 1992)
Epipolar constraint: Uncalibrated case X x x xˆ ʹ T T T E xˆ = 0 xʹ F x = 0 with F = Kʹ E K 1 F x is the epipolar line associated with x (l' = F x) F T x' is the epipolar line associated with x' (l = F T x') F e = 0 and F T e' = 0 F is singular (rank two) F has seven degrees o reedom
Estimating the undamental matrix
The eight-point algorithm T x = ( u, v,1), xʹ = ( uʹ, vʹ,1) 11 u 1 [ ʹ vʹ 1] v = 0 21 31 12 22 32 13 u [ uʹu uʹv uʹ vʹu vʹv vʹ u v 1] = 0 23 33 Solve homogeneous linear system using eight or more matches 11 12 13 21 22 23 31 32 33 Enorce rank-2 constraint (take SVD o F and throw out the smallest singular value)
Problem with eight-point algorithm [ ] 21 uʹu uʹv uʹ vʹu vʹv vʹ u v = 1 11 12 13 22 23 31 32
Problem with eight-point algorithm [ ] 21 uʹu uʹv uʹ vʹu vʹv vʹ u v = 1 Poor numerical conditioning Can be ixed by rescaling the data 11 12 13 22 23 31 32
The normalized eight-point algorithm (Hartley, 1995) Center the image data at the origin, and scale it so the mean squared distance between the origin and the data points is 2 pixels Use the eight-point algorithm to compute F rom the normalized points Enorce the rank-2 constraint (or example, take SVD o F and throw out the smallest singular value) Transorm undamental matrix back to original units: i T and T are the normalizing transormations in the two images, than the undamental matrix in original coordinates is T T F T
Seven-point algorithm Set up least squares system with seven pairs o correspondences and solve or null space (two vectors) using SVD Solve or linear combination o null space vectors that satisies det(f)=0 Source: D. Hoiem
Nonlinear estimation Linear estimation minimizes the sum o squared algebraic distances between points x i and epipolar lines F x i (or points x i and epipolar lines F T x i ): N T ( x i! F x i ) 2 i=1 Nonlinear approach: minimize sum o squared geometric distances N i=1 " # d 2 ( x i!, F x i )+ d 2 (x i, F T x i!) $ % x i! x i F T x i! Fx i
Comparison o estimation algorithms 8-point Normalized 8-point Nonlinear least squares Av. Dist. 1 2.33 pixels 0.92 pixel 0.86 pixel Av. Dist. 2 2.18 pixels 0.85 pixel 0.80 pixel
The Fundamental Matrix Song http://danielwedge.com/matrix/
From epipolar geometry to camera calibration Estimating the undamental matrix is known as weak calibration I we know the calibration matrices o the two cameras, we can estimate the essential matrix: E = K T FK The essential matrix gives us the relative rotation and translation between the cameras, or their extrinsic parameters Alternatively, i the calibration matrices are known, the ive-point algorithm can be used to estimate relative camera pose