MAPI Computer Vision. Multiple View Geometry

Similar documents
Machine vision. Summary # 11: Stereo vision and epipolar geometry. u l = λx. v l = λy

Structure from motion

Stereo Vision. MAN-522 Computer Vision

Epipolar Geometry Prof. D. Stricker. With slides from A. Zisserman, S. Lazebnik, Seitz

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

Multi-view geometry problems

Lecture 9: Epipolar Geometry

Epipolar Geometry and Stereo Vision

Multiple View Geometry

Epipolar Geometry and Stereo Vision

Lecture 14: Basic Multi-View Geometry

Multiple Views Geometry

Epipolar Geometry and the Essential Matrix

C / 35. C18 Computer Vision. David Murray. dwm/courses/4cv.

Rectification and Distortion Correction

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

CS201 Computer Vision Camera Geometry

Two-view geometry Computer Vision Spring 2018, Lecture 10

Stereo imaging ideal geometry

CS 664 Slides #9 Multi-Camera Geometry. Prof. Dan Huttenlocher Fall 2003

Recovering structure from a single view Pinhole perspective projection

calibrated coordinates Linear transformation pixel coordinates

1 (5 max) 2 (10 max) 3 (20 max) 4 (30 max) 5 (10 max) 6 (15 extra max) total (75 max + 15 extra)

Multiple View Geometry in Computer Vision

The end of affine cameras

Stereo CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Epipolar geometry. x x

Stereo and Epipolar geometry

3-D TERRAIN RECONSTRUCTION WITH AERIAL PHOTOGRAPHY

Fundamentals of Stereo Vision Michael Bleyer LVA Stereo Vision

BIL Computer Vision Apr 16, 2014

Two-View Geometry (Course 23, Lecture D)

3D Geometry and Camera Calibration

Structure from Motion. Prof. Marco Marcon

1 Projective Geometry

Unit 3 Multiple View Geometry

Structure from motion

Dense 3D Reconstruction. Christiano Gava

CS231A Course Notes 4: Stereo Systems and Structure from Motion

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Computer Vision cmput 428/615

Announcements. Stereo

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman

Reminder: Lecture 20: The Eight-Point Algorithm. Essential/Fundamental Matrix. E/F Matrix Summary. Computing F. Computing F from Point Matches

Computer Vision I. Announcement. Stereo Vision Outline. Stereo II. CSE252A Lecture 15

Camera Model and Calibration

Stereo II CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Correspondence and Stereopsis. Original notes by W. Correa. Figures from [Forsyth & Ponce] and [Trucco & Verri]

Lecture 6 Stereo Systems Multi- view geometry Professor Silvio Savarese Computational Vision and Geometry Lab Silvio Savarese Lecture 6-24-Jan-15

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography

Computer Vision Lecture 17

Lecture'9'&'10:'' Stereo'Vision'

Week 2: Two-View Geometry. Padua Summer 08 Frank Dellaert

Laser sensors. Transmitter. Receiver. Basilio Bona ROBOTICA 03CFIOR

Computer Vision Lecture 17

Geometry of a single camera. Odilon Redon, Cyclops, 1914

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: ,

Structure from motion

Dense 3D Reconstruction. Christiano Gava

Epipolar Geometry CSE P576. Dr. Matthew Brown

Lecture 5 Epipolar Geometry

Announcements. Stereo

3D reconstruction class 11

Fundamental Matrix & Structure from Motion

A Summary of Projective Geometry

Geometry of Multiple views

Computer Vision I - Algorithms and Applications: Multi-View 3D reconstruction

EXAM SOLUTIONS. Image Processing and Computer Vision Course 2D1421 Monday, 13 th of March 2006,

3D Computer Vision. Structure from Motion. Prof. Didier Stricker

Camera Geometry II. COS 429 Princeton University

Computer Vision Projective Geometry and Calibration. Pinhole cameras

Midterm Exam Solutions

Epipolar Geometry and Stereo Vision

CS6670: Computer Vision

521466S Machine Vision Exercise #1 Camera models

But First: Multi-View Projective Geometry

Cameras and Stereo CSE 455. Linda Shapiro

Computer Vision Project-1

Camera Model and Calibration. Lecture-12

ECE 470: Homework 5. Due Tuesday, October 27 in Seth Hutchinson. Luke A. Wendt

A Factorization Method for Structure from Planar Motion

COSC579: Scene Geometry. Jeremy Bolton, PhD Assistant Teaching Professor

Depth from two cameras: stereopsis

Lecture 6 Stereo Systems Multi-view geometry

Fig. 3.1: Interpolation schemes for forward mapping (left) and inverse mapping (right, Jähne, 1997).

Neighbourhood Operations

Chapter 3 Image Enhancement in the Spatial Domain

Computer Vision I Name : CSE 252A, Fall 2012 Student ID : David Kriegman Assignment #1. (Due date: 10/23/2012) x P. = z

Elements of Computer Vision: Multiple View Geometry. 1 Introduction. 2 Elements of Geometry. Andrea Fusiello

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 263

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Epipolar Geometry and Stereo Vision

Multi-View Geometry Part II (Ch7 New book. Ch 10/11 old book)

Depth from two cameras: stereopsis

Structure from Motion and Multi- view Geometry. Last lecture

Undergrad HTAs / TAs. Help me make the course better! HTA deadline today (! sorry) TA deadline March 21 st, opens March 15th

CSE 252B: Computer Vision II

There are many cues in monocular vision which suggests that vision in stereo starts very early from two similar 2D images. Lets see a few...

Rectification and Disparity

COMPARATIVE STUDY OF DIFFERENT APPROACHES FOR EFFICIENT RECTIFICATION UNDER GENERAL MOTION

Transcription:

MAPI Computer Vision Multiple View Geometry

Geometry o Multiple Views 2- and 3- view geometry p p Kpˆ [ K R t]p

Geometry o Multiple Views 2- and 3- view geometry Epipolar Geometry The epipolar geometry is the intrinsic projective geometry between two views. It is independent o scene structure, and only depends on the cameras' internal parameters and relative pose. Consider the images p and p o a point P observed by two cameras with optical centers O and O.

Geometry o Multiple Views 2- and 3- view geometry Epipolar Geometry These ive points all belong to the epipolar plane deined by the two intersecting rays OP and O P. In particular, the point p lies on the line l where this plane and the retina Π o the second camera intersect. The line l is the epipolar line associated with the point p, and it passes through the point e where the baseline joining the optical centers O and O intersects Π. Likewise, the point p lies on the epipolar line l associated with the point p, and this line passes through the intersection e o the baseline with the plane Π.

Geometry o Multiple Views 2- and 3- view geometry Epipolar Geometry The points e and e are called the epipoles o the two cameras. The epipole e is the (virtual) image o the optical center O o the irst camera in the image observed by the second camera, and vice versa. i p and p are images o the same point, then p must lie on the epipolar line associated with p. Base Line

Geometry o Multiple Views 2- and 3- view geometry 2- Camera Geometry Given the projection o P in one image, its projection in the other image is constrained to be on a line, epipolar line associated with p This epipolar constraint plays a undamental role in stereo vision and motion analysis

Geometry o Multiple Views 2- and 3- view geometry Epipoles e and e are at the intersection o the epipoles lines

Geometry o Multiple Views 2- and 3- view geometry Special case: rontoparallel cameras. Epipoles are at ininity

Geometry o Multiple Views 2- and 3- view geometry Fundamental matrix F The undamental matrix is the algebraic representation o epipolar geometry. Geometric derivation The mapping rom a point in one image to a corresponding epipolar line in the other image may be decomposed into two steps. irst step, the point x is mapped to some point x' in the other image lying on the epipolar line l'. This point x' is a potential match or the point x. second step, the epipolar line l' is obtained as the line joining x' to the epipole e'.

Geometry o Multiple Views 2- and 3- view geometry Fundamental matrix F l' = Geometric derivation l' = e' x' = x' = Hx [ e' ] x' [ e' ] Hx = Fx [] v = 0 v3 v 2 v 0 v 1 3 v 2 v 0 1

Geometry o Multiple Views 2- and 3- view geometry Fundamental matrix F x' = x = Correspondence condition: map x l The undamental matrix satisies the condition that or any pair o corresponding points x x' in the two images [ x' y' 1] [ x y 1] x' T Fx = 0 [ x' y' 1] x' x 11 + x' y 12 11 21 31 + x' 12 22 32 13 13 x 23 y = 0 33 1 + y' x + y' y 21 22 + y' 23 + x 31 + y 32 + 33 = 0

Geometry o Multiple Views 2- and 3- view geometry Fundamental matrix F Properties Transpose: I F is the undamental matrix o the pair o cameras (P, P'), then F T is the undamental matrix o the pair in the opposite order: (P', P). Epipolar lines: For any point x in the irst image, the corresponding epipolar line is l' = Fx. Similarly, l = F T x' represents the epipolar line corresponding to x' in the second image. The epipole: or any point x (other than e) the epipolar line l' = Fx contains the epipole e'. Thus e' satisies e T (Fx) = (e T F)x = 0 or all x. It ollows that e T F= 0, i.e. e' is the let null-vector o F. Similarly Fe=0, i.e. e is the right null-vector o F. F has seven degrees o reedom: a 3 x 3 homogeneous matrix has eight independent ratios; however, F also satisies the constraint detf=0 which removes one degree o reedom. F is a correlation, a projective map taking a point to a line. In this case a point in the irst image x deines a line in the second l' = Fx, which is the epipolar line o x. I l and l' are corresponding epipolar lines then any point x on l is mapped to the same line l'. This means there is no inverse mapping

Geometry o Multiple Views 2- and 3- view geometry Fundamental matrix F Epipolar line equation: l ' ax + by + c = 0 ( a, b, c) l ' = Fx T x belongs to l x T l' = 0 l' = 11 21 31 a = b c 11 21 31 12 22 32 x + x + x + 12 22 32 13 23 33 x y 1 y + y + y + 13 23 33

Geometry o Multiple Views 2- and 3- view geometry Fundamental matrix F The epipolar line homography The set o epipolar lines in each o the images orms a pencil o lines passing through the epipole. Such a pencil o lines may be considered as a 1-dimensional projective space. It is clear rom igure that corresponding epipolar lines are perspectively related, so that there is a homography between the pencil o epipolar lines centred at e in the irst view, and the pencil centred at e' in the second. A homography between two such 1-dimensional projective spaces has 3 degrees o reedom.

Geometry o Multiple Views 2- and 3- view geometry Fundamental matrix F The epipolar line homography The 7 degrees o reedom o the undamental matrix can thus be counted as ollows: 2 or e, 2 or e', 3 or the epipolar line homography which maps a line through e to a line through e'.

Geometry o Multiple Views 2- and 3- view geometry Fundamental matrix F [ ] 0 ' ' ' ' ' ' 0 1 1 ' ' 33 32 31 23 22 21 13 12 11 33 32 31 23 22 21 13 12 11 = + + + + + + + + = y x y y y x y x y x x x y x y x = = 1 ' ' ' ' ' ' 1 ' ' ' ' ' ' 0 A 0,1), ',, ', ' ',, ', ' ( 1 1 1 1 1 1 1 1 1 1 1 1 n n n n n n n n n n n n y x y y y x y x y x x x y x y y y x y x y x x x y x y y y x y x y x x x M M M M M M M M M

Geometry o Multiple Views 2- and 3- view geometry Fundamental matrix F For a solution to exist, matrix A must have rank at most 8, and i the rank is exactly 8, then the solution is unique, and can be ound by linear methods I the data is not exact, because o noise in the point coordinates, then the rank o A may be greater than 8 (in act equal to 9, since A has 9 columns). In this case, one inds a least-squares solution. The least-squares solution or is the singular vector corresponding to the smallest singular value o A. The solution vector ound in this way minimizes A subject to the condition = 1. The algorithm just described is the essence o a method called the 8-point algorithm or computation o the undamental matrix.

Geometry o Multiple Views 2- and 3- view geometry Fundamental matrix F The singularity constraint An important property o the undamental matrix is that it is singular, in act o rank 2. Furthermore, the let and right null-spaces o F are generated by the vectors representing (in homogeneous coordinates) the two epipoles in the two images. I the undamental matrix is not singular then computed epipolar lines are not coincident, as is demonstrated by igure

Geometry o Multiple Views 2- and 3- view geometry Fundamental matrix F The singularity constraint The matrix F ound by solving the set o linear equations will not in general have rank 2, and we should take steps to enorce this constraint. The most convenient way to do this is to correct the matrix F ound by the SVD solution rom A. Matrix F is replaced by the matrix F' that minimizes the Frobenius norm F F' subject to the condition detf'=0 SVD single value decomposition solution o over-determined systems o equations

Geometry o Multiple Views 2- and 3- view geometry Fundamental matrix F The 8-point algorithm or computation o the undamental matrix may be ormulated as consisting o two steps, as ollows. Linear solution. A solution F is obtained rom the vector corresponding to the smallest singular value o A, Constraint enorcement. Replace F by F', the closest singular matrix to F under a Frobenius norm. This correction is done using the SVD.

Geometry o Multiple Views 2- and 3- view geometry Fundamental matrix F The 8-point algorithm assume that we know the correspondences between the images In general, we do not know the correspondences and we must ind them automatically Solution: RANSAC = RANdom SAmple Consensus

Geometry o Multiple Views 2- and 3- view geometry Fundamental matrix F Automatic computation o F RANSAC Do k times:» Draw set with minimum number o correspondences»fit F to the set» Count the number d o correspondences that are closer than t to the itted epipolar lines» I d > dmin, recompute it error using all the correspondences Return best it ound

Geometry o Multiple Views 2- and 3- view geometry Structure Computation Back-projecting rays From the measured image points Triangulation

Stereo Vision Stereo vision is the process o recovering the three-dimensional location o points in the scene rom their projections in images. More precisely, i we have two images I l and I r (let and right rom the let and right eyes), given a pixel p l in the let image and the corresponding pixel p r in the right image, then the coordinates (X,Y,Z) o the corresponding point in space is computed.

Stereo Vision Geometrically, given p l, we know that the point P lies on the line L l joining p l and the let optical center C l (this line is the viewing ray), although we don t know the distance along this line. Similarily, we know that P lies along a line L r joining p r and P. knowing exactly the parameters o the cameras (intrinsic and extrinsic), we can explicitly compute the parameters o L l and L r. Thereore, we can compute the intersection o the two lines, which is the point P. This procedure is called triangulation.

Stereo Vision Correspondence: Given a point p l in one image, ind the corresponding point in the other image. Reconstruction: Given a correspondence (p l,p r ), compute the 3-D coordinates o the corresponding point in space, P.

Correspondence Given p l, inding the corresponding point p r involves searching the right image or the location p r such that the right image around p r looks like the let image around p l. Take a small window W around p l and compare it with the right image at all possible locations. The position p r that gives the best match is reported. The undamental operation, thereore, is to compare the pixels in a window W(p l ) with the pixels in a window W(p r ). Sum o Absolute Dierences Sum o Squared Dierences Normalized Correlation Stereo Vision

Matching Functions Stereo Vision

Correspondence Rectiication Searching along epipolar lines at arbitrary orientation is intuitively expensive. Always search along the rows o the right image. Given the epipolar geometry o the stereo pair, there exists in general a transormation that maps the images into a pair o images with the epipolar lines parallel to the rows o the image. This transormation is called rectiication. Images are almost always rectiied beore searching or correspondences in order to simpliy the search. The exception is when the epipole is inside one o the images. In that case, rectiication is not possible.

Correspondence Rectiication Given a plane P in space, there exists two homographies H l and H r that map each image plane onto P. That is, i p l is a point in the let image, then the corresponding point in P is Hp (in homogeneous coordinates). I we map both images to a common plane P such that P is parallel to the line C l C r, then the pair o virtual (rectiied) images is such that the epipolar lines are parallel. Stereo Vision

Correspondence Rectiication The algorithm or rectiication is then: Select a plane P parallel to C r C l Deine the let and right image coordinate systems on P Construct the rectiication matrices H l and H r rom P and the virtual image s coordinate systems.

Disparity Assuming that images are rectiied, given two corresponding points p l and p r, the dierence o their coordinates along the epipolar line x l -x r is called the disparity d. The disparity is the quantity that is directly measured rom the correspondence. The corresponding 3-D point P can be computed rom p l and d, assuming that the camera parameters are known.

Disparity Stereo Vision

Recover the 3-D coordinates Two cameras with parallel image planes Consider a point P o coordinates X,Y, Z. P is projected to p l and p r in the two images. The ocal length o the cameras is denoted by, and the distance between the optical centers (the baseline) is denoted by B. Looking at the diagram, we see that the triangles (C r,c l,m) and (p r,p l,p) are similar triangles. Thereore, the ratios o their height to base are equal Z B = B Z x l + x r Z = B / d

Recover the 3-D coordinates Two cameras with parallel image planes This relation is the undamental relation o stereo. Z = B / d It basically states that the depth is inversely proportional to the disparity. Once we know Z, the other two coordinates are derived using the standard perspective equations: xlz X = ylz Y =

Recover the 3-D coordinates Two cameras with parallel image planes Commercial solutions

Recover the 3-D coordinates Two cameras with parallel image planes Commercial solutions

Recover the 3-D coordinates Two cameras with parallel image planes Commercial solutions

Recover the 3-D coordinates Two cameras with parallel image planes How accurately can those coordinates be computed? Consider a matching pair o disparity d corresponding to a depth Z. I we make an error o one pixel (d+1 instead o d), the depth becomes Z. We want to evaluate DZ, the error in depth due to the error in disparity. Taking the derivative o Z as a unction o d, we get: ΔZ Δd = B d 2 = Z 2 B

Recover the 3-D coordinates Two cameras with parallel image planes The inal relation illustrates the undamental relation between baseline, ocal length and accuracy o stereo reconstruction. For an error o 1 pixel on disparity, we get an error in Z o: Z 2 B

Recover the 3-D coordinates Two cameras with parallel image planes Depth: The resolution o the stereo reconstruction decreases quadratically with depth. This implies severe limitation on the applicability o stereo. I we assume sub-pixel disparity interpolation with a resolution Δd, the depth resolution becomes: Z B 2 Δ d but it still remains quadratic in depth.

Recover the 3-D coordinates Two cameras with parallel image planes Baseline: The resolution improves as the baseline increases. We would be tempted to always use a baseline as large as possible. However, the matching becomes increasingly diicult as the Z 2 B baseline increases (or a given depth) because o the increasing amount o distortion between the let and the right images. Focal length: The resolution improves with ocal length. Intuitively, this is due to the act that, or a given image size, the density o pixels in the image plane increases as increases. Thereore, the disparity resolution is higher.

Recover the 3-D coordinates Two cameras with parallel image planes The previous discussion assumes that the viewing rays rom the let and right cameras intersect exactly. That is not usually the case because o small errors in calibration. The two viewing rays pass close to each other but do not exactly intersect. The point P is reconstructed as the point that is the closest to both lines.

Recover the 3-D coordinates Two cameras with parallel image planes Sub-pixel disparity The disparity is computed by moving a window one pixel at a time. As a result, the disparity is known only up to one pixel. This limitation on the resolution o the disparity translates into a severe limitation on the accuracy o the recovered 3-D coordinates. One eective way to get this problem is to recover the disparity at a iner resolution by interpolating between the pixel disparities using quadratic interpolation.

Recover the 3-D coordinates Two cameras with parallel image planes Sub-pixel disparity Suppose that the best disparity at a pixel is obtained at d o with a matching value (or example SSD) o S(d o ). We can obtain a second order approximation o the (unknown) unction S(d ) by approximating S by a parabola. At the position d opt corresponding to the bottom o the parabola, we have S( dopt )<=S(d o ). Thereore, d opt is a better estimate o the disparity than d o. The question remains as to how to ind this approximating parabola.

Recover the 3-D coordinates Two cameras with parallel image planes Sub-pixel disparity Let us irst translate all the disparity values so that d o =0. The equation o a general parabola is: S(d ) =ad 2 +bd+c. To recover the 3 parameters o the parabola we need 3 equations which we obtain by writing that the parabola passes through the point o disparity 0, -1, and +1: S(0) = c S(1) =a+b+c S(-1) =a-b+c Solving this, we obtain: c = S(0) a = ( S(1) + S(-1) - 2 S(0))/2 b = ( S(1) - S(-1))/2 The bottom o the parabola is obtained at dopt such that S (d ) = 2ad+b = 0. Thereore, the optimal disparity is obtained

Recover the 3-D coordinates Two cameras with parallel image planes Matching Conidence Stereo matching assumes that there is enough inormation in the images to discriminate between dierent positions o the matching window. That is not the case in regions o the image in which the intensity is nearly constant. In those regions, there is not a single minimum o the matching unction and the disparity cannot be computed. Detecting that the disparity estimate is unreliable This can be done by compute the gradient o the image in the x direction, I x and computing the local average o its squared magnitude: ΣI x2. This quantity can then be used as a measure o reliability o matching (conidence measure.)

Recover the 3-D coordinates Two cameras with parallel image planes Lighting Issues A problem is that the lighting conditions maybe substantially dierent between the let and right image. Because, or example, o dierent exposures or dierent settings o the camera. Because ψ measures directly the dierence in pixel values, its value will be corrupted.

Recover the 3-D coordinates Two cameras with parallel image planes Lighting Issues The normalized correlation reduces this problem Laplacian o Gaussian (LOG)» smoothly varying parts o the image do not carry much inormation or matching. The useul inormation is contained in higher-requency variations o intensity.

Recover the 3-D coordinates Two cameras with parallel image planes Laplacian o Gaussian (LOG)» smoothly varying parts o the image do not carry much inormation or matching. The useul inormation is contained in higherrequency variations o intensity. So we want to eliminate the slowly-varying parts o the image (lowrequency), and this can be done by using a second derivatives. Also, we want to eliminate the high-requency components which correspond to noise in the image, which suggests blurring with a Gaussian ilter. The combination o the two suggests the use o the Laplacian o Gaussian (LOG) previously deined in the context o edge detection. This ilter is technically a band-pass ilter (removes both high and low requencies.) In practice, the images are irst convoluted with the LOG ilter

Recover the 3-D coordinates Two cameras with parallel image planes Laplacian o Gaussian (LOG)» smoothly varying parts o the image do not carry much inormation or matching. The useul inormation is contained in higher-requency variations o intensity.

Recover the 3-D coordinates Two cameras with parallel image planes Eect o Window Size Window size has qualitatively the same eect as smoothing. Localization is better with smaller windows. In particular, the disparity image is corrupted less near occluding edges because o better localized match. Matching is better with larger windows because more pixels are taking into account, and there is thereore better discrimination between window positions. Localization degrades as the window size increases, or the same reason. For large values o w, the minimum is latter (lower curvature), leading to lower conidence in the matching as deined beore.

Recover the 3-D coordinates Two cameras with parallel image planes Ambiguity In many cases, several positions along the epipolar line can match the window around the let pixel. In that case, there is an ambiguity as to which point leads to the correct 3-D reconstruction. I the ambiguous matches are ar apart, they will correspond to very dierent points in space, thus leading to large errors. In practice, it is nearly impossible to eliminate all such ambiguities, especially in environments with lots o regular structures. The solution is generally to use more than two cameras in stereo.

Recover the 3-D coordinates >2 cameras - multibaseline approach Suppose that the three cameras are aligned. A point P corresponds to three points in the images, aligned along the horizontal epipolar line.

Recover the 3-D coordinates >2 cameras - multibaseline approach Suppose that the three cameras are aligned. A point P corresponds to three points in the images, aligned along the horizontal epipolar line. The disparities o those points d 12 and d 13 between the irst image and images 2 and 3 are in general dierent because the baselines are dierent: d d 12 13 = = B Z B Z 12 13 1 Z = d B 12 12 = d B 13 13

Recover the 3-D coordinates >2 cameras - multibaseline approach i we plot the matching curves (S 12 (d )) not as a unction o the disparity but at a unction o d = d/b, assuming that the matches are correct, all the curves should have the same minimum at d = 1/Z. Instead o using S 12 and S 13 separately, we can just combine them into a single matching unction: S(d ) = S 12 + S 13 and ind the minimum o S(d ).

Recover the 3-D coordinates >2 cameras - multibaseline approach Using a multibaseline approach combines the advantages o both short and long baseline: The short baseline will make the matching easier but leads to poorer localization in space. The longer baseline leads to higher precision localization but more diicult matching. Furthermore, ambiguous matches would not generally appear at multiple baselines, thus, by combining the matching unction into a single unction, we get a sharper, unique peak o the matching unction.