Camera system: pinhole model, calibration and reconstruction

Size: px
Start display at page:

Download "Camera system: pinhole model, calibration and reconstruction"

Transcription

1 Camera system: pinhole model, calibration and reconstruction Francesco Castaldo, Francesco A.N. Palmieri December 22, 203 F. Castaldo and F. A. N. Palmieri are with the Dipartimento di Ingegneria Industriale e dell Informazione, Seconda Università degli Studi di Napoli, Aversa (CE), Italy

2 Contents Introduction 2 2 Pinhole camera model D-3D geometry D-2D geometry Camera calibration 9 3. Projection matrix Sensitivity of projection matrix Homography Sensitivity of homography World point reconstruction D world point reconstruction Intersection Triangulation D world point recostruction Intersection Triangulation

3 Chapter Introduction This report investigates the functioning of a camera, analyzing its matematical model, the calibration stage and the world point reconstruction [3] [7] [4]. A generic camera system allows to reconstruct the position of a point in a three-dimensional (3D) space (the so-called world point), given the bi-dimensional (2D) point s projections on the cameras image planes (also called pixel projections) [3]. However, very often we deal with objects that are constrained to move in 2D spaces, e.g. ships moving on the sea plane for target tracking [5] (Figure. left). Therefore, we could simplify the camera model by assuming that the world points always lie on a plane. This situation defines an homography [3], that is a mapping between spaces with the same number of coordinates (2D world points and 2D pixels). In the following for each component of the camera we will address both the 2D-3D (2D pixels - 3D world points) and 2D-2D (2D pixels - 2D world points) scenarios. The 2D-D case is not very common and it will be neglected in this report. Figure.: On the left there is a scenario in which the 2D-2D hypothesis holds (in target tracking for harbour scenarios the ships are bounded to move onto the sea plane). On the right a scenario in which we have to use the 2D-3D model (airplanes fly in a 3D world space). 2

4 Chapter 2 Pinhole camera model In this chapter we present the camera model for a generic set up (3D-2D), and then deal with the 2D-2D case. 2. 2D-3D geometry A camera is a mapping between 3D world objects and a 2D image. We consider the simplest camera model, the so-called pinhole model [3]. From Figure 2. point C, to which all viewing rays from 3D space points converge, is the origin of the Euclidean coordinate system and it is called the center of projection. The center of projection is also called the camera center or the optical center. This model emerges as the equivalent of a classical camera setup in which the camera plane (also called image plane) is behind the camera center C. The image that is formed upside down on the camera plane is transposed in front of the camera center where the image appears in its natural orientation and with the same dimensions. Such a plane Z c = f is the new image plane, as shown in Figure 2.. The line that heads perpendicularly from the camera center to the image plane is called the principal axis for the camera. The intersection p of the principal axis and the image plane is called principal point. A 3D point X c = (X c, Y c, Z c ) T in the Euclidean space (X c, Y c, Z c ) centered in C (called the camera standard reference frame) is mapped to a point x = (x, y) T on the image plane where the viewing line of X c meets the image plane. The image reference frame is centered in p with x and y axes parallel to X c and Y c. Considering the plane (Y c, Z c ) as shown in Figure 2.2, by similarity of (ĈZ c Y c ) and (Ĉzy), we can write y = fy C Z C. By the same reasoning, in the plane (X c, Z c ) we have x = fx C Z C. Therefore the point (X C, Y C, Z C ) T is mapped to the point (fx C /Z C, fy C /Z C, f) T, that lies on the image plane. Ignoring the obvious third coordinate, we can write ( X c = (X c, Y c, Z c ) T x = (x, y) T fxc =, fy ) T c (2.) Z c Z c 3

5 viewing line camera centre principal axis principal point image plane Figure 2.: Pinhole camera model with image plane parallel to (X c, Y c ) Note that we cannot write (2.) in terms of matrix multiplications, due to the non-linearity of the mapping between the coordinates of X c and x. It is convenient (as it has become standard in the literature) to define world and image points by the augmented vectors X c = ( Xc ), x = ( x called homogeneous coordinates [3]. Observing that fx c f fy c = 0 f 0 0 Z c ), X c Y c Z c, (2.2) multiplying and dividing the first member of (2.2) by Z c we can write Z c fx c Z c fy c Z c = Z c x, and then write X c Z c x = fx c fy c Z c = f f X c Y c Z c. 4

6 Figure 2.2: Relation of coordinates system to focal length More compactly, we have λx = P X c, (2.3) where λ is a generic scale factor that depends on the specific point X c. Another way to write this expression is x P X c, where denotes equalities up to a scale factor. Matrix f P = 0 f 0 0, is called the camera projection matrix. The camera model just presented above is the simplest one and it is idealized because we have assumed that the origin of coordinates in the image plane is at the principal point and that the image plane is parallel to the (X c, Y c ) plane. However, cameras deployed in arbitrary position and orientation need to be described in a more general reference system. A first generalization assumes that the origin of image coordinates is not in in the principal point p. Denoting p = (p x, p y ) T, we can write X c x = (fx c /Z c + p x, fy c /Z c + p y ) T, that in homogeneous coordinates is X c λx = fx c + Z c p x fy c + Z c p y Z c = P X c Y c Z c, (2.4) where P = f 0 p x 0 0 f p y = K[I 3 0], 5

7 Figure 2.3: World to camera coordinate frames with K = f 0 p x 0 f p y 0 0 and I 3 the 3 3 identity matrix. K is called the camera calibration matrix. Generalizing the standard camera coordinate frame (X c, Y c, Z c ) to generic coordinates, known as the world coordinate frame (X, Y, Z), as shown in Figure 2.3, we can relate the two coordinate frames through a rotation matrix R and a translation vector t. Therefore X c = R X Y Z C x C y C z, = R( X C), where X = (X, Y, Z) T and C = (C x, C y, C z ) T represent respectively the generic 3D point and the camera center in the world reference frame. In homogeneous coordinates we can write X c = [ R R C 0 ] X Y Z [ R = R C 0 ] X, (2.5) ( ) x where x = is the generic point in homogeneous image coordinates. Therefore, putting (2.4) and (2.5) together, we have λx = P X, (2.6) with P = KR[I 3 C]. We can decide not to show explicitly the camera center C and write the transformation as X c = R X + t, with t = R C. Therefore 6

8 the 3 4 camera projection matrix is P = K[R t] Another element that we should take into account is the possibility of different scaling along both image coordinates. In the model derived above we assumed that image coordinates are Euclidean with equal scales in both axes directions. In transferring the coordinates to pixels, we will need to multiply the original coordinates by scale factors m x and m y as pixels may have different equivalent dimensions in x and y (the number of pixel per unit distance in x and y direction may be different). Matrix K becomes K = α x 0 x 0 0 α y y with α x = fm x and α y = fm y and x 0 = m x p x and y 0 = m y p y the coordinates of the principal points in terms of pixels in x and y direction. In the form derived above the parameters contained in K {f, α x, α y, x 0, y 0 } are called internal or intrinsic parameters, and the parameters in R and t are called external or extrinsic parameters. This camera model has 0 degrees of freedom ({α x, α y, x 0, y 0, f} form a set of four independent parameters plus we have three for R and three for t). It is useful to calculate the camera center C = (C x, C y, C z, ) T, that is the -dimentional right null-space of P, i.e. P C = 0. Algebraically, the center may be obtained with the following equations Cˆ x = det([p 2, p 3, p 4 ]), Ĉ y = det([p, p 3, p 4 ]), Ĉ z = det([p, p 2, p 4 ]), Ĉ t = det([p, p 2, p 3 ]), (2.7) where det( ) denotes the determinant of the matrix in the brackets, p i, i =, 2, 3, 4 denotes the columns of P and with C ˆ x C x /Ĉt C = C y C z = Ĉ y /Ĉt Ĉ z. (2.8) /Ĉt Ĉ t /Ĉt It is important in many application to calculate, given a camera C i and a pixel x on the camera image plane, the ray on which lie the (infinite) 3D points that can generate that projection x. This 3D rect can be written using two known points on the ray. The first is the camera center C, the second is the point X +, calculated as λ X + = P + x (2.9) where P + denotes the Moore-Penrose pseudo-inverse of P. Therefore the 3D rect can be written as X(β) = X + + βc (2.0) where β can vary arbitrary and determines the 3D points on the ray. 7

9 2.2 2D-2D geometry If world points are bounded to move on a plane, equation (2.6) becomes where x = (x, y, ) T, X = (X, Y, ) T, and h h 2 h 3 H = h 2 h 22 h 23, h 3 h 32 h 33 λx = HX, (2.) is a 3 3 homography matrix. The relationship between camera projection matrix P and homography H is the following. If P = [p, p 2, p 3, p 4 ], where p i, i =, 2, 3, 4 denotes the column of P, then H = [p, p 2, p 4 ]. 8

10 Chapter 3 Camera calibration In this section we address the camera calibration for both projection matrices and homographies. 3. Projection matrix The goal of camera calibration is to calculate the camera projection matrix P defined above when the sensor is deployed in arbitrary locations. We assume to know a certain number of point correspondences {x i, X i }, i =,.., N between 3D points X i and 2D image points x i (projections on the image plane of X i ), and we are required to find the matrix P such that λx i = P X i, as shown in (2.6), for all i. World points may be extracted from known landmarks or objects. There are several ways to accomplish calibration. One possible approach [7] is to rewrite equation (2.6) as where and λx i λy i λ P = = p p 2 p 3 p 4 p 2 p 22 p 23 p 24 p 3 p 32 p 33 p 34 λx i λy i λ = λx i = λ p p 2 p 3 p 4 p 2 p 22 p 23 p 24 p 3 p 32 p 33 p 34 We can re-write (3.) as { xi = λxi λ = pxi+p2yi+p3zi+p4 p 3X i+p 32Y i+p 33Z i+p 34 y i = λyi λ x i y i X Y Z, (3.) (3.2) = [p ij ]. (3.3) = p2xi+p22yi+p23zi+p24 p 3X i+p 32Y i+p 33Z i+p 34 i =,..., N. 9 (3.4)

11 We can write a homogeneous linear system using (3.4) for each known correspondence. In our model P has 0 degrees of freedom, therefore we need at least 5 world-images point matches, but in general, using a calibration pattern, we can obtain many more correspondences and consequently calculate a more accurate solution through a least-squares method. In fact, given N correspondences, we can write with and A = Ap=0, X Y Z x X x Y x Z x X Y Z y X y Y y Z y X 2 Y 2 Z x 2 X 2 x 2 Y 2 x 2 Z 2 x X 2 Y 2 Z 2 y 2 X 2 y 2 Y 2 y 2 Z 2 y X N Y N Z N x N X N x N Y N x N Z N x N X N Y N Z N y N X N y N Y N y N Z N y N p = [p, p 2,..., p 33, p 34 ] T. We have to solve a set of homogeneous equations in which there are more equations than unknowns. The obvious solution p = 0 is not of interest, therefore we have to pick a constraint in order to impose the minimization problem. A reasonable constraint would be p = ; we also observe that if p is a solution, then so is kp for any scalar k. Therefore the least-squares problem can be stated as finding p that minimizes Ap subject to p =. As showed in [3], using the singular values decomposition (SV D) [2] on A, we get A = UΣV T, then the problem requires to minimize UΣV T p. However, UΣV T p = ΣV T p and p = V T p (using the properties of orthonormal matrices U and V ). Therefore, we have to minimize ΣV T p subject to V T p =. If we put y = V T p, the problem becomes to minimize Σy subject to y =. Since Σ is a diagonal matrix with its elements in descending order, the minimum solution is simply y = (0, 0,..0, ) T, and since p = V y is simply the last column of V. Therefore, if A = [a ij ] is a 2N 2 matrix, with N > 5, then A = UDV T (3.5) is the economy size SV D of A, with U = [u ij ] and V = [v ij ] two orthogonal matrices 2N 2 and 2 9 respectively, and D = [d ij ] a 2 2 matrix containing the singular values of A. Using the procedure detailed above, p can be calculated as p = [v,2, v 2,2, v 3,2, v 4,2, v 5,2, v 6,2, v 7,2, v 8,2, v 9,2, v 0,2, v,2, v 2,2 ] T, (3.6) where v i,2, i =,.., 2 denotes the elements of the last column of V. Rearranging p we get P. Improvements of the resulting P can be achieved implementing data normalization and iterative procedures to reduce the error [3]. 0

12 3.. Sensitivity of projection matrix In this section we assess the impact of a noisy calibration stage to reconstruction quality. We assume that world points for the calibration have been perfectly estimated, while pixel correspondences are burdened with a variable quantity of noise. We refer to the procedure detailed in [6] [] for projection matrices. We calculate the partial derivatives of the elements of P = [p kl ], k =, 2, 3, l =, 2, 3, 4, respect to the calibration world points and pixels, defined as (X s, Y s, Z s ) and (x s, y s ), s =,.., N respectively. From (3.22) and (3.6) the following relations [] are obtained p kl x s = i,j p kl x s = X s v 3(k )+l,2 a 2s,9 v 3(k )+l,2 Y s a 2s,0 Z s v 3(k )+l,2 a 2s, v 3(k )+l,2 a 2s,2, (3.7) and p kl y s = i,j p kl y s = X s v 3(k )+l,2 a 2s,9 v 3(k )+l,2 Y s a 2s,0 Z s v 3(k )+l,2 a 2s, v 3(k )+l,2 a 2s,2, (3.8) To compute the partial derivatives (3.7) and (3.8) we need the partial derivatives of V respect to the entries of A. As showed in [6], by taking the derivative of (3.5) respect to a ij, we obtain A = U DV T + U D V T + UD V T (3.9) Since V is an orthogonal matrix, we have that where V T V = I = V T V + V T V = Ω ij V + ΩijT V (3.0) Ω ij V = V T V = [Ω ij V k,l ] (3.) is by definition a 2 2 skew-symmetric matrix. The same can be done for U, with Ω ij U = U T U = [Ω ij U k,l ] again a 2 2 skew-symmetric matrix. From (3.9), multiplying by U T and V from the left and right respectively, we get U T A V = Ω ij U D + D + DΩ ij V (3.2)

13 In order to compute the derivatives of v ij, we need to know Ω ij V, the elements of which can be calculated by solving a set of 2 2 linear systems in the form d l Ω ij U k,l + d k Ω ij V k,l = u ik v jl, d k Ω ij U k,l + d l Ω ij V k,l = u il v jk, (3.3) deduced by the antisymmetric properties of Ω ij U and Ωij V, with k =,.., N, l = i +,.., N. Once Ω ij has been computed, the sought derivatives of V are simply V V = V Ω ij V. (3.4) We use (3.4) to finally compute (3.7) and (3.8) for all the calibration pixels. These derivatives can be combined into the total differential dp kl = p kl x dx + p kl y dy p kl x N dx + p kl y N dy, (3.5) that sums the contributions of all the pixel errors for a single entries of P. The dx, dy,.., dx N, dy N values can vary accordingly to the estimated quality of the calibration process. In this way we can measure the uncertainty of the P matrix, defined in matrix form as P = 3.2 Homography dp dp 2 dp 3 dp 4 dp 2 dp 22 dp 23 dp 24 dp 3 dp 32 dp 33 dp 34 = [ p ij ]. (3.6) We detail the procedure for finding H. Given the correspondences {x i, X i }, i =,.., N between 2D world points X i and 2D image points x i we can re-write (2.) as λx i h h 2 h 3 X λy i = h 2 h 22 h 23 Y, (3.7) λ h 3 h 32 h 33 where and λx i λy i λ H = = λx i = λ h h 2 h 3 h 2 h 22 h 23 h 3 h 32 h 33 Equation (3.7) can be re-written as x i = λxi λ = hxi+h2yi+h3 h 3X i+h 32Y i+h 33, y i = λyi λ x i y i = h2xi+h22yi+h23 h 3X i+h 32Y i+h 33, i =,..., N. 2 (3.8) = [h ij ]. (3.9) (3.20)

14 Finding H can be reduced to the solution of the homogeneous linear system with and A = Ah = 0, (3.2) X Y x X x Y x X Y y X y Y y X 2 Y x 2 X 2 x 2 Y 2 x X 2 Y 2 y 2 X 2 y 2 Y 2 y X N Y N x N X N x N Y N x N X N Y N y N X N y N Y N y N h = [h, h 2, h 3, h 2, h 22, h 23, h 3, h 32, h 33 ] T. (3.22) We need at least 4 world-pixel matches, because each correspondence provides two independent equations and H has rank 8. With more than 4 correspondences, the set of equations is over-determined and we need an approximate solution, namely a vector h that minimizes a cost function. We are not interested in the obvious solution h = 0, therefore we need an additional constraint, that can be h = (H is defined up to a scale factor, therefore the value of the norm is unimportant). In this way, the solution h is obtained by minimizing the norm Ah, subject to h =. As showed in [3], we can recover the least-squares solution by applying singular values decomposition (SV D) [2] on A, with h that becomes the unit singular vector corresponding to the smallest singular value of A. If A = [a ij ] is a 2N 9 matrix, with N > 4, then, as before, A = UDV T is the economy size SV D of A, with U = [u ij ] and V = [v ij ] two orthogonal matrices 2N 9 and 9 9 respectively, and D = [d ij ] a 9 9 matrix containing the singular values of A. Using the procedure detailed above, h can be calculated as h = [v,9, v 2,9, v 3,9, v 4,9, v 5,9, v 6,9, v 7,9, v 8,9, v 9,9 ] T, (3.23) where v i,9, i =,.., 9 denotes the elements of the last column of V. Rearranging h we get H Sensitivity of homography We apply the same procedure detailed in [6] [] to assess the sensitivity for the homographies. We calculate the partial derivatives of the elements of H = [h kl ], k =, 2, 3, l =, 2, 3, respect to the calibration world points and pixels, defined as (X s, Y s ) and (x s, y s ), s =,.., N respectively. From (3.22) and (3.23) the following relations [] are obtained h kl x s = i,j h kl x s = X s v 3(k )+l,9 a 2s,7 Y s v 3(k )+l,9 a 2s,8 v 3(k )+l,9 a 2s,9, (3.24) 3

15 and h kl y s = i,j h kl y s = X s v 3(k )+l,9 a 2s,7 Y s v 3(k )+l,9 a 2s,8 v 3(k )+l,9 a 2s,9. (3.25) To compute the partial derivatives (3.24) and (3.25) we need the partial derivatives of V respect to the entries of A. We use the same procedure detailed for the 3D-2D case. We calculate (3.4) and compute (3.24) and (3.25) for all the calibration pixels. These derivatives can be combined into the total differential dh kl = h kl x dx + h kl y dy h kl x N dx + h kl y N dy, (3.26) that sums the contributions of all the pixel errors for a single entries of H. The dx, dy,.., dx N, dy N values can vary accordingly to the estimated quality of the calibration process. In this way we can measure the uncertainty of the H matrix, defined in matrix form as H = dh dh 2 dh 3 dh 2 dh 22 dh 23 dh 3 dh 32 dh 33 = [ h ij ]. (3.27) 4

16 Chapter 4 World point reconstruction In this section we address the problem of world point localization, given the projections of the point into the cameras image planes. As usual, we detail the procedure for both 2D-3D and 2D-2D cases. 4. 3D world point reconstruction Assuming that projection matrices P i are known for each camera i, we can localize a generic 3D point X = (X, Y, Z, ) T using camera matrices and corresponding 2D points x i = (x i, y i, ) T on each camera, that are the projection of the same X onto the two image planes. A separate problem is the identification of corresponding points in different images (i.e. given one pixel in the first image find automatically its homologous in the second image) [7]. In this paper we do not address such a problem, and in the experiments that will follow we have manually localized homologous points in pairs of images. Object identification is a important larger problem that we will address elsewhere. 4.. Intersection If we have available only two cameras, the simplest resolution of the problem, also known as camera intersection [4], can be stated as follows. From (2.6) applied to both cameras, we have λ x = P X λ 2 x 2 = P 2 X. This pair of equations can be rearranged in matrix form as AX λ = 0, (4.) 5

17 with and A = [ P x 0 P 2 0 x 2 X λ = X Y Z λ λ 2 We can solve the system via a least-squares method (as the one presented for camera calibration) and obtain X λ, from which extract X (first 4 elements of X λ ). The method can be applied to an arbitrary number of cameras: assuming to have N cameras and to know N 2D points x i, projections of the 3D point X that needs to be reconstructed, we get equation (4.), with A =. ], P x P 2 0 x P x P N x N, and X λ = X λ λ 2 λ 3... λ N Triangulation A more robust method is shown in [3], and can be stated as follows. We consider the reconstruction achieved by two cameras, and then generalize to N cameras. From (2.6) applied to cameras C and C 2, we have λ x = P X λ 2 x 2 = P 2 X (4.2) We can re-write (4.2) as x P X = 0 x 2 P 2 X = 0 (4.3) 6

18 where denotes the product vector. Moreover, the first equation of (4.3) can be written as x p T y X p T 2 X = 0, (4.4) p T 3 X where p it, i =, 2, 3 denotes the rows of matrix P. The same can be done for P 2. Taking the product vector for the first equation of (4.3), we get y p T 3 X p T p T X x p T 2 X 3 X X x p T 2 X y p T = 0, (4.5) being the generic product vector a b, a = [a, a 2, a 3 ] T, b = [b, b 2, b 3 ] T, equals to [a 2 b 3 a 3 b 2, a 3 b a b 3, a b 2 a 2 b ] T. Ignoring the third equation (being a linear combination of the other two), and repeating the same procedure for camera C 2, we can arrange the equations to have y p T 3 p T 2 p T x p T 3 y 2 p 2T 3 p 2T 2 p 2T x 2 p 2T 3 X = AX = 0, (4.6) Using the same procedure detailed for camera calibration, by taking the singular value decomposition of A and using the last column of V (normalizing with the last value of V ), we get the solution X. The same procedure can be applied using N cameras C,.., C N. Each camera adds two rows to equation (4.6), and we get y p T 3 p T 2 p T x p T 3 y 2 p 2T 3 p 2T 2 p 2T x 2 p 2T 3... y N p N T 3 p N T 2 p N T x N p N T 3 X = AX = 0, (4.7) This homogeneous equation system can be resolved in the same way as equation (4.6) has been resolved D world point recostruction Assuming that homographies H i are known for each camera i, we can localize a generic 2D point X = (X, Y, ) T using camera matrices and corresponding pixels x i = (x i, y, ) T on each camera, that are the projection of the same X onto the two image planes. 7

19 4.2. Intersection The camera intersection [4] method can be set up also for homographies, and can be stated as follows. From (2.) applied to both cameras, we have λ x = H X λ 2 x 2 = H 2 X. This pair of equations can be rearranged in matrix form as with and A = AX λ = 0, (4.8) [ ] H x 0 H 2 0 x 2, X λ = X Y λ λ 2 We can solve the system via a least-squares method and obtain X λ, from which extract X (first 3 elements of X λ ). The method can be applied to an arbitrary number of cameras: assuming to have N cameras and to know N 2D points x i, projections of the 2D world point X that needs to be reconstructed, we get equation (4.8), with A =. H x H 2 0 x H x H N x N, and X λ = X λ λ 2 λ 3... λ N. 8

20 4.2.2 Triangulation We consider camera triangulation for homographies. cameras C and C 2, we have λ x = H X λ 2 x 2 = H 2 X From (2.) applied to (4.9) We can re-write (4.9) as x H X = 0 x 2 H 2 X = 0 (4.0) where denotes the product vector. Moreover, the first equation of (4.0) can be written as x h T y X h T 2 X = 0, (4.) h T 3 X where h it, i =, 2, 3 denotes the rows of matrix H. The same can be done for H 2. Taking the product vector for the first equation of (4.0), we get y h T 3 X h T h T X x h T 2 X 3 X X x h T 2 X y h T = 0, (4.2) Ignoring the third equation (being a linear combination of the other two), and repeating the same procedure for camera C 2, we can arrange the equations to have y h T 3 h T 2 h T x h T 3 y 2 h 2T 3 h 2T 2 h 2T x 2 h 2T 3 X = AX = 0, (4.3) Using the same procedure detailed for camera calibration, by taking the singular value decomposition of A and using the last column of V (normalizing with the last value of V ), we get the solution X. The same procedure can be applied using N cameras C,.., C N. Each camera adds two rows to equation (4.3), and we get y h T 3 h T 2 h T x h T 3 y 2 h 2T 3 h 2T 2 h 2T x 2 h 2T 3... y N h N T 3 h N T 2 h N T x N h N T 3 9 X = AX = 0, (4.4)

21 This homogeneous equation system can be resolved with the same procedure defined for (4.3). 20

22 Bibliography [] Agnieszka Bier and Leszek Luchowski. Error analysis of stereo calibration and reconstruction. In Andr Gagalowicz and Wilfried Philips, editors, Computer Vision/Computer Graphics CollaborationTechniques, volume 5496 of Lecture Notes in Computer Science, pages Springer Berlin Heidelberg, [2] G.H. Golub and C. Reinsch. Singular value decomposition and least squares solutions. Numerische Mathematik, 4(5): , 970. [3] Richard Hartley and Andrew Zisserman. Multiple View Geometry in Computer Vision, 2ed. Cambridge, [4] Gerard Medioni and Sing Bing Kang. Emerging Topics in Computer Vision. Prentice Hall, [5] F.A.N. Palmieri, F. Castaldo, and G. Marino. Harbour surveillance with cameras calibrated with ais data. In Aerospace Conference, 203 IEEE, pages 8, 203. [6] Thodore Papadopoulo and Manolis I. A. Lourakis. Estimating the jacobian of the singular value decomposition: Theory and applications. In David Vernon, editor, ECCV (), volume 842 of Lecture Notes in Computer Science, pages Springer, [7] Emanuele Trucco and Alessandro Verri. Introductory Techniques for 3-D Computer Vision. Prentice Hall,

Camera Model and Calibration. Lecture-12

Camera Model and Calibration. Lecture-12 Camera Model and Calibration Lecture-12 Camera Calibration Determine extrinsic and intrinsic parameters of camera Extrinsic 3D location and orientation of camera Intrinsic Focal length The size of the

More information

Two-view geometry Computer Vision Spring 2018, Lecture 10

Two-view geometry Computer Vision Spring 2018, Lecture 10 Two-view geometry http://www.cs.cmu.edu/~16385/ 16-385 Computer Vision Spring 2018, Lecture 10 Course announcements Homework 2 is due on February 23 rd. - Any questions about the homework? - How many of

More information

CS231A Course Notes 4: Stereo Systems and Structure from Motion

CS231A Course Notes 4: Stereo Systems and Structure from Motion CS231A Course Notes 4: Stereo Systems and Structure from Motion Kenji Hata and Silvio Savarese 1 Introduction In the previous notes, we covered how adding additional viewpoints of a scene can greatly enhance

More information

Rectification and Distortion Correction

Rectification and Distortion Correction Rectification and Distortion Correction Hagen Spies March 12, 2003 Computer Vision Laboratory Department of Electrical Engineering Linköping University, Sweden Contents Distortion Correction Rectification

More information

CS201 Computer Vision Camera Geometry

CS201 Computer Vision Camera Geometry CS201 Computer Vision Camera Geometry John Magee 25 November, 2014 Slides Courtesy of: Diane H. Theriault (deht@bu.edu) Question of the Day: How can we represent the relationships between cameras and the

More information

Multiple View Geometry in Computer Vision

Multiple View Geometry in Computer Vision Multiple View Geometry in Computer Vision Prasanna Sahoo Department of Mathematics University of Louisville 1 Structure Computation Lecture 18 March 22, 2005 2 3D Reconstruction The goal of 3D reconstruction

More information

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry 55:148 Digital Image Processing Chapter 11 3D Vision, Geometry Topics: Basics of projective geometry Points and hyperplanes in projective space Homography Estimating homography from point correspondence

More information

Augmented Reality II - Camera Calibration - Gudrun Klinker May 11, 2004

Augmented Reality II - Camera Calibration - Gudrun Klinker May 11, 2004 Augmented Reality II - Camera Calibration - Gudrun Klinker May, 24 Literature Richard Hartley and Andrew Zisserman, Multiple View Geometry in Computer Vision, Cambridge University Press, 2. (Section 5,

More information

Epipolar Geometry and the Essential Matrix

Epipolar Geometry and the Essential Matrix Epipolar Geometry and the Essential Matrix Carlo Tomasi The epipolar geometry of a pair of cameras expresses the fundamental relationship between any two corresponding points in the two image planes, and

More information

Visual Recognition: Image Formation

Visual Recognition: Image Formation Visual Recognition: Image Formation Raquel Urtasun TTI Chicago Jan 5, 2012 Raquel Urtasun (TTI-C) Visual Recognition Jan 5, 2012 1 / 61 Today s lecture... Fundamentals of image formation You should know

More information

Camera Model and Calibration

Camera Model and Calibration Camera Model and Calibration Lecture-10 Camera Calibration Determine extrinsic and intrinsic parameters of camera Extrinsic 3D location and orientation of camera Intrinsic Focal length The size of the

More information

Reminder: Lecture 20: The Eight-Point Algorithm. Essential/Fundamental Matrix. E/F Matrix Summary. Computing F. Computing F from Point Matches

Reminder: Lecture 20: The Eight-Point Algorithm. Essential/Fundamental Matrix. E/F Matrix Summary. Computing F. Computing F from Point Matches Reminder: Lecture 20: The Eight-Point Algorithm F = -0.00310695-0.0025646 2.96584-0.028094-0.00771621 56.3813 13.1905-29.2007-9999.79 Readings T&V 7.3 and 7.4 Essential/Fundamental Matrix E/F Matrix Summary

More information

Camera calibration. Robotic vision. Ville Kyrki

Camera calibration. Robotic vision. Ville Kyrki Camera calibration Robotic vision 19.1.2017 Where are we? Images, imaging Image enhancement Feature extraction and matching Image-based tracking Camera models and calibration Pose estimation Motion analysis

More information

C / 35. C18 Computer Vision. David Murray. dwm/courses/4cv.

C / 35. C18 Computer Vision. David Murray.   dwm/courses/4cv. C18 2015 1 / 35 C18 Computer Vision David Murray david.murray@eng.ox.ac.uk www.robots.ox.ac.uk/ dwm/courses/4cv Michaelmas 2015 C18 2015 2 / 35 Computer Vision: This time... 1. Introduction; imaging geometry;

More information

Projective geometry for Computer Vision

Projective geometry for Computer Vision Department of Computer Science and Engineering IIT Delhi NIT, Rourkela March 27, 2010 Overview Pin-hole camera Why projective geometry? Reconstruction Computer vision geometry: main problems Correspondence

More information

Lecture 3: Camera Calibration, DLT, SVD

Lecture 3: Camera Calibration, DLT, SVD Computer Vision Lecture 3 23--28 Lecture 3: Camera Calibration, DL, SVD he Inner Parameters In this section we will introduce the inner parameters of the cameras Recall from the camera equations λx = P

More information

Pin Hole Cameras & Warp Functions

Pin Hole Cameras & Warp Functions Pin Hole Cameras & Warp Functions Instructor - Simon Lucey 16-423 - Designing Computer Vision Apps Today Pinhole Camera. Homogenous Coordinates. Planar Warp Functions. Motivation Taken from: http://img.gawkerassets.com/img/18w7i1umpzoa9jpg/original.jpg

More information

Robust Geometry Estimation from two Images

Robust Geometry Estimation from two Images Robust Geometry Estimation from two Images Carsten Rother 09/12/2016 Computer Vision I: Image Formation Process Roadmap for next four lectures Computer Vision I: Image Formation Process 09/12/2016 2 Appearance-based

More information

CSE 252B: Computer Vision II

CSE 252B: Computer Vision II CSE 252B: Computer Vision II Lecturer: Serge Belongie Scribe: Sameer Agarwal LECTURE 1 Image Formation 1.1. The geometry of image formation We begin by considering the process of image formation when a

More information

Chapter 7: Computation of the Camera Matrix P

Chapter 7: Computation of the Camera Matrix P Chapter 7: Computation of the Camera Matrix P Arco Nederveen Eagle Vision March 18, 2008 Arco Nederveen (Eagle Vision) The Camera Matrix P March 18, 2008 1 / 25 1 Chapter 7: Computation of the camera Matrix

More information

Computer Vision I - Algorithms and Applications: Multi-View 3D reconstruction

Computer Vision I - Algorithms and Applications: Multi-View 3D reconstruction Computer Vision I - Algorithms and Applications: Multi-View 3D reconstruction Carsten Rother 09/12/2013 Computer Vision I: Multi-View 3D reconstruction Roadmap this lecture Computer Vision I: Multi-View

More information

Introduction to Homogeneous coordinates

Introduction to Homogeneous coordinates Last class we considered smooth translations and rotations of the camera coordinate system and the resulting motions of points in the image projection plane. These two transformations were expressed mathematically

More information

Pin Hole Cameras & Warp Functions

Pin Hole Cameras & Warp Functions Pin Hole Cameras & Warp Functions Instructor - Simon Lucey 16-423 - Designing Computer Vision Apps Today Pinhole Camera. Homogenous Coordinates. Planar Warp Functions. Example of SLAM for AR Taken from:

More information

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah Camera Models and Image Formation Srikumar Ramalingam School of Computing University of Utah srikumar@cs.utah.edu Reference Most slides are adapted from the following notes: Some lecture notes on geometric

More information

COMP 558 lecture 19 Nov. 17, 2010

COMP 558 lecture 19 Nov. 17, 2010 COMP 558 lecture 9 Nov. 7, 2 Camera calibration To estimate the geometry of 3D scenes, it helps to know the camera parameters, both external and internal. The problem of finding all these parameters is

More information

Geometric camera models and calibration

Geometric camera models and calibration Geometric camera models and calibration http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2018, Lecture 13 Course announcements Homework 3 is out. - Due October

More information

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah Camera Models and Image Formation Srikumar Ramalingam School of Computing University of Utah srikumar@cs.utah.edu VisualFunHouse.com 3D Street Art Image courtesy: Julian Beaver (VisualFunHouse.com) 3D

More information

Multiple View Geometry in Computer Vision

Multiple View Geometry in Computer Vision Multiple View Geometry in Computer Vision Prasanna Sahoo Department of Mathematics University of Louisville 1 Projective 3D Geometry (Back to Chapter 2) Lecture 6 2 Singular Value Decomposition Given a

More information

Stereo Observation Models

Stereo Observation Models Stereo Observation Models Gabe Sibley June 16, 2003 Abstract This technical report describes general stereo vision triangulation and linearized error modeling. 0.1 Standard Model Equations If the relative

More information

Recovering structure from a single view Pinhole perspective projection

Recovering structure from a single view Pinhole perspective projection EPIPOLAR GEOMETRY The slides are from several sources through James Hays (Brown); Silvio Savarese (U. of Michigan); Svetlana Lazebnik (U. Illinois); Bill Freeman and Antonio Torralba (MIT), including their

More information

Midterm Exam Solutions

Midterm Exam Solutions Midterm Exam Solutions Computer Vision (J. Košecká) October 27, 2009 HONOR SYSTEM: This examination is strictly individual. You are not allowed to talk, discuss, exchange solutions, etc., with other fellow

More information

Epipolar geometry. x x

Epipolar geometry. x x Two-view geometry Epipolar geometry X x x Baseline line connecting the two camera centers Epipolar Plane plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections

More information

Short on camera geometry and camera calibration

Short on camera geometry and camera calibration Short on camera geometry and camera calibration Maria Magnusson, maria.magnusson@liu.se Computer Vision Laboratory, Department of Electrical Engineering, Linköping University, Sweden Report No: LiTH-ISY-R-3070

More information

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry 55:148 Digital Image Processing Chapter 11 3D Vision, Geometry Topics: Basics of projective geometry Points and hyperplanes in projective space Homography Estimating homography from point correspondence

More information

Machine vision. Summary # 11: Stereo vision and epipolar geometry. u l = λx. v l = λy

Machine vision. Summary # 11: Stereo vision and epipolar geometry. u l = λx. v l = λy 1 Machine vision Summary # 11: Stereo vision and epipolar geometry STEREO VISION The goal of stereo vision is to use two cameras to capture 3D scenes. There are two important problems in stereo vision:

More information

Lecture 9: Epipolar Geometry

Lecture 9: Epipolar Geometry Lecture 9: Epipolar Geometry Professor Fei Fei Li Stanford Vision Lab 1 What we will learn today? Why is stereo useful? Epipolar constraints Essential and fundamental matrix Estimating F (Problem Set 2

More information

calibrated coordinates Linear transformation pixel coordinates

calibrated coordinates Linear transformation pixel coordinates 1 calibrated coordinates Linear transformation pixel coordinates 2 Calibration with a rig Uncalibrated epipolar geometry Ambiguities in image formation Stratified reconstruction Autocalibration with partial

More information

Perception and Action using Multilinear Forms

Perception and Action using Multilinear Forms Perception and Action using Multilinear Forms Anders Heyden, Gunnar Sparr, Kalle Åström Dept of Mathematics, Lund University Box 118, S-221 00 Lund, Sweden email: {heyden,gunnar,kalle}@maths.lth.se Abstract

More information

Stereo Vision. MAN-522 Computer Vision

Stereo Vision. MAN-522 Computer Vision Stereo Vision MAN-522 Computer Vision What is the goal of stereo vision? The recovery of the 3D structure of a scene using two or more images of the 3D scene, each acquired from a different viewpoint in

More information

Epipolar Geometry and Stereo Vision

Epipolar Geometry and Stereo Vision Epipolar Geometry and Stereo Vision Computer Vision Jia-Bin Huang, Virginia Tech Many slides from S. Seitz and D. Hoiem Last class: Image Stitching Two images with rotation/zoom but no translation. X x

More information

3D Reconstruction with two Calibrated Cameras

3D Reconstruction with two Calibrated Cameras 3D Reconstruction with two Calibrated Cameras Carlo Tomasi The standard reference frame for a camera C is a right-handed Cartesian frame with its origin at the center of projection of C, its positive Z

More information

Stereo Image Rectification for Simple Panoramic Image Generation

Stereo Image Rectification for Simple Panoramic Image Generation Stereo Image Rectification for Simple Panoramic Image Generation Yun-Suk Kang and Yo-Sung Ho Gwangju Institute of Science and Technology (GIST) 261 Cheomdan-gwagiro, Buk-gu, Gwangju 500-712 Korea Email:{yunsuk,

More information

MERGING POINT CLOUDS FROM MULTIPLE KINECTS. Nishant Rai 13th July, 2016 CARIS Lab University of British Columbia

MERGING POINT CLOUDS FROM MULTIPLE KINECTS. Nishant Rai 13th July, 2016 CARIS Lab University of British Columbia MERGING POINT CLOUDS FROM MULTIPLE KINECTS Nishant Rai 13th July, 2016 CARIS Lab University of British Columbia Introduction What do we want to do? : Use information (point clouds) from multiple (2+) Kinects

More information

Homogeneous Coordinates. Lecture18: Camera Models. Representation of Line and Point in 2D. Cross Product. Overall scaling is NOT important.

Homogeneous Coordinates. Lecture18: Camera Models. Representation of Line and Point in 2D. Cross Product. Overall scaling is NOT important. Homogeneous Coordinates Overall scaling is NOT important. CSED44:Introduction to Computer Vision (207F) Lecture8: Camera Models Bohyung Han CSE, POSTECH bhhan@postech.ac.kr (",, ) ()", ), )) ) 0 It is

More information

Camera Calibration Using Line Correspondences

Camera Calibration Using Line Correspondences Camera Calibration Using Line Correspondences Richard I. Hartley G.E. CRD, Schenectady, NY, 12301. Ph: (518)-387-7333 Fax: (518)-387-6845 Email : hartley@crd.ge.com Abstract In this paper, a method of

More information

An idea which can be used once is a trick. If it can be used more than once it becomes a method

An idea which can be used once is a trick. If it can be used more than once it becomes a method An idea which can be used once is a trick. If it can be used more than once it becomes a method - George Polya and Gabor Szego University of Texas at Arlington Rigid Body Transformations & Generalized

More information

Structure from Motion and Multi- view Geometry. Last lecture

Structure from Motion and Multi- view Geometry. Last lecture Structure from Motion and Multi- view Geometry Topics in Image-Based Modeling and Rendering CSE291 J00 Lecture 5 Last lecture S. J. Gortler, R. Grzeszczuk, R. Szeliski,M. F. Cohen The Lumigraph, SIGGRAPH,

More information

Computer Vision. Geometric Camera Calibration. Samer M Abdallah, PhD

Computer Vision. Geometric Camera Calibration. Samer M Abdallah, PhD Computer Vision Samer M Abdallah, PhD Faculty of Engineering and Architecture American University of Beirut Beirut, Lebanon Geometric Camera Calibration September 2, 2004 1 Computer Vision Geometric Camera

More information

Computer Vision: Lecture 3

Computer Vision: Lecture 3 Computer Vision: Lecture 3 Carl Olsson 2019-01-29 Carl Olsson Computer Vision: Lecture 3 2019-01-29 1 / 28 Todays Lecture Camera Calibration The inner parameters - K. Projective vs. Euclidean Reconstruction.

More information

Multiple Views Geometry

Multiple Views Geometry Multiple Views Geometry Subhashis Banerjee Dept. Computer Science and Engineering IIT Delhi email: suban@cse.iitd.ac.in January 2, 28 Epipolar geometry Fundamental geometric relationship between two perspective

More information

METRIC PLANE RECTIFICATION USING SYMMETRIC VANISHING POINTS

METRIC PLANE RECTIFICATION USING SYMMETRIC VANISHING POINTS METRIC PLANE RECTIFICATION USING SYMMETRIC VANISHING POINTS M. Lefler, H. Hel-Or Dept. of CS, University of Haifa, Israel Y. Hel-Or School of CS, IDC, Herzliya, Israel ABSTRACT Video analysis often requires

More information

Agenda. Rotations. Camera models. Camera calibration. Homographies

Agenda. Rotations. Camera models. Camera calibration. Homographies Agenda Rotations Camera models Camera calibration Homographies D Rotations R Y = Z r r r r r r r r r Y Z Think of as change of basis where ri = r(i,:) are orthonormal basis vectors r rotated coordinate

More information

CS 664 Slides #9 Multi-Camera Geometry. Prof. Dan Huttenlocher Fall 2003

CS 664 Slides #9 Multi-Camera Geometry. Prof. Dan Huttenlocher Fall 2003 CS 664 Slides #9 Multi-Camera Geometry Prof. Dan Huttenlocher Fall 2003 Pinhole Camera Geometric model of camera projection Image plane I, which rays intersect Camera center C, through which all rays pass

More information

Week 2: Two-View Geometry. Padua Summer 08 Frank Dellaert

Week 2: Two-View Geometry. Padua Summer 08 Frank Dellaert Week 2: Two-View Geometry Padua Summer 08 Frank Dellaert Mosaicking Outline 2D Transformation Hierarchy RANSAC Triangulation of 3D Points Cameras Triangulation via SVD Automatic Correspondence Essential

More information

Geometry of image formation

Geometry of image formation eometry of image formation Tomáš Svoboda, svoboda@cmp.felk.cvut.cz Czech Technical University in Prague, Center for Machine Perception http://cmp.felk.cvut.cz Last update: November 3, 2008 Talk Outline

More information

Rigid Body Motion and Image Formation. Jana Kosecka, CS 482

Rigid Body Motion and Image Formation. Jana Kosecka, CS 482 Rigid Body Motion and Image Formation Jana Kosecka, CS 482 A free vector is defined by a pair of points : Coordinates of the vector : 1 3D Rotation of Points Euler angles Rotation Matrices in 3D 3 by 3

More information

Vision Review: Image Formation. Course web page:

Vision Review: Image Formation. Course web page: Vision Review: Image Formation Course web page: www.cis.udel.edu/~cer/arv September 10, 2002 Announcements Lecture on Thursday will be about Matlab; next Tuesday will be Image Processing The dates some

More information

Calibrating a Structured Light System Dr Alan M. McIvor Robert J. Valkenburg Machine Vision Team, Industrial Research Limited P.O. Box 2225, Auckland

Calibrating a Structured Light System Dr Alan M. McIvor Robert J. Valkenburg Machine Vision Team, Industrial Research Limited P.O. Box 2225, Auckland Calibrating a Structured Light System Dr Alan M. McIvor Robert J. Valkenburg Machine Vision Team, Industrial Research Limited P.O. Box 2225, Auckland New Zealand Tel: +64 9 3034116, Fax: +64 9 302 8106

More information

Elements of Computer Vision: Multiple View Geometry. 1 Introduction. 2 Elements of Geometry. Andrea Fusiello

Elements of Computer Vision: Multiple View Geometry. 1 Introduction. 2 Elements of Geometry. Andrea Fusiello Elements of Computer Vision: Multiple View Geometry. Andrea Fusiello http://www.sci.univr.it/~fusiello July 11, 2005 c Copyright by Andrea Fusiello. This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike

More information

Hartley - Zisserman reading club. Part I: Hartley and Zisserman Appendix 6: Part II: Zhengyou Zhang: Presented by Daniel Fontijne

Hartley - Zisserman reading club. Part I: Hartley and Zisserman Appendix 6: Part II: Zhengyou Zhang: Presented by Daniel Fontijne Hartley - Zisserman reading club Part I: Hartley and Zisserman Appendix 6: Iterative estimation methods Part II: Zhengyou Zhang: A Flexible New Technique for Camera Calibration Presented by Daniel Fontijne

More information

Multi-View Geometry Part II (Ch7 New book. Ch 10/11 old book)

Multi-View Geometry Part II (Ch7 New book. Ch 10/11 old book) Multi-View Geometry Part II (Ch7 New book. Ch 10/11 old book) Guido Gerig CS-GY 6643, Spring 2016 gerig@nyu.edu Credits: M. Shah, UCF CAP5415, lecture 23 http://www.cs.ucf.edu/courses/cap6411/cap5415/,

More information

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration Camera Calibration Jesus J Caban Schedule! Today:! Camera calibration! Wednesday:! Lecture: Motion & Optical Flow! Monday:! Lecture: Medical Imaging! Final presentations:! Nov 29 th : W. Griffin! Dec 1

More information

CHAPTER 3. Single-view Geometry. 1. Consequences of Projection

CHAPTER 3. Single-view Geometry. 1. Consequences of Projection CHAPTER 3 Single-view Geometry When we open an eye or take a photograph, we see only a flattened, two-dimensional projection of the physical underlying scene. The consequences are numerous and startling.

More information

Flexible Calibration of a Portable Structured Light System through Surface Plane

Flexible Calibration of a Portable Structured Light System through Surface Plane Vol. 34, No. 11 ACTA AUTOMATICA SINICA November, 2008 Flexible Calibration of a Portable Structured Light System through Surface Plane GAO Wei 1 WANG Liang 1 HU Zhan-Yi 1 Abstract For a portable structured

More information

Structure from Motion. Prof. Marco Marcon

Structure from Motion. Prof. Marco Marcon Structure from Motion Prof. Marco Marcon Summing-up 2 Stereo is the most powerful clue for determining the structure of a scene Another important clue is the relative motion between the scene and (mono)

More information

Camera model and calibration

Camera model and calibration and calibration AVIO tristan.moreau@univ-rennes1.fr Laboratoire de Traitement du Signal et des Images (LTSI) Université de Rennes 1. Mardi 21 janvier 1 AVIO tristan.moreau@univ-rennes1.fr and calibration

More information

Jacobian of Point Coordinates w.r.t. Parameters of General Calibrated Projective Camera

Jacobian of Point Coordinates w.r.t. Parameters of General Calibrated Projective Camera Jacobian of Point Coordinates w.r.t. Parameters of General Calibrated Projective Camera Karel Lebeda, Simon Hadfield, Richard Bowden Introduction This is a supplementary technical report for ACCV04 paper:

More information

Projective geometry, camera models and calibration

Projective geometry, camera models and calibration Projective geometry, camera models and calibration Subhashis Banerjee Dept. Computer Science and Engineering IIT Delhi email: suban@cse.iitd.ac.in January 6, 2008 The main problems in computer vision Image

More information

Epipolar Geometry Prof. D. Stricker. With slides from A. Zisserman, S. Lazebnik, Seitz

Epipolar Geometry Prof. D. Stricker. With slides from A. Zisserman, S. Lazebnik, Seitz Epipolar Geometry Prof. D. Stricker With slides from A. Zisserman, S. Lazebnik, Seitz 1 Outline 1. Short introduction: points and lines 2. Two views geometry: Epipolar geometry Relation point/line in two

More information

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 263

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 263 Index 3D reconstruction, 125 5+1-point algorithm, 284 5-point algorithm, 270 7-point algorithm, 265 8-point algorithm, 263 affine point, 45 affine transformation, 57 affine transformation group, 57 affine

More information

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG. Computer Vision Coordinates Prof. Flávio Cardeal DECOM / CEFET- MG cardeal@decom.cefetmg.br Abstract This lecture discusses world coordinates and homogeneous coordinates, as well as provides an overview

More information

Partial Calibration and Mirror Shape Recovery for Non-Central Catadioptric Systems

Partial Calibration and Mirror Shape Recovery for Non-Central Catadioptric Systems Partial Calibration and Mirror Shape Recovery for Non-Central Catadioptric Systems Abstract In this paper we present a method for mirror shape recovery and partial calibration for non-central catadioptric

More information

Multi-view geometry problems

Multi-view geometry problems Multi-view geometry Multi-view geometry problems Structure: Given projections o the same 3D point in two or more images, compute the 3D coordinates o that point? Camera 1 Camera 2 R 1,t 1 R 2,t 2 Camera

More information

3D Geometry and Camera Calibration

3D Geometry and Camera Calibration 3D Geometry and Camera Calibration 3D Coordinate Systems Right-handed vs. left-handed x x y z z y 2D Coordinate Systems 3D Geometry Basics y axis up vs. y axis down Origin at center vs. corner Will often

More information

Structure from motion

Structure from motion Structure from motion Structure from motion Given a set of corresponding points in two or more images, compute the camera parameters and the 3D point coordinates?? R 1,t 1 R 2,t R 2 3,t 3 Camera 1 Camera

More information

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 253

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 253 Index 3D reconstruction, 123 5+1-point algorithm, 274 5-point algorithm, 260 7-point algorithm, 255 8-point algorithm, 253 affine point, 43 affine transformation, 55 affine transformation group, 55 affine

More information

C18 Computer Vision. Lecture 1 Introduction: imaging geometry, camera calibration. Victor Adrian Prisacariu.

C18 Computer Vision. Lecture 1 Introduction: imaging geometry, camera calibration. Victor Adrian Prisacariu. C8 Computer Vision Lecture Introduction: imaging geometry, camera calibration Victor Adrian Prisacariu http://www.robots.ox.ac.uk/~victor InfiniTAM Demo Course Content VP: Intro, basic image features,

More information

Camera model and calibration

Camera model and calibration Camera model and calibration Karel Zimmermann, zimmerk@cmp.felk.cvut.cz (some images taken from Tomáš Svoboda s) Czech Technical University in Prague, Center for Machine Perception http://cmp.felk.cvut.cz

More information

Projector Calibration for Pattern Projection Systems

Projector Calibration for Pattern Projection Systems Projector Calibration for Pattern Projection Systems I. Din *1, H. Anwar 2, I. Syed 1, H. Zafar 3, L. Hasan 3 1 Department of Electronics Engineering, Incheon National University, Incheon, South Korea.

More information

CS231M Mobile Computer Vision Structure from motion

CS231M Mobile Computer Vision Structure from motion CS231M Mobile Computer Vision Structure from motion - Cameras - Epipolar geometry - Structure from motion Pinhole camera Pinhole perspective projection f o f = focal length o = center of the camera z y

More information

Stereo Matching for Calibrated Cameras without Correspondence

Stereo Matching for Calibrated Cameras without Correspondence Stereo Matching for Calibrated Cameras without Correspondence U. Helmke, K. Hüper, and L. Vences Department of Mathematics, University of Würzburg, 97074 Würzburg, Germany helmke@mathematik.uni-wuerzburg.de

More information

MAPI Computer Vision. Multiple View Geometry

MAPI Computer Vision. Multiple View Geometry MAPI Computer Vision Multiple View Geometry Geometry o Multiple Views 2- and 3- view geometry p p Kpˆ [ K R t]p Geometry o Multiple Views 2- and 3- view geometry Epipolar Geometry The epipolar geometry

More information

ECE 470: Homework 5. Due Tuesday, October 27 in Seth Hutchinson. Luke A. Wendt

ECE 470: Homework 5. Due Tuesday, October 27 in Seth Hutchinson. Luke A. Wendt ECE 47: Homework 5 Due Tuesday, October 7 in class @:3pm Seth Hutchinson Luke A Wendt ECE 47 : Homework 5 Consider a camera with focal length λ = Suppose the optical axis of the camera is aligned with

More information

Computer Vision I - Appearance-based Matching and Projective Geometry

Computer Vision I - Appearance-based Matching and Projective Geometry Computer Vision I - Appearance-based Matching and Projective Geometry Carsten Rother 05/11/2015 Computer Vision I: Image Formation Process Roadmap for next four lectures Computer Vision I: Image Formation

More information

Module 4F12: Computer Vision and Robotics Solutions to Examples Paper 2

Module 4F12: Computer Vision and Robotics Solutions to Examples Paper 2 Engineering Tripos Part IIB FOURTH YEAR Module 4F2: Computer Vision and Robotics Solutions to Examples Paper 2. Perspective projection and vanishing points (a) Consider a line in 3D space, defined in camera-centered

More information

Computer Vision Projective Geometry and Calibration. Pinhole cameras

Computer Vision Projective Geometry and Calibration. Pinhole cameras Computer Vision Projective Geometry and Calibration Professor Hager http://www.cs.jhu.edu/~hager Jason Corso http://www.cs.jhu.edu/~jcorso. Pinhole cameras Abstract camera model - box with a small hole

More information

Camera models and calibration

Camera models and calibration Camera models and calibration Read tutorial chapter 2 and 3. http://www.cs.unc.edu/~marc/tutorial/ Szeliski s book pp.29-73 Schedule (tentative) 2 # date topic Sep.8 Introduction and geometry 2 Sep.25

More information

3-D D Euclidean Space - Vectors

3-D D Euclidean Space - Vectors 3-D D Euclidean Space - Vectors Rigid Body Motion and Image Formation A free vector is defined by a pair of points : Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Coordinates of the vector : 3D Rotation

More information

Stereo II CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Stereo II CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz Stereo II CSE 576 Ali Farhadi Several slides from Larry Zitnick and Steve Seitz Camera parameters A camera is described by several parameters Translation T of the optical center from the origin of world

More information

Unit 3 Multiple View Geometry

Unit 3 Multiple View Geometry Unit 3 Multiple View Geometry Relations between images of a scene Recovering the cameras Recovering the scene structure http://www.robots.ox.ac.uk/~vgg/hzbook/hzbook1.html 3D structure from images Recover

More information

1 Projective Geometry

1 Projective Geometry CIS8, Machine Perception Review Problem - SPRING 26 Instructions. All coordinate systems are right handed. Projective Geometry Figure : Facade rectification. I took an image of a rectangular object, and

More information

CSE 252B: Computer Vision II

CSE 252B: Computer Vision II CSE 252B: Computer Vision II Lecturer: Serge Belongie Scribe: Haowei Liu LECTURE 16 Structure from Motion from Tracked Points 16.1. Introduction In the last lecture we learned how to track point features

More information

Last lecture. Passive Stereo Spacetime Stereo

Last lecture. Passive Stereo Spacetime Stereo Last lecture Passive Stereo Spacetime Stereo Today Structure from Motion: Given pixel correspondences, how to compute 3D structure and camera motion? Slides stolen from Prof Yungyu Chuang Epipolar geometry

More information

A Compact Algorithm for Rectification of Stereo Pairs

A Compact Algorithm for Rectification of Stereo Pairs Machine Vision and Applications manuscript No. (will be inserted by the editor) A Compact Algorithm for Rectification of Stereo Pairs Andrea Fusiello Emanuele Trucco Alessandro Verri Received: 25 February

More information

Structure from motion

Structure from motion Structure from motion Structure from motion Given a set of corresponding points in two or more images, compute the camera parameters and the 3D point coordinates?? R 1,t 1 R 2,t 2 R 3,t 3 Camera 1 Camera

More information

CSE 252B: Computer Vision II

CSE 252B: Computer Vision II CSE 252B: Computer Vision II Lecturer: Serge Belongie Scribe: Jayson Smith LECTURE 4 Planar Scenes and Homography 4.1. Points on Planes This lecture examines the special case of planar scenes. When talking

More information

Camera Geometry II. COS 429 Princeton University

Camera Geometry II. COS 429 Princeton University Camera Geometry II COS 429 Princeton University Outline Projective geometry Vanishing points Application: camera calibration Application: single-view metrology Epipolar geometry Application: stereo correspondence

More information

Degeneracy of the Linear Seventeen-Point Algorithm for Generalized Essential Matrix

Degeneracy of the Linear Seventeen-Point Algorithm for Generalized Essential Matrix J Math Imaging Vis 00 37: 40-48 DOI 0007/s085-00-09-9 Authors s version The final publication is available at wwwspringerlinkcom Degeneracy of the Linear Seventeen-Point Algorithm for Generalized Essential

More information

Structure from motion

Structure from motion Multi-view geometry Structure rom motion Camera 1 Camera 2 R 1,t 1 R 2,t 2 Camera 3 R 3,t 3 Figure credit: Noah Snavely Structure rom motion? Camera 1 Camera 2 R 1,t 1 R 2,t 2 Camera 3 R 3,t 3 Structure:

More information

Outline. ETN-FPI Training School on Plenoptic Sensing

Outline. ETN-FPI Training School on Plenoptic Sensing Outline Introduction Part I: Basics of Mathematical Optimization Linear Least Squares Nonlinear Optimization Part II: Basics of Computer Vision Camera Model Multi-Camera Model Multi-Camera Calibration

More information