Fisheye Camera s Intrinsic Parameter Estimation Using Trajectories of Feature Points Obtained from Camera Rotation

Size: px
Start display at page:

Download "Fisheye Camera s Intrinsic Parameter Estimation Using Trajectories of Feature Points Obtained from Camera Rotation"

Transcription

1 Fisheye Camera s Intrinsic Parameter Estimation Using Trajectories of Feature Points Obtained from Camera Rotation Akihiko Hishigi, Yuki Tanaka, Gakuto Masuyama, and Kazunori Umeda Abstract This paper proposes a simple method of estimating fisheye camera s intrinsic parameters without calibration targets. This method takes advantage of the trajectories of feature points in the scene. The trajectories of feature points are obtained from a rotation movement of the camera around the vertical axis. The feature points are utilized for calibration, thus specific calibration targets are not required. The proposed method estimates intrinsic parameters of a real fisheye camera. The validity of the proposed method is verified by the perspective projection of a distorted image using estimated parameters. In addition, this paper compares the results of perspective projection using the proposed method and the conventional method. I. INTRODUCTION Fisheye cameras are widely used as sensors to acquire information from the external world. This camera has a wide angle of view (almost 180 [deg]) and can measure a wide range of information at once. Therefore, it is effective for building sensor systems, such as driver assistance systems and monitoring systems, that require a wide range of measurements at a low cost. However, images obtained from a fisheye camera have intractable distortions. Thus, intrinsic parameters must be estimated accurately to remove the distortion by perspective projection. Intrinsic parameters represent the individual differences of each lens. The accuracy of the estimation of this parameter affects the image-processing result. Many studies have estimated the intrinsic parameters of the fisheye camera [1]-[6]. However, these methods are often cumbersome, due to the requirement of a special target. In Scaramuzza's method [7], a checkerboard pattern is presented repeatedly to calibrate the fisheye camera. Kannala [8] estimated the intrinsic parameters of the fisheye camera to shoot only once for each piece of the calibration board. However, that is also cumbersome due to the size of calibration board (2 3[m 2 ]). The quality of the calibration depends on the way or the number of times the target is shown. Therefore, this paper develops a marker-free calibration method. As seen in Figure 1, this method estimates the intrinsic parameters of a fisheye camera using trajectories of feature points obtained by rotating the camera; the camera is rotated around the vertical axis through the optical center of the camera. Note that the trajectory used in this paper is defined as the coordinates of a feature point, excluding temporal information from the trajectory in the meaning that is usually used. A brief overview of the proposed method is given in [9]. In this paper, we present details of the method with two minor modifications: camera model and evaluation function. Furthermore, we performed experiments using actual equipment. The verification of the proposed method is shown by simulation, and an experiment was conducted using the actual equipment. In addition, this paper compares the estimation results with an existing approach. II. INTRINSIC PARAMETERS OF THE FISHEYE CAMERA A. Model of the fisheye This paper uses the generic omnidirectional camera model that was proposed by Scaramuzza [7]. Figure 2 shows an outline of the camera model. A 3D position, P = [X Y Z] T, is represented by the 2D position p = [u v] T and the image center p 0 = [u 0 v 0] T : X u u 0 P = [ Y] [ v v 0 ], (1) Z f(ρ) where the origin of the image coordinate system is in the upper left corner. The symbol represents equivalence up to scale. ρ represents the distance from the image center, p 0, to a 2D position p = [u v] T : ρ = (u u 0 ) 2 + (v v 0 ) 2. (2) f(ρ), which is a polynomial of ρ, is expressed as follows: f(ρ) = a 0 + a 1 ρ + a 2 ρ 2 + a 3 ρ 3 + a 4 ρ 4 +. (3) In this paper, f(ρ) is a fourth-order polynomial. Including the center of the image, intrinsic parameters to estimate are defined as follows: I = [a 0 a 1 a 2 a 3 a 4 u 0 v 0 ] T. (4) Akihiko Hishigi and Yuki Tanaka are with the Course of Precision Engineering, School of Science and Engineering, Chuo University, , Kasuga, Bunkyo-ku, Tokyo, Japan, (e-mall: Gakuto Masuyama and Kazunori Umeda are with the Department of Precision Mechanics, Faculty of Science and Engineering, Chuo University, , Kasuga, Bunkyo-ku, Tokyo, Japan, (e-mall: {masuyama, umeda}@mech.chuo-u.ac.jp). Fig. 1. Estimation of intrinsic parameters [9]

2 Fig. 2. Camera model Fig. 3. 3D coordinates (a)camera 1 (b)camera 2 Fig. 4. Shape differences between the two cameras B. Projection of the 3D point In this paper, the 3D point, P, is represented by azimuth α and elevation β utilizing the rotation of the camera, as shown in Figure 3. p is projected from eq. (1) on the image as follows: p = [ u v ] = [ tanα f(ρ) + u 0 tan 2 α + 1 tanβ f(ρ) + v 0 ], (5) where ρ is the real solution of eq. (3). f(ρ) is given from eq. (1) and Figure 3 as follows: III. THE ESTIMATION METHOD This method assumes that the fisheye camera is installed so as to satisfy the following conditions. The rotation axis of the camera passes through the optical center. Z f(ρ) = X 2 + Y ρ 2 ρ (6) = tan 2 α + (tan 2 α + 1) tan 2 β. The optical axis and the rotation axis cross one another perpendicularly. Trajectories are captured by rotating the camera. These trajectories include information regarding the azimuth α, elevation β, and intrinsic parameters of the camera. Figure 4 shows the shapes of trajectories based on different intrinsic parameters in the simulation. Our idea is to estimate the intrinsic parameters based on the geometric shapes of the trajectories, which reflect the differences in the intrinsic parameters. In this method, the intrinsic parameters of the fisheye camera are estimated by minimizing the evaluation function. Arguments of the evaluation function are intrinsic parameters and the 3D coordinates of observation points. The flow of the estimation is shown as follows. A. Definition of the evaluation function First, the following constraint is obtained by erasing α from eq. (5): v ri v 0 (u fi u 0 ) 2 + f 2 (ρ) tan β = 0, (7) where u fi is obtained from observation point p fi = [u fi v fi] T. Then we can calculate the re-projected point p ri = [u ri v ri] T, where u ri = u fi and v ri is the solution of eq.(7). An evaluation function, E, is defined as the sum of the squares of the difference between observation point p fi and re-projection point p ri as follows: N E = (v fi v ri ) 2, i=1 where N is the number of observation points. Evaluation function E is calculated by β of the observation point, I and p fi. Intrinsic parameters, I, are estimated by the optimization technique. The elevation angle, β, of the feature point, as shown in the next section, is estimated by iterative calculation. In this paper, a modified Powell method [10] is used to optimize an evaluation function. B. Flow of the estimation Figure 5 shows the flow of the estimation of this paper. First, the intrinsic parameters are initialized. Appropriate initial values are given to a 0 ~a 4. u 0 and v 0 are calculated from the symmetry of each trajectory. Details about the initialization process are described in section III.C. First, the elevation angle of each point in one trajectory is solved from eq. (7) while fixing intrinsic parameters. An average of the elevation angles is then defined as β of the trajectory. The intrinsic parameters are estimated so as to minimize the evaluation function using obtained β. Iterating successive flow, each intrinsic parameter and the elevation are updated. Note that the estimation of a 0 ~a 4 is repeated while increasing the number of intrinsic parameters. We found that this heuristic approach improves the stability of the estimation process. Start Setting_ I Estimation_a 0 Estimation _a 0, a 2 Estimation _a 0, a 2, a 3 Estimation _a 0, a 2, a 3, a 4 Loop1_end? Estimation_ u 0, v 0 Loop2_end? Loop3_end? End Fig. 5. Flow of estimation (8)

3 Fig. 6. Calculation of v 0 Fig. 7. An example of an artificial feature points C. Initializing values Initial values affect the accuracy of estimated results. a 0 ~a 4 are almost the same among fisheye cameras. Thus, we set their initial values empirically. The manual initial guess on intrinsic parameters can be very rough and thus is not hard. An initial value of u 0 is determined by using a trajectory approximated by a quadratic curve. u 0 is set as the average of u at every extremum of the quadratic curves. However, u at an extremum tends to deviate when the trajectory is nearly linear. Therefore, we may exclude trajectories from the initialization comparing v at the endpoints of each trajectory. v 0 is calculated from correspondence between the curvature and the vertex coordinate. The quadratic coefficient and v at the extremum of each trajectory are plotted on a 2D plane. Figure 6 shows an example of the plot. The horizontal axis represents v at extremum, and the vertical axis represents a quadratic coefficient. We use points positioned near the center of the fisheye image to apply linear approximation. Then v 0 is determined from the approximated line by substituting 0 for the quadratic coefficient. IV. SIMULATION When the trajectories of feature points are captured using the actual camera, ideal trajectories cannot be obtained due to various factors, including errors in installing the camera or errors in feature point detection and tracking. In order to verify the accuracy of the proposed method and the influence of errors, the simulation is conducted using an artificial image. A. Setting of the environment An artificial image was produced with the assumption that the optical axis of the camera is rotated horizontally around the optical center in parallel with the horizontal plane. To verify the influence of the measurement error of the feature point, normally distributed artificial noise was added to u fi and v fi, independently. The mean and standard deviations were 0.0 and σ = 0.0, 0.5, 1.0, 2.0, respectively. We performed 10 experiments in each setting. Figure 7 shows the example of an artificial image used in the experiment. There were 820 observation points in each image. Initial values of a 0 ~a 4 are shown in Table I. To estimate the initial value v 0 stably, four trajectories near the image center were used for estimation. B. Experiment results The estimated value of each intrinsic parameter and the final evaluation function are shown in Table II. Figure 8 shows the observation points and re-projected points at the end of the experiment. Only the upper right corner of the image is shown. Each blue dot represents an observation point. Each red rectangle represents a re-projection point. As in Table II, the differences of the image center between the average and the true value were within 1[pixel] in any experiment. Thus, convergence can be correctly confirmed stably even if the measurement error was large. However, the evaluation function was not 0 even if the standard deviation was 0.0[pixel]. The reason is considered to be that the evaluation function converges to a local optimum solution due to the calculation of the elevation angle. It is difficult to verify the accuracy of a 0 ~a 4. Instead, we compared the trajectories of the re-projected points obtained with the following conditions: I. a 0 ~a 4 : estimated value u 0, v 0 : true value. II. a 0 ~a 4 : true value u 0, v 0 : true value. The trajectories were assessed by the average of the distance of each re-projected point pair. The results are shown in Figure 9. The horizontal axis represents σ of the noise. The vertical axis represents the average of the distance between re-projected points. Error bars represent the standard deviation. When σ was 1.0[pixel] and 2.0[pixel], a large re-projection error was observed once for 10 trials. These were excluded from the estimated results as the estimation failure. As in Figure 9, if the standard deviation was 0.0[pixel], the average of the re-projection error was 0.17[pixel] per point. As in Table II, the re-projection error was large despite the small value of the evaluation function. The reason is also considered to be that the evaluation function converges to a local optimum solution. If the standard deviation was 2.0[pixel], the average re-projection error was 1.38[pixel]. Further improvements of accuracy and stability would be required for practical use. The problem, which converges to a local optimum solution, must be solved. We are considering giving another constraint on the optimization process to avoid premature convergence.

4 Average of re-projection error[pixel] TABLE I INITIAL INTRINSIC PARAMETERS a 0 a 1 a 2 a 3 a TABLE II AVERAGE VALUE AND STANDARD DEVIATION OF 10 EXPERIMENTS True value σ = 0.0 σ = 0.5 σ = 1.0 σ = 2.0 Ave. S.D Ave. S.D Ave. S.D Ave. S.D a a a 2 ( 10 4 ) a 3 ( 10 7 ) a 4 ( 10 9 ) u 0 [pixel] v 0 [pixel] E [pixel 2 ] a) σ = 0.0[pixel](E = 32.70[pixel 2 ]) b) σ = 0.5[pixel] (E = [pixel 2 ]) c) σ = 1.0[pixel](E = [pixel 2 ]) d) σ = 2.0[pixel] (E = [pixel 2 ]) Fig. 8. Examples of the observation points and re-projection points of each experiment Standard deviation σ of sensing error Fig. 9. Re-projection error

5 Average of re-projection error[pixel] Average of re-projection error[pixel] Standard deviation σ of sensing error (a) Current camera model (b) Previous camera model Fig. 10. Re-projection error for previous evaluation function C. Comparison with the previous camera model and evaluation function To validate the differences of camera models, we applied the previous evaluation function [9] to the current camera model and the previous camera model, respectively. The results are shown in Figure 10. As in Figure 10, the previous camera model is more accurate than the current camera model. However, the previous camera model was unstable and the evaluation function did not converge in several times. In 10 trials for each standard deviation, failures of 3, 2 and 3 times occurred for 0.5[pixel], 1.0[pixel] and 2.0[pixel] respectively. On the other hand, in the current camera model, there was no case that estimation diverged. To summarize, the current model s estimation is easier to estimate without requiring the azimuth angle and more stable compared with the previous model, but is less accurate. By comparing Figure 9 and Figure 10(a), we can validate the differences of evaluation functions. We can say that the current evaluation function has higher accuracy. The stability is slightly inferior to the estimation using the previous evaluation function, since a few failures occurred for the current evaluation function as mentioned in section IV.B Standard deviation σ of sensing error successive images. To estimate intrinsic parameters stably and accurately, we chose full-trajectory manually. Figure 13 shows the environment of the experiment. Trajectories, which were generated by tracking, are shown in Figure 14. Fig. 11. Fisheye camera mounted on a tripod Fig. 12. Capturing feature point trajectories V. INTRINSIC PARAMETER ESTIMATION EXPERIMENT WITH ACTUAL EQUIPMENT A. Generation of feature point trajectories Figure 11 shows fisheye camera mounted on a tripod. CCD camera is Point Grey Research Dragonfly2. Its number of effective pixels is The fisheye lens is SPACE TV 1634 M. Its focal length is 1.6[mm] and angle of view is [deg] [deg]. The experiment with actual equipment used images captured by rotating the mounted camera. The trajectory of each feature point was obtained by sequentially matching feature points of two successive images. In this paper, the AKAZE feature was used for detection and matching using feature points [11]. The AKAZE gives features with sub-pixel accuracy. Euclidian distances between feature points matched with two successive images, are calculated to judge whether matching is successful or not. If the distance between the feature points was larger than the threshold value in two successive images, these were regarded as incorrectly matched. The camera was rotated manually. Images were captured continuously at a regular interval. Figure 12 depicts transitions of the upper left corner of the whiteboard. It is desired to obtain dense feature points by tracking between Fig. 13. Environment of the experiment Fig. 14. Feature point trajectories TABLE III INTRINSIC PARAMETERS Proposed method Existing method a a a 2 ( 10 4 ) a 3 ( 10 7 ) a 4 ( 10 9 ) u 0 [pixel] v 0 [pixel]

6 B. Conditions of intrinsic parameter estimation Intrinsic parameters a 0 ~a 4 to be initialized are the same values as those in Table I. Image center were calculated for the above method which was explained in section III.C. C. Results of the experiment Figure 15 shows the re-projected points obtained by estimated intrinsic parameters. As in Figure 15, the minimization of the evaluation function was generally performed correctly. Also, Table III shows the results of estimated parameters and the existing method. It is impossible to know the true value of the intrinsic parameters. The evaluation was performed by the perspective projection of the fisheye image using the intrinsic parameters to be estimated. Also, the evaluation was compared with the existing methods. Using Scaramuzza s method [6], the intrinsic parameters were calculated with the same fisheye camera. Twenty images that described the checkerboard pattern were used for estimation. An input image to be converted is shown in Figure 16. The perspective projection, using the results of each technique, is shown in Figure 17. As in Figure 17(a), we can see straight lines in the checkerboard pattern. As compared with Figure 17(b), the distortion to the same extent has been removed as well. Therefore, the intrinsic parameters that were estimated can be considered to be approximately correct. We conducted quantitative evaluation by fitting a line to lattice points of a checkerboard pattern after transforming a fisheye image to a perspective image using the estimated intrinsic parameters. We extracted 10 vertical lines and 7 horizontal lines from Figure 17. Deviation of lattice points are evaluated from the obtained line. The average of standard deviation is 0.35[pixel] and 0.16[pixel] for the proposed method and the conventional method respectively. Although the deviation is sufficiently small even in the proposed method, the estimation accuracy is inferior to the traditional method. That is considered that there is a room for further improvement of accuracy. The reason distortion remains slightly is thought to be that a position of the installed camera violates presupposition. As for the discrepancy between the optical center and the rotation axis, it is possible to reduce the influence of the discrepancy by measuring distant feature points. Another cause of slight distortion is considered to be the uneven presence of trajectories. Trajectories tracked completely successfully were only used for estimation in this experiment. Accuracy will be further improved by using non-full trajectories; tracking may fails in the middle and then succeed again. Fig. 15. Re-projected points Fig. 16. Input image (a)proposed Method (b) Conventional Method Fig. 17. Results of perspective projection VI. CONCLUSION This paper proposed a method of estimating fisheye camera s intrinsic parameters without calibration targets. The experiment was conducted using the actual equipment. As a result, the minimization of the evaluation function was performed correctly, and it was possible to obtain an image that removed the distortion in the perspective projection with the estimated parameters. In future works, further accuracy and stability improvements will be achieved by solving the problems: the discrepancy between the optical center and the rotation axis, and the uneven presence of trajectories. REFERENCES [1] Z. Zhang, A Flexible New Technique for Camera Calibration, IEEE Transaction on Pattern Analysis and Machine Intelligence, Vol. 22, No. 11, pp , [2] P. Carr, Y. Sheikh, and I. Matthews, Point-less Calibration: Camera Parameters from Gradient-based Alignment to Edge Images, Proceedings of 2012 IEEE Workshop on Applications of Computer Vision (WACV 12), pp , [3] C. Hughes, P. Denny, M. Glavin, and E. Jones, Equidistant Fisheye Calibration and Rectification by Vanishing Extraction, IEEE Transaction on Pattern Analysis and Machine Intelligence, Vol. 32, No.12, pp , [4] H. Bakstein and T. Pajdle, Panoramic Mosaicing with a 180 Field of View Lens, Proceedings of the 3rd Workshop on Omnidirectional Vision (OMNIVIS 02), pp.60-67, [5] L. Heng, B. Li, and M. Pollefeys, CamOdoCal: Automatic Intrinsic and Extrinsic Calibration of a Rig with Multiple Generic Cameras and Odometry, Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2013), pp , [6] M. Rufli, D. Scaramuzza, and R. Siegwart, Automatic Detection of Checkerboards on Blurred and Distorted Images, Proceedings of the 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2008), pp , [7] D. Scaramuzza, A. Martinelli, and R. Siegwart, A Toolbox for Easily Calibrating Omnidirectional Cameras," Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2006), pp , [8] J. Kannala and S.S. Brandt, A Generic Camera Model and Calibration Method for Conventional, Wide-angle, and Fisheye Lenses, IEEE Transaction on Pattern Analysis and Machine Intelligence, Vol.28, No.8, pp , [9] Y. Tanaka, G. Masuyama, and K. Umeda, Intrinsic Parameter Estimation of a Fisheye Camera Using Trajectories of Feature Points when Rotating the Camera, Proceedings of the 6th International Conference on Advanced Mechatronics (ICAM2015), pp , [10] M. J. D. Powell, An Efficient Method for Finding the Minimum of a Function of Several Variables without Calculating Derivatives, Computer Journal, Vol.7, No.2, pp , [11] P.F. Alcantarilla, J. Nuevo, and A. Bartoli, Fast Explicit Diffusion for Accelerated Features in Nonlinear Scale Space, Proceedings of British Machine Vision Conference (BMVC), pp.1-11, 2013.

A 100Hz Real-time Sensing System of Textured Range Images

A 100Hz Real-time Sensing System of Textured Range Images A 100Hz Real-time Sensing System of Textured Range Images Hidetoshi Ishiyama Course of Precision Engineering School of Science and Engineering Chuo University 1-13-27 Kasuga, Bunkyo-ku, Tokyo 112-8551,

More information

Stereo Rectification for Equirectangular Images

Stereo Rectification for Equirectangular Images Stereo Rectification for Equirectangular Images Akira Ohashi, 1,2 Fumito Yamano, 1 Gakuto Masuyama, 1 Kazunori Umeda, 1 Daisuke Fukuda, 2 Kota Irie, 3 Shuzo Kaneko, 2 Junya Murayama, 2 and Yoshitaka Uchida

More information

Simultaneous Tele-visualization of Construction Machine and Environment Using Body Mounted Cameras

Simultaneous Tele-visualization of Construction Machine and Environment Using Body Mounted Cameras Simultaneous Tele-visualization of Construction Machine and Environment Using Body Mounted Cameras Wei Sun, Soichiro Iwataki, Ren Komatsu, Hiromitsu Fujii, Atsushi Yamashita, and Hajime Asama Abstract

More information

3D Environment Measurement Using Binocular Stereo and Motion Stereo by Mobile Robot with Omnidirectional Stereo Camera

3D Environment Measurement Using Binocular Stereo and Motion Stereo by Mobile Robot with Omnidirectional Stereo Camera 3D Environment Measurement Using Binocular Stereo and Motion Stereo by Mobile Robot with Omnidirectional Stereo Camera Shinichi GOTO Department of Mechanical Engineering Shizuoka University 3-5-1 Johoku,

More information

High-speed Three-dimensional Mapping by Direct Estimation of a Small Motion Using Range Images

High-speed Three-dimensional Mapping by Direct Estimation of a Small Motion Using Range Images MECATRONICS - REM 2016 June 15-17, 2016 High-speed Three-dimensional Mapping by Direct Estimation of a Small Motion Using Range Images Shinta Nozaki and Masashi Kimura School of Science and Engineering

More information

Camera Calibration for a Robust Omni-directional Photogrammetry System

Camera Calibration for a Robust Omni-directional Photogrammetry System Camera Calibration for a Robust Omni-directional Photogrammetry System Fuad Khan 1, Michael Chapman 2, Jonathan Li 3 1 Immersive Media Corporation Calgary, Alberta, Canada 2 Ryerson University Toronto,

More information

Calibration of a Different Field-of-view Stereo Camera System using an Embedded Checkerboard Pattern

Calibration of a Different Field-of-view Stereo Camera System using an Embedded Checkerboard Pattern Calibration of a Different Field-of-view Stereo Camera System using an Embedded Checkerboard Pattern Pathum Rathnayaka, Seung-Hae Baek and Soon-Yong Park School of Computer Science and Engineering, Kyungpook

More information

Image Transformations & Camera Calibration. Mašinska vizija, 2018.

Image Transformations & Camera Calibration. Mašinska vizija, 2018. Image Transformations & Camera Calibration Mašinska vizija, 2018. Image transformations What ve we learnt so far? Example 1 resize and rotate Open warp_affine_template.cpp Perform simple resize

More information

Detection of Small-Waving Hand by Distributed Camera System

Detection of Small-Waving Hand by Distributed Camera System Detection of Small-Waving Hand by Distributed Camera System Kenji Terabayashi, Hidetsugu Asano, Takeshi Nagayasu, Tatsuya Orimo, Mutsumi Ohta, Takaaki Oiwa, and Kazunori Umeda Department of Mechanical

More information

Measurement of Pedestrian Groups Using Subtraction Stereo

Measurement of Pedestrian Groups Using Subtraction Stereo Measurement of Pedestrian Groups Using Subtraction Stereo Kenji Terabayashi, Yuki Hashimoto, and Kazunori Umeda Chuo University / CREST, JST, 1-13-27 Kasuga, Bunkyo-ku, Tokyo 112-8551, Japan terabayashi@mech.chuo-u.ac.jp

More information

Absolute Scale Structure from Motion Using a Refractive Plate

Absolute Scale Structure from Motion Using a Refractive Plate Absolute Scale Structure from Motion Using a Refractive Plate Akira Shibata, Hiromitsu Fujii, Atsushi Yamashita and Hajime Asama Abstract Three-dimensional (3D) measurement methods are becoming more and

More information

Dept. of Adaptive Machine Systems, Graduate School of Engineering Osaka University, Suita, Osaka , Japan

Dept. of Adaptive Machine Systems, Graduate School of Engineering Osaka University, Suita, Osaka , Japan An Application of Vision-Based Learning for a Real Robot in RoboCup - A Goal Keeping Behavior for a Robot with an Omnidirectional Vision and an Embedded Servoing - Sho ji Suzuki 1, Tatsunori Kato 1, Hiroshi

More information

Camera Calibration for Video See-Through Head-Mounted Display. Abstract. 1.0 Introduction. Mike Bajura July 7, 1993

Camera Calibration for Video See-Through Head-Mounted Display. Abstract. 1.0 Introduction. Mike Bajura July 7, 1993 Camera Calibration for Video See-Through Head-Mounted Display Mike Bajura July 7, 1993 Abstract This report describes a method for computing the parameters needed to model a television camera for video

More information

Development of Vision System on Humanoid Robot HRP-2

Development of Vision System on Humanoid Robot HRP-2 Development of Vision System on Humanoid Robot HRP-2 Yutaro Fukase Institute of Technology, Shimizu Corporation, Japan fukase@shimz.co.jp Junichiro Maeda Institute of Technology, Shimizu Corporation, Japan

More information

Image Formation. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

Image Formation. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania Image Formation Antonino Furnari Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania furnari@dmi.unict.it 18/03/2014 Outline Introduction; Geometric Primitives

More information

Three-Dimensional Measurement of Objects in Liquid with an Unknown Refractive Index Using Fisheye Stereo Camera

Three-Dimensional Measurement of Objects in Liquid with an Unknown Refractive Index Using Fisheye Stereo Camera Three-Dimensional Measurement of Objects in Liquid with an Unknown Refractive Index Using Fisheye Stereo Camera Kazuki Sakamoto, Alessandro Moro, Hiromitsu Fujii, Atsushi Yamashita, and Hajime Asama Abstract

More information

Outdoor Scene Reconstruction from Multiple Image Sequences Captured by a Hand-held Video Camera

Outdoor Scene Reconstruction from Multiple Image Sequences Captured by a Hand-held Video Camera Outdoor Scene Reconstruction from Multiple Image Sequences Captured by a Hand-held Video Camera Tomokazu Sato, Masayuki Kanbara and Naokazu Yokoya Graduate School of Information Science, Nara Institute

More information

Robot Vision: Camera calibration

Robot Vision: Camera calibration Robot Vision: Camera calibration Ass.Prof. Friedrich Fraundorfer SS 201 1 Outline Camera calibration Cameras with lenses Properties of real lenses (distortions, focal length, field-of-view) Calibration

More information

arxiv: v1 [cs.cv] 28 Sep 2018

arxiv: v1 [cs.cv] 28 Sep 2018 Camera Pose Estimation from Sequence of Calibrated Images arxiv:1809.11066v1 [cs.cv] 28 Sep 2018 Jacek Komorowski 1 and Przemyslaw Rokita 2 1 Maria Curie-Sklodowska University, Institute of Computer Science,

More information

Behavior Learning for a Mobile Robot with Omnidirectional Vision Enhanced by an Active Zoom Mechanism

Behavior Learning for a Mobile Robot with Omnidirectional Vision Enhanced by an Active Zoom Mechanism Behavior Learning for a Mobile Robot with Omnidirectional Vision Enhanced by an Active Zoom Mechanism Sho ji Suzuki, Tatsunori Kato, Minoru Asada, and Koh Hosoda Dept. of Adaptive Machine Systems, Graduate

More information

Precise Omnidirectional Camera Calibration

Precise Omnidirectional Camera Calibration Precise Omnidirectional Camera Calibration Dennis Strelow, Jeffrey Mishler, David Koes, and Sanjiv Singh Carnegie Mellon University {dstrelow, jmishler, dkoes, ssingh}@cs.cmu.edu Abstract Recent omnidirectional

More information

DESIGN AND TESTING OF MATHEMATICAL MODELS FOR A FULL-SPHERICAL CAMERA ON THE BASIS OF A ROTATING LINEAR ARRAY SENSOR AND A FISHEYE LENS

DESIGN AND TESTING OF MATHEMATICAL MODELS FOR A FULL-SPHERICAL CAMERA ON THE BASIS OF A ROTATING LINEAR ARRAY SENSOR AND A FISHEYE LENS DESIGN AND TESTING OF MATHEMATICAL MODELS FOR A FULL-SPHERICAL CAMERA ON THE BASIS OF A ROTATING LINEAR ARRAY SENSOR AND A FISHEYE LENS Danilo SCHNEIDER, Ellen SCHWALBE Institute of Photogrammetry and

More information

Chapter 3 Image Registration. Chapter 3 Image Registration

Chapter 3 Image Registration. Chapter 3 Image Registration Chapter 3 Image Registration Distributed Algorithms for Introduction (1) Definition: Image Registration Input: 2 images of the same scene but taken from different perspectives Goal: Identify transformation

More information

Camera Calibration with a Simulated Three Dimensional Calibration Object

Camera Calibration with a Simulated Three Dimensional Calibration Object Czech Pattern Recognition Workshop, Tomáš Svoboda (Ed.) Peršlák, Czech Republic, February 4, Czech Pattern Recognition Society Camera Calibration with a Simulated Three Dimensional Calibration Object Hynek

More information

Gesture Recognition using Temporal Templates with disparity information

Gesture Recognition using Temporal Templates with disparity information 8- MVA7 IAPR Conference on Machine Vision Applications, May 6-8, 7, Tokyo, JAPAN Gesture Recognition using Temporal Templates with disparity information Kazunori Onoguchi and Masaaki Sato Hirosaki University

More information

DEVELOPMENT OF POSITION MEASUREMENT SYSTEM FOR CONSTRUCTION PILE USING LASER RANGE FINDER

DEVELOPMENT OF POSITION MEASUREMENT SYSTEM FOR CONSTRUCTION PILE USING LASER RANGE FINDER S17- DEVELOPMENT OF POSITION MEASUREMENT SYSTEM FOR CONSTRUCTION PILE USING LASER RANGE FINDER Fumihiro Inoue 1 *, Takeshi Sasaki, Xiangqi Huang 3, and Hideki Hashimoto 4 1 Technica Research Institute,

More information

Part Images Formed by Flat Mirrors. This Chapter. Phys. 281B Geometric Optics. Chapter 2 : Image Formation. Chapter 2: Image Formation

Part Images Formed by Flat Mirrors. This Chapter. Phys. 281B Geometric Optics. Chapter 2 : Image Formation. Chapter 2: Image Formation Phys. 281B Geometric Optics This Chapter 3 Physics Department Yarmouk University 21163 Irbid Jordan 1- Images Formed by Flat Mirrors 2- Images Formed by Spherical Mirrors 3- Images Formed by Refraction

More information

Subpixel Corner Detection Using Spatial Moment 1)

Subpixel Corner Detection Using Spatial Moment 1) Vol.31, No.5 ACTA AUTOMATICA SINICA September, 25 Subpixel Corner Detection Using Spatial Moment 1) WANG She-Yang SONG Shen-Min QIANG Wen-Yi CHEN Xing-Lin (Department of Control Engineering, Harbin Institute

More information

2 DETERMINING THE VANISHING POINT LOCA- TIONS

2 DETERMINING THE VANISHING POINT LOCA- TIONS IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL.??, NO.??, DATE 1 Equidistant Fish-Eye Calibration and Rectiication by Vanishing Point Extraction Abstract In this paper we describe

More information

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG. Computer Vision Coordinates Prof. Flávio Cardeal DECOM / CEFET- MG cardeal@decom.cefetmg.br Abstract This lecture discusses world coordinates and homogeneous coordinates, as well as provides an overview

More information

Calibration of Lens Distortion for Super-Wide-Angle Stereo Vision

Calibration of Lens Distortion for Super-Wide-Angle Stereo Vision Calibration of Lens Distortion for Super-Wide-Angle Stereo Vision Hajime KAWANISHI, Yoshitaka HARA, Takashi TSUBOUCHI, Akihisa OHYA Graduate School of Systems and Information Engineering, University of

More information

Ground Plane Motion Parameter Estimation For Non Circular Paths

Ground Plane Motion Parameter Estimation For Non Circular Paths Ground Plane Motion Parameter Estimation For Non Circular Paths G.J.Ellwood Y.Zheng S.A.Billings Department of Automatic Control and Systems Engineering University of Sheffield, Sheffield, UK J.E.W.Mayhew

More information

Moving Object Detection Using a Stereo Camera Mounted on a Moving Platform

Moving Object Detection Using a Stereo Camera Mounted on a Moving Platform Special Issue on Smart Sensing SICE Journal of Control, Measurement, and System Integration, Vol 10, No 5, pp 344 349, September 2017 Moving Object Detection Using a Stereo Camera Mounted on a Moving Platform

More information

ROBUST LINE-BASED CALIBRATION OF LENS DISTORTION FROM A SINGLE VIEW

ROBUST LINE-BASED CALIBRATION OF LENS DISTORTION FROM A SINGLE VIEW ROBUST LINE-BASED CALIBRATION OF LENS DISTORTION FROM A SINGLE VIEW Thorsten Thormählen, Hellward Broszio, Ingolf Wassermann thormae@tnt.uni-hannover.de University of Hannover, Information Technology Laboratory,

More information

Calibrating an Overhead Video Camera

Calibrating an Overhead Video Camera Calibrating an Overhead Video Camera Raul Rojas Freie Universität Berlin, Takustraße 9, 495 Berlin, Germany http://www.fu-fighters.de Abstract. In this section we discuss how to calibrate an overhead video

More information

Project Title: Welding Machine Monitoring System Phase II. Name of PI: Prof. Kenneth K.M. LAM (EIE) Progress / Achievement: (with photos, if any)

Project Title: Welding Machine Monitoring System Phase II. Name of PI: Prof. Kenneth K.M. LAM (EIE) Progress / Achievement: (with photos, if any) Address: Hong Kong Polytechnic University, Phase 8, Hung Hom, Kowloon, Hong Kong. Telephone: (852) 3400 8441 Email: cnerc.steel@polyu.edu.hk Website: https://www.polyu.edu.hk/cnerc-steel/ Project Title:

More information

Employing a Fish-Eye for Scene Tunnel Scanning

Employing a Fish-Eye for Scene Tunnel Scanning Employing a Fish-Eye for Scene Tunnel Scanning Jiang Yu Zheng 1, Shigang Li 2 1 Dept. of Computer Science, Indiana University Purdue University Indianapolis, 723 W. Michigan St. Indianapolis, IN46202,

More information

Camera calibration for miniature, low-cost, wide-angle imaging systems

Camera calibration for miniature, low-cost, wide-angle imaging systems Camera calibration for miniature, low-cost, wide-angle imaging systems Oliver Frank, Roman Katz, Christel-Loic Tisse and Hugh Durrant-Whyte ARC Centre of Excellence for Autonomous Systems University of

More information

CSCI 5980: Assignment #3 Homography

CSCI 5980: Assignment #3 Homography Submission Assignment due: Feb 23 Individual assignment. Write-up submission format: a single PDF up to 3 pages (more than 3 page assignment will be automatically returned.). Code and data. Submission

More information

Flexible Calibration of a Portable Structured Light System through Surface Plane

Flexible Calibration of a Portable Structured Light System through Surface Plane Vol. 34, No. 11 ACTA AUTOMATICA SINICA November, 2008 Flexible Calibration of a Portable Structured Light System through Surface Plane GAO Wei 1 WANG Liang 1 HU Zhan-Yi 1 Abstract For a portable structured

More information

Calibration of Fish-Eye Stereo Camera for Aurora Observation

Calibration of Fish-Eye Stereo Camera for Aurora Observation Calibration of Fish-Eye Stereo Camera for Aurora Observation Yoshiki Mori 1, Atsushi Yamashita 2, Masayuki Tanaka 3, Ryuho Kataoka 3 Yoshizumi Miyoshi 4, Toru Kaneko 1, Masatoshi Okutomi 3, Hajime Asama

More information

Calibration method for a central catadioptric-perspective camera system

Calibration method for a central catadioptric-perspective camera system 514 J. Opt. Soc. Am. A / Vol. 9, No. 11 / November 01 He et al. Calibration method for a central catadioptric-perspective camera system Bingwei He, 1, * Zhipeng Chen, 1 and Youfu Li 1 School of Mechanical

More information

A New Method and Toolbox for Easily Calibrating Omnidirectional Cameras

A New Method and Toolbox for Easily Calibrating Omnidirectional Cameras A ew Method and Toolbox for Easily Calibrating Omnidirectional Cameras Davide Scaramuzza 1 and Roland Siegwart 1 1 Swiss Federal Institute of Technology Zurich (ETHZ) Autonomous Systems Lab, CLA-E, Tannenstrasse

More information

Catadioptric camera model with conic mirror

Catadioptric camera model with conic mirror LÓPEZ-NICOLÁS, SAGÜÉS: CATADIOPTRIC CAMERA MODEL WITH CONIC MIRROR Catadioptric camera model with conic mirror G. López-Nicolás gonlopez@unizar.es C. Sagüés csagues@unizar.es Instituto de Investigación

More information

Vision Review: Image Formation. Course web page:

Vision Review: Image Formation. Course web page: Vision Review: Image Formation Course web page: www.cis.udel.edu/~cer/arv September 10, 2002 Announcements Lecture on Thursday will be about Matlab; next Tuesday will be Image Processing The dates some

More information

Stereo Image Rectification for Simple Panoramic Image Generation

Stereo Image Rectification for Simple Panoramic Image Generation Stereo Image Rectification for Simple Panoramic Image Generation Yun-Suk Kang and Yo-Sung Ho Gwangju Institute of Science and Technology (GIST) 261 Cheomdan-gwagiro, Buk-gu, Gwangju 500-712 Korea Email:{yunsuk,

More information

Equidistant Fish-Eye Calibration and Rectification by Vanishing Point Extraction

Equidistant Fish-Eye Calibration and Rectification by Vanishing Point Extraction IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 32, NO. X, XXXXXXX 2010 1 Equidistant Fish-Eye Calibration and Rectification by Vanishing Point Extraction Ciarán Hughes, Member, IEEE,

More information

Jump Stitch Metadata & Depth Maps Version 1.1

Jump Stitch Metadata & Depth Maps Version 1.1 Jump Stitch Metadata & Depth Maps Version 1.1 jump-help@google.com Contents 1. Introduction 1 2. Stitch Metadata File Format 2 3. Coverage Near the Poles 4 4. Coordinate Systems 6 5. Camera Model 6 6.

More information

DEVELOPMENT OF REAL TIME 3-D MEASUREMENT SYSTEM USING INTENSITY RATIO METHOD

DEVELOPMENT OF REAL TIME 3-D MEASUREMENT SYSTEM USING INTENSITY RATIO METHOD DEVELOPMENT OF REAL TIME 3-D MEASUREMENT SYSTEM USING INTENSITY RATIO METHOD Takeo MIYASAKA and Kazuo ARAKI Graduate School of Computer and Cognitive Sciences, Chukyo University, Japan miyasaka@grad.sccs.chukto-u.ac.jp,

More information

Occluded Facial Expression Tracking

Occluded Facial Expression Tracking Occluded Facial Expression Tracking Hugo Mercier 1, Julien Peyras 2, and Patrice Dalle 1 1 Institut de Recherche en Informatique de Toulouse 118, route de Narbonne, F-31062 Toulouse Cedex 9 2 Dipartimento

More information

Estimation of Camera Motion with Feature Flow Model for 3D Environment Modeling by Using Omni-Directional Camera

Estimation of Camera Motion with Feature Flow Model for 3D Environment Modeling by Using Omni-Directional Camera Estimation of Camera Motion with Feature Flow Model for 3D Environment Modeling by Using Omni-Directional Camera Ryosuke Kawanishi, Atsushi Yamashita and Toru Kaneko Abstract Map information is important

More information

Outline. ETN-FPI Training School on Plenoptic Sensing

Outline. ETN-FPI Training School on Plenoptic Sensing Outline Introduction Part I: Basics of Mathematical Optimization Linear Least Squares Nonlinear Optimization Part II: Basics of Computer Vision Camera Model Multi-Camera Model Multi-Camera Calibration

More information

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation Obviously, this is a very slow process and not suitable for dynamic scenes. To speed things up, we can use a laser that projects a vertical line of light onto the scene. This laser rotates around its vertical

More information

REFINEMENT OF COLORED MOBILE MAPPING DATA USING INTENSITY IMAGES

REFINEMENT OF COLORED MOBILE MAPPING DATA USING INTENSITY IMAGES REFINEMENT OF COLORED MOBILE MAPPING DATA USING INTENSITY IMAGES T. Yamakawa a, K. Fukano a,r. Onodera a, H. Masuda a, * a Dept. of Mechanical Engineering and Intelligent Systems, The University of Electro-Communications,

More information

Perspective Projection Describes Image Formation Berthold K.P. Horn

Perspective Projection Describes Image Formation Berthold K.P. Horn Perspective Projection Describes Image Formation Berthold K.P. Horn Wheel Alignment: Camber, Caster, Toe-In, SAI, Camber: angle between axle and horizontal plane. Toe: angle between projection of axle

More information

ADS40 Calibration & Verification Process. Udo Tempelmann*, Ludger Hinsken**, Utz Recke*

ADS40 Calibration & Verification Process. Udo Tempelmann*, Ludger Hinsken**, Utz Recke* ADS40 Calibration & Verification Process Udo Tempelmann*, Ludger Hinsken**, Utz Recke* *Leica Geosystems GIS & Mapping GmbH, Switzerland **Ludger Hinsken, Author of ORIMA, Konstanz, Germany Keywords: ADS40,

More information

(Geometric) Camera Calibration

(Geometric) Camera Calibration (Geoetric) Caera Calibration CS635 Spring 217 Daniel G. Aliaga Departent of Coputer Science Purdue University Caera Calibration Caeras and CCDs Aberrations Perspective Projection Calibration Caeras First

More information

Accurate and Dense Wide-Baseline Stereo Matching Using SW-POC

Accurate and Dense Wide-Baseline Stereo Matching Using SW-POC Accurate and Dense Wide-Baseline Stereo Matching Using SW-POC Shuji Sakai, Koichi Ito, Takafumi Aoki Graduate School of Information Sciences, Tohoku University, Sendai, 980 8579, Japan Email: sakai@aoki.ecei.tohoku.ac.jp

More information

Centre for Digital Image Measurement and Analysis, School of Engineering, City University, Northampton Square, London, ECIV OHB

Centre for Digital Image Measurement and Analysis, School of Engineering, City University, Northampton Square, London, ECIV OHB HIGH ACCURACY 3-D MEASUREMENT USING MULTIPLE CAMERA VIEWS T.A. Clarke, T.J. Ellis, & S. Robson. High accuracy measurement of industrially produced objects is becoming increasingly important. The techniques

More information

DERIVING PEDESTRIAN POSITIONS FROM UNCALIBRATED VIDEOS

DERIVING PEDESTRIAN POSITIONS FROM UNCALIBRATED VIDEOS DERIVING PEDESTRIAN POSITIONS FROM UNCALIBRATED VIDEOS Zoltan Koppanyi, Post-Doctoral Researcher Charles K. Toth, Research Professor The Ohio State University 2046 Neil Ave Mall, Bolz Hall Columbus, OH,

More information

A COMPREHENSIVE SIMULATION SOFTWARE FOR TEACHING CAMERA CALIBRATION

A COMPREHENSIVE SIMULATION SOFTWARE FOR TEACHING CAMERA CALIBRATION XIX IMEKO World Congress Fundamental and Applied Metrology September 6 11, 2009, Lisbon, Portugal A COMPREHENSIVE SIMULATION SOFTWARE FOR TEACHING CAMERA CALIBRATION David Samper 1, Jorge Santolaria 1,

More information

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah Camera Models and Image Formation Srikumar Ramalingam School of Computing University of Utah srikumar@cs.utah.edu VisualFunHouse.com 3D Street Art Image courtesy: Julian Beaver (VisualFunHouse.com) 3D

More information

Partial Calibration and Mirror Shape Recovery for Non-Central Catadioptric Systems

Partial Calibration and Mirror Shape Recovery for Non-Central Catadioptric Systems Partial Calibration and Mirror Shape Recovery for Non-Central Catadioptric Systems Abstract In this paper we present a method for mirror shape recovery and partial calibration for non-central catadioptric

More information

CIS 580, Machine Perception, Spring 2015 Homework 1 Due: :59AM

CIS 580, Machine Perception, Spring 2015 Homework 1 Due: :59AM CIS 580, Machine Perception, Spring 2015 Homework 1 Due: 2015.02.09. 11:59AM Instructions. Submit your answers in PDF form to Canvas. This is an individual assignment. 1 Camera Model, Focal Length and

More information

METRIC PLANE RECTIFICATION USING SYMMETRIC VANISHING POINTS

METRIC PLANE RECTIFICATION USING SYMMETRIC VANISHING POINTS METRIC PLANE RECTIFICATION USING SYMMETRIC VANISHING POINTS M. Lefler, H. Hel-Or Dept. of CS, University of Haifa, Israel Y. Hel-Or School of CS, IDC, Herzliya, Israel ABSTRACT Video analysis often requires

More information

A Study on the Distortion Correction Methodology of Vision Sensor

A Study on the Distortion Correction Methodology of Vision Sensor , July 2-4, 2014, London, U.K. A Study on the Distortion Correction Methodology of Vision Sensor Younghoon Kho, Yongjin (James) Kwon 1 Abstract This study investigates a simple and effective vision calibration

More information

EE368 Project: Visual Code Marker Detection

EE368 Project: Visual Code Marker Detection EE368 Project: Visual Code Marker Detection Kahye Song Group Number: 42 Email: kahye@stanford.edu Abstract A visual marker detection algorithm has been implemented and tested with twelve training images.

More information

Expanding gait identification methods from straight to curved trajectories

Expanding gait identification methods from straight to curved trajectories Expanding gait identification methods from straight to curved trajectories Yumi Iwashita, Ryo Kurazume Kyushu University 744 Motooka Nishi-ku Fukuoka, Japan yumi@ieee.org Abstract Conventional methods

More information

ECE Digital Image Processing and Introduction to Computer Vision. Outline

ECE Digital Image Processing and Introduction to Computer Vision. Outline ECE592-064 Digital Image Processing and Introduction to Computer Vision Depart. of ECE, NC State University Instructor: Tianfu (Matt) Wu Spring 2017 1. Recap Outline 2. Modeling Projection and Projection

More information

Realtime Omnidirectional Stereo for Obstacle Detection and Tracking in Dynamic Environments

Realtime Omnidirectional Stereo for Obstacle Detection and Tracking in Dynamic Environments Proc. 2001 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems pp. 31-36, Maui, Hawaii, Oct./Nov. 2001. Realtime Omnidirectional Stereo for Obstacle Detection and Tracking in Dynamic Environments Hiroshi

More information

Dense 3-D Reconstruction of an Outdoor Scene by Hundreds-baseline Stereo Using a Hand-held Video Camera

Dense 3-D Reconstruction of an Outdoor Scene by Hundreds-baseline Stereo Using a Hand-held Video Camera Dense 3-D Reconstruction of an Outdoor Scene by Hundreds-baseline Stereo Using a Hand-held Video Camera Tomokazu Satoy, Masayuki Kanbaray, Naokazu Yokoyay and Haruo Takemuraz ygraduate School of Information

More information

calibrated coordinates Linear transformation pixel coordinates

calibrated coordinates Linear transformation pixel coordinates 1 calibrated coordinates Linear transformation pixel coordinates 2 Calibration with a rig Uncalibrated epipolar geometry Ambiguities in image formation Stratified reconstruction Autocalibration with partial

More information

Geometric Accuracy Evaluation, DEM Generation and Validation for SPOT-5 Level 1B Stereo Scene

Geometric Accuracy Evaluation, DEM Generation and Validation for SPOT-5 Level 1B Stereo Scene Geometric Accuracy Evaluation, DEM Generation and Validation for SPOT-5 Level 1B Stereo Scene Buyuksalih, G.*, Oruc, M.*, Topan, H.*,.*, Jacobsen, K.** * Karaelmas University Zonguldak, Turkey **University

More information

Fully Automatic Endoscope Calibration for Intraoperative Use

Fully Automatic Endoscope Calibration for Intraoperative Use Fully Automatic Endoscope Calibration for Intraoperative Use Christian Wengert, Mireille Reeff, Philippe C. Cattin, Gábor Székely Computer Vision Laboratory, ETH Zurich, 8092 Zurich, Switzerland {wengert,

More information

Projector Calibration for Pattern Projection Systems

Projector Calibration for Pattern Projection Systems Projector Calibration for Pattern Projection Systems I. Din *1, H. Anwar 2, I. Syed 1, H. Zafar 3, L. Hasan 3 1 Department of Electronics Engineering, Incheon National University, Incheon, South Korea.

More information

Tracking Under Low-light Conditions Using Background Subtraction

Tracking Under Low-light Conditions Using Background Subtraction Tracking Under Low-light Conditions Using Background Subtraction Matthew Bennink Clemson University Clemson, South Carolina Abstract A low-light tracking system was developed using background subtraction.

More information

A Low Power, High Throughput, Fully Event-Based Stereo System: Supplementary Documentation

A Low Power, High Throughput, Fully Event-Based Stereo System: Supplementary Documentation A Low Power, High Throughput, Fully Event-Based Stereo System: Supplementary Documentation Alexander Andreopoulos, Hirak J. Kashyap, Tapan K. Nayak, Arnon Amir, Myron D. Flickner IBM Research March 25,

More information

Mirror Based Framework for Human Body Measurement

Mirror Based Framework for Human Body Measurement 362 Mirror Based Framework for Human Body Measurement 1 Takeshi Hashimoto, 2 Takayuki Suzuki, 3 András Rövid 1 Dept. of Electrical and Electronics Engineering, Shizuoka University 5-1, 3-chome Johoku,

More information

Camera Parameters Estimation from Hand-labelled Sun Sositions in Image Sequences

Camera Parameters Estimation from Hand-labelled Sun Sositions in Image Sequences Camera Parameters Estimation from Hand-labelled Sun Sositions in Image Sequences Jean-François Lalonde, Srinivasa G. Narasimhan and Alexei A. Efros {jlalonde,srinivas,efros}@cs.cmu.edu CMU-RI-TR-8-32 July

More information

CALIBRATION BETWEEN DEPTH AND COLOR SENSORS FOR COMMODITY DEPTH CAMERAS. Cha Zhang and Zhengyou Zhang

CALIBRATION BETWEEN DEPTH AND COLOR SENSORS FOR COMMODITY DEPTH CAMERAS. Cha Zhang and Zhengyou Zhang CALIBRATION BETWEEN DEPTH AND COLOR SENSORS FOR COMMODITY DEPTH CAMERAS Cha Zhang and Zhengyou Zhang Communication and Collaboration Systems Group, Microsoft Research {chazhang, zhang}@microsoft.com ABSTRACT

More information

Calibration of a rotating multi-beam Lidar

Calibration of a rotating multi-beam Lidar The 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems October 18-22, 2010, Taipei, Taiwan Calibration of a rotating multi-beam Lidar Naveed Muhammad 1,2 and Simon Lacroix 1,2 Abstract

More information

PANORAMA-BASED CAMERA CALIBRATION

PANORAMA-BASED CAMERA CALIBRATION PANORAMA-BASED CAMERA CALIBRATION Bertrand Cannelle, Nicolas Paparoditis, Olivier Tournaire Université Paris-Est, Institut Géographique National, MATIS Laboratory 73 Avenue de Paris, 94165 Saint-Mandé

More information

Calibration of a fish eye lens with field of view larger than 180

Calibration of a fish eye lens with field of view larger than 180 CENTER FOR MACHINE PERCEPTION CZECH TECHNICAL UNIVERSITY Calibration of a fish eye lens with field of view larger than 18 Hynek Bakstein and Tomáš Pajdla {bakstein, pajdla}@cmp.felk.cvut.cz REPRINT Hynek

More information

3D-OBJECT DETECTION METHOD BASED ON THE STEREO IMAGE TRANSFORMATION TO THE COMMON OBSERVATION POINT

3D-OBJECT DETECTION METHOD BASED ON THE STEREO IMAGE TRANSFORMATION TO THE COMMON OBSERVATION POINT 3D-OBJECT DETECTION METHOD BASED ON THE STEREO IMAGE TRANSFORMATION TO THE COMMON OBSERVATION POINT V. M. Lisitsyn *, S. V. Tikhonova ** State Research Institute of Aviation Systems, Moscow, Russia * lvm@gosniias.msk.ru

More information

Evaluating Measurement Error of a 3D Movable Body Scanner for Calibration

Evaluating Measurement Error of a 3D Movable Body Scanner for Calibration Evaluating Measurement Error of a 3D Movable Body Scanner for Calibration YU-CHENG LIN Department of Industrial Engingeering and Management Overseas Chinese University No. 100 Chiaokwang Road, 408, Taichung

More information

Self-calibration of an On-Board Stereo-vision System for Driver Assistance Systems

Self-calibration of an On-Board Stereo-vision System for Driver Assistance Systems 4-1 Self-calibration of an On-Board Stereo-vision System for Driver Assistance Systems Juan M. Collado, Cristina Hilario, Arturo de la Escalera and Jose M. Armingol Intelligent Systems Lab Department of

More information

A Simple Interface for Mobile Robot Equipped with Single Camera using Motion Stereo Vision

A Simple Interface for Mobile Robot Equipped with Single Camera using Motion Stereo Vision A Simple Interface for Mobile Robot Equipped with Single Camera using Motion Stereo Vision Stephen Karungaru, Atsushi Ishitani, Takuya Shiraishi, and Minoru Fukumi Abstract Recently, robot technology has

More information

Improving Vision-Based Distance Measurements using Reference Objects

Improving Vision-Based Distance Measurements using Reference Objects Improving Vision-Based Distance Measurements using Reference Objects Matthias Jüngel, Heinrich Mellmann, and Michael Spranger Humboldt-Universität zu Berlin, Künstliche Intelligenz Unter den Linden 6,

More information

A motion planning method for mobile robot considering rotational motion in area coverage task

A motion planning method for mobile robot considering rotational motion in area coverage task Asia Pacific Conference on Robot IoT System Development and Platform 018 (APRIS018) A motion planning method for mobile robot considering rotational motion in area coverage task Yano Taiki 1,a) Takase

More information

Automatic free parking space detection by using motion stereo-based 3D reconstruction

Automatic free parking space detection by using motion stereo-based 3D reconstruction Automatic free parking space detection by using motion stereo-based 3D reconstruction Computer Vision Lab. Jae Kyu Suhr Contents Introduction Fisheye lens calibration Corresponding point extraction 3D

More information

Photometric Stereo with Auto-Radiometric Calibration

Photometric Stereo with Auto-Radiometric Calibration Photometric Stereo with Auto-Radiometric Calibration Wiennat Mongkulmann Takahiro Okabe Yoichi Sato Institute of Industrial Science, The University of Tokyo {wiennat,takahiro,ysato} @iis.u-tokyo.ac.jp

More information

3D DEFORMATION MEASUREMENT USING STEREO- CORRELATION APPLIED TO EXPERIMENTAL MECHANICS

3D DEFORMATION MEASUREMENT USING STEREO- CORRELATION APPLIED TO EXPERIMENTAL MECHANICS 3D DEFORMATION MEASUREMENT USING STEREO- CORRELATION APPLIED TO EXPERIMENTAL MECHANICS Dorian Garcia, Jean-José Orteu École des Mines d Albi, F-81013 ALBI CT Cedex 09, France Dorian.Garcia@enstimac.fr,

More information

NAME VCamera camera model representation

NAME VCamera camera model representation NAME VCamera camera model representation SYNOPSIS #include void VRegisterCameraType (void); extern VRepnKind VCameraRepn; VCamera camera; ld... -lvcam -lvista -lm -lgcc... DESCRIPTION

More information

Homogeneous Coordinates. Lecture18: Camera Models. Representation of Line and Point in 2D. Cross Product. Overall scaling is NOT important.

Homogeneous Coordinates. Lecture18: Camera Models. Representation of Line and Point in 2D. Cross Product. Overall scaling is NOT important. Homogeneous Coordinates Overall scaling is NOT important. CSED44:Introduction to Computer Vision (207F) Lecture8: Camera Models Bohyung Han CSE, POSTECH bhhan@postech.ac.kr (",, ) ()", ), )) ) 0 It is

More information

Omni Stereo Vision of Cooperative Mobile Robots

Omni Stereo Vision of Cooperative Mobile Robots Omni Stereo Vision of Cooperative Mobile Robots Zhigang Zhu*, Jizhong Xiao** *Department of Computer Science **Department of Electrical Engineering The City College of the City University of New York (CUNY)

More information

Intrinsic and Extrinsic Camera Parameter Estimation with Zoomable Camera for Augmented Reality

Intrinsic and Extrinsic Camera Parameter Estimation with Zoomable Camera for Augmented Reality Intrinsic and Extrinsic Camera Parameter Estimation with Zoomable Camera for Augmented Reality Kazuya Okada, Takafumi Taketomi, Goshiro Yamamoto, Jun Miyazaki, Hirokazu Kato Nara Institute of Science and

More information

Miniature faking. In close-up photo, the depth of field is limited.

Miniature faking. In close-up photo, the depth of field is limited. Miniature faking In close-up photo, the depth of field is limited. http://en.wikipedia.org/wiki/file:jodhpur_tilt_shift.jpg Miniature faking Miniature faking http://en.wikipedia.org/wiki/file:oregon_state_beavers_tilt-shift_miniature_greg_keene.jpg

More information

Optic Flow and Basics Towards Horn-Schunck 1

Optic Flow and Basics Towards Horn-Schunck 1 Optic Flow and Basics Towards Horn-Schunck 1 Lecture 7 See Section 4.1 and Beginning of 4.2 in Reinhard Klette: Concise Computer Vision Springer-Verlag, London, 2014 1 See last slide for copyright information.

More information

INDOOR PTZ CAMERA CALIBRATION WITH CONCURRENT PT AXES

INDOOR PTZ CAMERA CALIBRATION WITH CONCURRENT PT AXES INDOOR PTZ CAMERA CALIBRATION WITH CONCURRENT PT AXES Jordi Sanchez-Riera, Jordi Salvador, Josep R. Casas Image Processing Group, UPC Technical University of Catalonia Jordi Girona 1-3, edifici D5, 08034

More information

(a) (b) (c) Fig. 1. Omnidirectional camera: (a) principle; (b) physical construction; (c) captured. of a local vision system is more challenging than

(a) (b) (c) Fig. 1. Omnidirectional camera: (a) principle; (b) physical construction; (c) captured. of a local vision system is more challenging than An Omnidirectional Vision System that finds and tracks color edges and blobs Felix v. Hundelshausen, Sven Behnke, and Raul Rojas Freie Universität Berlin, Institut für Informatik Takustr. 9, 14195 Berlin,

More information