Calibration of a Different Field-of-view Stereo Camera System using an Embedded Checkerboard Pattern

Similar documents
Stereo Image Rectification for Simple Panoramic Image Generation

Flexible Calibration of a Portable Structured Light System through Surface Plane

Catadioptric camera model with conic mirror

arxiv: v1 [cs.cv] 28 Sep 2018

Projector Calibration for Pattern Projection Systems

MERGING POINT CLOUDS FROM MULTIPLE KINECTS. Nishant Rai 13th July, 2016 CARIS Lab University of British Columbia

Partial Calibration and Mirror Shape Recovery for Non-Central Catadioptric Systems

arxiv: v1 [cs.cv] 28 Sep 2018

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Camera Calibration for a Robust Omni-directional Photogrammetry System

Camera calibration for miniature, low-cost, wide-angle imaging systems

Easy to Use Calibration of Multiple Camera Setups

Image Transformations & Camera Calibration. Mašinska vizija, 2018.

DEVELOPMENT OF A ROBUST IMAGE MOSAICKING METHOD FOR SMALL UNMANNED AERIAL VEHICLE

3D Geometry and Camera Calibration

arxiv: v1 [cs.cv] 19 Sep 2017

Visual Tracking of Planes with an Uncalibrated Central Catadioptric Camera

Project Title: Welding Machine Monitoring System Phase II. Name of PI: Prof. Kenneth K.M. LAM (EIE) Progress / Achievement: (with photos, if any)

calibrated coordinates Linear transformation pixel coordinates

Rectification and Distortion Correction

A novel point matching method for stereovision measurement using RANSAC affine transformation

Pattern Feature Detection for Camera Calibration Using Circular Sample

Infrared Camera Calibration in the 3D Temperature Field Reconstruction

Partial Calibration and Mirror Shape Recovery for Non-Central Catadioptric Systems

LUMS Mine Detector Project

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.

CS201 Computer Vision Camera Geometry

Rectification and Disparity

Adaptive Zoom Distance Measuring System of Camera Based on the Ranging of Binocular Vision

CHAPTER 3 DISPARITY AND DEPTH MAP COMPUTATION

Homographies and RANSAC

Image Formation. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

Vision Review: Image Formation. Course web page:

COMPARATIVE STUDY OF DIFFERENT APPROACHES FOR EFFICIENT RECTIFICATION UNDER GENERAL MOTION

Efficient Stereo Image Rectification Method Using Horizontal Baseline

Image Rectification (Stereo) (New book: 7.2.1, old book: 11.1)

A New Method and Toolbox for Easily Calibrating Omnidirectional Cameras

Accurate Motion Estimation and High-Precision 3D Reconstruction by Sensor Fusion

Fisheye Camera s Intrinsic Parameter Estimation Using Trajectories of Feature Points Obtained from Camera Rotation

Monocular Vision Based Autonomous Navigation for Arbitrarily Shaped Urban Roads

Visual Odometry for Non-Overlapping Views Using Second-Order Cone Programming

1 Projective Geometry

Calibration of a fish eye lens with field of view larger than 180

Rigid Body Motion and Image Formation. Jana Kosecka, CS 482

MR-Mirror: A Complex of Real and Virtual Mirrors

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 253

SELF-CALIBRATION OF CENTRAL CAMERAS BY MINIMIZING ANGULAR ERROR

Omnivergent Stereo-panoramas with a Fish-eye Lens

ROBUST LINE-BASED CALIBRATION OF LENS DISTORTION FROM A SINGLE VIEW

Computer Vision Projective Geometry and Calibration. Pinhole cameras

Step-by-Step Model Buidling

Automatic Detection of Checkerboards on Blurred and Distorted Images

Precise Omnidirectional Camera Calibration

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 263

Fast Natural Feature Tracking for Mobile Augmented Reality Applications

EXAM SOLUTIONS. Image Processing and Computer Vision Course 2D1421 Monday, 13 th of March 2006,

Absolute Scale Structure from Motion Using a Refractive Plate

Stereo SLAM. Davide Migliore, PhD Department of Electronics and Information, Politecnico di Milano, Italy

Camera Model and Calibration. Lecture-12

Scene Reconstruction from Uncontrolled Motion using a Low Cost 3D Sensor

Mosaics. Today s Readings

SRI Small Vision System

Camera Pose Measurement from 2D-3D Correspondences of Three Z Shaped Lines

Epipolar Geometry in Stereo, Motion and Object Recognition

IMPACT OF SUBPIXEL PARADIGM ON DETERMINATION OF 3D POSITION FROM 2D IMAGE PAIR Lukas Sroba, Rudolf Ravas

DERIVING PEDESTRIAN POSITIONS FROM UNCALIBRATED VIDEOS

Robot Vision: Camera calibration

Surround Structured Lighting for Full Object Scanning

Mathematics of a Multiple Omni-Directional System

CSCI 5980: Assignment #3 Homography

Computer Vision Lecture 17

Computer Vision Lecture 17

CS5670: Computer Vision

25 The vibration spiral

CS6670: Computer Vision

An Embedded Calibration Stereovision System

A Stereo Machine Vision System for. displacements when it is subjected to elasticplastic

Rectification. Dr. Gerhard Roth

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman

Image formation. Thanks to Peter Corke and Chuck Dyer for the use of some slides

Outline. ETN-FPI Training School on Plenoptic Sensing

CS664 Lecture #19: Layers, RANSAC, panoramas, epipolar geometry

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah

Scalable geometric calibration for multi-view camera arrays

Using RANSAC for Omnidirectional Camera Model Fitting

Closing the Loop in Appearance-Guided Structure-from-Motion for Omnidirectional Cameras

A 3D Pattern for Post Estimation for Object Capture

Cameras and Stereo CSE 455. Linda Shapiro

Wide-Angle Lens Distortion Correction Using Division Models

We are IntechOpen, the first native scientific publisher of Open Access books. International authors and editors. Our authors are among the TOP 1%

Announcements. Stereo

Camera Calibration for Video See-Through Head-Mounted Display. Abstract. 1.0 Introduction. Mike Bajura July 7, 1993

Camera Model and Calibration

Perception II: Pinhole camera and Stereo Vision

THE POSITION AND ORIENTATION MEASUREMENT OF GONDOLA USING A VISUAL CAMERA

Epipolar Geometry and Stereo Vision

Stereoscopic Vision System for reconstruction of 3D objects

Stereo and Epipolar geometry

Wide-Baseline Stereo Vision for Mars Rovers

Transcription:

Calibration of a Different Field-of-view Stereo Camera System using an Embedded Checkerboard Pattern Pathum Rathnayaka, Seung-Hae Baek and Soon-Yong Park School of Computer Science and Engineering, Kyungpook National University, Daegu, South Korea bandarapathum@yahoo.com, eardrops@naver.com, sypark@knu.ac.kr Keywords: Abstract: Heterogeneous Camera System, Field-of-view, Stereo, Camera Calibration, Relative Pose Estimation, Embedded Checkerboard Pattern, Image Rectification. Knowing the correct relative pose between cameras is considered as the first and foremost important step in a stereo camera system. It has been of the interest in many computer vision related experiments. Much work has been introduced for stereo systems with relatively common field-of-views; where a few number of advanced feature points-based methods have been presented for partially overlapping field-of-view systems. In this paper, we propose a new, yet simplified, method to calibrate a partially overlapping field-of-view heterogeneous stereo camera system using a specially designed embedded planar checkerboard pattern. The embedded pattern is a combination of two differently colored planar patterns with different checker sizes. The heterogeneous camera system comprises a lower focal length wide-angle camera and a higher focal length conventional narrow-angle camera. Relative pose between the cameras is calculated by multiplying transformation matrices. Our proposed method becomes a decent alternative to many advanced feature-based techniques. We show the robustness of our method through re-projection error and comparing point difference values in Y axis in image rectification results. 1 INTRODUCTION AND PREVIOUS WORK The process of camera calibration has been of interest in computer vision field for many a year. Many advanced methods have been introduced not only for mono camera calibration but also for stereo camera setups. A majority of these introduced stereo camera calibration methods benefited from the idea of having relatively common or overlapping field-of-views. Conventional stereo cameras with similar specifications, such as identical focal lengths are used to fulfill this criterion. On the other hand; the popularity of wide-angle lenses such as fish-eye cameras have started to increase as they hold wider field-of-views compared to conventional cameras. Barreto and Daniilidis have proposed a factorization approach without non-linear minimization to estimate the relative pose between multiple wide field-of-view cameras (Barreto and Daniilidis, 2004). As mentioned in (Micusik and Pajdla, 2006) a RANdom SAmple Consensus (RAN- SAC (Fischler and Bolles, 1981)) based polynomial eigenvalue method was introduced to estimate the relative pose of wide field-of-view cameras. Lhuillier presented another similar approach (Lhuillier, 2008) to the method mentioned in (Micusik and Pajdla, 2006), where a central model was first applied to estimate the camera geometry, which followed by applying a non-central model and a decoupling orientation-translation to identify the transformation. Antipodal epipolar constraint was used in the method introduced by Lim et al. (Lim et al., 2010) to estimate the pose of such wide field-of-view cameras. Apart from that, some optical flow estimation approaches have been introduced as cited in (Scaramuzza et al., 2006). The drawback of these approaches is that they heavily depend on point features. Proper detection of these sensitive features is hard due to the irregular resolutions and lens distortions, particularly available in wide field-of-view cameras. Once these wide field-ofview cameras are combined with conventional narrow field-of-view cameras, estimating relative pose using rich feature points could no longer be a feasible approach. In this paper, we present a flexible stereo calibration method for a different or partially-overlapping field-of-view heterogeneous stereo camera system. The system consists of a lower focal length wide- 294 Rathnayaka P., Baek S. and Park S. Calibration of a Different Field-of-view Stereo Camera System using an Embedded Checkerboard Pattern. DOI: 10.5220/0006267802940300 In Proceedings of the 12th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2017), pages 294-300 ISBN: 978-989-758-225-7 Copyright c 2017 by SCITEPRESS Science and Technology Publications, Lda. All rights reserved

Calibration of a Different Field-of-view Stereo Camera System using an Embedded Checkerboard Pattern General Checker pattern Wide angle camera Narrow angle camera Figure 2: Specially designed embedded planar checkerboard pattern. It is a combination of two differently colored planar patterns. The outer pattern consists of red-blue checker patterns, where the inner pattern consists of red-yellowblue-cyan checker patterns. The square size ratio between outer pattern and inner pattern is 2:1. Figure 1: Example of how conventional checkerboard pattern is visible in two different field-of-view cameras. Left camera represents the wide field-of-view fish eye lens camera, where right camera represents the narrow field-ofview conventional camera. Since the field-of-views are different, visible areas of the checkerboard pattern are also different. Example of real image sequence. angle (fish eye lens) camera and a higher focal length narrow-angle camera. As in conventional camera calibration methods ((Zhang, 2000; Wei and Ma, 1993; Yu and Wang, 2006; Kwon et al., 2007)), our proposed method also requires the cameras to observe a planar checkerboard pattern in different orientations. Since the field-of-views between two cameras are different, using the conventional black-white checkerboard pattern was not feasible, and we used a specially designed embedded checkerboard pattern. The structure of the paper is as follows: Section 2, consists of overall preliminaries of the proposed method. This includes information about the formation of the embedded planar checkerboard pattern we used. Section 3 presents our proposed camera calibration method. In section 4; we describe some validation experiment results using real image sequences taken in different scenarios, and make Comparisons between rectification results. We draw conclusions and future work in section 5. Red-Blue Pattern Black-Green Pattern Embedded Pattern Figure 3: The method of generating the embedded planar checkerboard pattern. An 7 10 outer checkerboard pattern colored with red-blue checkers is combined with a 6 8 inner checkerboard pattern with black-green checkers. The idea of mixing outer and inner checkerboard patterns to generate different colors appear in the inner checkerboard pattern. 2 PRELIMINARIES The conventional method of showing a black-white planar checkerboard pattern is shown in Figure 1. Due to different field-of-views of stereo cameras, finding common areas of the checkerboard pattern in image sequences becomes more difficult. If the checkerboard pattern images are captured in a very close distance to cameras, many straight forward stereo camera calibrating systems become more hardly possible. Considering it as a fact, we decided to develop an embedded type checkerboard pattern by combining two different sized planar checkerboard patterns (Figure 2). The embedded pattern consists of an 7 10 outer checkerboard pattern (colored in red-blue checkers) 295

VISAPP 2017 - International Conference on Computer Vision Theory and Applications Figure 4: Two different field-of-view cameras see the embedded color checkerboard pattern. Left is the wide field-of-view fish eye lens camera, right is the narrow field-of-view conventional camera. Even though the view ranges are different, full independent checkerboard patterns are visible to both the cameras in continuous image sequences. An example of how full independent checkerboard patterns are seen in both cameras. Left image is the full image seen by the left camera, right image is the full image seen by the right camera. Even though the right image is only a certain part of the left image, but still the whole inner checkerboard pattern is visible. This is the idea of using two different checkerboard patterns. The new stereo camera calibration method is described in the upcoming sections. In our method, we first perform individual mono calibrations of both the cameras. There, we first calculate camera matrices and distortion coefficients. We generate undistort images using these calculated parameters and provide those undistort image sequences as input images to our proposed stereo calibration method. 3.1 Experiment Setup Figure 5: The heterogeneous camera system used in our proposed method. Left camera is the wide field-of-view fish eye lens camera (focal length = 3.5 mm). Right camera is the narrow field-of-view conventional camera (focal length = 8 mm). Both cameras are mounted on a steel panning bar. and a 6 8 inner checkerboard pattern (colored in redyellow-blue-cyan colors). The ratio between checker sizes in inner and outer patterns is 1:2. Figure 3 shows how this embedded checkerboard pattern is created. The main reason to use such an embedded pattern is that, even though the field-of-views between two cameras are different, independent; but continuously identical; image sequences (which cover the full area of checker patterns) can be captured from two cameras. This is graphically shown in Figure 4. 3 THE STEREO CAMERA CALIBRATION METHOD The experiment setup we used is depicted in Figure 5. Heterogeneous cameras are mounted on either side of a panning bar. For outdoor experiments, we firmly attach this setup to the front mirror of a general-purpose vehicle. Left is the wide field-of-view fish eye lens camera (focal length = 3.5 mm); where right is the conventional narrow field-of-view camera (focal length = 8 mm). Images are captured with a resolution of 1280 1440. 3.2 Individual Mono Camera Calibration We start our proposed stereo camera calibration by performing individual mono camera calibrations. These mono calibrations are done separately to generate undistort images for distorted input image sequences. As in conventional methods, we show the embedded checkerboard pattern to both the cameras and capture images in different orientations separately. Simultaneous acquisition of good calibration images in a closer distance is unfeasible due to different view angles of the cameras (as described in Section 2). After capturing image sequences, we split them into their respective R, G, B channels. The fish eye camera sees both the outer pattern and the inner pattern. Similarly, the narrow angle camera sees the full 296

2A Split RGB Detect chessboard corners Individual calibration Calculate respective extrinsic Calibration of a Different Field-of-view Stereo Camera System using an Embedded Checkerboard Pattern Figure 6: Input images are split into R, G, B channels to distinguish outer pattern from the inner pattern. Wide field-of-view camera image contains both outer and inner patterns, and we only wanted the outer pattern. Only the R channel is extracted and given as the left input image. Narrow field-of-view camera image sees the full inner pattern and a part of the outer pattern. We wanted this camera to see the inner pattern only, since the inverted G channel is extracted and given as the right input image. Input left image Extract R Inverse left extrinsic Input right image Extract G and invert Right extrinsic after shifting world coordinate Do stereo calibration Figure 7: The stereo camera calibration method. Method is based on performing individual camera calibrations first. The estimated extrinsic parameters are then used to calculate for the relative pose between two cameras. inner pattern and a partial area of the outer pattern. In our method, we wanted to let the fish eye camera to see only the outer pattern, whereas the narrow angle camera to see only the inner pattern. Once we extract the R channel from left input images; we managed to get images which contain only the outer pattern (Figure 6a). Similarly, we first extracted the G channel from the right input images, and inverted it to get images which only contain the inner pattern (Figure 6b). Once the images are properly extracted, we performed individual calibrations according to the wellknown Zhangs camera calibration method. 2A (0,0,0) X Y (2A,2A,0) (7A,3A,0) (0,0,0) (8A,2A,0) (0,0,0) (7A,3A,0) (3A,2A,0) (10A,5A,0) 3.3 Stereo Camera Calibration After individual calibrations of the cameras; we follow simple steps mentioned in Figure 7 to perform stereo camera calibration. As individual calibrations, stereo calibration method also depends on Zang s method. The first corner position of the outer pattern is considered as the origin of the world coordinate system. The origin of the inner pattern is normally the first corner position, but since the view points of both cameras are different, this origin is shifted toward the origin of the outer patterns origin. This is Wide camera image Narrow camera image Figure 8: Coordinate system of two patterns. Starting point of the inner pattern is shifted toward the starting point of the outer pattern. An example is shown in bottom figures. depicted in Figure 8. Intrinsic calibration gives the transformation of both the cameras with respect to the world coordinate system. In stereo calibration, we determine to know the transformation between two cameras (the pose). Here keeping the left-wide angle lens as the reference camera, we simply multiply two 297

VISAPP 2017 - International Conference on Computer Vision Theory and Applications transformation matrices to obtain the relative pose between two cameras. Equations 1 to 4 show the process of stereo calibration. [R t] LR = ([R t] WL ) 1 [R t] WR (1) r 11 r 12 r 13 t 1 r 21 r 22 r 23 t 2 r 31 r 32 r 33 t (2) 3 0 0 0 1 r 11 r 12 r 13 s x s y s z t 1 r 21 r 22 r 23 s x s y s z t 2 r 31 r 32 r 33 s x s y s z t 3 0 0 0 1 s x = r 11,r 21,r 31 s y = r 12,r 22,r 32 s z = r 13,r 23,r 33 4 EXPERIMENT RESULTS (3) (4) We have performed two different experiments to evaluate the accuracy of our proposed calibration technique. Image sequences are captured in outdoor environments. Additionally, we performed image rectifications ((Papadimitriou and Dennis, 1996), (Loop and Zhang, 1999), (Fusiello et al., 2000)) to evaluate the robustness of our method. Image sequences are captured using the system setup mentioned in Figure 5. Table 1 summarizes individual calibration results of the cameras. The average pixel errors for both cameras determine the success of individual calibrations. Similarly, Table 2 and 3 summarize stereo camera calibration results for two experiment setups. Re-projection errors are also summarized beneath, which affirms the robustness of our proposed method. Figure 9a represents an example of image rectification result created using our proposed method; where Figure 9b shows a rectification canvas image created using calibration parameters obtained from a conventional black-white calibration pattern. Pixel wise error comparison between two output results affirm that our proposed calibration method is a proper alternative for conventional calibration methods. Some additional rectification results are depicted in Figure 10. The green lines drawn depict the epilines. Average Y value difference for 4 rectified images drawn using our method was about 1 2 pixels, where it was around 2.5 4 pixels in the conventional method. These experiment results affirm that the proposed method is a good alternative to already Table 1: Intrinsic camera parameters. Left camera Right camera f X 1502.4332 2299.2762 f Y 1512.7735 2319.0263 c x 1306.2153 1289.8025 c Y 707.3376 777.7190 Average err 0.4268 Table 2: Extrinsic camera parameters for image set 1 (after optimization of intrinsic parameters). Left camera Right camera f X 1495.7756 2299.7590 f Y 1508.5335 2313.1445 R X -0.02281 R Y -0.0709 R Z -0.0227 T X -156.2325 T Y 0.2552 T Z 12.9685 Reprojection err 0.5501 Table 3: Extrinsic camera parameters for image set 2 (after optimization of intrinsic parameters). Left camera Right camera f X 1560.6922 2379.7323 f Y 1562.5243 2386.7573 R X 0.0222 R Y 0.0025 R Z -0.0219 T X -158.6004 T Y -7.3310 T Z -4.4767 Reprojection err 0.6141 experimented calibration methods when the field of views are different. Rectification comparison results are summarized in Table 4. Table 4: Rectification result comparisons between conventional and proposed method. Conventional Proposed Set 1 2.55957 1.61847 Set 2 2.66457 1.74390 Set 3 3.25659 1.90601 Set 4 3.44579 2.03619 Average err 2.98161 1.82614 298

Calibration of a Different Field-of-view Stereo Camera System using an Embedded Checkerboard Pattern Figure 9: Comparisons between stereo rectification results. An example of rectification performed using stereo calibration results obtained from our proposed method. Eplipolar lines are drawn to see the accuracy of the rectification. A part of the rectification canvas is magnified, and it determines corner locations lie on the same epi line. Rectification result done using stereo camera calibration results obtained using a conventional black-white checkerboard pattern. Magnified area shows 2 3 pixel error in rectified result. Figure 10: Examples of rectification results created using proposed stereo camera calibration parameters. Epi lines pass through similar point locations with a considerably low error precision. These rectification results affirm the robustness of our proposed stereo calibration method. 299

VISAPP 2017 - International Conference on Computer Vision Theory and Applications 5 CONCLUSIONS This paper presented an easy-to-use stereo camera calibration method for a heterogeneous camera system. The heterogeneous camera system consists of two different and partially overlapping field-of-view wide angle camera and a narrow angle camera. Due to the deformities of viewing angles of cameras, we could not apply the conventional stereo camera calibration methods. Some of the methods require the cameras to see a white-black checkerboard pattern, where some heavily dependent on extracting very sensitive corresponding feature points. Performing stereo calibrations based on these methods becomes more unfeasible for a heterogeneous system. To overcome different field-of-view problem, we developed a special checkerboard pattern by combining two differently colored planar checkerboards. We defined the big pattern as the outer pattern, and the small pattern overlaid on the outer pattern as the inner pattern. Wide angle camera sees both the inner pattern and outer pattern, where the narrow angle camera sees the full area of the inner pattern (with a partial area of the outer pattern). We used basic channel splitting method to distinguish inner pattern area from the outer patter area. We extracted R channel from the left image and G channel from the right image. The relative pose between two cameras is calculated using transformation matrices between left and right cameras with-respect-to checkerboard pattern. We evaluated the accuracy of the proposed calibration method by doing stereo rectification experiments. The average re-projection error and rectification result comparisons affirmed the robustness of our method. ACKNOWLEDGMENTS This work was jointly supported by the Technology Innovation Program funded by the Ministry of Trade, Industry and Energy of Republic of Korea (10052982), the Civil Military Technology Cooperation Center and the Korea Atomic Energy Research Institute (KAERI). Fischler, M. A. and Bolles, R. C. (1981). Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM, 24(6):381 395. Fusiello, A., Trucco, E., and Verri, A. (2000). A compact algorithm for rectification of stereo pairs. Machine Vision and Applications, 12(1):16 22. Kwon, H., Park, J., and Kak, A. C. (2007). A New Approach for Active Stereo Camera Calibration. In Proceedings 2007 IEEE International Conference on Robotics and Automation, pages 3180 3185. IEEE. Lhuillier, M. (2008). Automatic scene structure and camera motion using a catadioptric system. Computer Vision and Image Understanding, 109(2):186 203. Lim, J., Barnes, N., and Li, H. (2010). Estimating relative camera motion from the antipodal-epipolar constraint. IEEE transactions on pattern analysis and machine intelligence, 32(10):1907 1914. Loop, C. and Zhang, Z. (1999). Computing rectifying homographies for stereo vision. In Computer Vision and Pattern Recognition, 1999. IEEE Computer Society Conference on., volume 1. IEEE. Micusik, B. and Pajdla, T. (2006). Structure from motion with wide circular field of view cameras. IEEE Transactions on Pattern Analysis and Machine Intelligence, 28(7):1135 1149. Papadimitriou, D. V. and Dennis, T. J. (1996). Epipolar line estimation and rectification for stereo image pairs. IEEE transactions on image processing, 5(4):672 676. Scaramuzza, D., Martinelli, A., and Siegwart, R. (2006). A toolbox for easily calibrating omnidirectional cameras. In 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 5695 5701. IEEE. Wei, G.-Q. and Ma, S. (1993). A complete two-plane camera calibration method and experimental comparisons. In Computer Vision, 1993. Proceedings., Fourth International Conference on, pages 439 446. IEEE. Yu, H. and Wang, Y. (2006). An improved self-calibration method for active stereo camera. In Proceedings of the World Congress on Intelligent Control and Automation (WCICA), volume 1, pages 5186 5190. Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Transactions on pattern analysis and machine intelligence, 22(11):1330 1334. REFERENCES Barreto, J. and Daniilidis, K. (2004). Wide area multiple camera calibration and estimation of radial distortion. In Proceedings of the 5th Workshop on Omnidirectional Vision, Camera Networks and Non-Classical Cameras, Prague, Czech Republic. 300