Geometric Modeling and Calibration of Planar Multi-Projector Displays Using Rational Bezier Patches

Similar documents
Markerless View-Independent Registration of Multiple Distorted Projectors on Extruded Surfaces Using Uncalibrated Camera

Capture and Displays CS 211A

Advances towards high-resolution pack-and-go displays: A survey

Multi-Projector Display with Continuous Self-Calibration

The Design and Implementation of PixelFlex: A Reconfigurable Multi-Projector Display System

DottyToto: A measurement engine for aligning multi-projector display systems

Surround Structured Lighting for Full Object Scanning

Chapters 1 7: Overview

Using a Camera to Capture and Correct Spatial Photometric Variation in Multi-Projector Displays

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 263

Automatic Alignment of Tiled Displays for A Desktop Environment

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 253

Color Mixing Property of a Projector-Camera System

Robot Vision: Camera calibration

Autocalibration of Multiprojector CAVE-Like Immersive Environments

Seamless Multi-Projector Display on Curved Screens

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Camera model and multiple view geometry

Partial Calibration and Mirror Shape Recovery for Non-Central Catadioptric Systems

EECS 556 Image Processing W 09. Interpolation. Interpolation techniques B splines

Rectification and Distortion Correction

Motion Tracking and Event Understanding in Video Sequences

Camera Calibration for Video See-Through Head-Mounted Display. Abstract. 1.0 Introduction. Mike Bajura July 7, 1993

Scalable Multi-view Registration for Multi-Projector Displays on Vertically Extruded Surfaces

arxiv: v1 [cs.cv] 28 Sep 2018

Video Mosaics for Virtual Environments, R. Szeliski. Review by: Christopher Rasmussen

Camera Based Calibration Techniques for Seamless Flexible Multi-Projector Displays

Compact and Low Cost System for the Measurement of Accurate 3D Shape and Normal

calibrated coordinates Linear transformation pixel coordinates

Scientific Visualization Example exam questions with commented answers

Structured Light II. Thanks to Ronen Gvili, Szymon Rusinkiewicz and Maks Ovsjanikov

Pin Hole Cameras & Warp Functions

Vision Review: Image Formation. Course web page:

Calibrating an Overhead Video Camera

Projected Light Displays Using Visual Feedback

Automatic Registration of Multi-Projector Domes Using a Single Uncalibrated Camera

Dynamic Shadow Elimination for Multi-Projector Displays. Rahul Sukthankar Tat-Jen Cham Gita Sukthankar

Learning based face hallucination techniques: A survey

A Summary of Projective Geometry

A Low Power, High Throughput, Fully Event-Based Stereo System: Supplementary Documentation

Surround Structured Lighting for Full Object Scanning

Today s lecture. Image Alignment and Stitching. Readings. Motion models

Projected Light Displays Using Visual Feedback

Implemented by Valsamis Douskos Laboratoty of Photogrammetry, Dept. of Surveying, National Tehnical University of Athens

Exterior Orientation Parameters

Approximation of 3D-Parametric Functions by Bicubic B-spline Functions

Planar homographies. Can we reconstruct another view from one image? vgg/projects/singleview/

Pin Hole Cameras & Warp Functions

Euclidean Reconstruction Independent on Camera Intrinsic Parameters

Real-Time Projector Tracking on Complex Geometry Using Ordinary Imagery

3D Computer Vision. Structured Light II. Prof. Didier Stricker. Kaiserlautern University.

Feature Transfer and Matching in Disparate Stereo Views through the use of Plane Homographies

Blacksburg, VA July 24 th 30 th, 2010 Georeferencing images and scanned maps Page 1. Georeference

MA 323 Geometric Modelling Course Notes: Day 28 Data Fitting to Surfaces

Algorithm research of 3D point cloud registration based on iterative closest point 1

Image Transfer Methods. Satya Prakash Mallick Jan 28 th, 2003

ENGN 2911 I: 3D Photography and Geometry Processing Assignment 2: Structured Light for 3D Scanning

2D Spline Curves. CS 4620 Lecture 13

Segmentation and Tracking of Partial Planar Templates

Lecture 10: Multi view geometry

An Interactive Warping Method for Multi-channel VR Projection Display Systems with Quadric Surface Screens

arxiv: v1 [cs.cv] 28 Sep 2018

Image Pre-conditioning for Out-of-Focus Projector Blur

Compositing a bird's eye view mosaic

Augmenting Reality with Projected Interactive Displays

L16. Scan Matching and Image Formation

A NEW FEATURE BASED IMAGE REGISTRATION ALGORITHM INTRODUCTION

Computer Vision Projective Geometry and Calibration. Pinhole cameras

Use of Shape Deformation to Seamlessly Stitch Historical Document Images

Steerable Projector Calibration

Structured Light. Tobias Nöll Thanks to Marc Pollefeys, David Nister and David Lowe

ROBUST LINE-BASED CALIBRATION OF LENS DISTORTION FROM A SINGLE VIEW

CHAPTER 3. Single-view Geometry. 1. Consequences of Projection

Light Field Occlusion Removal

POSITIONING A PIXEL IN A COORDINATE SYSTEM

09 - Designing Surfaces. CSCI-GA Computer Graphics - Fall 16 - Daniele Panozzo

Experiments with Edge Detection using One-dimensional Surface Fitting

A Low-Cost Projector Mosaic with Fast Registration

MA 323 Geometric Modelling Course Notes: Day 21 Three Dimensional Bezier Curves, Projections and Rational Bezier Curves

Flexible Calibration of a Portable Structured Light System through Surface Plane

Autocalibrating Tiled Projectors on Piecewise Smooth Vertically Extruded Surfaces

Scalable Alignment of Large-Format Multi-Projector Displays Using Camera Homography Trees

A Shattered Perfection: Crafting a Virtual Sculpture

Novel Lossy Compression Algorithms with Stacked Autoencoders

Video Alignment. Final Report. Spring 2005 Prof. Brian Evans Multidimensional Digital Signal Processing Project The University of Texas at Austin

Building Large Area Multi-Projector Displays

Normals of subdivision surfaces and their control polyhedra

Chapter 2 - Fundamentals. Comunicação Visual Interactiva

Rectification and Disparity

SUBDIVISION ALGORITHMS FOR MOTION DESIGN BASED ON HOMOLOGOUS POINTS

ADVANCED IMAGE PROCESSING METHODS FOR ULTRASONIC NDE RESEARCH C. H. Chen, University of Massachusetts Dartmouth, N.

Partial Calibration and Mirror Shape Recovery for Non-Central Catadioptric Systems

Normals of subdivision surfaces and their control polyhedra

Dense 3D Reconstruction. Christiano Gava

Projection simulator to support design development of spherical immersive display

Calibration of a Different Field-of-view Stereo Camera System using an Embedded Checkerboard Pattern

Towards Ambient Projection for Intelligent Environments

Image-based Modeling and Rendering: 8. Image Transformation and Panorama

Stereo and Epipolar geometry

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography

Transcription:

Geometric Modeling and Calibration of Planar Multi-Projector Displays Using Rational Bezier Patches Ezeiel S. Bhaser Department of Computer Science, University of California, Irvine ebhaser@ics.uci.edu Aditi Majumder Department of Computer Science, University of California, Irvine majumder@ics.uci.edu Abstract In order to achieve seamless imagery in a planar multiprojector display, geometric distortions and misalignment of images within and across projectors have to be removed. Camera-based calibration methods are popular for achieving this in an automated fashion. Previous methods for geometric calibration fall into two categories: (a) Methods that model the geometric function relating the projectors to cameras using simple linear models, lie homography, to calibrate the display. These models assume perfect linear devices and cannot address projector non-linearities, lie lens distortions, which are common in most commodity projectors. (b) Methods that use piecewise linear approximations to model the relationship between projectors and cameras. These require a dense sampling of the function space to achieve good calibration. In this paper, we present a new closed-form model that relates projectors to cameras in planar multi-projector displays, using rational Bezier patches. This model overcomes the shortcomings of the previous methods by allowing for projectors with significant lens distortion. It can be further used to develop an efficient and accurate geometric calibration method with a sparse sampling of the function. 1. Introduction Multi-projector displays are effective tools in several applications lie scientific visualization, virtual reality and entertainment, offering life-size scale and fine detail simultaneously. Since different parts of the image in such a display come from different physical devices, they suffer from geometric and photometric mismatch. Camera-based calibration techniques use a centralized camera to observe all these different devices and calibrate them geometrically and photometrically to achieve one seamless image [31, 5, 6, 10, 14, 15, 16, 18, 19, 20, 25, 24, 26, 30, 31, 17, 23, 22]. Such camera-based approaches are very popular due to automation and the support provided for flexible/casual projector arrangement, resulting in scalable and reconfigurable displays. This paper focuses on the geometric calibration problem for planar displays in a centralized framewor where a centralized server controls both the projectors and the camera. 1.1. Related Wor Geometric calibration of planar multi-projector displays entails reconstructing two functions: (a) the function that relates the individual projector coordinates to the camera coordinates; and (b) the function that relates the camera coordinates to the global screen coordinates. So far, geometric calibration techniques devised for such displays fall into the following categories. Linear methods assume linear models for cameras and projectors. Hence, they relate the projector, camera, and the screen coordinates by linear matrices called homographies [28, 5]. However, most multi-projector displays today use commodity projectors that exhibit significant geometric nonlinearities. Lens distortion is common in projectors, especially ones with short throw distance. These are commonly used for rear projection systems to reduce the space requirement behind the screen. Such distortions can be present even when projectors are set to their ideal zoom level. Linear methods do not account for these nonlinearities, and they cannot achieve accurate alignment in the presence of lens distortions. Piecewise linear methods address non-linearities by using a piecewise linear function, essentially a triangulation, to relate the projector, camera and screen coordinates. Though this achieves reasonable geometric accuracy, a dense triangulation is required to sample the parameter space adequately, especially for displays that have projectors with high lens distortion.[30]. The only wor that uses a non-linear method is [10] where a closed form cubic function handles some of the non-linearities. However, this wor is not motivated by 1-4244-1180-7/07/$25.00 2007 IEEE

an accurate analysis and representation of the ind of nonlinearities seen in projectors and hence a very dense sampling of the function space is still required, even more than for piecewise linear methods. Further, since a cubic polynomial is not perspective projection invariant, this method assumes a near-rectangular array where eystoning effects of projectors are negligible. This only occurs for rear a projection system where the projectors are on a rac behind the screen and nearly perpendicular to the screen. However, front-projection systems show severe eystoning since normally projectors are mounted on the ceiling and project in an oblique fashion on the screen on the ground. 1.2. Main Contributions In this paper, we present a new model for relating the projector to the camera and the screen coordinates in a planar multi-projector display using a rational Bezier patch (Section 2). We show that the projector radial distortion can be modeled compactly with an affine invariant non-rational Bezier patch. In a projector-camera setting on a planar display, this function is appended by a homography that relates the projector with the camera and the screen linearly. We show that the combination of these non-linear and linear functions can be modeled adequately by a single perspective projection invariant rational Bezier patch. Thus, all the different distortions in a planar, projector based display (lie ey-stoning, radial distortion, tangential distortion) can be modeled in a unified manner using rational Bezier patches. Next, we demonstrate the use of this representation to calibrate a multi-projector display made of projectors with significant lens distortions. This is achieved by a sparse sampling of the rational Bezier function resulting in a calibration technique that is efficient both in representation and performance (Section 3). To the best of our nowledge, this is the first wor that models both the linear and non-linear distortions faced in a projector-camera setting in a unified manner resulting in an accurate and efficient geometric calibration technique. Our wor has far-reaching effects in other ways too. Projection technology is advancing at a tremendous pace. Pocet projectors that can fit on one s palm are no longer a myth, but a reality [1]. Industry initiative are finding ways to build short throw, unbreaable projectors that use LEDs as light sources so that they can be embedded in cell phones. It is evident that lens distortion will be significant in such low-cost projectors. Thus, it is important to develop a model that handles such distortions in the projector, as well as the relationship between the projector and other entities lie the camera and the screen, in a unified manner. Our unified representation will help avoid pre-calibrating devices by representing distortions arising due to completely different reasons (lie lens distortion and ey-stoning) elegantly and compactly via one closed-form function. The model should be amenable to (a) accurate representation; and (b) efficient and easy application to camera-based single or multiple projector geometric calibration, at least for the common case of planar displays. Currently, while designing a compact set-up for projection-based displays or changing an existing set-up to increase space utility, a short throw lens is required to decrease the throw distance of the projectors. These lenses require several optical elements that can reverse the high lens distortions created by short focal lengths. As a result, they are cost-prohibitive (around $2000 - $6000) and at least 5-6 times the cost of commodity projectors (around $1000 -$1500). Since our method can handle severe lens distortions, it can mae the use of inexpensive lenses on projectors possible, just as they are used on cameras, maing multi-projector displays even more affordable. 2. Model for Single Projector-Camera Pair In this section, we consider the simple case of a single projector projecting on a planar screen and being observed by a single camera. We assume a camera with no lens distortions. This can be achieved easily by a number of existing techniques, the most popular open source method being the Caltech Camera Calibration Toolbox [3]. Geometric calibration can be achieved achieved by estimating two functions. The first function maps the camera coordinates (u, v) to the normalized screen coordinates (p, q). The presence of a planar screen allows us to use homography H S for this mapping. The second function maps projector coordinates (x, y) to camera coordinates (u, v). Our goal is to design an accurate parametric function F that defines this relationship, i.e. (u, v) = F (x, y). The function F has two components. First is the lens distortion function R relating undistorted projector coordinates (x, y) to distorted coordinates on the display screen (x d, y d ). The second component is the wellnown homography H that relates the distorted coordinates to the camera coordinates (u, v). (u, v) = H(x d, y d ) = H(R(x, y)) = F(x, y) (1) We would lie to model F by a single compact parametric function that can accurately capture the effect of the concatenation of two functions, H and R, and is amenable to efficient data fitting computation. 2.1. Modeling R with Non-Rational Bezier Patches The classic lens distortion model, proposed by D.C. Brown in 1966 [4, 8], consists of two independent distortions: radial distortion (either barrel or pin-cushion) and tangential distortion.

Radial distortion is usually modeled by r d = r + 1 r 2 + 2 r 4 + 3 r 6 (2) where r is the radial distance from the principal point in an undistorted image, r d is the same distance in the distorted image and i, 1 i 3 are the radial distortion parameters. Negative values of i s result in barrel distortion while positive values result in pin-cushioning. The principal center is a point in the image that is unchanged by radial distortion. In general the principal center (x c, y c ) need not be at the center of the image but is usually close to the center. Equation 2 when converted to cartesian coordinate [3, 13] results in the following distortion equations, x d = x + (x x c )( 1 r 2 + 2 r 4 + 3 r 6 ) = x + ρ x, (3) y d = y + (y y c )( 1 r 2 + 2 r 4 + 3 r 6 ) = y + ρ y, (4) where (x d, y d ) and (x, y) are the distorted and undistorted image coordinates respectively, r = (x xc ) 2 + (y y c ) 2 and i, 1 i 3, are the radial distortion coefficients. The tangential distortion is modeled by x d = x + 2p 1 xy + p 2 r 2 + 2p 2 x 2 = x + τ x, (5) y d = y + 2p 2 xy + p 1 r 2 + 2p 1 y 2 = y + τ y (6) where r = (x x c ) 2 + (y y c ) 2 and p 1 and p 2 are the tangential distortion parameters. Radial and tangential distortion are independent of each other and hence their effects can be combined into a comprehensive lens distortion equation, (x d, y d ) = (x + ρ x + τ x, y + ρ y + τ y ). (7) Estimating the unnown radial and tangential distortion coefficients involves solving complex non-linear optimizations. Hence, several simplifying assumptions are made which are often not true in a real system. The most common assumption is that the principal point coincides with the center of the image. Further, the higher degree terms and tangential distortion are often ignored [27, 13, 32, 12, 3, 29, 7]. We propose that the distorted coordinates can be represented by non-rational Bezier patches instead and can be easily corrected too. A non-rational Bezier patch of degree and dimension d is defined by ( + 1) d control points, P ij in a d dimension space. In our case, we consider a two dimensional patch where d = 2. The non-rational Bezier patch is a tensor product patch where the control points are weighted by blending functions that are product of independent one dimensional Bernstein polynomials. Bernstein polynomials of degree consist of n, 0 n polynomials given by b n, (t) = n! n!(n )! t (1 t) n (8) where 0 t 1. The non-rational Bezier patch can then be defined from these polynomials as N B(x, y) = = P ij b i, (x)b j, (y) (9) P ij B ij (x, y), (10) where B ij is the tensor product of the Bernstein polynomials b i, and b j, and 0 (x, y) 1. In the context of the single projector-camera pair for planar displays, the projector lens distortion function R(x, y) is represented by (x d, y d ) = R(x, y) = N B(x, y) (11) Our choice of non-rational Bezier patch is guided by the following rationale. 1. The non-rational Bezier patch provides a unified framewor to represent any ind of lens distortion. More severe lens distortion can be easily captured by just increasing the degree of the patch. For example, a bicubic patch may be sufficient for standard narrow FOV lenses, but a quadric patch would be required for greater distortions in a wide FOV lens. Thus, a large range of distortions can be handled by higher degree non-rational Bezier patches without affecting the underlying computational framewor. 2. When the equation of a non-rational Bezier patch is expanded and compared with Equation 7, one finds that the non-rational Bezier patch has many more higher order terms, maing it a more flexible model. For example, it is possible for short throw lens projectors to have a mix of both barrel and pin-cushion distortions. Such cases are better handled by a more flexible function lie the non-rational Bezier patch. 3. The principal center is encoded by the control points of the patch itself and need not be determined during parameter estimation. The principal point creates a particular nuisance during estimation of lens distortion parameters forcing most practical systems to assume it to be at the center. 4. Lie lens distortion, non-rational Bezier patches are also affine invariant. 5. In most practical applications, the radial distortion is followed by a perspective transformation from 3D to 2D space. Rational Bezier is an obvious extension to handle both of these distortions in a unified manner. Table 1 and Figure 1 presents results on the accuracy of this model. We consider a dense sampling of the (x, y) space

Case No. Lens Distortion Parameters % Error For Center Radial Tangential Bezier of degree (x c, y c ) ( 1, 2, 3 ) (p 1, p 2 ) 2 3 4 5 6 7 (a) (0.5, 0.5) ( 0.35, 0, 0) (0, 0) 5.78 ɛ (b) (0.5, 0.5) (0, 0, 0) (0.1, 0.1) ɛ (c) (0.5, 0.5) ( 0.35, 0.35, 0) (0, 0) 9.00 0.91 0.56 ɛ (d) (0.6, 0.55) ( 0.35, 0.13, 0) (0.05, 0.05) 1.9 0.14 0.03 ɛ (e) (0.6, 0.55) ( 0.35, 0.13, 0.016) (0.05, 0.05) 2.0 0.17 0.038 0.0012 0.00022 ɛ Table 1. This table shows the accuracy of our fit for lens distortion functions using non-rational Bezier patches of different degrees. Due to large number of lower degree terms, note that in case 5 distortion of higher degrees (degree 7) can be fitted to a reasonably accuracy by a lower degree Bezier (lie degree 4). ɛ represents a very small number strictly less then 10 11. (a) (b) (c) (d) (e) Figure 1. Images showing the fives cases of distortion mentioned in Table 1. and distort it using various lens distortion parameters, and fit a non-rational Bezier patch to the distorted points. To evaluate the accuracy of the fitted patch, we find the average Euclidian distance error between fitted and the distorted data as a percentage of the distance from the principal point. Note that even when higher order distortions are present (e.g. x 7 ), lower degree non-rational Bezier patches (e.g. bicubic) can provide a good fit. This is due to the presence of large number of different combination of lower order cross-terms. However, note that the estimation of this non-rational Bezier function is not required for the purpose of calibration. 2.2. Modeling HR with Rational Bezier Patches Non-rational Bezier patches are affine invariant but not perspective projection invariant. Relating the projector to the camera not only involves lens distortion but also a perspective projection. In case of planar displays, this is a homography H, a simpler 3 3 transformation induced by the planar screen. So we need a function that is perspective projection invariant to model the combined effects of H and R in Equation 1. Rational Bezier patches, a generalized form of Bezier patches, offer us this advantage. Rational Bezier patches are controlled additionally by a scalar weight associated with each control point, and is defined by j=0 P ijw ij B ij (x, y) B(x, y) = i=0 w ijb ij (x, y). (12) The weights provide this function with the capability to sample the parameter space non-uniformly. Thus, in addition to satisfying all properties of a non-rational Bezier patch, a rational Bezier patch is also perspective projection invariant. We propose the use of a rational Bezier patch to estimate the function F = HR in a unified manner. 3. Calibrating Multiple Projectors We now apply this method to the calibration of multiple projectors using a camera, in the presence of radial distortion. As mentioned in the previous section, we use a homography H S for the mapping between camera coordinates (u, v) to the normalized screen coordinates (p, q) and a rational Bezier patch B to model (u, v) = F (x, y), the function that maps from projector coordinates (x, y) to camera coordinates (u, v). 3.1. Parameter Estimation To estimate H S, we require four correspondences between the camera and screen coordinates, (u, v) and (p, q). The four correspondences result in a system of linear equations that can be solved to recovery the homography matrix. To find the correspondences, we identify four points in the camera image that correspond to the four corners of the screen. This can be done by attaching fiducials to the display. To do it automatically, we light up all the projectors with white light and use simple image processing to detect the extreme corners of the lighted region to find the camera coordinates corresponding to the four corners of the normalized screen coordinates. To estimate the parameters of the rational Bezier, B, we

(a) (b) (c) (d) (e) (f) Figure 2. Results shown on our display made of 3 3 array of nine projectors. The calibration is achieved using bicubic rational Bezier patches with quite sparse correspondences (8 6 for a 1024 768 projector). (a, b, c) Before calibration; (c, d, e) After calibration. The grid image in the last row is a grid made of lines of one pixel wide. To retain clarity of the grids photometric calibration is not applied to the grid image in (c) to generate the calibrated image (e). Please zoom in to chec calibration of grid and text at the overlaps. find correspondences between projector pixels (x, y) and camera pixels (u, v). Let us assume N correspondences i.e. (x, y ) related to (u, v ), 0 N 1. The unnowns are the control points in the camera space, Pij = (uij, vij ) and their scalar weights wij. From the nown correspondences, Equation 12 can be written for each correspondence as follows, (u, v ) wij Bij (x, y ) (uij, vij )wij Bij (x, y ) = 0. (13) Assuming Uij = uij wij and Vij = vij wij, Equation 13 above yields two equations for each correspondence as follows, u wij Bij (x, y ) Uij Bij (x, y ) = 0. (14) v wij Bij (x, y ) Vij Bij (x, y ) = 0. (15) The unnowns while estimating a two-dimensional rational Bezier patch of degree are ( + 1)2 control points and weights, which results in a total of 3(+1)2 unnowns. The weights are dependent variables and depend on the control points in a non-linear fashion. As a result, we cannot use traditional singular value decomposition to solve this set of linear equations in a least squares sense. So, we use the Levenberg-Marquardt algorithm to numerically solve for the control points and the weights [21]. As an initial guess, we fit a control polygon tightly around the set of correspondences (u, v ) in the camera space. Further, we constrain weights to be be greater than 0. These two conditions assure that the method converges to the global minimum. The proof of this is beyond the scope of this paper. To find the correspondences for between each of the projector and camera coordinates, we display a rectangular grid of Gaussian blobs, with nown coordinates. Images from

non-overlapping projectors are captured simultaneously by the camera. We use a 2D stepping procedure that depends on initial user input in identifying the top left blob and its immediate right and bottom neighbors in the camera space. With this minimal three point user input we can design a stepping procedure that (a) estimates the rough position of the next blob in the scan line order; and (b) loo for the correct blob position by computing the nearest windowed center-of-mass technique [9]. If this is not possible for more extreme projector distortions, one can binary code the blobs and project them in a time sequential manner to recover the exact ids of the detected blobs [25, 30]. 3.2. Image Correction To generate the corrected image for each projector that will result in a geometrically undistorted and seamless display we use the following technique. For each projector pixel (x, y), we first compute the corresponding camera coordinates (u, v) using the rational Bezier patch B. Next, we compute the corresponding normalized screen coordinates (p, q) by using the homography H S. This indexes into the image being rendered on the display. Bilinear interpolation is used to assign the final value of each projector pixel. So, the correction is achieved by the mapping (p, q) = H S (u, v) = H S (B(x, y)). To tae advantage of the GPU from modern graphics cards, the non-linear warping correction was incorporated into Chromium an open source distributed rendering engine for graphics clusters[11]. The mapping is first precomputed and the resulting pixel coordinates are stored in a looup table. A fragment program for the pixel shader then quicly maps pixels from normal coordinate space to the new coordinate space during rendering. 3.3. Results In this section, we present the results of our method and compare it with other existing methods. For photometric seamlessness, we use the method to achieve perceptual seamlessness presented in [20, 18]. Most of our results are shown for barrel distortion. But they wor equally well for pin-cushioning also. The fitting of rational Bezier patches on an Intel Pentium 4 3.00Ghz CPU for 50 samples taes a few seconds. For all the images shown here, please zoom in to chec the quality of the calibration, especially for text and grids. The grids we use have lines of single pixel width in the projector space. The accompanying video demonstrates the real time implementation of the geometric calibration using the GPU. We have tested our method on our 3 3 array of nine projectors. Our projectors have relatively large throw ratio and hence do not show terrible lens distortion. Instead of using a cost prohibitive option of buying nine short-throw lenses, we simulate the distortion digitally by distorting the imagery going to the projectors. Figure 2 shows our result on our display. Note that we get sub-pixel accuracy even for grids and text. Since simulating more severe lens distortion would disturb our display set-up, we demonstrate the generality of our method using realistic computer simulations. Figure 3 shows some extreme lens distortions and calibration achieved by our method using both a sparse and dense set of correspondences. The former considers a sparse sampling of the projector space using a 8 6 grid of 48 correspondences per projector (1024 768 pixels). The latter uses a dense sampling of 24 18 grid of 144 correspondences. In both cases we use a bicubic (degree 3) rational Bezier patch. Our results from both are indistinguishable from each other. This proves the sufficiency of sparse sampling for our method. We also compare our result with the piecewise linear method using both sparse and dense set of correspondences. The sparse sampling in this case results in errors of more than a few pixels. In fact, our method with a sparse set of correspondences yields better results than the piecewise linear method, even with a dense set of correspondences. 4. Conclusion In conclusion, we have presented a closed form solution to relate the projector coordinates to the camera and screen coordinates in the presence of considerable lens distortion. This maes it possible to use inexpensive lenses with no self-correction to reduce the throw distance of projector. This results in a compact, affordable, multi-projector setup without having any adverse effects on the display quality. To the best of our nowledge, this is the first wor that models the non-linear warp from the distorted projectors to the screen accurately using a closed form solution, resulting in a general, simple and effective geometric calibration technique. Our method is equally applicable to single projectorcamera systems thus paving the way to inexpensive imperfect small-scale projectors on hand-held devices. Further, our method can be easily extended to curved screens Piecewise linear methods require dense sampling of the projector to camera relationship to capture the screen curvature effectively [26, 25]. Our function, being a piecewise curve representation, is naturally tuned for better representation of this function on curved screens, even with a sparse sampling. In addition, extending piecewise linear methods to distributed settings is not possible due to lac of a closed-form representation. Thus, the distributed framewor of [2] is forced to adopt a distributed homography tree technique since homographies offer a closed form representation of the relationship between the projector, camera and screen coordinates. However, homographies, being linear, yield unac-

(a) (b) (c) (e) (d) (f) (g) (h) Figure 3. Geometric calibration of projectors with severe barrel distortion with linear piecewise method using sparse (a) and dense (b) correspondences; using rational bicubic Bezier patches using sparse (c) and dense (d) correspondences; Please zoom in the images to see the results. (e,f,g,h) shows the zoomed in view of the overlap region across four projectors of (a,b,c,d) respectively. It is evident that piecewise linear methods with sparse correspondences lead to severe mismatches. The results using rational Bezier are similar irrespective of the sampling proving the sufficiency of sparse sampling. The result of using rational Bezier patches with sparse sampling is even better than using a dense sampling for the piecewise linear method evident from thicer grid lines in the overlap region in (b) due to slight errors. ceptable alignment in real non-linear devices. Our method offers the perfect tradeoff by providing a closed-form nonlinear model and can be extended easily for adoption in the distributed framewor. Finally, we want to investigate the implications of our model in achieving geometric calibration using an uncalibrated camera. Rational Bezier patches being well-behaved entities may mae the separation of the distortion parameters of projector and camera feasible through polynomial factorization. References [1] Mitsubishi pocetprojector led dlp projector. Technical report, http://www.mitsubishipresentations.com/proj pocet.asp. [2] E. S. Bhaser, P. Sinha, and A. Majumder. Asynchronous distributed calibration for scalable reconfigurable multi-

projector displays. IEEE Transections of Visualization and Computer Graphics (TVCG), 12(5):1101 1108, 2006. [3] J. Y. Bouguet. Camera calibration toolbox for matlab. Technical report, http://www.vision.caltech.edu/bouguetj/calib doc/index.html. [4] D. Brown. Decentering distortion of lenses. Photometric Engineering, 32(3):444 462, 1966. [5] H. Chen, R. Suthanar, G. Wallace, and K. Li. Scalable alignment of large-format multi-projector displays using camera homography trees. Proceedings of IEEE Visualization, 2002. [6] Y. Chen, D. W. Clar, A. Finelstein, T. Housel, and K. Li. Automatic alignment of high-resolution multi-projector displays using an un-calibrated camera. Proceedings of IEEE Visualization, 2000. [7] K. Cornelis, M. Pollefeys, and L. V. Gool. Lens distortion recovery for accurate sequential structure and motion recovery. European Conference in Computer Vision, pages 186 200, 2002. [8] J. Fryer and D. Brown. Lens distortion for close-range photogrammetry. Photogrammetric Engineering and Remote Sensing, 52(1):51 58, 1986. [9] R. C. Gonzalez and R. E. Woods. Digital Image Processing. Addison Wesley, 1992. [10] M. Hereld, I. R. Judson, and R. Stevens. Dottytoto: A measurement engine for aligning multi-projector display systems. Argonne National Laboratory preprint ANL/MCS- P958-0502, 2002. [11] G. Humphreys, M. Houston, R. Ng, R. Fran, S. Ahem, P. Kirchner, and J. Klosowsi. Chromium : A stream processing framewor for interactive rendering on clusters. ACM Transactions of Graphics, 2002. [12] H. G. Jung, Y. H. Lee, P. J. Yoon, and J. Kim. Radial distortion refinement by inverse mapping-based extrapolation. Proceedings to 18th International Conference on Pattern Recognition(ICPR), 2006. [13] L. Ma, Y. Chen, and K. L. Moore. A New Analytical Radial Distortion Model for Camera Calibration. Ariv Computer Science e-prints, July 2003. [14] A. Majumder. Properties of color variation across multiprojector displays. Proceedings of SID Eurodisplay, 2002. [15] A. Majumder. Contrast enhancement of multi-displays using human contrast sensitivity. Proceedings of IEEE International Conference on Computer Vision and Pattern Recognition (CVPR), 2005. [16] A. Majumder, Z. He, H. Towles, and G. Welch. Achieving color uniformity across multi-projector displays. Proceedings of IEEE Visualization, 2000. [17] A. Majumder, D. Jones, M. McCrory, M. E. Papa, and R. Stevens. Using a camera to capture and correct spatial photometric variation in multi-projector displays. IEEE International Worshop on Projector-Camera Systems, 2003. [18] A. Majumder and R. Stevens. LAM: Luminance attenuation map for photometric uniformity in projection based displays. Proceedings of ACM Virtual Reality and Software Technology, 2002. [19] A. Majumder and R. Stevens. Color nonuniformity in projection-based displays: Analysis and solutions. IEEE Transactions on Visualization and Computer Graphics, 10(2), March April 2003. [20] A. Majumder and R. Stevens. Perceptual photometric seamlessness in tiled projection-based displays. ACM Transactions on Graphics, 24(1), January 2005. [21] D. W. Marquardt. Lens distortion recovery for accurate sequential structure and motion recovery. Journal of Society for Industrial and Applied Mathematics, 11:431 441, 1963. [22] S. K. Nayar, H. Peri, M. D. Grossberg, and P. N. Belhumeur. A projection system with radiometric compensation for screen imperfections. Proceedings of IEEE International Worshop on Projector-Camera Systems, 2003. [23] A. Raij, G. Gill, A. Majumder, H. Towles, and H. Fuchs. PixelFlex2: A Comprehensive, Automatic, Casually-Aligned Multi-Projector Display. IEEE International Worshop on Projector-Camera Systems, 2003. [24] R. Rasar. Immersive planar displays using roughly aligned projectors. In Proceedings of IEEE Virtual Reality 2000, 1999. [25] R. Rasar, M. Brown, R. Yang, W. Chen, H. Towles, B. Seales, and H. Fuchs. Multi projector displays using camera based registration. Proceedings of IEEE Visualization, 1999. [26] R. Rasar, J. van Baar, P. Beardsley, T. Willwacher, S. Rao, and C. Forlines. ilamps: Geometrically aware and selfconfiguring projectors. ACM Transactions on Graphics, 22(3), 2003. [27] J. Salvi,. Armangu, and J. Batlle. A comparative review of camera calibrating methods with accuracy evaluation. Pattern Recognition, 35:1617 1635, 2002. [28] R. Suthanar, R. Stocton, and M. Mullin. Smarter presentations: Exploiting homography in cameraprojector systems. Proceedings of International Conference on Computer Vision (ICCV), 2001. [29] R. Swaminathan and S. K. Nayar. Nonmetric calibration of wide-angle lenses and polycameras. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(10):1172 1178, 2000. [30] R. Yang, D. Gotz, J. Hensley, H. Towles, and M. S. Brown. Pixelflex: A reconfigurable multi-projector display system. Proceedings of IEEE Visualization, 2001. [31] R. Yang, A. Majumder, and M. Brown. Camera based calibration techniques for seamless multi-projector displays. IEEE Transactions on Visualization and Computer Graphics, 11(2), March-April 2005. [32] Z. Zhang. Flexible camera calibration by viewing a plane from unnown orientation. IEEE International Conference on Computer Vision, pages 666 673, 1999.