Camera Parameters Estimation from Hand-labelled Sun Sositions in Image Sequences

Similar documents
Precise Omnidirectional Camera Calibration

arxiv: v1 [cs.cv] 28 Sep 2018

Motion estimation of unmanned marine vehicles Massimo Caccia

CIS 580, Machine Perception, Spring 2016 Homework 2 Due: :59AM

ROBUST LINE-BASED CALIBRATION OF LENS DISTORTION FROM A SINGLE VIEW

What do color changes reveal about an outdoor scene?

POME A mobile camera system for accurate indoor pose

Camera Model and Calibration

Occluded Facial Expression Tracking

GEOG 4110/5100 Advanced Remote Sensing Lecture 4

METRIC PLANE RECTIFICATION USING SYMMETRIC VANISHING POINTS

A Pot of Gold: Rainbows as a Calibration Cue

cse 252c Fall 2004 Project Report: A Model of Perpendicular Texture for Determining Surface Geometry

Camera Model and Calibration. Lecture-12

A SUN TRACKER FOR PLANETARY ANALOG ROVERS

Fundamentals of Stereo Vision Michael Bleyer LVA Stereo Vision

Geometric Rectification of Remote Sensing Images

Stereo Observation Models

Answers to practice questions for Midterm 1

N-Views (1) Homographies and Projection

Feature Transfer and Matching in Disparate Stereo Views through the use of Plane Homographies

Experiment 8 Wave Optics

More Mosaic Madness. CS194: Image Manipulation & Computational Photography. Steve Seitz and Rick Szeliski. Jeffrey Martin (jeffrey-martin.

Validating the Contextual Information of Outdoor Images for Photo Misuse Detection

Silhouette Coherence for Camera Calibration under Circular Motion

Estimating Natural Illumination from a Single Outdoor Image

Bird Solar Model Source Creator

Stereo Vision Computer Vision (Kris Kitani) Carnegie Mellon University

Single View Geometry. Camera model & Orientation + Position estimation. Jianbo Shi. What am I? University of Pennsylvania GRASP

Machine learning based automatic extrinsic calibration of an onboard monocular camera for driving assistance applications on smart mobile devices

Targetless Calibration of a Lidar - Perspective Camera Pair. Levente Tamás, Zoltán Kató

Computer Vision cmput 428/615

Radiometric Calibration with Illumination Change for Outdoor Scene Analysis

Segmentation and Tracking of Partial Planar Templates

High Altitude Balloon Localization from Photographs

CAMERA CALIBRATION FOR VISUAL ODOMETRY SYSTEM

REFINEMENT OF COLORED MOBILE MAPPING DATA USING INTENSITY IMAGES

Flexible Calibration of a Portable Structured Light System through Surface Plane

GEOG 4110/5100 Advanced Remote Sensing Lecture 2

Coordinate Transformations for VERITAS in OAWG - Stage 4

A Fast Linear Registration Framework for Multi-Camera GIS Coordination

DEPTH AND GEOMETRY FROM A SINGLE 2D IMAGE USING TRIANGULATION

Computer Graphics 7: Viewing in 3-D

DEVELOPMENT OF CAMERA MODEL AND GEOMETRIC CALIBRATION/VALIDATION OF XSAT IRIS IMAGERY

Computer Vision I. Dense Stereo Correspondences. Anita Sellent 1/15/16

CHAPTER 3. Single-view Geometry. 1. Consequences of Projection

Camera Calibration with a Simulated Three Dimensional Calibration Object

Camera Calibration for Video See-Through Head-Mounted Display. Abstract. 1.0 Introduction. Mike Bajura July 7, 1993

Applying Synthetic Images to Learning Grasping Orientation from Single Monocular Images

Midterm Exam Solutions

Reflection, Refraction and Polarization of Light

Projector Calibration for Pattern Projection Systems

Geometric Correction

Short on camera geometry and camera calibration

Subpixel Corner Detection Using Spatial Moment 1)

Geolocation with FW 6.4x & Video Security Client Geolocation with FW 6.4x & Video Security Client 2.1 Technical Note

Calibration of Fish-Eye Stereo Camera for Aurora Observation

DEVELOPMENT OF POSITION MEASUREMENT SYSTEM FOR CONSTRUCTION PILE USING LASER RANGE FINDER

Simulation of Brightness Temperatures for the Microwave Radiometer (MWR) on the Aquarius/SAC-D Mission. Salman S. Khan M.S. Defense 8 th July, 2009

Stereo Imaging and Geolocation

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Application of solar position algorithm for sun-tracking system

Image Formation. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

Computer Graphics: Viewing in 3-D. Course Website:

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication

Single View Geometry. Camera model & Orientation + Position estimation. What am I?

Midterm Examination CS 534: Computational Photography

Stereo Vision. MAN-522 Computer Vision

Research Article Fisheye-Lens-Based Visual Sun Compass for Perception of Spatial Orientation

LOCAL GEODETIC HORIZON COORDINATES

EE565:Mobile Robotics Lecture 3

A Stratified Approach for Camera Calibration Using Spheres

Introduction to Homogeneous coordinates

Chapter 3 Image Registration. Chapter 3 Image Registration

Sensor Modalities. Sensor modality: Different modalities:

Vision Review: Image Formation. Course web page:

Real-time Image-based Reconstruction of Pipes Using Omnidirectional Cameras

Hand-Eye Calibration from Image Derivatives

Geometric camera models and calibration

Partial Calibration and Mirror Shape Recovery for Non-Central Catadioptric Systems

Exploitation of GPS-Control Points in low-contrast IR-imagery for homography estimation

Proceedings of the 6th Int. Conf. on Computer Analysis of Images and Patterns. Direct Obstacle Detection and Motion. from Spatio-Temporal Derivatives

Stereoscopic Vision System for reconstruction of 3D objects

COORDINATE TRANSFORMATION. Lecture 6

Measurement of 3D Foot Shape Deformation in Motion

Terrain correction. Backward geocoding. Terrain correction and ortho-rectification. Why geometric terrain correction? Rüdiger Gens

Bowling for Calibration: An Undemanding Camera Calibration Procedure Using a Sphere

Introduction to Computer Vision. Introduction CMPSCI 591A/691A CMPSCI 570/670. Image Formation

Mathematics in Orbit

Intrinsic and Extrinsic Camera Parameter Estimation with Zoomable Camera for Augmented Reality

CS5670: Computer Vision

Robust Camera Calibration from Images and Rotation Data

Rectification and Distortion Correction

A Novel Stereo Camera System by a Biprism

Stereo Image Rectification for Simple Panoramic Image Generation

CIS 580, Machine Perception, Spring 2015 Homework 1 Due: :59AM

Direct Methods in Visual Odometry

How to Measure Wedge. Purpose. Introduction. Tools Needed

Camera calibration for miniature, low-cost, wide-angle imaging systems

Mech 296: Vision for Robotic Applications. Today s Summary

Transcription:

Camera Parameters Estimation from Hand-labelled Sun Sositions in Image Sequences Jean-François Lalonde, Srinivasa G. Narasimhan and Alexei A. Efros {jlalonde,srinivas,efros}@cs.cmu.edu CMU-RI-TR-8-32 July 28 The Robotics Institute Carnegie Mellon University Pittsburgh, Pennsylvania 15213 c Carnegie Mellon University

I Abstract We present a novel technique to determine camera parameters when the sun is visible in an image sequence. The technique requires a user to label the position of the sun in a few images. We show that it can be used to successfully recover the camera focal length, as well as its azimuth and zenith angles. The method is thoroughly evaluated on synthetic data under a wide range of operating conditions, and representative examples are shown on real data. Ground truth parameters can be recovered with less than 3% error in challenging cases.

CONTENTS III Contents 1 Introduction 1 1.1 Assumptions............................. 1 1.2 Related Work............................. 1 2 Geometry of the Problem 2 3 Determining the Camera Parameters 3 4 Validation on Synthetic Data 4 4.1 Focal Length............................. 5 4.2 Camera Zenith............................ 5 4.3 Camera Azimuth........................... 6 5 Representative Results on Real Data 7 6 Summary 8

1 1 Introduction In this document, we show that if the sun is visible in a few frames of an outdoor webcam sequence, we can use its position to recover three important camera parameters: its focal length, and its azimuth and zenith angles. This method was used in [5] to evaluate the quality of their sky-based algorithm. The rest of the document is divided as follows. First, we introduce the few assumptions that need to be made in order for this method to work, followed by a brief presentation of related work. We then illustrate the problem geometry in Sect. 2, and show a method that recovers the camera parameters in Sect. 3. Finally, we demonstrate the performance of our algorithm on synthetic (Sect. 4) and real (Sect. 5) data. 1.1 Assumptions In order to simplify the problem, we start by making a few realistic assumptions about the camera. First, the camera GPS location is assumed to be known, and all the images must be time-tagged. We also assume the camera has square pixels, and that it has no roll angle, so we need only estimate the yaw and pitch angles. This corresponds to the typical situation where the camera is looking straight. Of course, the sun has to be visible in a few frames, and we will assume its position has been hand-labeled by an expert. 1.2 Related Work Camera calibration has been extensively studied in the computer vision community. Most available methods require the use of a physical calibration target that must be placed within the camera field of view [8, 2]. However, these methods cannot be used if the camera is not physically accessible. In the case of a static camera acquiring a time-tagged sequence of images, it is possible to use natural features such as the sun and sky to estimate some camera parameters. Cozman and Krotkov [1] proposed to use the sun altitude to recover the camera geolocation (latitude and longitude). Their method is very accurate, but relies on precise measurements using a high-quality camera and inclinometer. Trebi-Ollenu et al. [7] proposed a way to determine the camera heading (i.e. camera azimuth) by detecting the sun using a custom-designed sun sensor. In our case, we do not have access to any additional hardware to ensure precise sun detection in images, so we rely on a user to indicate the sun position in a few images. Of particular relevance to our work, Jacobs et al. [3] show how they can reliably geolocate the camera by correlating the intensity variations in the images with satellite imagery. Since our approach requires a priori knowledge of the camera position, this technique could be used to initialize our approach, thus enabling the recovery of position and orientation automatically.

2 2 GEOMETRY OF THE PROBLEM Figure 1: Geometry of the problem. 2 Geometry of the Problem Let us first start by illustrating the geometry of the problem and introduce corresponding symbols, which will be used throughout this paper. Consider Fig. 1, where a camera with reference frame (x c, y c, z c ) is looking at a scene, and the sun is projected at pixels (u s, v s ) in the image. Our goal is to recover camera parameters from multiple images where the sun is visible, at different (u s, v s ) coordinates. If the center of projection of the camera coincides with the origin of the world reference frame (x w, y w, z w ) and that its optical axis is aligned with the positive x w -axis, we need only estimate the camera orientation, θ c and φ c, as well as its focal length f c. The sun angular position (θ s, φ s ) can be computed for a given image from its GPS location and time of capture using the method of Reda and Andreas [6]. To compute the projection of the sun in the image, we first compute the 3-D vector associated to the sun: x s sin θ s cos φ s s = y s = sin θ s sin φ s. (1) z s cos θ s We can express the sun 3-D coordinates in the camera local coordinate system,

3 by rotating s according to (θ c, φ c ): where R = R 1 R 2 = cos φ c sin φ c sin φ c cos φ c 1 s = R 1 s, (2) cos(π/2 θ c ) sin(π/2 θ c ) 1 sin(π/2 θ c ) cos(π/2 θ c ) By properties of rotation matrices, we can express R 1 directly:. (3) R 1 = (R 1 R 2 ) 1 = R2 1 1 = R2 T R1 T cos(π/2 θ c ) sin(π/2 θ c ) = 1 sin(π/2 θ c ) cos(π/2 θ c ) sin θ c cos φ c sin θ c sin φ c cos θ c = sin φ c cos φ c. cos θ c cos φ c cos θ c sin φ c sin θ c cos φ c sin φ c sin φ c cos φ c 1 (4) We then project the sun position in the image by re-arranging the axes such that the projection axes (u, v) point in the direction corresponding to the drawing in Fig. 1: u s = f cy s x s v s = f cz s x. (5) s 3 Determining the Camera Parameters Given the geometry of the problem presented in the previous section, we now propose a way to use the sun positions to determine the focal length, zenith and azimuth angles of a camera. The idea is to minimize the reprojection error, that is, minimize the distance between the sun labels and the sun position projected onto the image plane. If we have several observations, we can find the solution that best fits the data in a least-squares sense: min f c,θ c,φ c N p i P(R 1 s i ) 2, (6) i=1 where p i = [ u i v i ] T are the sun labels in the image plane, and P is the projection operator (5). Our goal is to find the camera parameters (f c, θ c, φ c ) that will best align the sun labels p i with the rotated sun position s i projected in the image. Written explicitely, we minimize: min f c,θ c,φ c ( ) 2 ( ) 2 N u i fy s,i x + v i fz s,i s,i x, (7) s,i i=1

4 4 VALIDATION ON SYNTHETIC DATA Name Symbol Default value Focal length f c 1 px Number of image N 2 Camera zenith θ c 9 Camera azimuth φ c (North) Table 1: Default values for each parameters used in the experiments on synthetic data. Symbols for each parameter are also shown, is used when the symbol is not referred to in any equation. where (x s,i, y s,i, z s,i ) are defined by x s,i = sin θ c cos φ c x s,i + sin θ c sin φ c y s,i + cos θ c z s,i y s,i = sin φ c x s,i + cos φ c y s,i z s,i = cos θ c cos φ c x s,i cos θ c sin φ c y s,i + sin θ c z s,i (8) Equation (7) is a non-linear least-squares minimization in the parameters (θ c, φ c ), which can be solved using Levenberg-Marquadt. The procedure is initialized by finding the minimum value for (7) over a discretized grid in the parameter space. 4 Validation on Synthetic Data In order to thoroughly validate our approach, we test it under a very wide variety of conditions using synthetically-generated data. Data points are generated randomly from a uniform distribution within the visible sky area (above the horizon line). We evaluate the influence of four parameters: The camera focal length f c. The number of available images N. visible in many images. In practice, the sun might not be The camera zenith angle θ c. This angle indicates whether the camera is pointing up or down. The camera azimuth angle φ c. This is the angle of the camera with respect to North (increasing towards West). Gaussian noise in labeling precision is also added in order to evaluate its influence on the quality of the results. In order to account for the randomness in point selection, we perform each experiment 2 times, and show the corresponding error bars. For each plot shown below, one parameter is varied at a time, and the others are kept constant at a default value. Table 1 indicates the default values for each parameter. In addition, only results obtained by minimizing the 3-D distance are shown, because those obtained by minimizing the reprojection error are very similar.

4.1 Focal Length 5 4.1 Focal Length The plots illustrating the error in focal length estimation are shown in Fig. 2. In the noisy case, the estimation error is at most 1.5%, when the input focal length is very small (see Fig. 2-(a)). Estimation quality does not seem to be affected by the horizon line position or the camera azimuth, as shown in Fig. 2-(c) and (d). 1.6 1.4 Reprojection error in focal length estimation σ 2 =. σ 2 = 5. 1.4 1.2 Reprojection error in focal length estimation σ 2 =. σ 2 = 5. 1.2 1 Focal length error (%) 1.8.6 Focal length error (%).8.6.4.4.2.2 2 4 6 8 1 12 14 16 18 2 Focal length (px) (a) Focal length f c 5 1 15 2 25 3 35 4 45 5 Number of images (b) Number of images N.8.7 Reprojection error in focal length estimation σ 2 =. σ 2 = 5..7.6 Reprojection error in focal length estimation σ 2 =. σ 2 = 5..6.5 Focal length error (%).5.4.3 Focal length error (%).4.3.2.2.1.1 84 86 88 9 92 94 96 Zenith angle (deg) (c) Camera zenith θ c 15 1 5 5 1 15 Azimuth angle (deg) (d) Camera azimuth φ c Figure 2: Error in focal length estimation as a function of several parameters, and noise level. For each plot, one parameter is varied, while the others are kept fixed at their default values. The default values used in all the experiments are shown in Table 1. Three different gaussian noise levels are shown, ranging from no noise (σ 2 = ), low (σ 2 = 5) and high (σ 2 = 1). 4.2 Camera Zenith The plots illustrating the error in camera zenith estimation are shown in Fig. 3. The results are very similar to the ones obtained for the focal length. In the noisy case, the estimation error is at most.5, when the input focal length is

6 4 VALIDATION ON SYNTHETIC DATA very small (see Fig. 3-(a)). Estimation quality does not seem to be affected by the horizon line position or the camera azimuth, as shown in Fig. 3-(c) and (d)..45.4.35 Reprojection error in camera zenith estimation σ 2 =. σ 2 = 5..2.18.16 Reprojection error in camera zenith estimation σ 2 =. σ 2 = 5. Camera zenith error (deg).3.25.2.15 Camera zenith error (deg).14.12.1.8.6.1.4.5.2 2 4 6 8 1 12 14 16 18 2 Focal length (px) (a) Focal length f c 5 1 15 2 25 3 35 4 45 5 Number of images (b) Number of labels N.12.1 Reprojection error in camera zenith estimation σ 2 =. σ 2 = 5..1.9.8 Reprojection error in camera zenith estimation σ 2 =. σ 2 = 5. Camera zenith error (deg).8.6.4 Camera zenith error (deg).7.6.5.4.3.2.2.1 84 86 88 9 92 94 96 Zenith angle (deg) (c) Camera zenith θ c 15 1 5 5 1 15 Azimuth angle (deg) (d) Camera azimuth φ c Figure 3: Error in camera zenith estimation as a function of several parameters, and noise level. For each plot, one parameter is varied, while the others are kept fixed at their default values. The default values used in all the experiments are shown in Table 1. Three different gaussian noise levels are shown, ranging from no noise (σ 2 = ), low (σ 2 = 5) and high (σ 2 = 1). 4.3 Camera Azimuth The plots illustrating the error in camera azimuth estimation are also very similar to the previous ones, and are shown in Fig. 4. In the noisy case, the estimation error is at most.4, when the input focal length is very small (see Fig. 4-(a)). Estimation quality does not seem to be affected by the horizon line position or the camera azimuth, as shown in Fig. 4-(c) and (d). Using synthetic data, we were able to demonstrate the usefulness of our approach in a very wide variety of situations. However, we also need to test it using real data to show its usefulness in real applications.

7.7.6 Reprojection error in camera azimuth estimation σ 2 =. σ 2 = 5..6.5 Reprojection error in camera azimuth estimation σ 2 =. σ 2 = 5. Camera azimuth error (deg).5.4.3.2 Camera azimuth error (deg).4.3.2.1.1 2 4 6 8 1 12 14 16 18 2 Focal length (px) (a) Focal length f c 5 1 15 2 25 3 35 4 45 5 Number of images (b) Number of labels N.35.3 Reprojection error in camera azimuth estimation σ 2 =. σ 2 = 5..3.25 Reprojection error in camera azimuth estimation σ 2 =. σ 2 = 5. Camera azimuth error (deg).25.2.15.1 Camera azimuth error (deg).2.15.1.5.5 84 86 88 9 92 94 96 Zenith angle (deg) (c) Horizon line θ c 15 1 5 5 1 15 Azimuth angle (deg) (d) Camera azimuth φ c Figure 4: Error in camera azimuth estimation as a function of several parameters, and noise level. For each plot, one parameter is varied, while the others are kept fixed at their default values. The default values used in all the experiments are shown in Table 1. Three different gaussian noise levels are shown, ranging from no noise (σ 2 = ), low (σ 2 = 5) and high (σ 2 = 1). 5 Representative Results on Real Data When the sun is visible in an image sequence, it is possible to recover the camera parameters using the technique presented in this document. To validate the results, we can then predict the sun position in all the images, even those that have not been labeled, and inspect the results visually. Fig. 5 shows a few examples of predicted sun positions for a typical image sequence. The horizon line can also be predicted from the recovered camera parameters as an additional way of evaluating the quality of the results: v h = f c tan(π/2 θ c ). (9) Table 2 shows the estimated parameters for the same sequence.

8 6 SUMMARY f c φ c θ c 651.57 px 93.39 West 85.94 Table 2: Summary of the estimated values for the sequence shown in Fig. 5. (a) frame 353 (b) frame 894 (c) frame 12659 (d) frame 1578 Figure 5: Sun position and horizon prediction. The sun position (yellow star) and horizon line (red horizontal line) is predicted for un-labelled images from a typical sequence where the sun is visible. Note that the first and second row of images are separated by 4 months, but the accuracy is not affected. The horizon line is never explicitely labelled, but can still be successfully recovered. 6 Summary We presented a technique to determine the camera parameters when the sun is visible in an image sequence. The technique requires a user to label the position of the sun in a few images. In particular, this technique is used in [5] to obtain high precision estimates of the camera parameters and thus serve as a basis for evaluating their calibration-from-sky algorithm.

REFERENCES 9 References [1] F. Cozman and E. Krotkov. Robot localization using a computer vision sextant. In IEEE International Conference on Robotics and Automation, 1995. [2] J. Heikkilä. Geometric camera calibration using circular control points. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(1):166 177, October 2. [3] N. Jacobs, S. Satkin, N. Roman, R. Speyer, and R. Pless. Geolocating static cameras. In IEEE International Conference on Computer Vision, 27. [4] K. Kanatani. Analysis of 3-D rotation fitting. IEEE Transactions on Pattern Analysis and Machine Intelligence, 16(5):543 549, May 1994. [5] J.-F. Lalonde, S. G. Narasimhan, and A. A. Efros. What does the sky tell us about the camera? In European Conference on Computer Vision, 28. [6] I. Reda and A. Andreas. Solar position algorithm for solar radiation applications. Technical Report NREL/TP-56-3432, National Renewable Energy Laboratory, November 25. [7] A. Trebi-Ollennu, T. Huntsberger, Y. Cheng, E. T. Baumgartner, B. Kennedy, and P. Schenker. Design and analysis of a sun sensor for planetary rover absolute heading detection. IEEE Transactions on Robotics and Automation, 17(6):939 947, December 21. [8] Z. Zhang. A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11):133 1334, 2.