Cameras and Radiometry. Last lecture in a nutshell. Conversion Euclidean -> Homogenous -> Euclidean. Affine Camera Model. Simplified Camera Models

Similar documents
Announcements. The equation of projection. Image Formation and Cameras

Reflectance & Lighting

Announcements. Equation of Perspective Projection. Image Formation and Cameras

Announcements. Image Formation: Outline. Homogenous coordinates. Image Formation and Cameras (cont.)

Introduction to Computer Vision. Introduction CMPSCI 591A/691A CMPSCI 570/670. Image Formation

Understanding Variability

Mosaics, Plenoptic Function, and Light Field Rendering. Last Lecture

Vision Review: Image Formation. Course web page:

Capturing light. Source: A. Efros

COSC579: Scene Geometry. Jeremy Bolton, PhD Assistant Teaching Professor

Announcements. Lighting. Camera s sensor. HW1 has been posted See links on web page for readings on color. Intro Computer Vision.

Image Formation. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

Radiometry. Radiometry. Measuring Angle. Solid Angle. Radiance

Visual Recognition: Image Formation

Measuring Light: Radiometry and Cameras

3D Vision Real Objects, Real Cameras. Chapter 11 (parts of), 12 (parts of) Computerized Image Analysis MN2 Anders Brun,

Geometric camera models and calibration

Camera model and multiple view geometry

Homogeneous Coordinates. Lecture18: Camera Models. Representation of Line and Point in 2D. Cross Product. Overall scaling is NOT important.

CSE 252B: Computer Vision II

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Shading Models. Welcome!

Discriminant Functions for the Normal Density

Camera Model and Calibration

dq dt I = Irradiance or Light Intensity is Flux Φ per area A (W/m 2 ) Φ =

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication

Rigid Body Motion and Image Formation. Jana Kosecka, CS 482

3D Geometry and Camera Calibration

Announcements. Rotation. Camera Calibration

CS4670: Computer Vision

The Light Field. Last lecture: Radiometry and photometry

Single View Geometry. Camera model & Orientation + Position estimation. What am I?

Perspective Projection [2 pts]

Representing the World

dq dt I = Irradiance or Light Intensity is Flux Φ per area A (W/m 2 ) Φ =

Announcements. Radiometry and Sources, Shadows, and Shading

Image Formation: Light and Shading. Introduction to Computer Vision CSE 152 Lecture 3

Image formation. Thanks to Peter Corke and Chuck Dyer for the use of some slides

Computer Vision cmput 428/615

Perspective projection and Transformations

Lecture 11 MRF s (conbnued), cameras and lenses.

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.

Computer Vision Projective Geometry and Calibration. Pinhole cameras

Photometric Stereo. Lighting and Photometric Stereo. Computer Vision I. Last lecture in a nutshell BRDF. CSE252A Lecture 7

Agenda. Rotations. Camera models. Camera calibration. Homographies

Camera Model and Calibration. Lecture-12

Robotics - Projective Geometry and Camera model. Marcello Restelli

ECE-161C Cameras. Nuno Vasconcelos ECE Department, UCSD

Announcements. Image Formation: Light and Shading. Photometric image formation. Geometric image formation

Realistic Camera Model

CS6670: Computer Vision

Pin Hole Cameras & Warp Functions

Assignment 2 : Projection and Homography

Pin Hole Cameras & Warp Functions

Outline. ETN-FPI Training School on Plenoptic Sensing

Radiometry. Reflectance & Lighting. Solid Angle. Radiance. Radiance Power is energy per unit time

Chapter 23. Geometrical Optics (lecture 1: mirrors) Dr. Armen Kocharian

Camera Projection Models We will introduce different camera projection models that relate the location of an image point to the coordinates of the

Final Exam. Today s Review of Optics Polarization Reflection and transmission Linear and circular polarization Stokes parameters/jones calculus

Radiometry and reflectance

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah

Waves & Oscillations

CS667 Lecture Notes: Radiometry

CIS 580, Machine Perception, Spring 2016 Homework 2 Due: :59AM

Projective Geometry and Camera Models

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Computer Vision Projective Geometry and Calibration. Pinhole cameras

Autonomous Navigation for Flying Robots

Agenda. Rotations. Camera calibration. Homography. Ransac

Single View Geometry. Camera model & Orientation + Position estimation. What am I?

Radiance. Radiance properties. Radiance properties. Computer Graphics (Fall 2008)

Radiometry & BRDFs CS295, Spring 2017 Shuang Zhao

Agenda. Perspective projection. Rotations. Camera models

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 253

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah

CS6670: Computer Vision

Computer Vision Projective Geometry and Calibration

More Mosaic Madness. CS194: Image Manipulation & Computational Photography. Steve Seitz and Rick Szeliski. Jeffrey Martin (jeffrey-martin.

3D Graphics for Game Programming (J. Han) Chapter II Vertex Processing

Announcements. Camera Calibration. Thin Lens: Image of Point. Limits for pinhole cameras. f O Z

References Photography, B. London and J. Upton Optics in Photography, R. Kingslake The Camera, The Negative, The Print, A. Adams

Measuring Light: Radiometry and Photometry

2.710 Optics Spring 09 Solutions to Problem Set #1 Posted Wednesday, Feb. 18, 2009

A question from Piazza

Computer Vision Project-1

Spectral Color and Radiometry

Overview. Radiometry and Photometry. Foundations of Computer Graphics (Spring 2012)

Diffraction. Single-slit diffraction. Diffraction by a circular aperture. Chapter 38. In the forward direction, the intensity is maximal.

Today s lecture. Image Alignment and Stitching. Readings. Motion models

Part Images Formed by Flat Mirrors. This Chapter. Phys. 281B Geometric Optics. Chapter 2 : Image Formation. Chapter 2: Image Formation

E (sensor) is given by; Object Size

3-D D Euclidean Space - Vectors

Single View Geometry. Camera model & Orientation + Position estimation. Jianbo Shi. What am I? University of Pennsylvania GRASP

Paths, diffuse interreflections, caching and radiometry. D.A. Forsyth

CSE152a Computer Vision Assignment 1 WI14 Instructor: Prof. David Kriegman. Revision 0

Lighting. Camera s sensor. Lambertian Surface BRDF

Pinhole Camera Model 10/05/17. Computational Photography Derek Hoiem, University of Illinois

Computer Vision I Name : CSE 252A, Fall 2012 Student ID : David Kriegman Assignment #1. (Due date: 10/23/2012) x P. = z

Viewing. Reading: Angel Ch.5

521466S Machine Vision Exercise #1 Camera models

Geometric transformations in 3D and coordinate frames. Computer Graphics CSE 167 Lecture 3

Transcription:

Cameras and Radiometry Last lecture in a nutshell CSE 252A Lecture 5 Conversion Euclidean -> Homogenous -> Euclidean In 2-D Euclidean -> Homogenous: (x, y) -> k (x,y,1) Homogenous -> Euclidean: (x, y, z) -> (x/z, y/z) In 3-D Euclidean -> Homogenous: (x, y, z) -> k (x,y,z,1) Homogenous -> Euclidean: (x, y, z, w) -> (x/w, y/w, z/w) Z 1 Y X (x,y,1) (x,y) Affine Camera Model Take perspective projection equation, and perform Taylor series expansion about some point (x 0,y 0,z 0 ). Drop terms that are higher order than linear. Resulting expression is affine camera model Appropriate in Neighborhoo About (x 0,y 0,z 0 Simplified Camera Models erspective rojection Coordinate Changes: Rigid Transformations Affine Camera Model Scaled rthographic rojection Euclidean rthographic rojection Homogeneous 1

Some points about S(n) S(n) = { R R nxn : R T R = I, det(r) = 1} S(2): rotation matrices in plane R 2 S(3): rotation matrices in 3-space R 3 orms a Group under matrix product operation: Identity Inverse Associative Closure Closed (finite intersection of closed sets) Bounded R i,j [-1, +1] Does not form a vector space. Manifold of dimension n(n-1)/2 Dim(S(2)) = 1 Dim(S(3)) = 3 S(3) arameterizations of S(3) 3-D manifold, so between 3 parameters and 2n +1 parameters (Whitney s Embedding Thm.) Roll-itch-Yaw Euler Angles Axis Angle (Rodrigues formula) Cayley s formula Matrix Exponential Quaternions (four parameters + one constraint) Camera parameters Issue World units (e.g., cm), camera units (pixels) camera may not be at the origin, looking down the z-axis extrinsic parameters one unit in camera coordinates may not be the same as one unit in world coordinates intrinsic parameters - focal length, principal point, aspect ratio, angle between axes, etc. Camera Calibration X U Transformation 1 0 0 0 Transformation V = representing 0 1 0 0 representing Y Z W intrinsic parameters 0 0 1 0 extrinsic parameters T 3 x 3 4 x 4: Rigid transformation, estimate intrinsic and extrinsic camera parameters See Text book for how to do it. Camera Calibration Toolbox for Matlab (Bouguet) http://www.vision.caltech.edu/bouguetj/calib_doc/ Qualcomm Augmented Reality Demo http://www.youtube.com/watch? v=hxtq1qbmliw&feature=player_embedded rojective Transformation (Homography) 3 x 3 linear transformation of homogenous coordinates oints map to points, Lines map to lines 2

Image of a plane The mapping between coordinates in a plane in 3-D to the image plane under perspective is a projective transformation Beyond the pinhole Camera Getting more light Bigger Aperture A square maps to an arbitrary quadrilateral Under affine or scaled orthographic camera model, square maps to a parallelogram inhole Camera Images with Variable Aperture Limits for pinhole cameras 2 mm 1mm.6 mm.35 mm.15 mm.07 mm The reason for lenses Thin Lens ptical axis Rotationally symmetric about optical axis. Spherical interfaces. 3

Thin Lens: Center Thin Lens: ocus All rays that enter lens along line pointing at emerge in same direction. arallel lines pass through the focus, Thin Lens: Image of oint Thin Lens: Image of oint Z f Z All rays passing through lens and starting at converge upon Thin Lens: Image lane Thin Lens: Aperture Q Q Image lane A price: Whereas the image of is in focus, the image of Q isn t. Image lane Smaller Aperture -> Less Blur inhole -> No Blur 4

ield of View Deviations from the lens model Deviations from this ideal are aberrations Two types Image lane f ield of View 1. geometrical 2. chromatic spherical aberration astigmatism distortion coma Aberrations are reduced by combining lenses Compound lenses Spherical aberration Rays parallel to the axis do not converge uter portions of the lens yield smaller focal lengths Astigmatism An optical system with astigmatism is one where rays that propagate in two perpendicular planes have different foci. If an optical system with astigmatism is used to form an image of a cross, the vertical and horizontal lines will be in sharp focus at two different distances. object Distortion magnification/focal length different for different angles of inclination Chromatic aberration (great for prisms, bad for lenses) pincushion (tele-photo) barrel (wide-angle) Can be corrected! (if parameters are know) 5

Chromatic aberration Vignetting: Spatial Non-Uniformity rays of different wavelengths focused in different planes cannot be removed completely sometimes achromatization is achieved for more than 2 wavelengths camera Iris Litvinov & Schechner, radiometric nonidealities Vignetting Appearance and surface reflectance nly part of the light reaches the sensor eriphery of the image is dimmer Read Chapter 4 of once & orsyth Solid Angle Irradiance Radiance BRD Lambertian/hong BRD Radiometry A local coordinate system on a surface Consider a point on the surface Light arrives at from a hemisphere of directions defined by the surface normal N We can defined a local coordinate system whose origin is and with one axis aligned with N Convenient to represent in spherical angles. 6

oreshortening Measuring Angle The solid angle subtended by an object from a point is the area of the projection of the object onto the unit sphere centered at. Measured in steradians, sr Definition is analogous to projected angle in 2D If I m at, and I look out, solid angle tells me how much of my view is filled with an object By analogy with angle (in radians), the solid angle subtended by a region at a point is the area projected on a unit sphere centered at that point The solid angle subtended by a patch area da is given by Solid Angle ower is energy per unit time Radiance: ower traveling at some point in a specified direction, per unit area perpendicular to the direction of travel, per unit solid angle Symbol: L(x,θ,φ) Units: watts per square meter per steradian : w/(m 2 sr 1 ) (θ, φ) dω x da CSE 252A ower is energy per unit time Radiance: ower traveling at some point in a specified direction, per unit area perpendicular to the direction of travel, per unit solid angle Symbol: L(x,θ,φ) Units: watts per square meter per steradian : w/(m 2 sr 1 ) L = α (θ, φ) x da dω (dacosα)dω In free space, radiance is constant as it propagates along a ray Derived from conservation of flux undamental in Light Transport. CSE 252A ower emitted from patch, but radiance in direction different from surface normal CSE 252A, Winter 2007 7

Crucial property: How much light is arriving at a Total Irradiance arriving at the surface? surface is given by adding Units of irradiance: Watts/m 2 irradiance over all incoming angles This is a function of incoming angle. Total irradiance is A surface experiencing radiance L (x,θ,φ) coming in from solid angle dω experiences irradiance: L(x,θ,φ) φ θ N What is image irradiance E for a radiance L emitted from a point? x x CSE 252A CSE 252A, Winter 2007 δa δa δa δa Let δω be the solid angle subtended by δa (or δa ) from the center of the lens Let Ω be the solid angle subtended by the lens from. CSE 252A, Winter 2007 CSE 252A, all 2009 δa δa δa δa The power δ emitted from the patch δa with radiance L and falling on the lens is: CSE 252A, Winter 2007 E = π 2 d cos 4 α L 4 z' E: Image irradiance L: emitted radiance d : Lens diameter z : depth of image plane CSE 252A, Winter 2007 α: Image Angle of Irradiance patch from optical axis 8