Understanding Variability

Similar documents
Introduction to Computer Vision. Introduction CMPSCI 591A/691A CMPSCI 570/670. Image Formation

Lecture 22: Basic Image Formation CAP 5415

Capturing light. Source: A. Efros

Image Formation: Light and Shading. Introduction to Computer Vision CSE 152 Lecture 3

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Shading Models. Welcome!

dq dt I = Irradiance or Light Intensity is Flux Φ per area A (W/m 2 ) Φ =

Measuring Light: Radiometry and Cameras

Reflectance & Lighting

Cameras and Radiometry. Last lecture in a nutshell. Conversion Euclidean -> Homogenous -> Euclidean. Affine Camera Model. Simplified Camera Models

Announcements. Image Formation: Light and Shading. Photometric image formation. Geometric image formation

Photometric Stereo.

dq dt I = Irradiance or Light Intensity is Flux Φ per area A (W/m 2 ) Φ =

Overview. Radiometry and Photometry. Foundations of Computer Graphics (Spring 2012)

CS201 Computer Vision Lect 4 - Image Formation

Module 5: Video Modeling Lecture 28: Illumination model. The Lecture Contains: Diffuse and Specular Reflection. Objectives_template

Radiance. Radiance properties. Radiance properties. Computer Graphics (Fall 2008)

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication

Radiometry and reflectance

Engineered Diffusers Intensity vs Irradiance

EE Light & Image Formation

Introduction to Computer Vision. Week 8, Fall 2010 Instructor: Prof. Ko Nishino

Announcement. Lighting and Photometric Stereo. Computer Vision I. Surface Reflectance Models. Lambertian (Diffuse) Surface.

Global Illumination. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

Lighting affects appearance

CMSC427 Shading Intro. Credit: slides from Dr. Zwicker

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

CS6670: Computer Vision

Radiometry. Radiometry. Measuring Angle. Solid Angle. Radiance

AP Physics: Curved Mirrors and Lenses

Announcements. Radiometry and Sources, Shadows, and Shading

Paths, diffuse interreflections, caching and radiometry. D.A. Forsyth

Photometric Stereo. Lighting and Photometric Stereo. Computer Vision I. Last lecture in a nutshell BRDF. CSE252A Lecture 7

Announcements. Lighting. Camera s sensor. HW1 has been posted See links on web page for readings on color. Intro Computer Vision.

CS 5625 Lec 2: Shading Models

Mosaics, Plenoptic Function, and Light Field Rendering. Last Lecture

Radiometry & BRDFs CS295, Spring 2017 Shuang Zhao

CENG 477 Introduction to Computer Graphics. Ray Tracing: Shading

CSE 167: Lecture #7: Color and Shading. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011

Radiometry Measuring Light

Computer Graphics (CS 4731) Lecture 16: Lighting, Shading and Materials (Part 1)

Global Illumination The Game of Light Transport. Jian Huang

Computer Vision. The image formation process

Radiometry (From Intro to Optics, Pedrotti 1-4) Radiometry is measurement of Emag radiation (light) Consider a small spherical source Assume a black

Light: Geometric Optics

Lighting. Figure 10.1

Measuring Light: Radiometry and Photometry

Ray Optics. Ray model Reflection Refraction, total internal reflection Color dispersion Lenses Image formation Magnification Spherical mirrors

Ray Optics. Physics 11. Sources of Light Rays: Self-Luminous Objects. The Ray Model of Light

Optics II. Reflection and Mirrors

782 Schedule & Notes

Radiometry. Reflectance & Lighting. Solid Angle. Radiance. Radiance Power is energy per unit time

Lighting affects appearance

COSC579: Scene Geometry. Jeremy Bolton, PhD Assistant Teaching Professor

Topic 9: Lighting & Reflection models 9/10/2016. Spot the differences. Terminology. Two Components of Illumination. Ambient Light Source

Other approaches to obtaining 3D structure

Lighting and Reflectance COS 426

Topic 9: Lighting & Reflection models. Lighting & reflection The Phong reflection model diffuse component ambient component specular component

Computer Graphics (CS 543) Lecture 7b: Intro to lighting, Shading and Materials + Phong Lighting Model

Part Images Formed by Flat Mirrors. This Chapter. Phys. 281B Geometric Optics. Chapter 2 : Image Formation. Chapter 2: Image Formation

The Rendering Equation. Computer Graphics CMU /15-662

Image Processing 1 (IP1) Bildverarbeitung 1

How to achieve this goal? (1) Cameras

Radiometry. Computer Graphics CMU /15-662, Fall 2015

Chapter 36. Image Formation

Representing the World

Light: Geometric Optics (Chapter 23)

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2016

Announcements. Rotation. Camera Calibration

Conceptual Physics 11 th Edition

Rendering: Reality. Eye acts as pinhole camera. Photons from light hit objects

Epipolar geometry contd.

Simple Lighting/Illumination Models

The Rendering Equation. Computer Graphics CMU /15-662, Fall 2016

Assignment #2. (Due date: 11/6/2012)

Image formation. Thanks to Peter Corke and Chuck Dyer for the use of some slides

Measuring Light: Radiometry and Photometry

PHYSICS. Chapter 34 Lecture FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E RANDALL D. KNIGHT

Chapter 32 Light: Reflection and Refraction. Copyright 2009 Pearson Education, Inc.

Refraction of Light. This bending of the ray is called refraction

The Law of Reflection

Light: Geometric Optics

Lights, Surfaces, and Cameras. Light sources emit photons Surfaces reflect & absorb photons Cameras measure photons

And if that 120MP Camera was cool

4. A bulb has a luminous flux of 2400 lm. What is the luminous intensity of the bulb?

Lecture 15: Shading-I. CITS3003 Graphics & Animation

Physics 11 Chapter 18: Ray Optics

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

Philpot & Philipson: Remote Sensing Fundamentals Interactions 3.1 W.D. Philpot, Cornell University, Fall 12

Computer Vision Project-1

3D Computer Vision. Photometric stereo. Prof. Didier Stricker

Vision Review: Image Formation. Course web page:

CS6670: Computer Vision

E (sensor) is given by; Object Size

Reflection & Mirrors

Introduction to Computer Vision

Announcements. Image Formation: Outline. Homogenous coordinates. Image Formation and Cameras (cont.)

OPPA European Social Fund Prague & EU: We invest in your future.

Ray Optics I. Last time, finished EM theory Looked at complex boundary problems TIR: Snell s law complex Metal mirrors: index complex

Illumination. Courtesy of Adam Finkelstein, Princeton University

CS-184: Computer Graphics. Today. Lecture 22: Radiometry! James O Brien University of California, Berkeley! V2014-S

Transcription:

Understanding Variability Why so different?

Light and Optics Pinhole camera model Perspective projection Thin lens model Fundamental equation Distortion: spherical & chromatic aberration, radial distortion Reflection and Illumination: color, Lambertian and specular surfaces, Phong, BRDF Sensing Light Conversion to Digital Images Sampling Theorem Other Sensors: frequency, type,.

Basic Radiometry Radiometry is the part of image formation concerned with the relation among the amounts of light energy emitted from light sources, reflected from surfaces, and registered by sensors. Light Source CCD Array Optics p i n e P L(P,d) Surface

Lecture Assumptions Typical imaging scenario: visible light ideal lenses standard sensor (e.g. TV camera) opaque objects Goal To create 'digital' images which can be processed to recover some of the characteristics of the 3D world which was imaged.

Steps World Optics Sensor Signal Digitizer Digital Representation World reality Optics focus {light} from world on sensor Sensor converts {light} to {electrical energy} Signal representation of incident light as continuous electrical energy Digitizer converts continuous signal to discrete signal Digital Rep. final representation of reality in computer memory

Factors in Image Formation Geometry concerned with the relationship between points in the three-dimensional world and their images Radiometry concerned with the relationship between the amount of light radiating from a surface and the amount incident at its image Photometry concerned with ways of measuring the intensity of light Digitization concerned with ways of converting continuous signals (in both space and time) to digital approximations

Image Formation Light (Energy) Source Surface Imaging Plane Pinhole Lens World Optics Sensor Signal B&W Film Color Film TV Camera Silver Density Silver density in three color layers Electrical

Geometry Geometry describes the projection of: three-dimensional (3D) world two-dimensional (2D) image plane. Typical Assumptions Light travels in a straight line Optical Axis: the perpendicular from the image plane through the pinhole (also called the central projection ray) Each point in the image corresponds to a particular direction defined by a ray from that point through the pinhole. Various kinds of projections: - perspective - oblique - orthographic - isometric - spherical

Basic Optics Two models are commonly used:" Pin-hole camera" Optical system composed of lenses" Pin-hole is the basis for most graphics and vision" Derived from physical construction of early cameras" Mathematics is very straightforward" Thin lens model is first of the lens models" Mathematical model for a physical lens" Lens gathers light over area and focuses on image plane."

Pinhole Camera Model Image Plane Optical Axis f Pinhole lens World projected to 2D Image Image inverted Size reduced Image is dim No direct depth information f called the focal length of the lens Known as perspective projection

Pinhole camera image Amsterdam Photo by Robert Kosara, robert@kosara.net http://www.kosara.net/gallery/pinholeamsterdam/pic01.html

Equivalent Geometry Consider case with object on the optical axis: z f More convenient with upright image: z - f Projection plane z = 0 Equivalent mathematically

Thin Lens Model Rays entering parallel on one side converge at focal point." Rays diverging from the focal point become parallel." f i o IMAGE PLANE LENS OPTIC AXIS 1 = 1 1 + THIN LENS LAW f i o

Coordinate System Simplified Case: Origin of world and image coordinate systems coincide Y-axis aligned with y-axis X-axis aligned with x-axis Z-axis along the central projection ray Image Coordinate System p(x,y) y (0,0,0) X Y Y World Coordinate System Z P(X,Y,Z) x (0,0) X Z

Perspective Projection Compute the image coordinates of p in terms of the world coordinates of P. p(x, y) y P(X,Y,Z) Z=-f x Z = 0 Z Look at projections in x-z and y-z planes

X-Z Projection - f Z x X By similar triangles: x f = X Z+f x = fx Z+f

Y-Z Projection y Y - f Z y By similar triangles: = f Y Z+f y = fy Z+f

Perspective Equations Given point P(X,Y,Z) in the 3D world The two equations: x = fx Z+f y = fy Z+f transform world coordinates (X,Y,Z) into image coordinates (x,y)

Reverse Projection Given a center of projection and image coordinates of a point, it is not possible to recover the 3D depth of the point from a single image. P(X,Y,Z) can be anywhere along this line p(x,y) All points on this line have image coordinates (x,y). In general, at least two images of the same point taken from two different locations are required to recover depth.

Stereo Geometry P(X,Y,Z) Object point p l Central Projection Rays p r Vergence Angle Depth obtained by triangulation Correspondence problem: p l and p r must correspond to the left and right projections of P, respectively.

Variability in appearance Consequences of image formation geometry for computer vision What set of shapes can an object take on? rigid non-rigid planar non-planar SIFT features Sensitivity to errors.

Radiometry Brightness: informal notion used to describe both scene and image brightness. Image brightness: related to energy flux incident on the image plane: IRRADIANCE Illuminance Scene brightness: brightness related to energy flux emitted (radiated) from a surface. RADIANCE luminance

Light and Surfaces Reflection mirrors highlights specularities Scattering Lambertian matte diffuse

Light sources Point source Extended source Single wavelength Multi-wavelength Uniform Non-uniform

Linearity of Light Linearity definition For extended sources For multiple wavelengths Across time

Light watts radiated into 4 steradians! watts d" r = d sphere d R = Point Source Radiant Intensity = d (of source) Watts/unit solid angle (steradian)

Irradiance Light falling on a surface from all directions. How much? d da Irradiance: power per unit area falling on a surface. d Irradiance E = d watts/m 2

Inverse Square Law Relationship between point source radiance (radiant intensity) and irradiance da! watts d" r d = da r 2 d E = d R: Radiant Intensity E: Irradiance d R = d r 2 d = = d r 2 E : Watts : Steradians E = R r 2

Surface acts as light source Radiates over a hemisphere Surface Radiance (Extended source) Surface Normal d! R: Radiant Intensity E: Irradiance L: Surface radiance da f Surface Radiance: power per unit foreshortened area emitted into a solid angle L = d da d f (watts/m 2 steradian)

Pseudo-Radiance Consider two definitions: Radiance: power per unit foreshortened area emitted into a solid angle Pseudo-radiance power per unit area emitted into a solid angle Why should we work with radiance rather than pseudoradiance? Only reason: Radiance is more closely related to our intuitive notion of brightness.

Lambertian Surfaces A particular point P on a Lambertian (perfectly matte) surface appears to have the same brightness no matter what angle it is viewed from. Piece of paper Matte paint Doesn t depend upon incident light angle. What does this say about how they emit light?

Lambertian Surfaces Equal Amounts of Light Observed From Each Vantage Point Theta Area of black box = 1 Area of orange box = 1/cos(Theta) Foreshortening rule.

Lambertian Surfaces Relative magnitude of light scattered in each direction. Proportional to cos (Theta).

Lambertian Surfaces Equal Amounts of Light Observed From Each Vantage Point Theta Area of black box = 1 Area of orange box = 1/cos(Theta) Foreshortening rule. Radiance = 1/cos(Theta)*cos(Theta) = 1

The BRDF The bidirectional reflectance distribution function.

Geometry Goal: Relate the radiance of a surface to the irradiance in the image plane of a simple optical system. das " Lens Diameter d : Solid angle of patch da s : Area on surface da i : Area in image i! e! dai

Light at the Surface E = flux incident on the surface (irradiance) = d d i = incident angle e = emittance angle g = phase angle " = surface reflectance d#! watts Incident Ray r da s " i e g N (surface normal) Emitted ray We need to determine d and da

Reflections from a Surface I da da = da s cos i {foreshortening effect in direction of light source} d d = flux intercepted by surface over area da da subtends solid angle d = da s cos i / r 2 d = R d = R da s cos i / r 2 E = d / da s Surface Irradiance: E = R cos i / r 2

Reflections from a Surface II Now treat small surface area as an emitter.because it is bouncing light into the world How much light gets reflected? Incident Ray da s i e g N (surface normal) Emitted Ray E is the surface irradiance L is the surface radiance = luminance They are related through the surface reflectance function: L s E = (i,e,g, ) May also be a function of the wavelength of the light

das Power Concentrated in Lens Lens Diameter d "! Image Plane! dai L = s d 2 da d s Z -f Luminance of patch (known from previous step) What is the power of the surface patch as a source in the direction of the lens? d 2 = L s da d s

das Through a Lens Darkly Lens Diameter d! e " In general: L is a function of the angles i and e. s Lens can be quite large Hence, must integrate over the lens solid angle to get d d = da s L s d

Simplifying Assumption Lens diameter is small relative to distance from patch Surface area of patch in direction of lens d = da d = da s L s s L s d d L s is a constant and can be removed from the integral Solid angle subtended by lens in direction of patch = da s cos e = Area of lens as seen from patch (Distance from lens to patch) 2 = 2 (d/2) cos (z / cos ) 2

Putting it Together d = da s L s d = da cos e L s s 2 (d/2) cos (z / cos ) 2 Power concentrated in lens: d d = L s da cos e cos 3 s 4 Z 2 Assuming a lossless lens, this is also the power radiated by the lens as a source.

das Lens Diameter d Through a Lens Darkly "! Image Plane! dai Image irradiance at da = Z -f i d da i =E i E = i L s da s 2 3 da i 4 d Z cos e cos ratio of areas

das Lens Diameter d Patch ratio "! Image Plane! dai Z -f The two solid angles are equal da s cos e da i cos (Z / cos ) 2 = da s cos (-f / cos ) 2 = da i cos e Z -f 2

The Fundamental Result Source Radiance to Image Sensor Irradiance: da s da i = cos cos e Z -f 2 E = i L s da s 2 3 da i 4 d Z cos e cos E = i L s cos cos e Z -f 2 4 d Z 2 3 cos e cos E = i L s 4 d -f 2 4 cos

Radiometry Final Result E = i L s 4 d -f 2 4 cos Image irradiance is a function of: Scene radiance L s Focal length of lens f Diameter of lens d f/d is often called the 'effective focal length' of the lens Off-axis angle

4 Cos Light Falloff Lens Center Top view shaded by height y /2 /2 /2 x

Limitation of Radiometry Model Surface reflection can be a function of viewing and/or illumination angle Surface normal N i g! i! e e (i,e,g,, e ) = i dl(e, ) e de(i, ) i Arbitrary reference axis may also be a function of the wavelength of the light source Assumed a point source (sky, for example, is not)

Lambertian Surfaces The BRDF for a Lambertian surface is a constant (i,e,g,, ) = k e i function of cos e due to the foreshortening effect k is the 'albedo' of the surface Good model for diffuse surfaces Other models combine diffuse and specular components (Phong, Torrance-Sparrow, Oren-Nayar)

BRDF Ron Dror s thesis

Reflection parameters

Real World Light Variation

Fake Light and Real Light

Simple and Complex Light

BRDF again

Likelihood of material

Classifying Under Uncertainty

More Fake Light

Illumination Maps

Classification

Classification Continued

Lighting direction vs. Viewer direction For Lambertian surface: Viewer direction is irrelevant Lighting direction is very relevant