Photometric Stereo. Lighting and Photometric Stereo. Computer Vision I. Last lecture in a nutshell BRDF. CSE252A Lecture 7

Similar documents
Announcements. Photometric Stereo. Shading reveals 3-D surface geometry. Photometric Stereo Rigs: One viewpoint, changing lighting

Photometric Stereo. Photometric Stereo. Shading reveals 3-D surface geometry BRDF. HW3 is assigned. An example of photometric stereo

Announcement. Lighting and Photometric Stereo. Computer Vision I. Surface Reflectance Models. Lambertian (Diffuse) Surface.

Announcement. Photometric Stereo. Computer Vision I. Shading reveals 3-D surface geometry. Shape-from-X. CSE252A Lecture 8

And if that 120MP Camera was cool

Radiometry. Reflectance & Lighting. Solid Angle. Radiance. Radiance Power is energy per unit time

Capturing light. Source: A. Efros

A question from Piazza

Radiometry. Radiometry. Measuring Angle. Solid Angle. Radiance

Announcements. Radiometry and Sources, Shadows, and Shading

Reflectance & Lighting

Photometric stereo. Recovering the surface f(x,y) Three Source Photometric stereo: Step1. Reflectance Map of Lambertian Surface

Image Processing 1 (IP1) Bildverarbeitung 1

Announcements. Camera Calibration. Thin Lens: Image of Point. Limits for pinhole cameras. f O Z

Announcements. Rotation. Camera Calibration

Announcements. Lighting. Camera s sensor. HW1 has been posted See links on web page for readings on color. Intro Computer Vision.

Announcements. Image Formation: Outline. Homogenous coordinates. Image Formation and Cameras (cont.)

General Principles of 3D Image Analysis

Assignment #2. (Due date: 11/6/2012)

Global Illumination. Frank Dellaert Some slides by Jim Rehg, Philip Dutre

Sources, shadows and shading

Announcements. Image Formation: Light and Shading. Photometric image formation. Geometric image formation

Image Formation: Light and Shading. Introduction to Computer Vision CSE 152 Lecture 3

Photometric Stereo.

Understanding Variability

Lecture 22: Basic Image Formation CAP 5415

EE Light & Image Formation

Lighting. Camera s sensor. Lambertian Surface BRDF

Epipolar geometry contd.

Overview. Radiometry and Photometry. Foundations of Computer Graphics (Spring 2012)

Starting this chapter

Re-rendering from a Dense/Sparse Set of Images

Radiance. Radiance properties. Radiance properties. Computer Graphics (Fall 2008)

Photometric*Stereo* October*8,*2013* Dr.*Grant*Schindler* *

Photometric Stereo, Shape from Shading SfS Chapter Szelisky

Announcements. Recognition I. Gradient Space (p,q) What is the reflectance map?

Other approaches to obtaining 3D structure

Mosaics, Plenoptic Function, and Light Field Rendering. Last Lecture

Introduction to Computer Vision. Introduction CMPSCI 591A/691A CMPSCI 570/670. Image Formation

Introduction to Radiosity

COMP 558 lecture 16 Nov. 8, 2010

Cameras and Radiometry. Last lecture in a nutshell. Conversion Euclidean -> Homogenous -> Euclidean. Affine Camera Model. Simplified Camera Models

Shape from shading. Surface brightness and Surface Orientation --> Reflectance map READING: Nalwa Chapter 5. BKP Horn, Chapter 10.

The Rendering Equation and Path Tracing

Radiance. Pixels measure radiance. This pixel Measures radiance along this ray

Introduction to Computer Vision. Week 8, Fall 2010 Instructor: Prof. Ko Nishino

6. Illumination, Lighting

Lecture 7 - Path Tracing

BRDF Computer Graphics (Spring 2008)

Chapter 1 Introduction

Perception of Shape from Shading

Lambertian model of reflectance I: shape from shading and photometric stereo. Ronen Basri Weizmann Institute of Science

CS6670: Computer Vision

A Survey of Light Source Detection Methods

3D Motion Analysis Based on 2D Point Displacements

Lights, Surfaces, and Cameras. Light sources emit photons Surfaces reflect & absorb photons Cameras measure photons

Rays and Throughput. The Light Field. Page 1

Lecture 15: Shading-I. CITS3003 Graphics & Animation

Analysis of Planar Light Fields from Homogeneous Convex Curved Surfaces Under Distant Illumination

Homework 4 Computer Vision CS 4731, Fall 2011 Due Date: Nov. 15, 2011 Total Points: 40

CENG 477 Introduction to Computer Graphics. Ray Tracing: Shading

Illumination in Computer Graphics

SOURCES, SHADOWS AND SHADING

Paths, diffuse interreflections, caching and radiometry. D.A. Forsyth

x ~ Hemispheric Lighting

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Shading Models. Welcome!

Ligh%ng and Reflectance

MIT Monte-Carlo Ray Tracing. MIT EECS 6.837, Cutler and Durand 1

HW2. October 24, CSE 252A Computer Vision I Fall Assignment 2

Stereo Wrap + Motion. Computer Vision I. CSE252A Lecture 17

Light Field = Radiance(Ray)

CS6320: 3D Computer Vision Project 3 Shape from Shading

CS-184: Computer Graphics. Today. Lecture 22: Radiometry! James O Brien University of California, Berkeley! V2014-S

Radiometry and reflectance

CS184 LECTURE RADIOMETRY. Kevin Wu November 10, Material HEAVILY adapted from James O'Brien, Brandon Wang, Fu-Chung Huang, and Aayush Dawra

Lighting affects appearance

Lecture 24: More on Reflectance CAP 5415

Module 5: Video Modeling Lecture 28: Illumination model. The Lecture Contains: Diffuse and Specular Reflection. Objectives_template

Lambertian model of reflectance II: harmonic analysis. Ronen Basri Weizmann Institute of Science

CS4670/5760: Computer Vision Kavita Bala Scott Wehrwein. Lecture 23: Photometric Stereo

Precomputed Radiance Transfer: Theory and Practice

Using a Raster Display Device for Photometric Stereo

Local vs. Global Illumination & Radiosity

Lighting and Shading Computer Graphics I Lecture 7. Light Sources Phong Illumination Model Normal Vectors [Angel, Ch

Global Illumination CS334. Daniel G. Aliaga Department of Computer Science Purdue University

3-D Stereo Using Photometric Ratios

2/1/10. Outline. The Radiance Equation. Light: Flux Equilibrium. Light: Radiant Power. Light: Equation. Radiance. Jan Kautz

Announcements. Introduction. Why is this hard? What is Computer Vision? We all make mistakes. What do you see? Class Web Page is up:

dq dt I = Irradiance or Light Intensity is Flux Φ per area A (W/m 2 ) Φ =

Edge detection. Convert a 2D image into a set of curves. Extracts salient features of the scene More compact than pixels

DIRECT SHAPE FROM ISOPHOTES

3D Computer Vision. Photometric stereo. Prof. Didier Stricker

Color to Binary Vision. The assignment Irfanview: A good utility Two parts: More challenging (Extra Credit) Lighting.

Announcements. Hough Transform [ Patented 1962 ] Generalized Hough Transform, line fitting. Assignment 2: Due today Midterm: Thursday, May 5 in class

13 Distribution Ray Tracing

Raytracing & Epsilon. Today. Last Time? Forward Ray Tracing. Does Ray Tracing Simulate Physics? Local Illumination

Lighting affects appearance

CS4670/5760: Computer Vision

Reconstruction of Discrete Surfaces from Shading Images by Propagation of Geometric Features

The Light Field. Last lecture: Radiometry and photometry

Transcription:

Lighting and Photometric Stereo Photometric Stereo HW will be on web later today CSE5A Lecture 7 Radiometry of thin lenses δa Last lecture in a nutshell δa δa'cosα δacos β δω = = ( z' / cosα ) ( z / cosα ) π d cosα π d 3 Ω = = cosα 4 ( z / cosα ) 4 z δa cosα z = δa' cos β z' π d 3 δp = LΩδAcos β = LδAcos α cos β 4 z δp π d δa 3 E = = L cos α cos β CSE 5A, δa' Winter 4 006 z δa' d E = π cos 4 α L 4 z E: Image irradiance L: emitted radiance d : Lens diameter Z: depth α: Image Angle Irradiance of patch from optical axis BRDF With assumptions in previous slide Bi-directional Reflectance Distribution Function ρ(θ in, φ in ; θ out, φ out ) Ratio of incident irradiance to emitted radiance (θ in,φ in ) ^n Function of Incoming light direction: θ in, φ in Outgoing light direction: θ out, φ out ρ ( x θ, φ ; θ, φ ) CSE 5A, Winter 006 ; in in out out = L Lo ( x; θout, φout ) ( x; θ, φ ) cosθ dω i in in in (θ out,φ out ) CSE 5A, Winter 006 1

Light sources, shading, shadows Lighting Nearby Point Sources Line sources Area sources Function on 4-D space Distant Point sources Function on sphere Spherical Harmonics Shadows Standard nearby point source model ρ d ( x) N ( x) S( x) ( ) r x N S N is the surface normal ρ is diffuse (Lambertian) albedo S is source vector - a vector from x to the source, whose length is the intensity term works because a dot-product is basically a cosine CSE 5A, Winter 006 CSE 5A, Winter 006 Line sources Area sources Examples: diffuser boxes, white walls. The radiosity at a point due to an area source is obtained by adding up the contribution over the section of view hemisphere subtended by the source change variables and add up over the source See Forsyth & Ponce or a graphics text for details. radiosity due to line source varies with inverse distance, if the source is long enough CSE 5A, Winter 006 CSE 5A, Winter 006 Standard distant point source model Assume that all points in the scene are close to each other with respect to the distance to the source. Then the source direction doesn t vary much over the scene, and the distance doesn t vary much either. We can roll the constants together to get: N S ( x) ( N( x) S ) ρ d Illumination in terms of Spherical Harmonics Express lighting L as sum spherical harmonics Y l,m (θ,φ) Band l L( θ, φ) L Y ( θ, φ) = l= 0 m= l lm, lm, Ylm, ( θ, ϕ) l=0 1 l=1 y z x CSE 5A, Winter 006 CSE 5A, Winter 006 l=. xy yz 3z 1 zx x y m=- m=-1 m=0 m=1 m=

Shadows cast by a point source Area Source Shadows A point that can t see the source is in shadow For point sources, the geometry is simple Cast Shadow Attached Shadow 1. Fully illuminated. Penumbra 3. Umbra (shadow) CSE 5A, Winter 006 At the top, geometry of a gutter with triangular cross-section; below, predicted radiosity solutions, scaled to lie on top of each other, for different albedos of the geometry. When albedo is close to zero, shading follows a local model; when it is close to one, there are substantial reflexes. Irradiance observed in an image of this geometry for a real white gutter. Figure from Mutual Illumination, by D.A. Forsyth and A.P. Zisserman, Proc. CVPR, 1989, copyright 1989 IEEE Figure from Mutual Illumination, by D.A. Forsyth and A.P. Zisserman, Proc. CVPR, 1989, copyright 1989 IEEE Shading reveals 3-D surface geometry Photometric Stereo 3

An example of photometric stereo Shape-from-shading: Use just one image to recover shape. Requires knowledge of light source direction and BRDF everywhere. Too restrictive to be useful. Photometric stereo: Single viewpoint, multiple images under different lighting. 1. Arbitrary known BRDF, known lighting. Lambertian BRDF, known lighting 3. Lambertian BRDF, unknown lighting. Input Images Photometric Stereo: General BRDF and Reflectance Map Coordinate system Coordinate system n f(x,y) f(x,y) y y x Surface: s(x,y) =(x,y, f(x,y)) Tangent vectors: s( x, y) f = 1,0, s( x, y) f = 0,1, Normal vector s s n = = sx sy f f =,, 1 x Gradient Space: (p,q) f f p =, q = Normal vector s s f f n = =,, 1 1 n ˆ = ( p, q, 1) T p + q + 1 T 4

Image Formation E(x,y) n s a A Reflectance Map n s a E(x,y) For a given point A on the surface, the image irradiance E(x,y) is a function of 1. The BRDF at A. The surface normal at A 3. The direction of the light source. Let the BRDF be the same at all points on the surface, and let the light direction s be a constant. 1. Then image irradiance is a function of only the direction of the surface normal.. In gradient space, we have E(p,q). Example Reflectance Map: Lambertian surface E(p,q) For lighting from front Light Source Direction, expressed in gradient space. Reflectance Map of Lambertian Surface Two Light Sources Two reflectance maps Emeasured Emeasured 1 E.g., Normal lies on this curve What does the intensity (Irradiance) of one pixel in one image tell us? It constrains normal to a curve A third image would disambiguate match 5

Reflectance Map of Lambertian + Specular Surface Photometric stereo: Step1 1. Acquire three images with known light source direction.. Using known source direction & BRDF, construct reflectance map for each direction. 3. For each pixel location (x,y), find (p,q) as the intersection of the three curves. 4. This is the surface normal at pixel (x,y). Over image, this is normal field. Normal Field Plastic Baby Doll: Normal Field Next step: Go from normal field to surface Recovering the surface f(x,y) Many methods: Simplest approach 1. From estimate n =(n x,n y,n z ), p=n x /n z, q=n y /n z. Integrate p=df/dx along a row (x,0) to get f(x,0) 3. Then integrate q=df/dy along each column starting with value of the first row f(x,0) 6

What might go wrong? What might go wrong? Integrability. If f(x,y) is the height function, we expect that f f = Height z(x,y) is obtained by integration along a curve from (x 0, y 0 ). z( x, y) = z( x0, y0) + ( pdx + qdy) If one integrates the derivative field along any closed curve, on expects to get back to the starting value. Might not happen because of noisy estimates of (p,q) ( x, y) ( x0, y0 ) In terms of estimated gradient space (p,q), this means: p q = But since p and q were estimated indpendently at each point as intersection of curves on three reflectance maps, equality is not going to exactly hold Horn s Method [ Robot Vision, B.K.P. Horn, 1986 ] Formulate estimation of surface height z(x,y) from gradient field by minimizing cost functional: Image ( z p) + ( z q) dxdy x where (p,q) are estimated components of the gradient while z x and z y are partial derivatives of best fit surface Solved using calculus of variations iterative updating z(x,y) can be discrete or represented in terms of basis functions. Integrability is naturally satisfied. y We stopped here, but rest of slides are included for those looking to get a head start on the next assignment. Lambertian Surface ^s ^n II. Photometeric Stereo: Lambertian Surface, Known Lighting e(u,v) At image location (u,v), the intensity of a pixel x(u,v) is: e(u,v) = [a(u,v) ^n(u,v)] [s 0 ^s ] = b(u,v) s where a(u,v) is the albedo of the surface projecting to (u,v). n(u,v) is the direction of the surface normal. s 0 is the light source intensity. s is the direction to the light source. a 7

Lambertian Photometric stereo If the light sources s 1, s, and s 3 are known, then we can recover b from as few as three images. (Photometric Stereo: Silver 80, Woodham81). [e 1 e e 3 ] = b T [s 1 s s 3 ] i.e., we measure e 1, e, and e 3 and we know s 1, s, and s 3. We can then solve for b by solving a linear system. T b = [ e ][ ] 1 1 e e3 s1 s s3 Normal is: n = b/ b, albedo is: b What if we have more than 3 Images? Linear Least Squares [e 1 e e 3 ] = b T [s 1 s s 3 ] Rewrite as e = Sb where e is n by 1 b is 3 by 1 S is n by 3 Let the residual be r=e-sb Squaring this: r = r T r = (e-sb) T (e-sb) = e T e-b T S T e + b T S T Sb (r ) b =0 - zero derivative is a necessary condition for a minimum, or -S T e+s T Sb=0; Solving for b gives b= (S T S) -1 S T e Input Images Recovered albedo Recovered normal field Surface recovered by integration 8

Lambertian Photometric Stereo Reconstruction with albedo map Without the albedo map Another person No Albedo map III. Photometric Stereo with unknown lighting and Lambertian surfaces Covered in Illumination cone slides 9