Image-based Lighting (Part 2) 10/19/17 T2 Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros, Kevin Karsch
Today Brief review of last class Show how to get an HDR image from several LDR images, and how to display HDR Show how to insert fake objects into real scenes using environment maps
How to render an object inserted into an image? Image-based lighting Capture incoming light with a light probe Model local scene Ray trace, but replace distant scene with info from light probe Debevec SIGGRAPH 1998
Key ideas for Image-based Lighting Environment maps: tell what light is entering at each angle within some shell +
Spherical Map Example
Key ideas for Image-based Lighting Light probes: a way of capturing environment maps in real scenes
Mirrored Sphere
1) Compute normal of sphere from pixel position 2) Compute reflected ray direction from sphere normal 3) Convert to spherical coordinates (theta, phi) 4) Create equirectangular image
Mirror ball -> equirectangular
Mirror ball -> equirectangular Mirror ball Normals Reflection vectors Phi/theta of reflection vecs Equirectangular Phi/theta equirectangular domain
One small snag How do we deal with light sources? Sun, lights, etc? They are much, much brighter than the rest of the environment 46. 1907. Relative Brightness 15116. 1. 18. Use High Dynamic Range photography!
Key ideas for Image-based Lighting Capturing HDR images: needed so that light probes capture full range of radiance
High dynamic range LDR->HDR by merging exposures Exposure 1 0 to 255 Exposure 2 Exposure n Real world 10-6 10 6
Ways to vary exposure Shutter Speed (*) F/stop (aperture, iris) Neutral Density (ND) Filters
Recovering High Dynamic Range Radiance Maps from Photographs Paul Debevec Jitendra Malik Computer Science Division University of California at Berkeley August 1997
The Approach Get pixel values Z ij for image with shutter time Δt j (i th pixel location, j th image) Exposure is irradiance integrated over time: E ij = R i Dt j Pixel values are non-linearly mapped E ij s: Z ij = f (E ij ) = f (R i Dt j ) Rewrite to form a (not so obvious) linear system: ln f -1 (Z ij ) = ln(r i )+ ln(dt j ) g(z ij ) = ln(r i )+ ln(dt j )
The objective N Solve for radiance R and mapping g for each of 256 pixel values to minimize: P i 1 j 1 ij max 2 ln R i ln t j g( Zij ) w( Z ) Z z Z min w( z) g ( z) 2 give pixels near 0 or 255 less weight known shutter time for image j exposure should smoothly increase as pixel intensity increases irradiance at particular pixel site is the same for each image exposure, as a function of pixel value
Matlab Code
Matlab Code function [g,le]=gsolve(z,b,l,w) n = 256; A = zeros(size(z,1)*size(z,2)+n+1,n+size(z,1)); b = zeros(size(a,1),1); k = 1; %% Include the data-fitting equations for i=1:size(z,1) for j=1:size(z,2) wij = w(z(i,j)+1); A(k,Z(i,j)+1) = wij; A(k,n+i) = -wij; b(k,1) = wij * B(j); k=k+1; end end A(k,129) = 1; %% Fix the curve by setting its middle value to 0 k=k+1; for i=1:n-2 %% Include the smoothness equations A(k,i)=l*w(i+1); A(k,i+1)=-2*l*w(i+1); A(k,i+2)=l*w(i+1); k=k+1; end x = A\b; %% Solve the system using pseudoinverse g = x(1:n); le = x(n+1:size(x,1));
Illustration Image series 1 2 3 t = 1/64 sec 1 2 3 t = 1/16 sec 3 1 2 t = 1/4 sec 1 2 3 t = 1 sec Pixel Value Z = f(exposure) Exposure = Radiance * t log Exposure = log Radiance log t 1 2 3 t = 4 sec
Pixel value Results: Digital Camera Kodak DCS460 1/30 to 30 sec Recovered response curve log Exposure
Reconstructed radiance map
Results: Color Film Kodak Gold ASA 100, PhotoCD
Recovered Response Curves Red Green Blue RGB
How to display HDR? Linearly scaled to display device
Global Operator (Reinhart et al) L display L 1 world L world
Global Operator Results
Reinhart Operator Darkest 0.1% scaled to display device
Local operator http://people.csail.mit.edu/fredo/publi/siggraph2002/durandbilateral.pdf
Acquiring the Light Probe
Assembling the Light Probe
Real-World HDR Lighting Environments Funston Beach Eucalyptus Grove Uffizi Gallery Grace Cathedral Lighting Environments from the Light Probe Image Gallery: http://www.debevec.org/probes/
Illumination Results
Comparison: Radiance map versus single image HDR LDR
CG Objects Illuminated by a Traditional CG Light Source
Illuminating Objects using Measurements of Real Light Object Light Environment assigned glow material property in Greg Ward s RADIANCE system. http://radsite.lbl.gov/radiance/
Paul Debevec. A Tutorial on Image-Based Lighting. IEEE Computer Graphics and Applications, Jan/Feb 2002.
Rendering with Natural Light SIGGRAPH 98 Electronic Theater
Movie http://www.youtube.com/watch?v=ehbgkexh9lu
Illuminating a Small Scene
We can now illuminate synthetic objects with real light. - Environment map - Light probe - HDR - Ray tracing How do we add synthetic objects to a real scene?
Real Scene Example Goal: place synthetic objects on table
Modeling the Scene light-based model real scene
Light Probe / Calibration Grid
Modeling the Scene light-based model synthetic objects local scene real scene
Rendering into the Scene Background Image
Differential Rendering Local scene w/o objects, illuminated by model
Rendering into the Scene Objects and Local Scene matched to Scene
Differential Rendering Difference in local scene - =
Differential Rendering Final Result
IMAGE-BASED LIGHTING IN FIAT LUX Paul Debevec, Tim Hawkins, Westley Sarokin, H. P. Duiker, Christine Cheng, Tal Garfinkel, Jenny Huang SIGGRAPH 99 Electronic Theater
Fiat Lux http://ict.debevec.org/~debevec/fiatlux/movie/ http://ict.debevec.org/~debevec/fiatlux/technology/
HDR Image Series 2 sec 1/4 sec 1/30 sec 1/250 sec 1/2000 sec 1/8000 sec
Assembled Panorama
Light Probe Images
Capturing a Spatially-Varying Lighting Environment
What if we don t have a light probe? Zoom in on eye Insert Relit Face Environment map from eye http://www1.cs.columbia.edu/cave/projects/world_eye/ -- Nishino Nayar 2004
Environment Map from an Eye
Can Tell What You are Looking At Eye Image: Computed Retinal Image:
Video
Summary Real scenes have complex geometries and materials that are difficult to model We can use an environment map, captured with a light probe, as a replacement for distance lighting We can get an HDR image by combining bracketed shots We can relight objects at that position using the environment map