Lecture 11: Monte Carlo Integration Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2016
Reminder: Quadrature-Based Numerical Integration f(x) Z b a f(x)dx x 0 = a x 1 x 2 x 3 x 4 = b E.g. trapezoidal rule - estimate integral assuming function is piecewise linear
Multi-Dimensional Integrals (Rendering Examples)
2D Integral: Recall Antialiasing By Area Sampling Checkerboard sequence by Tom Duff Point Sampling Area Sampling Integrate over 2D area of pixel
3D Integral: Motion Blur Integrate over area of pixel and over exposure time. Cook et al. 1984
5D Integral: Real Camera Pixel Exposure Credit: lycheng99, http://flic.kr/p/x4dezh Integrate over 2D lens pupil, 2D pixel, and over exposure time
5D Integral: Real Camera Pixel Exposure p 0 r 2 Lens aperture d p Sensor plane Pixel area Q pixel = 1 Z t 1 d 2 t 0 Z A lens Z A pixel L(p 0! p,t) cos 4 dp dp 0 dt
High-Dimensional Integration Complete set of samples: Curse of dimensionality N = n n n {z } y d = n d Numerical integration error: Random sampling error: Error 1 n = 1 N 1/d Error = Variance 1/2 1 p N In high dimensions, Monte Carlo integration requires fewer samples than quadrature-based numerical integration In global illumination, we will encounter infinite-dimensional integrals
Monte Carlo Integration
Monte Carlo Numerical Integration Idea: estimate integral based on evaluation of function at random sample points Advantages: General and relatively simple method Requires only function evaluation at any point Works for very general functions, including functions with discontinuities Efficient for high-dimensional integrals avoids curse of dimensionality Disadvantages: Noise. Integral estimate is random, only correct on average Can be slow to converge need a lot of samples
Example: Monte Carlo Lighting Light Center Sample center of light Random Random point on light 1 sample per pixel
Review: Random Variables X random variable. Represents a distribution of potential values X p(x) probability density function (PDF). Describes relative probability of a random process choosing value (x) Uniform PDF: all values over a domain are equally likely e.g., for an unbiased die X takes on values 1, 2, 3, 4, 5, 6 p(1) = p(2) = p(3) = p(4) = p(5) = p(6)
Probability Distribution Function (PDF) n discrete values x i With probability p i p i Requirements of a probability distribution: x i p i 0 nx p i =1 i=1 Six-sided die example: p i = 1 6 p i X x i Think: is the probability that a random measurement of will yield the value X takes on the value x i with probability p i
Expected Value of a Random Variable The average value that one obtains if repeatedly drawing samples from the random distribution. X drawn from distribution with n discrete values with probabilities x i p i p i nx Expected value of X: E[X] = x i p i i=1 Die example: E[X] = nx i=1 i 6 = (1 + 2 + 3 + 4 + 5 + 6)/6 =3.5
Continuous Probability Distribution Function X p(x) A random variable X that can take any of a continuous set of values, where the relative probability of a particular value is given by a continuous probability density function p(x). Z Conditions on p(x): p(x) 0 and Z p(x) dx =1 Expected value of X: E[X] = xp(x) dx
Function of a Random Variable A function Y of a random variable X is also a random variable: X p(x) Y = f(x) Z Expected value of a function of a random variable: Z E[Y ]=E[f(X)] = f(x) p(x) dx
Monte Carlo Integration Simple idea: estimate the integral of a function by averaging random samples of the function s value. f(x) Z b a f(x)dx a x i b
Monte Carlo Integration Let us define the Monte Carlo estimator for the definite integral of given function f(x) Z b Definite integral f(x)dx a Random variable X i p(x) Monte Carlo estimator F N = 1 N NX i=1 f(x i ) p(x i )
Expected Value of Monte Carlo Estimator " # E[F N ]=E 1 N NX f(x i ) i=1 p(x i ) E Properties of expected values: " X # Y i E[Y i ] = 1 N NX i=1 E apple f(xi ) p(x i ) i = X i E[aY ]=ae[y ] = 1 N NX i=1 Z b a f(x) p(x) p(x)dx NX Z b = 1 N = i=1 Z b a a f(x)dx f(x)dx The expected value of the Monte Carlo estimator is the desired integral.
Basic Monte Carlo Estimator The basic Monte Carlo estimator is a simple special case where we use uniform random variables Uniform random variable X i p(x) = 1 b a Basic Monte Carlo estimator F N = b N a NX i=1 f(x i )
Unbiased Estimator Definition: A Monte Carlo integral estimator is unbiased if its expected value is the desired integral. The general and basic Monte Carlo estimators are unbiased. Why do we want unbiased estimators?
Definite Integral Can Be N-Dimensional Example in 3D: Z x1 Z y1 Z z1 f(x, y, z) dx dy dz x 0 y 0 z 0 Uniform 3D random variable* X i p(x, y, z) = 1 1 1 x 1 x 0 y 1 y 0 z 1 z 0 Basic 3D MC estimator* F N = (x 1 x 0 )(y 1 y 0 )(z 1 z 0 ) N * Generalizes to arbitrary N-dimensional PDFs NX i=1 f(x i )
Direct Lighting (Irradiance) Estimate Sample directions over hemisphere uniformly in solid angle Z p(!) = 1 E(p) = L(p,!) cos d! 2 Z L(p,!) Estimator: Z d! X i p(!) Y i = f(x i ) Y i = L(p,! i )cos i da F N = 2 N NX i=1 Y i
Direct Lighting (Irradiance) Estimate Sample directions over hemisphere uniformly in solid angle Z E(p) = L(p,!) cos d! Z Given surface point p A ray tracer evaluates radiance along a ray Z For each of N samples: Generate random direction:! i Compute incoming radiance arriving L i at p from direction: Compute incident irradiance due to ray:! i 2 Accumulate N de i into estimator de i = L i cos i
Direct Lighting - Solid Angle Sampling Light Blocker 100 shadow rays per pixel
Observation: incoming radiance is zero for most directions in this scene Idea: integrate only over the area of the light (directions where incoming radiance is non-zero)
0! 0 =p p 0 Direct Lighting: Area Integral ZA0 L o (p 0,! 0 ) V (p, p 0 ) E(p) = cos cos 0 p p 0 2 da0 Change of variables to integral over area of light * dw = da0 cos p 0 p 2 A 0 p 0 Binary visibility function: 1 if p is visible from p, 0 otherwise (accounts for light occlusion) Outgoing radiance from light point p, in direction w towards p! =p 0 p p
Direct Lighting: Area Integral ZA0 L o (p 0,! 0 ) V (p, p 0 ) E(p) = cos cos 0 p p 0 2 da0 Sample shape uniformly by area A Z A 0 p 0 A 0 p(p 0 )da 0 =1 p(p 0 )= 1 A 0 0! 0 =p p 0 Estimator 0! =p 0 p F N = A0 N NX i=1 Y i p Y i = L o (p 0 i,! 0 i) V (p, p 0 i) cos i cos 0 i p p 0 i 2 0 X
Direct Lighting - Area Sampling 100 shadow rays per pixel
Random Sampling Introduces Noise Light Center Sample center of light Random Random point on light 1 sample per pixel
More Random Samples Reduces Noise Light Center Random 1 shadow ray 16 shadow rays
Why Is Area Sampling Better Than Solid Angle? Light Center Solid angle sampling Random Area sampling
How to Draw Samples From a Desired Probability Distribution?
Cumulative Distribution Function (CDF) Discrete random variable with p i n possible values. jx x i Cumulative PDF: P j = p i 1 i=1 where: 0 apple P i apple 1 P j P n =1 0
Sampling from Probability Distribution To randomly select a value, select x i if 1 P i 1 < apple P i Uniform random variable 2 [0, 1) P j How to compute? Binary search. 0 x 2
Continuous Probability Distribution PDF p(x) Uniform distribution on unit interval p(x) 0 CDF P (x) Z x 0 1 P (x) = 0 p(x)dx 1 P (x) =Pr(X<x) P (1) = 1 Z b Pr(a apple X apple b) = a p(x)dx = P (b) P (a) 0 1
Sampling Continuous Probability Distributions Called the inversion method Cumulative probability distribution function P (x) =Pr(X<x) 1 Construction of samples: Solve for x = P 1 ( ) Must know the formula for: 1. The integral of p(x) 2. The inverse function P 1 (x) 0 x 1
Example: Sampling Continuous Distribution Given: f(x) =x 2 x 2 [0, 2] Want to sample according to this Compute PDF: Z 2 1= 0 cf(x)dx = c(f (2) F (0)) = c 1 3 23 = 8c 3 c = 3 8, p(x) =3 8 x2 Ensure PDF is normalized
Example: Sampling Continuous Distribution Given: f(x) =x 2 x 2 [0, 2] p(x) = 3 8 x2 Compute CDF: Z x P (x) = 0 = x3 8 p(x)dx
Example: Sampling Continuous Distribution Given: f(x) =x 2 x 2 [0, 2] p(x) = 3 8 x2 Applying the inversion method P (x) = x3 8 Sample from p(x) = P (x) = x3 8 x = 3p 8 x
How to Uniformly Sample The Unit Circle?
Sampling Unit Circle: Simple but Wrong Method,= uniform random angle between 0 and 2 (r c = uniform random radius between 0 and 1 Return point: (r cos,rsin ) Source: [PBRT] Result Reference
Need to Sample Uniformly in Area Incorrect Not Equi-areal Correct Equi-areal =2 1 =2 1 r = 2 r = p 2
Sampling Unit Circle (via Inversion in 2D)* A = Z 2 0 Z 1 0 r dr d = Z 1 0 r dr Z 2 0 d = r 2 2 1 2 0 0 = p(r, )dr d = 1 r dr d! p(r, ) = r p(r, ) =p(r)p( ) r, independent rdrd p( ) = 1 2 P ( ) = 1 2 =2 1 p(r) =2r P (r) =r 2 r = p 2 * See Shirley et al. p.331 for another explanation.
Shirley s Mapping Distinct cases for eight octants r = 1 = 2 4r
Rejection Sampling Circle s Area do { x = 1-2 * rand01(); y = 1-2 * rand01(); } while (x*x + y*y > 1.); Efficiency of technique: area of circle / area of square
Rejection Sampling 2D Directions do { x = 1-2 * rand01(); y = 1-2 * rand01(); } while (x*x + y*y > 1.); r = sqrt(x*x+y*y); x_dir = x/r; y_dir = y/r; Generates 2D directions with uniform probability
Approximate the Area of a Circle inside = 0 for (i = 0; i < N; ++i) { x = 1-2 * rand01(); y = 1-2 * rand01(); if (x*x + y*y < 1.) ++inside; } A = inside * 4 / N;
Uniform Sampling of Hemisphere Generate random direction on hemisphere (all dirs. equally likely) p(!) = 1 2 Direction computed from uniformly distributed point on 2D square: q q ( 1, 2 )! ( 1 2 1 cos(2 2), 1 2 1 sin(2 2), 1 ) 1.0 0.8 0.6 0.4 0.2 0.2 0.4 0.6 0.8 1.0 Exercise to students: derive from the inversion method
Efficiency
Variance of a Random Variable Definition V [Y ]=E[(Y E[Y ]) 2 ] = E[Y 2 ] E[Y ] 2 Variance decreases linearly with number of samples V " 1 N NX i=1 Y i # = 1 N 2 NX i=1 V [Y i ]= 1 N 2 NV[Y ]= 1 N V [Y ] Properties of variance " N X NX V Y i # = V [Y i ] V [ay ]=a 2 V [Y ] i=1 i=1
More Random Samples Reduces Variance Light Center Random 1 shadow ray 16 shadow rays
Comparing Different Techniques Variance in an estimator manifests as noise in rendered images Estimator efficiency measure: E ciency / 1 Variance Cost If one integration technique has twice the variance as another, then it takes twice as many samples to achieve the same variance If one technique has twice the cost of another technique with the same variance, then it takes twice as much time to achieve the same variance
Recall MC Estimator Can Use Non-Uniform PDF Let us define the Monte Carlo estimator for the definite integral of given function f(x) Z b Definite integral f(x)dx a Random variable X i p(x) Monte Carlo estimator F N = 1 N NX i=1 f(x i ) p(x i )
Importance Sampling Idea: concentrate selection of random samples in parts of domain where function is large ( high value samples ) Sample according to f(x) Recall definition of variance: p(x) =cf(x) V [ f] =E[ f 2 ] E 2 [ f] f(x) = f(x) p(x) E[ f 2 ]= Z apple f(x) p(x) 2 p(x)dx Z apple = f(x) f(x)/e[f] 2 f(x) E[f] dx Z =E[f] f(x)dx If PDF is proportional to f then variance is 0! =E 2 [f]! V [ f] =0?!?
Importance Sampling: Area Sampling Light Center Solid angle sampling Random Area sampling
Hemispherical Solid Angle Sampling, 100 shadow rays (random directions drawn uniformly from hemisphere)
The MC estimator uses different random directions at each pixel. Only some directions point towards the light. (Estimator is a random variable) Why does this image look noisy?
Light Source Area Sampling 100 shadow rays If no occlusion is present, all directions chosen in computing estimate hit the light source. (Choice of direction only matters if portion of light is occluded from surface point p.)
Random Sampling Introduces Noise Light Center Sample center of light Random Random point on light 1 sample per pixel
More Random Samples Reduces Noise Light Center Random 1 shadow ray 16 shadow rays
Importance Sampling Reduces Noise Light Center Solid angle sampling Random Area sampling
Effect of Sampling Distribution Fit f(x) p 2 (x) p 1 (x) What is the behavior of f(x)/p 1 (x)? f(x)/p 2 (x)? How does this impact the variance of the estimator?
Things to Remember Monte Carlo integration Unbiased estimators Good for high-dimensional integrals Estimates are visually noisy and need many samples Importance sampling can reduce variance (noise) if probability distribution fits underlying function Sampling random variables Inversion method, rejection sampling Sampling in 1D, 2D, disks, hemispheres
Acknowledgments Many thanks to Kayvon Fatahalian, Matt Pharr, and Pat Hanrahan, who created the majority of these slides.
Extra
Pseudo-Random Number Generation Example: cellular automata #30 http://mathworld.wolfram.com/rule30.html
Pseudo-Random Number Generation Example: cellular automata #30 http://mathworld.wolfram.com/rule30.html Center line values are random bit sequence Used for random number generator in Mathematica
Pseudo-Random Number Generation Credit: Richard Ling