Numerical Integration

Similar documents
Monte Carlo Integration

Monte Carlo Integration

Monte Carlo Ray Tracing. Computer Graphics CMU /15-662

MULTI-DIMENSIONAL MONTE CARLO INTEGRATION

The Rendering Equation. Computer Graphics CMU /15-662, Fall 2016

CS 563 Advanced Topics in Computer Graphics Monte Carlo Integration: Basic Concepts. by Emmanuel Agu

Monte Carlo Integration of The Rendering Equation. Computer Graphics CMU /15-662, Spring 2017

Motivation. Advanced Computer Graphics (Fall 2009) CS 283, Lecture 11: Monte Carlo Integration Ravi Ramamoorthi

Monte-Carlo Ray Tracing. Antialiasing & integration. Global illumination. Why integration? Domains of integration. What else can we integrate?

Measuring Light: Radiometry and Cameras

Motivation. Monte Carlo Path Tracing. Monte Carlo Path Tracing. Monte Carlo Path Tracing. Monte Carlo Path Tracing

13 Distribution Ray Tracing

Monte Carlo Integration COS 323

Lecture 7: Monte Carlo Rendering. MC Advantages

Global Illumination The Game of Light Transport. Jian Huang

Anti-aliasing and Monte Carlo Path Tracing. Brian Curless CSE 557 Autumn 2017

Monte Carlo Integration COS 323

Anti-aliasing and Monte Carlo Path Tracing

INFOMAGR Advanced Graphics. Jacco Bikker - February April Welcome!

GAMES Webinar: Rendering Tutorial 2. Monte Carlo Methods. Shuang Zhao

Path Tracing part 2. Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017

Motivation: Monte Carlo Path Tracing. Sampling and Reconstruction of Visual Appearance. Monte Carlo Path Tracing. Monte Carlo Path Tracing

In the real world, light sources emit light particles, which travel in space, reflect at objects or scatter in volumetric media (potentially multiple

VARIANCE REDUCTION TECHNIQUES IN MONTE CARLO SIMULATIONS K. Ming Leung

Raytracing & Epsilon. Today. Last Time? Forward Ray Tracing. Does Ray Tracing Simulate Physics? Local Illumination

Choosing the Right Algorithm & Guiding

Rendering Equation & Monte Carlo Path Tracing I

Lecture 7 - Path Tracing

The Rendering Equation and Path Tracing

Realistic Image Synthesis

Monte Carlo 1: Integration

Sampling and Monte-Carlo Integration

= f (a, b) + (hf x + kf y ) (a,b) +

The Rendering Equation. Computer Graphics CMU /15-662

Probability and Statistics for Final Year Engineering Students

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Shading Models. Welcome!

Monte Carlo Integration

Integration. Volume Estimation

B.Stat / B.Math. Entrance Examination 2017

F dr = f dx + g dy + h dz. Using that dz = q x dx + q y dy we get. (g + hq y ) x (f + hq x ) y da.

2/1/10. Outline. The Radiance Equation. Light: Flux Equilibrium. Light: Radiant Power. Light: Equation. Radiance. Jan Kautz

Distribution Ray Tracing. University of Texas at Austin CS384G - Computer Graphics Fall 2010 Don Fussell

Photon Maps. The photon map stores the lighting information on points or photons in 3D space ( on /near 2D surfaces)

Realistic Camera Model

Physics 736. Experimental Methods in Nuclear-, Particle-, and Astrophysics. - Statistical Methods -

Understanding Variability

Irradiance Gradients. Media & Occlusions

OpenGL Graphics System. 2D Graphics Primitives. Drawing 2D Graphics Primitives. 2D Graphics Primitives. Mathematical 2D Primitives.

3D Rendering. Course Syllabus. Where Are We Now? Rendering. 3D Rendering Example. Overview. Rendering. I. Image processing II. Rendering III.

The Light Field. Last lecture: Radiometry and photometry

Problem #3 Daily Lessons and Assessments for AP* Calculus AB, A Complete Course Page Mark Sparks 2012

Physically Realistic Ray Tracing

Motivation: Monte Carlo Rendering. Sampling and Reconstruction of Visual Appearance. Caustics. Illumination Models. Overview of lecture.

Capturing light. Source: A. Efros

Monte Carlo Integration

Schedule. MIT Monte-Carlo Ray Tracing. Radiosity. Review of last week? Limitations of radiosity. Radiosity

Biostatistics 615/815 Lecture 16: Importance sampling Single dimensional optimization

Global Illumination and Monte Carlo

Contents. MATH 32B-2 (18W) (L) G. Liu / (TA) A. Zhou Calculus of Several Variables. 1 Homework 1 - Solutions 3. 2 Homework 2 - Solutions 13

Math Exam III Review

Lecture 4. Digital Image Enhancement. 1. Principle of image enhancement 2. Spatial domain transformation. Histogram processing

Parametric Surfaces. Substitution

Paths, diffuse interreflections, caching and radiometry. D.A. Forsyth

To Do. Real-Time High Quality Rendering. Motivation for Lecture. Monte Carlo Path Tracing. Monte Carlo Path Tracing. Monte Carlo Path Tracing

February 2017 (1/20) 2 Piecewise Polynomial Interpolation 2.2 (Natural) Cubic Splines. MA378/531 Numerical Analysis II ( NA2 )

CCNY Math Review Chapter 2: Functions

Will Monroe July 21, with materials by Mehran Sahami and Chris Piech. Joint Distributions

Measuring Light: Radiometry and Photometry

Math 113 Calculus III Final Exam Practice Problems Spring 2003

Integration. Example Find x 3 dx.

University of Saskatchewan Department of Mathematics & Statistics MATH Final Instructors: (01) P. J. Browne (03) B. Friberg (05) H.

Photo-realistic Renderings for Machines Seong-heum Kim

Assignment 3: Path tracing

Lecture 8: Jointly distributed random variables

Global Illumination and the Rendering Equation

The Bidirectional Scattering Distribution Function as a First-class Citizen in Radiance. Greg Ward, Anyhere Software

Monte Carlo Path Tracing. The Rendering Equation

Maths Year 11 Mock Revision list

Tentti/Exam Midterm 2/Välikoe 2, December

Intensity Transformations and Spatial Filtering

Radiometry. Computer Graphics CMU /15-662, Fall 2015

CENG 477 Introduction to Computer Graphics. Ray Tracing: Shading

Rendering. Generate an image from geometric primitives II. Rendering III. Modeling IV. Animation. (Michael Bostock, CS426, Fall99)

CS 563 Advanced Topics in Computer Graphics Irradiance Caching and Particle Tracing. by Stephen Kazmierczak

University of California, Berkeley

Korrigeringar: An introduction to Global Illumination. Global Illumination. Examples of light transport notation light

Isometries. 1 Identifying Isometries

Raytracing CS148 AS3. Due :59pm PDT

Metropolis Light Transport

A Random Number Based Method for Monte Carlo Integration

Anti-aliasing and Monte Carlo Path Tracing. Brian Curless CSE 457 Autumn 2017

Drawing a Triangle (and an introduction to sampling)

Importance Sampling of Area Lights in Participating Media

Monte Carlo 1: Integration

MIT Monte-Carlo Ray Tracing. MIT EECS 6.837, Cutler and Durand 1

An interesting related problem is Buffon s Needle which was first proposed in the mid-1700 s.

MATH 116 REVIEW PROBLEMS for the FINAL EXAM


Measuring Light: Radiometry and Photometry

Digital Image Processing

Transcription:

Lecture 12: Numerical Integration (with a focus on Monte Carlo integration) Computer Graphics CMU 15-462/15-662, Fall 2015

Review: fundamental theorem of calculus Z b f(x)dx = F (b) F (a) a f(x) = d dx F (x) F (x) F (x) F (a) Z x f(t)dt = F (x) F (a) a x = a x

Definite integral as area under curve Z b a f(x)dx f(x) x = a x = b

Simple case: constant function Z b Cdx =(b a)c a f(x) C x = a x = b

Affine function: f(x) =cx + d Z b a f(x)dx = 1 2 (f(a)+f(b))(b a) f(x) f(b) 1 2 (f(a)+f(b)) f(a) x = a x = b

Piecewise affine function Sum of integrals of individual affine components f(x) Z b a f(x)dx = 1 2 n 1 X i=0 (x i+1 x i )(f(x i )+f(x i+1 )) x 0 = a x 1 x 2 x 3 x 4 = b

Piecewise affine function If N-1 segments are of equal length: h = b a n 1 f(x) Z b a f(x)dx = h 2 Weighted combination of measurements. = h = nx i=0 n 1 X i=0 n 1 X i=1 (f(x i )+f(x i+1 ) f(x i )+ 1 2 (f(x 0)+f(x n )) A i f(x i )! x 0 = a x 1 x 2 x 3 x 4 = b

Polynomials? f(x) x = a x = b

Aside: interpolating polynomials Consider n+1 measurements of a function f(x) f(x 0 ),f(x 1 ),f(x 2 ),,f(x n ) There is a unique degree n polynomial that interpolates the points: p(x) = nx i=0 f(x i ) ny j6=i,j=0 x x i xj x j l 1 (x) l 2 (x) l 3 (x) nx = f(x i )l i (x) i=0 Note: is 1 at and 0 at l i (x) x i l 0 (x) all other measurement points

Gaussian quadrature theorem If f(x) is a polynomial of degree of up to 2n+1, then its integral over [a,b] is computed exactly by a weighted combination of n+1 measurements in this range. Z b a f(x)dx = nx i=0 A i f(x i ) A i = Z b a l i (x)dx Where are these points? Roots of degree n+1 polynomial Z b q(x) where: x k q(x)dx =0 0 apple k apple n a apple apple

Arbitrary function f(x)? f(x) x 0 = a x 1 x 2 x 3 x 4 = b

Trapezoidal rule Approximate integral of f(x) by assuming function is piecewise linear For equal length segments: f(x) h = b Z b a a n 1 f(x)dx = h n 1 X i=1 f(x i )+ 1 2 (f(x 0)+f(x n ))! x 0 = a x 1 x 2 x 3 x 4 = b

Trapezoidal rule Consider cost and accuracy of estimate as n!1 (or h! 0 ) Work: O(n) Error can be shown to be: O(h 2 )=O( 1 n 2 ) f(x) (for f(x) with continuous second derivative) h

Integration in 2D Consider integrating f(x, y) using the trapezoidal rule (apply rule twice: when integrating in x and in y) Z by a y Z Z bx by f(x, y)dxdy = O(h 2 )+ a x a y i=0! nx A i f(x i,y) dy i=0 nx Z by = O(h 2 )+ A i nx = O(h 2 )+ A i 0 a y f(x i,y)dy @O(h 2 )+ i=0 j=0 1 nx A j f(x i,y j ) A First application of rule Second application = O(h 2 )+ nx i=0 nx j=0 A i A j f(x i,y j ) Errors add, so error still: But work is now: O(n 2 ) (n x n set of measurements) O(h 2 ) Must perform much more work in 2D to get same error bound on integral! In K-D, let N = n k 1 Error goes as: O N 2/k

Recall: camera measurement equation from last time Q = 1 Z t 1 d 2 5D integral! t 0 Z A lens d Z L(p 0! p,t) cos 4 dp dp 0 dt 0 A film p 0 r 2 p (Rendering requires computation of infinite dimensional integrals. Coming soon in class!)

Monte Carlo Integration Slides credit: a majority of these slides were created by Matt Pharr and Pat Hanrahan

Monte Carlo numerical integration Estimate value of integral using random sampling of function - Value of estimate depends on random samples used - But algorithm gives the correct value of integral on average So far we ve discussed techniques that use a fixed set of sample points (e.g., uniformly spaced, or obtained by finding roots of polynomial (Gaussian quadrature)) Only requires function to be evaluated at random points on its domain - Applicable to functions with discontinuities, functions that are impossible to integrate directly Error of estimate is independent of the dimensionality of the integrand - Depends on the number of random samples used: Recall previous trapezoidal rule example: (dropping the n 2 for simplicity) O(n 1/2 ) O(n 1/k )

Review: random variables X random variable. Represents a distribution of potential values X p(x) probability density function (PDF). Describes relative probability of a random process choosing value (x) Uniform PDF: all values over a domain are equally likely e.g., for an unbiased die X takes on values 1,2,3,4,5,6 p(1) = p(2) = p(3) = p(4) = p(5) = p(6)

Discrete probability distributions n discrete values x i With probability p i p i Requirements of a PDF: x i p i 0 nx p i =1 i=1 Six-sided die example: p i = 1 6 p i X x i Think: is the probability that a random measurement of will yield the value X takes on the value x i with probability p i

Cumulative distribution function (CDF) (For a discrete probability distribution) p i jx Cumulative PDF: P j = p i x i i=1 1 where: 0 apple P i apple 1 P n =1 P j 0

How do we generate samples of a discrete random variable (with a known PDF?)

Sampling from discrete probability distributions To randomly select an event, select x i if 1 P i 1 < apple P i Uniform random variable 2 [0, 1) P j 0 x 2

Continuous probability distributions PDF p(x) Uniform distribution (for random variable X defined on [0,1] domain) p(x) 0 CDF P (x) Z x 0 1 P (x) = p(x)dx 1 0 P (x) =Pr(X<x) P (1) = 1 Z b Pr(a apple X apple b) = p(x)dx a = P (b) P (a) 0 1

Sampling continuous random variables using the inversion method Cumulative probability distribution function P (x) =Pr(X<x) 1 Construction of samples: Solve for x = P 1 ( ) Must know the formula for: 1. The integral of p(x) 2. The inverse function P 1 (x) 0 x 1

Example: applying the inversion method Given: Relative density of probability of random variable taking on value x over [0,2] domain f(x) =x 2 x 2 [0, 2] Compute PDF: Z 2 1= 0 cf(x)dx = c(f (2) F (0)) = c 1 3 23 = 8c 3 c = 3 8, p(x) =3 8 x2 Probability density function (integrates to 1)

Example: applying the inversion method Given: f(x) =x 2 x 2 [0, 2] p(x) = 3 8 x2 Compute CDF: Z x P (x) = 0 = x3 8 p(x)dx

Example: applying the inversion method Given: f(x) =x 2 x 2 [0, 2] p(x) = 3 8 x2 P (x) = x3 8 Sample from p(x) = P (x) = x3 8 x = 3p 8 x

How do we uniformly sample the unit circle? (Choose any point P=(px, py) in circle with equal probability)

Uniformly sampling unit circle: first try, = uniform random angle between 0 and 2 (r c = uniform random radius between 0 and 1 Return point: (r cos,rsin ) This algorithm does not produce the desired uniform sampling of the area of a circle. Why?

Because sampling is not uniform in area! Points farther from center of circle are less likely to be chosen rdrd =2 1 r = 2 p(r, )drd rdrd p(r, ) r

Sampling a circle (via inversion in 2D) A = Z 2 0 Z 1 0 r dr d = Z 1 0 r dr Z 2 0 d = r 2 2 1 2 0 0 = p(r, )dr d = 1 r dr d! p(r, ) = r p(r, ) =p(r)p( ) r, independent rdrd p( ) = 1 2 P ( ) = 1 2 =2 1 p(r) =2r P (r) =r 2 r = p 2

Uniform area sampling of a circle WRONG Not Equi-areal RIGHT Equi-areal =2 1 =2 1 r = 2 r = p 2

Shirley s mapping Distinct cases for eight octants r = 1 = 2 4r

Uniform sampling via rejection sampling Generate random point within unit circle do { x = 1-2 * rand01(); y = 1-2 * rand01(); } while (x*x + y*y > 1.); Efficiency of technique: area of circle / area of square

Aside: approximating the area of a circle inside = 0 for (i = 0; i < N; ++i) { x = 1-2 * rand01(); y = 1-2 * rand01(); if (x*x + y*y < 1.) ++inside; } A = inside * 4 / N;

Rejection sampling to generate 2D directions Goal: generate random directions in 2D with uniform probability x = 1-2 * rand01(); y = 1-2 * rand01(); r = sqrt(x*x+y*y); x_dir = x/r; y_dir = y/r; This algorithm is not correct! What is wrong?

Rejection sampling to generate 2D directions Goal: generate random directions in 2D with uniform probability do { x = 1-2 * rand01(); y = 1-2 * rand01(); } while (x*x + y*y > 1.); r = sqrt(x*x+y*y); x_dir = x/r; y_dir = y/r;

Monte Carlo integration Z b Definite integral What we seek to estimate a f(x)dx Random variables X i is the value of a random sample drawn from the distribution Y i is also a random variable. p(x) X i p(x) Y i = f(x i ) Z b Expectation of f E[Y i ]=E[f(X i )] = f(x) p(x)dx a Estimator Monte Carlo estimate of Assuming samples X i Z b a f(x)dx drawn from uniform F N = b N a NX i=1 Y i pdf. I will provide estimator for arbitrary PDFs later in lecture.

Basic unbiased Monte Carlo estimator E[F N ]=E " b a NX Y i # N i=1 Unbiased estimator: Expected value of estimator is the integral we wish to evaluate. E Properties of expectation: " X Y i # X = E[Y i ] i = b a N = b a N = 1 N = NX i=1 Z b a NX i=1 NX i=1 Z b a E[Y i ]= b Z b a f(x)dx N a f(x) p(x)dx f(x)dx NX i=1 E[f(X i )] Assume uniform probability density for now X i U(a, b) p(x) = 1 b a i E[aY ]=ae[y ]

Direct lighting estimate Uniformly-sample hemisphere of directions with respect to solid angle Z p(!) = 1 E(p) = L(p,!) cos d! 2 Z L(p,!) d! Estimator: Z X i p(!) Y i = f(x i ) Y i = L(p,! i )cos i da F N = 2 N NX i=1 Y i

Direct lighting estimate Uniformly-sample hemisphere of directions with respect to solid angle Z E(p) = L(p,!) cos d! Z Given surface point p Z For each of N samples: Generate random direction: A ray tracer evaluates radiance along a ray (see Raytracer::trace_ray() in raytracer.cpp)! i! i Compute incoming radiance arriving L i at p from direction: Compute incident irradiance due to ray: de i = L i cos i Accumulate 2 N de i into estimator

Uniform hemisphere sampling Generate random direction on hemisphere (all directions equally likely) p(!) = 1 2 Direction computed from uniformly distributed point on 2D plane: q q ( 1, 2 )=( 1 2 1 cos(2 2), 1 2 1 sin(2 2), 1 ) 1.0 0.8 0.6 0.4 0.2 0.2 0.4 0.6 0.8 1.0 Exercise to students: derive from the inversion method

Hemispherical solid angle Light source sampling, 100 sample rays (random directions drawn uniformly from hemisphere) Occluder (blocks light)

Why is the image in the previous slide noisy?

Incident lighting estimator uses different random directions in each pixel. Some of those directions point towards the light, others do not. (Estimator is a random variable)

Idea: don t need to integrate over entire hemisphere of directions (incoming radiance is 0 from most directions) Only integrate over the area of the light (directions where incoming radiance is non-zero)

Direct lighting: area integral Z E(p) = L(p,!) cos d! Integral over directions Z E(p) = ZA0 L o (p 0,! 0 ) V (p, p 0 ) Z cos cos 0 p p 0 2 da0 Change of variables to integral over area of light * A 0 p 0 dw = da p 0 p 2 = da0 cos p 0 p 2 0! 0 =p p 0 Binary visibility function: 1 if p is visible from p, 0 otherwise (accounts for light occlusion)! =p 0 p Outgoing radiance from light point p, in direction w towards p p

0! 0 =p p 0 Direct lighting: area integral ZA0 L o (p 0,! 0 ) V (p, p 0 ) E(p) = cos cos 0 p p 0 2 da0 Sample shape uniformly by area A Z A 0 p 0 A 0 p(p 0 )da 0 =1 p(p 0 )= 1 A 0! =p 0 p p

0! 0 =p p 0 Direct lighting: area integral ZA0 L o (p 0,! 0 ) V (p, p 0 ) E(p) = cos cos 0 p p 0 2 da0 Probability: p(p 0 )= 1 A 0 A 0 p 0 Estimator Y i = L o (p 0 i,! i) 0 V (p, p 0 i) cos i cos i 0 p p 0 i 2 0! =p 0 p F N = A0 N NX i=1 Y i 0 X p

Light source area sampling, 100 sample rays If no occlusion is present, all directions chosen in computing estimate hit the light source. (Choice of direction only matters if portion of light is occluded from surface point p.)

Variance Definition V [Y ]=E[(Y E[Y ]) 2 ] = E[Y 2 ] E[Y ] 2 Variance decreases linearly with number of samples V " 1 N NX i=1 Y i # = 1 N 2 NX i=1 V [Y i ]= 1 N 2 NV[Y ]= 1 N V [Y ] Properties of variance: V " N X i=1 Y i # = NX V [Y i ] i=1 V [ay ]=a 2 V [Y ]

1 area light sample (high variance in irradiance estimate)

16 area light samples (high variance in irradiance estimate)

Comparing different techniques Variance in an estimator manifests as noise in rendered images Estimator efficiency measure: E ciency / 1 Variance Cost If one integration technique has twice the variance as another, then it takes twice as many samples to achieve the same variance If one technique has twice the cost of another technique with the same variance, then it takes twice as much time to achieve the same variance

Biasing We previously used a uniform probability distribution to generate samples in our estimator Idea: change the distribution bias the selection of samples X i p(x) However, for estimator to remain unbiased, must change the estimator to: Y i = f(x i) p(x i ) Note: biasing selection of random samples is different than creating a biased estimator - Biased estimator: expected value of estimator does not equal integral it is designed to estimate (not good!)

General unbiased Monte Carlo estimator Z b NX f(x i ) f(x)dx 1 N a i=1 p(x i ) X i p(x) Special case where X i drawn from uniform distribution: F N = b N a NX i=1 f(x i ) X i U(a, b) p(x) = 1 b a

Biased sample selection, but unbiased estimator Probability: X i p(x) Estimator: Y i = f(x i) p(x i ) E[Y i ]=E apple f(xi ) p(x i ) = = Z f(x) p(x) p(x)dx Z f(x)dx

Importance sampling Idea: bias selection of samples towards parts of domain where function we are integrating is large ( most useful samples ) Sample according to f(x) Recall definition of variance: p(x) =cf(x) V [ f] =E[ f 2 ] E 2 [ f] f(x) = f(x) p(x) E[ f 2 ]= Z apple f(x) p(x) 2 p(x)dx Z apple = f(x) f(x)/e[f] 2 f(x) E[f] dx Z =E[f] f(x)dx If PDF is proportional to f then variance is 0! =E 2 [f]! V [ f] =0?!?

Importance sampling example Consider uniform hemisphere sampling in irradiance estimate: f(!) =L i (!) cos p(!) = 1 2 q q ( 1, 2 )=( 1 2 1 cos(2 2), 1 2 1 sin(2 2), 1 ) Z NX NX NX f(!)d! 1 N i f(!) p(!) = 1 N i L i (!) cos 1/2 = 2 N i L i (!) cos

Importance sampling example Cosine-weighted hemisphere sampling in irradiance estimate: f(!) =L i (!) cos p(!) = cos Z NX NX NX f(!) L i (!) cos = N L i (!) f(!)d! 1 N i p(!) = 1 N i cos / i Idea: bias samples toward directions where cos is large (if L is constant, then these are the directions that contribute most)

Effect of sampling distribution Fit f(x) p 2 (x) p 1 (x) What is the behavior of f(x)/p 1 (x)? f(x)/p 2 (x)? How does this impact the variance of the estimator?

Summary: Monte Carlo integration Monte Carlo estimator - Estimate integral by evaluating function at random sample points in domain F N = 1 N NX i=1 f(x i ) p(x i ) Z b a f(x)dx Useful in rendering due to estimate high dimension integrals - Faster convergence in estimating high dimensional integrals than nonrandomized quadrature methods - Suffers from noise due to variance in estimate Importance sampling - Reduce variance by biasing choice of samples to regions of domain where value of function is large - Intuition: pick samples that will contribute most to estimate