lecture 19: monte carlo methods
|
|
- Elisabeth Carson
- 5 years ago
- Views:
Transcription
1 lecture 19: monte carlo methods STAT 598z: Introduction to computing for statistics Vinayak Rao Department of Statistics, Purdue University April 3, 2019
2 Monte Carlo integration We want to calculate integrals/summations o ten expectations w.r.t. some probability distribution p(x) µ := E p [f] = f(x)p(x)dx X 1/16
3 Monte Carlo integration We want to calculate integrals/summations o ten expectations w.r.t. some probability distribution p(x) µ := E p [f] = f(x)p(x)dx X What is the prob. a game of patience (solitaire) is solvable? P(Solvable) = 1 Π 1(Π is solvable) Π 1/16
4 Monte Carlo integration We want to calculate integrals/summations o ten expectations w.r.t. some probability distribution p(x) µ := E p [f] = f(x)p(x)dx X If we drop 3 points on the plane, each Gaussian distributed, what is average the area of the resulting triangle? E[A] = A(x 1, x 2, x 3 )p(x 1 )p(x 2 )p(x 3 )dx 1 dx 2 dx 3 1/16
5 Monte Carlo integration We want to calculate integrals/summations o ten expectations w.r.t. some probability distribution p(x) µ := E p [f] = f(x)p(x)dx X For dataset (X, y), what is is average loss if you randomly choose a weight-vector according to some distribution (e.g. rnorm)? E w [L(X, y)] = (y w T X) 2 p(w)dw 1/16
6 Monte Carlo integration We want to calculate integrals: µ := E p [f] = X f(x)p(x)dx 2/16
7 Monte Carlo integration We want to calculate integrals: µ := E p [f] = X f(x)p(x)dx Sampling approximation: rather than visit all points in X, calculate a summation over a finite set. Monte Carlo approximation: Obtain points by sampling from p(x) x i p ˆµ 1 N N f(x i ) i=1 2/16
8 Monte Carlo integration Is this a good idea? Very simple Unbiased 3/16
9 Monte Carlo integration Is this a good idea? Very simple Unbiased If x i p, [ 1 E p [ˆµ] = E p N ] N f(x) = 1 N i=1 N E p [f] = µ i=1 Unbiased estimate 3/16
10 Monte Carlo integration Is this a good idea? Very simple Unbiased If x i p, [ 1 E p [ˆµ] = E p N ] N f(x) = 1 N i=1 N E p [f] = µ i=1 Unbiased estimate Var p [ˆµ] = 1 N Var p[f], Error = StdDev N 1/2 3/16
11 Monte Carlo integration Is this a good idea? In low-dims, use numerical methods like quadrature In high-dims, numerical methods become infeasible 4/16
12 Monte Carlo integration Is this a good idea? In low-dims, use numerical methods like quadrature In high-dims, numerical methods become infeasible E.g. Simpson s rule in d-dimensions, with N grid points: error N 4/d 4/16
13 Monte Carlo integration Is this a good idea? In low-dims, use numerical methods like quadrature In high-dims, numerical methods become infeasible E.g. Simpson s rule in d-dimensions, with N grid points: Monte Carlo integration: error N 4/d error N 1/2 4/16
14 Monte Carlo integration Is this a good idea? In low-dims, use numerical methods like quadrature In high-dims, numerical methods become infeasible E.g. Simpson s rule in d-dimensions, with N grid points: Monte Carlo integration: error N 4/d error N 1/2 Independent of dimensionality! 4/16
15 Generating random variables The simplest useful probability distribution Unif(0, 1). In theory, can be used to generate any other RV. Easy to generate uniform RVs on a deterministic computer? 5/16
16 Generating random variables The simplest useful probability distribution Unif(0, 1). In theory, can be used to generate any other RV. Easy to generate uniform RVs on a deterministic computer? No! 5/16
17 Generating random variables The simplest useful probability distribution Unif(0, 1). In theory, can be used to generate any other RV. Easy to generate uniform RVs on a deterministic computer? No! Instead: pseudorandom numbers. 5/16
18 Generating random variables The simplest useful probability distribution Unif(0, 1). In theory, can be used to generate any other RV. Easy to generate uniform RVs on a deterministic computer? No! Instead: pseudorandom numbers. Map a seed to a random-looking sequence. 5/16
19 Generating random variables The simplest useful probability distribution Unif(0, 1). In theory, can be used to generate any other RV. Easy to generate uniform RVs on a deterministic computer? No! Instead: pseudorandom numbers. Map a seed to a random-looking sequence. Downside: 5/16
20 Generating random variables The simplest useful probability distribution Unif(0, 1). In theory, can be used to generate any other RV. Easy to generate uniform RVs on a deterministic computer? No! Instead: pseudorandom numbers. Map a seed to a random-looking sequence. Downside: Upside: Can use seeds for reproducibility or debugging set.seed(1) 5/16
21 Generating random variables The simplest useful probability distribution Unif(0, 1). In theory, can be used to generate any other RV. Easy to generate uniform RVs on a deterministic computer? No! Instead: pseudorandom numbers. Map a seed to a random-looking sequence. Downside: Upside: Can use seeds for reproducibility or debugging set.seed(1) Careful with batch/parallel processing. 5/16
22 Generating random variables In R we generate uniform random variables via runif Additional functions include rnorm(), rgamma(), rexp(), sample() etc. 6/16
23 Generating random variables In R we generate uniform random variables via runif Additional functions include rnorm(), rgamma(), rexp(), sample() etc. In theory, we can generate any other random variable by transforming a uniform (or any other) random variable: u Unif(0, 1), x = f(u) 6/16
24 Generating random variables In R we generate uniform random variables via runif Additional functions include rnorm(), rgamma(), rexp(), sample() etc. In theory, we can generate any other random variable by transforming a uniform (or any other) random variable: u Unif(0, 1), x = f(u) In practice, finding this f is too hard. Need other approaches. There is a whole subfield of statistics addressing this. 6/16
25 Examples of Monte Carlo sampling x, y are a pair of dice. What is p(x, y)? 7/16
26 Examples of Monte Carlo sampling x, y are a pair of dice. What is p(x, y)? What is E[min(x, y)] = 6 6 i=1 j=1 min(x, y)p(x, y) 7/16
27 Examples of Monte Carlo sampling x, y are a pair of dice. What is p(x, y)? What is E[min(x, y)] = 6 6 i=1 j=1 min(x, y)p(x, y) Don t really need Monte Carlo here (but what if we had a 100 dice?), but let s try it anyway. How? 7/16
28 Examples of Monte Carlo sampling x, y are a pair of dice. What is p(x, y)? What is E[min(x, y)] = 6 6 i=1 j=1 min(x, y)p(x, y) Don t really need Monte Carlo here (but what if we had a 100 dice?), but let s try it anyway. How? Roll a pair of dice N times. Call the ith outcome (x i, y i ). Then E[min(x, y)] 1 N N min(x i, y i ) i=1 7/16
29 A (bad) way of estimating the area of a circle C Let C be the unit disc, i.e. all points (x, y) with x 2 + y 2 1. Its area is A(C) = = x,y C 0 0 dxdy δ C ((x, y)) dxdy Here δ C ((x, y)) = 1 if x 2 + y 2 1, else it equals 0 8/16
30 A (bad) way of estimating the area of a circle C Let C be the unit disc, i.e. all points (x, y) with x 2 + y 2 1. Its area is A(C) = = x,y C 0 0 dxdy δ C ((x, y)) dxdy Here δ C ((x, y)) = 1 if x 2 + y 2 1, else it equals 0 How might we try to approximate this using Monte Carlo? What is the f and p? 8/16
31 A (bad) way of estimating the area of a circle One way: choose some probability distribution p(x, y) (e.g. both x and y are Gaussian distributed) Then: A(C) = = N N i=1 0 δ C ((x, y)) dxdy δ C ((x, y)) p(x, y)dxdy p(x, y) δ C ((x i, y i )) p(x i, y i ) (Monte Carlo, with (x i, y i ) p) 9/16
32 A (bad) way of estimating the area of a circle One way: choose some probability distribution p(x, y) (e.g. both x and y are Gaussian distributed) Then: A(C) = = N N i=1 0 δ C ((x, y)) dxdy δ C ((x, y)) p(x, y)dxdy p(x, y) δ C ((x i, y i )) p(x i, y i ) (Monte Carlo, with (x i, y i ) p) In words, sample N points (x i, y i ) from some distribution p, and plug them into the last equation above 9/16
33 A (bad) way of estimating the area of a circle N < # Number of Monte Carlo simulations x <- rnorm(n); y <- rnorm(n) # Sample N Gaussian pairs (x,y) px <- dnorm(x); py <- dnorm(y) pp <- px * py # Calculate their probabiliy dd <- sqrt(x^2+y^2) mc_est <- 1/N * sum(1/pp[dd<1]) # Monte Carlo estimate 10/16
34 Rare event simulation: Let X = (x 1,..., x 100 ) be a hundred dice. What is p( 100 d=1 x d 450)? 11/16
35 Rare event simulation: Let X = (x 1,..., x 100 ) be a hundred dice. What is p(sum(x) 450)? (where Sum(X) = 100 d=1 x d) 12/16
36 Rare event simulation: Let X = (x 1,..., x 100 ) be a hundred dice. What is p(sum(x) 450)? (where Sum(X) = 100 d=1 x d) p(sum(x) 450) = δ(sum(x) 450)p(X) = E p [δ(sum(x) 450)] δ( ) is the indicator function δ(condition) = 1 if condition is true, else 0. 12/16
37 Naive Monte Carlo sampling Propose from p(x) Calculate 1 N N i=1 δ(sum(x i)) 13/16
38 Naive Monte Carlo sampling Propose from p(x) Calculate 1 N N i=1 δ(sum(x i)) Most δ(sum(x i )) terms will be 0 High variance 13/16
39 Importance sampling Importance sampling: assigns importance weights to samples. 14/16
40 Importance sampling Importance sampling: assigns importance weights to samples. Scheme: Draw a proposal x i from q( ) Assign it a weight w i = p(x i )/q(x i ) 14/16
41 Importance sampling Importance sampling: assigns importance weights to samples. Scheme: Draw a proposal x i from q( ) Assign it a weight w i = p(x i )/q(x i ) µ = X f(x)p(x)dx = X f(x) p(x) q(x) q(x)dx 1 N N w i f(x i ) := µ imp i=1 14/16
42 Importance Sampling (contd) When does this make sense? Sometimes it s easier to simulate from q(x) than p(x) 15/16
43 Importance Sampling (contd) When does this make sense? Sometimes it s easier to simulate from q(x) than p(x) Sometimes it s better to simulate from q(x) than p(x)! To reduce variance. E.g. rare event simulation. 15/16
44 Importance sampling For 100 dice, what is p(sum > 450)? A better choice might be to bias the dice. E.g. q(x i = v) v (for v {1,... 6}) 16/16
45 Importance sampling For 100 dice, what is p(sum > 450)? A better choice might be to bias the dice. E.g. q(x i = v) v (for v {1,... 6}) Propose from q(x) Calculate weights w(x i ) = p(x i )/q(x i ) Calculate 1 N N i=1 w(x i)δ(sum(x i )) 16/16
46 Importance sampling For 100 dice, what is p(sum > 450)? A better choice might be to bias the dice. E.g. q(x i = v) v (for v {1,... 6}) Propose from q(x) Calculate weights w(x i ) = p(x i )/q(x i ) Calculate 1 N N i=1 w(x i)δ(sum(x i )) Gives a better estimate of p(sum(x) 500) = δ(sum(x) 500)p(X) 16/16
MULTI-DIMENSIONAL MONTE CARLO INTEGRATION
CS580: Computer Graphics KAIST School of Computing Chapter 3 MULTI-DIMENSIONAL MONTE CARLO INTEGRATION 2 1 Monte Carlo Integration This describes a simple technique for the numerical evaluation of integrals
More informationThe Plan: Basic statistics: Random and pseudorandom numbers and their generation: Chapter 16.
Scientific Computing with Case Studies SIAM Press, 29 http://www.cs.umd.edu/users/oleary/sccswebpage Lecture Notes for Unit IV Monte Carlo Computations Dianne P. O Leary c 28 What is a Monte-Carlo method?
More informationNumerical Integration
Lecture 12: Numerical Integration (with a focus on Monte Carlo integration) Computer Graphics CMU 15-462/15-662, Fall 2015 Review: fundamental theorem of calculus Z b f(x)dx = F (b) F (a) a f(x) = d dx
More informationIntegration. Volume Estimation
Monte Carlo Integration Lab Objective: Many important integrals cannot be evaluated symbolically because the integrand has no antiderivative. Traditional numerical integration techniques like Newton-Cotes
More informationScientific Computing with Case Studies SIAM Press, Lecture Notes for Unit IV Monte Carlo
Scientific Computing with Case Studies SIAM Press, 2009 http://www.cs.umd.edu/users/oleary/sccswebpage Lecture Notes for Unit IV Monte Carlo Computations Dianne P. O Leary c 2008 1 What is a Monte-Carlo
More informationWill Monroe July 21, with materials by Mehran Sahami and Chris Piech. Joint Distributions
Will Monroe July 1, 017 with materials by Mehran Sahami and Chris Piech Joint Distributions Review: Normal random variable An normal (= Gaussian) random variable is a good approximation to many other distributions.
More informationBehavior of the sample mean. varx i = σ 2
Behavior of the sample mean We observe n independent and identically distributed (iid) draws from a random variable X. Denote the observed values by X 1, X 2,..., X n. Assume the X i come from a population
More informationMonte Carlo Integration COS 323
Monte Carlo Integration COS 323 Last time Interpolatory Quadrature Review formulation; error analysis Newton-Cotes Quadrature Midpoint, Trapezoid, Simpson s Rule Error analysis for trapezoid, midpoint
More informationScientific Computing: An Introductory Survey
Scientific Computing: An Introductory Survey Chapter 13 Random Numbers and Stochastic Simulation Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright
More informationSampling and Monte-Carlo Integration
Sampling and Monte-Carlo Integration Sampling and Monte-Carlo Integration Last Time Pixels are samples Sampling theorem Convolution & multiplication Aliasing: spectrum replication Ideal filter And its
More informationPhoton Maps. The photon map stores the lighting information on points or photons in 3D space ( on /near 2D surfaces)
Photon Mapping 1/36 Photon Maps The photon map stores the lighting information on points or photons in 3D space ( on /near 2D surfaces) As opposed to the radiosity method that stores information on surface
More informationMotivation. Advanced Computer Graphics (Fall 2009) CS 283, Lecture 11: Monte Carlo Integration Ravi Ramamoorthi
Advanced Computer Graphics (Fall 2009) CS 283, Lecture 11: Monte Carlo Integration Ravi Ramamoorthi http://inst.eecs.berkeley.edu/~cs283 Acknowledgements and many slides courtesy: Thomas Funkhouser, Szymon
More informationCSE 446 Bias-Variance & Naïve Bayes
CSE 446 Bias-Variance & Naïve Bayes Administrative Homework 1 due next week on Friday Good to finish early Homework 2 is out on Monday Check the course calendar Start early (midterm is right before Homework
More informationGOV 2001/ 1002/ E-2001 Section 1 1 Monte Carlo Simulation
GOV 2001/ 1002/ E-2001 Section 1 1 Monte Carlo Simulation Anton Strezhnev Harvard University January 27, 2016 1 These notes and accompanying code draw on the notes from TF s from previous years. 1 / 33
More informationComputational Methods. Randomness and Monte Carlo Methods
Computational Methods Randomness and Monte Carlo Methods Manfred Huber 2010 1 Randomness and Monte Carlo Methods Introducing randomness in an algorithm can lead to improved efficiencies Random sampling
More informationQuantitative Biology II!
Quantitative Biology II! Lecture 3: Markov Chain Monte Carlo! March 9, 2015! 2! Plan for Today!! Introduction to Sampling!! Introduction to MCMC!! Metropolis Algorithm!! Metropolis-Hastings Algorithm!!
More informationMonte Carlo Integration COS 323
Monte Carlo Integration COS 323 Integration in d Dimensions? One option: nested 1-D integration f(x,y) g(y) y f ( x, y) dx dy ( ) = g y dy x Evaluate the latter numerically, but each sample of g(y) is
More informationMonte Carlo Integration and Random Numbers
Monte Carlo Integration and Random Numbers Higher dimensional integration u Simpson rule with M evaluations in u one dimension the error is order M -4! u d dimensions the error is order M -4/d u In general
More informationMonte Carlo Integration
Lab 18 Monte Carlo Integration Lab Objective: Implement Monte Carlo integration to estimate integrals. Use Monte Carlo Integration to calculate the integral of the joint normal distribution. Some multivariable
More informationToday s outline: pp
Chapter 3 sections We will SKIP a number of sections Random variables and discrete distributions Continuous distributions The cumulative distribution function Bivariate distributions Marginal distributions
More informationINFOMAGR Advanced Graphics. Jacco Bikker - February April Welcome!
INFOMAGR Advanced Graphics Jacco Bikker - February April 2016 Welcome! I x, x = g(x, x ) ε x, x + S ρ x, x, x I x, x dx Today s Agenda: Introduction Stratification Next Event Estimation Importance Sampling
More informationGAMES Webinar: Rendering Tutorial 2. Monte Carlo Methods. Shuang Zhao
GAMES Webinar: Rendering Tutorial 2 Monte Carlo Methods Shuang Zhao Assistant Professor Computer Science Department University of California, Irvine GAMES Webinar Shuang Zhao 1 Outline 1. Monte Carlo integration
More informationWhat is the Monte Carlo Method?
Program What is the Monte Carlo Method? A bit of history Applications The core of Monte Carlo: Random realizations 1st example: Initial conditions for N-body simulations 2nd example: Simulating a proper
More informationSection 2.3: Monte Carlo Simulation
Section 2.3: Monte Carlo Simulation Discrete-Event Simulation: A First Course c 2006 Pearson Ed., Inc. 0-13-142917-5 Discrete-Event Simulation: A First Course Section 2.3: Monte Carlo Simulation 1/1 Section
More informationBiostatistics 615/815 Lecture 16: Importance sampling Single dimensional optimization
Biostatistics 615/815 Lecture 16: Single dimensional optimization Hyun Min Kang November 1st, 2012 Hyun Min Kang Biostatistics 615/815 - Lecture 16 November 1st, 2012 1 / 59 The crude Monte-Carlo Methods
More informationSimulation. Programming in R for Data Science Anders Stockmarr, Kasper Kristensen, Anders Nielsen
Simulation Programming in R for Data Science Anders Stockmarr, Kasper Kristensen, Anders Nielsen What can random numbers be used for Some applications: Simulation of complex systems MCMC (Markov Chain
More informationECE353: Probability and Random Processes. Lecture 11- Two Random Variables (II)
ECE353: Probability and Random Processes Lecture 11- Two Random Variables (II) Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Joint
More informationProbability Model for 2 RV s
Probability Model for 2 RV s The joint probability mass function of X and Y is P X,Y (x, y) = P [X = x, Y= y] Joint PMF is a rule that for any x and y, gives the probability that X = x and Y= y. 3 Example:
More informationMonte-Carlo Ray Tracing. Antialiasing & integration. Global illumination. Why integration? Domains of integration. What else can we integrate?
Monte-Carlo Ray Tracing Antialiasing & integration So far, Antialiasing as signal processing Now, Antialiasing as integration Complementary yet not always the same in particular for jittered sampling Image
More informationProbabilistic Robotics
Probabilistic Robotics Discrete Filters and Particle Filters Models Some slides adopted from: Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Kai Arras and Probabilistic Robotics Book SA-1 Probabilistic
More informationPage 129 Exercise 5: Suppose that the joint p.d.f. of two random variables X and Y is as follows: { c(x. 0 otherwise. ( 1 = c. = c
Stat Solutions for Homework Set Page 9 Exercise : Suppose that the joint p.d.f. of two random variables X and Y is as follows: { cx fx, y + y for y x, < x < otherwise. Determine a the value of the constant
More informationMissing Data Analysis for the Employee Dataset
Missing Data Analysis for the Employee Dataset 67% of the observations have missing values! Modeling Setup For our analysis goals we would like to do: Y X N (X, 2 I) and then interpret the coefficients
More informationComputational Methods for Finding Probabilities of Intervals
Computational Methods for Finding Probabilities of Intervals Often the standard normal density function is denoted ϕ(z) = (2π) 1/2 exp( z 2 /2) and its CDF by Φ(z). Because Φ cannot be expressed in closed
More informationAn introduction to design of computer experiments
An introduction to design of computer experiments Derek Bingham Statistics and Actuarial Science Simon Fraser University Department of Statistics and Actuarial Science Outline Designs for estimating a
More informationMotivation. Monte Carlo Path Tracing. Monte Carlo Path Tracing. Monte Carlo Path Tracing. Monte Carlo Path Tracing
Advanced Computer Graphics (Spring 2013) CS 283, Lecture 11: Monte Carlo Path Tracing Ravi Ramamoorthi http://inst.eecs.berkeley.edu/~cs283/sp13 Motivation General solution to rendering and global illumination
More informationReproducibility in Stochastic Simulation
Reproducibility in Stochastic Simulation Prof. Michael Mascagni Department of Computer Science Department of Mathematics Department of Scientific Computing Graduate Program in Molecular Biophysics Florida
More informationIntegration. Example Find x 3 dx.
Integration A function F is called an antiderivative of the function f if F (x)=f(x). The set of all antiderivatives of f is called the indefinite integral of f with respect to x and is denoted by f(x)dx.
More informationUNIT 9A Randomness in Computation: Random Number Generators
UNIT 9A Randomness in Computation: Random Number Generators 1 Last Unit Computer organization: what s under the hood 3 This Unit Random number generation Using pseudorandom numbers 4 Overview The concept
More informationLecture 8: Jointly distributed random variables
Lecture : Jointly distributed random variables Random Vectors and Joint Probability Distributions Definition: Random Vector. An n-dimensional random vector, denoted as Z = (Z, Z,, Z n ), is a function
More informationSimulation: Solving Dynamic Models ABE 5646 Week 12, Spring 2009
Simulation: Solving Dynamic Models ABE 5646 Week 12, Spring 2009 Week Description Reading Material 12 Mar 23- Mar 27 Uncertainty and Sensitivity Analysis Two forms of crop models Random sampling for stochastic
More informationMonte Carlo sampling
1 y u theta 0 x 1 Monte Carlo sampling Problem 1 Suppose we want to sample uniformly at random from the triangle defined by the points (0,0), (0,1), (1,0). First Sampling Algorithm: We decide to do this
More informationf (Pijk ) V. may form the Riemann sum: . Definition. The triple integral of f over the rectangular box B is defined to f (x, y, z) dv = lim
Chapter 14 Multiple Integrals..1 Double Integrals, Iterated Integrals, Cross-sections.2 Double Integrals over more general regions, Definition, Evaluation of Double Integrals, Properties of Double Integrals.3
More informationCS 563 Advanced Topics in Computer Graphics Monte Carlo Integration: Basic Concepts. by Emmanuel Agu
CS 563 Advanced Topics in Computer Graphics Monte Carlo Integration: Basic Concepts by Emmanuel Agu Introduction The integral equations generally don t have analytic solutions, so we must turn to numerical
More informationGetting started with simulating data in R: some helpful functions and how to use them Ariel Muldoon August 28, 2018
Getting started with simulating data in R: some helpful functions and how to use them Ariel Muldoon August 28, 2018 Contents Overview 2 Generating random numbers 2 rnorm() to generate random numbers from
More informationMarkov Decision Processes (MDPs) (cont.)
Markov Decision Processes (MDPs) (cont.) Machine Learning 070/578 Carlos Guestrin Carnegie Mellon University November 29 th, 2007 Markov Decision Process (MDP) Representation State space: Joint state x
More informationCSE 6242/CX Ensemble Methods. Or, Model Combination. Based on lecture by Parikshit Ram
CSE 6242/CX 4242 Ensemble Methods Or, Model Combination Based on lecture by Parikshit Ram Numerous Possible Classifiers! Classifier Training time Cross validation Testing time Accuracy knn classifier None
More informationApprenticeship Learning for Reinforcement Learning. with application to RC helicopter flight Ritwik Anand, Nick Haliday, Audrey Huang
Apprenticeship Learning for Reinforcement Learning with application to RC helicopter flight Ritwik Anand, Nick Haliday, Audrey Huang Table of Contents Introduction Theory Autonomous helicopter control
More information10703 Deep Reinforcement Learning and Control
10703 Deep Reinforcement Learning and Control Russ Salakhutdinov Machine Learning Department rsalakhu@cs.cmu.edu Policy Gradient I Used Materials Disclaimer: Much of the material and slides for this lecture
More informationJoint probability distributions
Joint probability distributions Let X and Y be two discrete rv s defined on the sample space S. The joint probability mass function p(x, y) is p(x, y) = P(X = x, Y = y). Note p(x, y) and x y p(x, y) =
More informationCSE446: Linear Regression. Spring 2017
CSE446: Linear Regression Spring 2017 Ali Farhadi Slides adapted from Carlos Guestrin and Luke Zettlemoyer Prediction of continuous variables Billionaire says: Wait, that s not what I meant! You say: Chill
More informationMonte Carlo Simulations
Monte Carlo Simulations DESCRIPTION AND APPLICATION Outline Introduction Description of Method Cost Estimating Example Other Considerations Introduction Most interesting things are probabilistic (opinion)
More informationCS839: Probabilistic Graphical Models. Lecture 10: Learning with Partially Observed Data. Theo Rekatsinas
CS839: Probabilistic Graphical Models Lecture 10: Learning with Partially Observed Data Theo Rekatsinas 1 Partially Observed GMs Speech recognition 2 Partially Observed GMs Evolution 3 Partially Observed
More informationOverview. Monte Carlo Methods. Statistics & Bayesian Inference Lecture 3. Situation At End Of Last Week
Statistics & Bayesian Inference Lecture 3 Joe Zuntz Overview Overview & Motivation Metropolis Hastings Monte Carlo Methods Importance sampling Direct sampling Gibbs sampling Monte-Carlo Markov Chains Emcee
More informationParametric Surfaces. Substitution
Calculus Lia Vas Parametric Surfaces. Substitution Recall that a curve in space is given by parametric equations as a function of single parameter t x = x(t) y = y(t) z = z(t). A curve is a one-dimensional
More informationNote Set 4: Finite Mixture Models and the EM Algorithm
Note Set 4: Finite Mixture Models and the EM Algorithm Padhraic Smyth, Department of Computer Science University of California, Irvine Finite Mixture Models A finite mixture model with K components, for
More informationReading. 8. Distribution Ray Tracing. Required: Watt, sections 10.6,14.8. Further reading:
Reading Required: Watt, sections 10.6,14.8. Further reading: 8. Distribution Ray Tracing A. Glassner. An Introduction to Ray Tracing. Academic Press, 1989. [In the lab.] Robert L. Cook, Thomas Porter,
More informationUsing Random numbers in modelling and simulation
Using Random numbers in modelling and simulation Determination of pi by monte-carlo method Consider a unit square and the unit circle, as shown below. The radius of the circle is 1, so the area of the
More informationMonte Carlo Techniques. Professor Stephen Sekula Guest Lecture PHY 4321/7305 Sep. 3, 2014
Monte Carlo Techniques Professor Stephen Sekula Guest Lecture PHY 431/7305 Sep. 3, 014 What are Monte Carlo Techniques? Computational algorithms that rely on repeated random sampling in order to obtain
More informationPHYSICS 115/242 Homework 5, Solutions X = and the mean and variance of X are N times the mean and variance of 12/N y, so
PHYSICS 5/242 Homework 5, Solutions. Central limit theorem. (a) Let y i = x i /2. The distribution of y i is equal to for /2 y /2 and zero otherwise. Hence We consider µ y y i = /2 /2 σ 2 y y 2 i y i 2
More informationProbability Models.S4 Simulating Random Variables
Operations Research Models and Methods Paul A. Jensen and Jonathan F. Bard Probability Models.S4 Simulating Random Variables In the fashion of the last several sections, we will often create probability
More informationYou ve already read basics of simulation now I will be taking up method of simulation, that is Random Number Generation
Unit 5 SIMULATION THEORY Lesson 39 Learning objective: To learn random number generation. Methods of simulation. Monte Carlo method of simulation You ve already read basics of simulation now I will be
More informationCS61B Lecture #32. Last modified: Sun Nov 5 19:32: CS61B: Lecture #32 1
CS61B Lecture #32 Today: Pseudo-random Numbers (Chapter 11) What use are random sequences? What are random sequences? Pseudo-random sequences. How to get one. Relevant Java library classes and methods.
More informationIntroduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization. Wolfram Burgard
Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard 1 Motivation Recall: Discrete filter Discretize the continuous state space High memory complexity
More informationMachine Learning A W 1sst KU. b) [1 P] For the probability distribution P (A, B, C, D) with the factorization
Machine Learning A 708.064 13W 1sst KU Exercises Problems marked with * are optional. 1 Conditional Independence a) [1 P] For the probability distribution P (A, B, C, D) with the factorization P (A, B,
More informationRandom Numbers Random Walk
Random Numbers Random Walk Computational Physics Random Numbers Random Walk Outline Random Systems Random Numbers Monte Carlo Integration Example Random Walk Exercise 7 Introduction Random Systems Deterministic
More informationMonte Carlo Methods. Lecture slides for Chapter 17 of Deep Learning Ian Goodfellow Last updated
Monte Carlo Methods Lecture slides for Chapter 17 of Deep Learning www.deeplearningbook.org Ian Goodfellow Last updated 2017-12-29 Roadmap Basics of Monte Carlo methods Importance Sampling Markov Chains
More informationMotivation: Monte Carlo Path Tracing. Sampling and Reconstruction of Visual Appearance. Monte Carlo Path Tracing. Monte Carlo Path Tracing
Sampling and Reconstruction of Visual Appearance CSE 274 [Winter 2018], Lecture 4 Ravi Ramamoorthi http://www.cs.ucsd.edu/~ravir Motivation: Key application area for sampling/reconstruction Core method
More informationAnti-aliasing and Monte Carlo Path Tracing. Brian Curless CSE 557 Autumn 2017
Anti-aliasing and Monte Carlo Path Tracing Brian Curless CSE 557 Autumn 2017 1 Reading Required: Marschner and Shirley, Section 13.4 (online handout) Pharr, Jakob, and Humphreys, Physically Based Ray Tracing:
More informationPhysics 736. Experimental Methods in Nuclear-, Particle-, and Astrophysics. - Statistical Methods -
Physics 736 Experimental Methods in Nuclear-, Particle-, and Astrophysics - Statistical Methods - Karsten Heeger heeger@wisc.edu Course Schedule and Reading course website http://neutrino.physics.wisc.edu/teaching/phys736/
More informationComputational Physics Adaptive, Multi-Dimensional, & Monte Carlo Integration Feb 21, 2019
Computational Physics Adaptive, Multi-Dimensional, & Monte Carlo Integration Feb 21, 219 http://hadron.physics.fsu.edu/~eugenio/comphy/ eugenio@fsu.edu f(x) trapezoidal rule Series Integration review from
More informationMotivation: Monte Carlo Rendering. Sampling and Reconstruction of Visual Appearance. Caustics. Illumination Models. Overview of lecture.
Sampling and Reconstruction of Visual Appearance CSE 74 [Winter 8], Lecture 3 Ravi Ramamoorthi http://www.cs.ucsd.edu/~ravir Motivation: Monte Carlo Rendering Key application area for sampling/reconstruction
More information10.4 Linear interpolation method Newton s method
10.4 Linear interpolation method The next best thing one can do is the linear interpolation method, also known as the double false position method. This method works similarly to the bisection method by
More information14.30 Introduction to Statistical Methods in Economics Spring 2009
MIT OpenCourseWare http://ocw.mit.edu 14.30 Introduction to Statistical Methods in Economics Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
More informationModel validation T , , Heli Hiisilä
Model validation T-61.6040, 03.10.2006, Heli Hiisilä Testing Neural Models: How to Use Re-Sampling Techniques? A. Lendasse & Fast bootstrap methodology for model selection, A. Lendasse, G. Simon, V. Wertz,
More informationToday s Lecture. Factors & Sampling. Quick Review of Last Week s Computational Concepts. Numbers we Understand. 1. A little bit about Factors
Today s Lecture Factors & Sampling Jarrett Byrnes September 8, 2014 1. A little bit about Factors 2. Sampling 3. Describing your sample Quick Review of Last Week s Computational Concepts Numbers we Understand
More informationNested Sampling: Introduction and Implementation
UNIVERSITY OF TEXAS AT SAN ANTONIO Nested Sampling: Introduction and Implementation Liang Jing May 2009 1 1 ABSTRACT Nested Sampling is a new technique to calculate the evidence, Z = P(D M) = p(d θ, M)p(θ
More informationTopic 5 - Joint distributions and the CLT
Topic 5 - Joint distributions and the CLT Joint distributions Calculation of probabilities, mean and variance Expectations of functions based on joint distributions Central Limit Theorem Sampling distributions
More informationDistribution Ray Tracing. University of Texas at Austin CS384G - Computer Graphics Fall 2010 Don Fussell
Distribution Ray Tracing University of Texas at Austin CS384G - Computer Graphics Fall 2010 Don Fussell Reading Required: Watt, sections 10.6,14.8. Further reading: A. Glassner. An Introduction to Ray
More informationPhysically Realistic Ray Tracing
Physically Realistic Ray Tracing Reading Required: Watt, sections 10.6,14.8. Further reading: A. Glassner. An Introduction to Ray Tracing. Academic Press, 1989. [In the lab.] Robert L. Cook, Thomas Porter,
More informationEstimation of Item Response Models
Estimation of Item Response Models Lecture #5 ICPSR Item Response Theory Workshop Lecture #5: 1of 39 The Big Picture of Estimation ESTIMATOR = Maximum Likelihood; Mplus Any questions? answers Lecture #5:
More informationMath 21a Tangent Lines and Planes Fall, What do we know about the gradient f? Tangent Lines to Curves in the Plane.
Math 21a Tangent Lines and Planes Fall, 2016 What do we know about the gradient f? Tangent Lines to Curves in the Plane. 1. For each of the following curves, find the tangent line to the curve at the point
More informationMachine Learning A WS15/16 1sst KU Version: January 11, b) [1 P] For the probability distribution P (A, B, C, D) with the factorization
Machine Learning A 708.064 WS15/16 1sst KU Version: January 11, 2016 Exercises Problems marked with * are optional. 1 Conditional Independence I [3 P] a) [1 P] For the probability distribution P (A, B,
More informationWorksheet 3.1: Introduction to Double Integrals
Boise State Math 75 (Ultman) Worksheet.: Introduction to ouble Integrals Prerequisites In order to learn the new skills and ideas presented in this worksheet, you must: Be able to integrate functions of
More informationChapter Four: Loops II
Chapter Four: Loops II Slides by Evan Gallagher & Nikolay Kirov Chapter Goals To understand nested loops To implement programs that read and process data sets To use a computer for simulations Processing
More informationComputer Vision Group Prof. Daniel Cremers. 11. Sampling Methods
Prof. Daniel Cremers 11. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric
More informationPrecomposing Equations
Precomposing Equations Let s precompose the function f(x) = x 3 2x + 9 with the function g(x) = 4 x. (Precompose f with g means that we ll look at f g. We would call g f postcomposing f with g.) f g(x)
More informationPROGRAM DESIGN TOOLS. Algorithms, Flow Charts, Pseudo codes and Decision Tables. Designed by Parul Khurana, LIECA.
PROGRAM DESIGN TOOLS Algorithms, Flow Charts, Pseudo codes and Decision Tables Introduction The various tools collectively referred to as program design tools, that helps in planning the program are:-
More information(Section 6.2: Volumes of Solids of Revolution: Disk / Washer Methods)
(Section 6.: Volumes of Solids of Revolution: Disk / Washer Methods) 6.. PART E: DISK METHOD vs. WASHER METHOD When using the Disk or Washer Method, we need to use toothpicks that are perpendicular to
More informationComputational Statistics General Background Information
Computational Statistics General Background Information Brief introduction to R. R functions. Monte Carlo methods in statistics. Uniform random number generation. Random number generation in R. 1 A Little
More informationEE 511 Linear Regression
EE 511 Linear Regression Instructor: Hanna Hajishirzi hannaneh@washington.edu Slides adapted from Ali Farhadi, Mari Ostendorf, Pedro Domingos, Carlos Guestrin, and Luke Zettelmoyer, Announcements Hw1 due
More informationThree Types of Probability
CHAPTER Three Types of Probability This article is not so much about particular problems or problem solving tactics as it is about labels. If you think about it, labels are a big key to the way we organize
More informationImplicit generative models: dual vs. primal approaches
Implicit generative models: dual vs. primal approaches Ilya Tolstikhin MPI for Intelligent Systems ilya@tue.mpg.de Machine Learning Summer School 2017 Tübingen, Germany Contents 1. Unsupervised generative
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Lecture 17 EM CS/CNS/EE 155 Andreas Krause Announcements Project poster session on Thursday Dec 3, 4-6pm in Annenberg 2 nd floor atrium! Easels, poster boards and cookies
More informationGame Engineering: 2D
Game Engineering: 2D CS420-2010F-07 Objects in 2D David Galles Department of Computer Science University of San Francisco 07-0: Representing Polygons We want to represent a simple polygon Triangle, rectangle,
More informationCPSC 340: Machine Learning and Data Mining. Probabilistic Classification Fall 2017
CPSC 340: Machine Learning and Data Mining Probabilistic Classification Fall 2017 Admin Assignment 0 is due tonight: you should be almost done. 1 late day to hand it in Monday, 2 late days for Wednesday.
More informationRaytracing & Epsilon. Today. Last Time? Forward Ray Tracing. Does Ray Tracing Simulate Physics? Local Illumination
Raytracing & Epsilon intersects light @ t = 25.2 intersects sphere1 @ t = -0.01 & Monte Carlo Ray Tracing intersects sphere1 @ t = 10.6 Solution: advance the ray start position epsilon distance along the
More informationMassively Parallel GPU-friendly Algorithms for PET. Szirmay-Kalos László, Budapest, University of Technology and Economics
Massively Parallel GPU-friendly Algorithms for PET Szirmay-Kalos László, http://cg.iit.bme.hu, Budapest, University of Technology and Economics (GP)GPU: CUDA (OpenCL) Multiprocessor N Multiprocessor 2
More informationEstimating the Information Rate of Noisy Two-Dimensional Constrained Channels
Estimating the Information Rate of Noisy Two-Dimensional Constrained Channels Mehdi Molkaraie and Hans-Andrea Loeliger Dept. of Information Technology and Electrical Engineering ETH Zurich, Switzerland
More informationRandom Numbers and Monte Carlo Methods
Random Numbers and Monte Carlo Methods Methods which make use of random numbers are often called Monte Carlo Methods after the Monte Carlo Casino in Monaco which has long been famous for games of chance.
More informationProbability and Statistics for Final Year Engineering Students
Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: April 11, 2011. Lecture 1: Introduction and Basic Terms Welcome to the course, time table, assessment, etc..
More information