Physics 736. Experimental Methods in Nuclear-, Particle-, and Astrophysics. - Statistical Methods -

Size: px
Start display at page:

Download "Physics 736. Experimental Methods in Nuclear-, Particle-, and Astrophysics. - Statistical Methods -"

Transcription

1 Physics 736 Experimental Methods in Nuclear-, Particle-, and Astrophysics - Statistical Methods - Karsten Heeger heeger@wisc.edu

2 Course Schedule and Reading course website This Week final homework #8 assigned this week, due April 13, 2011 define and outline course projects this week - title - abstract - plans for course project work: what do you plan to calculate?

3 Statistics & Numerical Methods Topics Monte Carlo techniques random numbers review maximum likelihood least square method hypothesis testing goodness of fit

4 Course Projects Define your course project this week are your interested in detectors? data analysis and statistical techniques? numerical methods Goals for course project address a physics problem, provide physics background (e.g. Higgs search, Dama) apply one or more of the topics from this course (detectors, background, statistics) Timeline choose topic - by April 4, 2011 Latex outline - by April 8, 2011 title + abstract bullet points of what you are going to do some literature references See instructions on course website. Please and contact me with any questions!

5 Review of Homework Karsten Heeger, Univ. of Wisconsin NUSS, July 13, 2009

6 Statistical Distributions binomial Poisson Gaussian chisquare distribution

7 Statistical Distributions Gaussian A P (x) = 1 σ (x µ) 2 (2π) e 2σ 2 binomial B P (r) = N! r!(n r)! pr (1 p) N r Poisson chisquare distribution C D P (r) = µr e µ r! P (u)du = (u/2)(u/2) 1 e u/2 2Γ(ν/2) du

8 Probability and Statistics Poisson distribution cumulative distribution

9 Error Propagation for linearly independent variables

10 Error Propagation

11 Standard Deviation of the Mean

12 Central Limit Theorem If xi are a set of n independent variables of mean μ and variance σ 2 then for large n: y = x i n will tend to a Gaussian with mean μ and variance σ 2 /n This is true even if xi come from distributions with different means μi and variance σi 2 In this case mean = µ i n variance = σ 2 i n

13 Random Numbers Monte Carlo Techniques Karsten Heeger, Univ. of Wisconsin NUSS, July 13, 2009

14 Random Numbers & Monte Carlo Techniques Who has used a Monte Carlo before? Who has written a Monte Carlo before? what are elements of Monte Carlo?

15 Random Numbers & Monte Carlo Techniques Monte Carlo (MC) refers to any procedure that makes use of random numbers MC methods are used in simulation of natural phenomena simulation of experimental apparatus numerical analysis (e.g. integration of many variables)

16 Random Numbers & Monte Carlo Techniques Simulating Data or Experiment

17 Random Numbers & Monte Carlo Techniques Simulating Radioactive Decay

18 Random Numbers & Monte Carlo Techniques Estimating the Area of a Circle ** * ** ++? 't-.r o o F o z -o x 2.s Area of circle hits from 100 pairs of random numbers uniformly distributed between -1 and +1 # of hits inside circle give area estimate circle area estimates obtained from 100 MC runs, each with 100 pairs of random numbers. Gaussian curve based on mean and standard deviation of 100 estimated areas.

19 Random Numbers & Monte Carlo Techniques Random Numbers What is a random number? Is 3 a random number?

20 Random Numbers & Monte Carlo Techniques Random Numbers What is a random number? Is 3 a random number? No such thing as a single random number. A sequence of random numbers or a set of numbers that have nothing to do with the other numbers in the sequence. In a uniform distribution of random numbers in the range of [0,1] every number has the sam chance of turning up is as likely as 0.5.

21 Random Numbers & Monte Carlo Techniques How to Generate Random Numbers chaotic system e.g. lottery random process radioactive decay thermal noise cosmic ray arrival random number tables computer code

22 Random Numbers & Monte Carlo Techniques Random Number Tables

23 Random Numbers & Monte Carlo Techniques How to Generate Random Numbers all algorithms produce a periodic sequence of numbers sequence of numbers in a uniform distribution in the range [0,1] algorithms generate integers between 0 and M and return a real value x n = I n /M to obtain effectively random values, use small subset of a single period e.g. Mersenne twister algorithm = long period

24 Random Numbers & Monte Carlo Techniques How to Generate Random Numbers Middle Square, Von Neumann, 1946 generate a sequence of 10 digit integers, start with one, square it, and then take the middle 10 digits from answer as next number in sequence sequence is not random since each number is completely determined from previous one. but it appears random. a more complex algorithm does not lead to a better random sequence. it is better to use an algorithm that is well understood.

25 Random Numbers & Monte Carlo Techniques RANDU from IBM in 1960s RANDU 2D I n+1 =(65539 I n )mod2 31 RANDU 3D

26 Random Numbers & Monte Carlo Techniques How to Generate Random Numbers not all random number generators are good! For example, in ROOT TRandom3 recommended by ROOT TRandom too short of a period For example, in Numerical Recipes authors have admitted that RAN1 and RAN2 in first edition are mediocre generators ran0, ran1, ran2 are much better in second edition

27 Random Numbers & Monte Carlo Techniques How to Improve Generators improve behavior and increase period ny modifying algorithms I n =(a I n 1 + b I n 2 )mod m this has 2 initial seeds and can have a period greater than m RABMAR generator in CERNLIB requires 103 seeds. the ultimate random number generator.

28 Random Numbers & Monte Carlo Techniques Simulating Distributions so far we have only considered random number in [0,1] more complicated problems generally require random numbers generated according to specific distributions we can generate random numbers according to certain distributions (e.g Poisson for radioactive decay) Goal: obtain a random deviate x from any probability density distribution function f(x) can use special purpose algorithms. use numerical libraries and routines. we will discuss 2 techniques here...

29 Acceptance/Rejection Method (von Neumann) Problem: generate a series of random numbers, xi, which follow a distribution f(x) Method: choose trial value, xtrial. accept with probability f(xtrial) choose trial x with random number λ1 x trial = x min +(x max x min )λ 1 random points are chosen inside the box and rejected if the ordinate exceeds f(x)

30 Acceptance/Rejection Method (von Neumann) random points are chosen inside the box and rejected if the ordinate exceeds f(x) bounding region is method to increase efficiency efficiency of method = ratio of areas keep Ch(x) as close as possible to f(x) Method applicable if - f(x) is too complex for other techniques - f(x) can be computed beware of normalization

31 Acceptance/Rejection Method (von Neumann) rejection algorithm is not efficient if the distribution has one or more larger peaks (or poles). in this case trial events are seldomly accepted. algorithm does not work when the range of x is [-, + ]

32 Inverse Transform Method applicable for simple distribution functions Method probability density function is f(x) in [-, + ] integrated probability up to point a is F(a) for x a F(a) is itself a random variable which will occur with uniform probability density on [0,1] we can find a unique x for a given u if u = F (x) provided we can find inverse x = F (u) 1

33 Inverse Transform Method Use of a random number u chosen from a uniform distribution [0,1] to find a random number x from a distribution with cumulative distribution function F(x) PDG

34 Inverse Transform Method Practical Method 1. normalize distribution function so that it becomes a probability distribution function (PDF) 2. integrate PDF from xmin to arbitrary x. this is probability of choosing a value less than x. 3. equate this to a uniform random number and solve for x. the resulting x will be distributed according to PDF. in other words, solve following equation for x given a uniform random number λ x x f(x)dx min xmax f(x)dx = λ x min

35 Inverse Transform Method convenient when you can calculate the inverse function e.g. exp(x), (1-x) n, 1/(1+x 2 ) there are som packages that do this for you. e.g. UNU.RAN in ROOT Examples generate x between 0 and 4 according to f(x) =x 0.5 generate x between 0 and according to f(x) =e x

36 Random Numbers & Monte Carlo Techniques What if rejection technique is impractical and you cannot invert the integral of the distribution function? Replace the distribution function f(x) by an approximate form f* (x)for which the inversion technique can be applied Generate trial values for x with inversion technique according to f*(x), and accept trial value with probability proportional to weight w = f(x)/f (x) f (x) rejection technique = special case where f*(x) is constant

37 Random Numbers & Monte Carlo Techniques Multidimensional Simulation (simulating a distribution in more than one dimension) if distribution is separable then variables are uncorrelated, each can be generated as before f(x, y) =g(x)h(y) generate x according to g(x) and y according to h(y) otherwise, distribution along each dimension needs to be calculated ymax D x (x) = f(x, y)dy y min f (x, y)dx find approximate distribution so that f (x, y)dy are invertible weights for trial events are given by w = f(x, y) f (x, y)

38 Monte Carlo Numbering Scheme To facilitate interfacing between event generators, detector simulators, and analysis packages used in particle physics

39 Maximum Likelihood Method of Least Squares Karsten Heeger, Univ. of Wisconsin NUSS, July 13, 2009

40 Maximum Likelihood Estimation general method of parameter estimation when functional form of parent distribution is known for large samples the ML estimators are normally distributed, and hence the variances are easy to determine for small samples, ML estimators possess most good properties n measurements xi of a quantity with probability density function f(x θ) L(x 1,x 2,... θ) =Πf(x i,θ) estimate ˆθ is the value which maximizes L

41 Maximum Likelihood Estimation L(x 1,x 2,... θ) =Πf(x i,θ) Since L and lnl attain their maximum values at the same point one usually uses lnl since sums are easier to work with than products: lnl = i ln(f(x i θ)) Normally point of maximum likelihood is found numerically.

42 Maximum Likelihood Estimation Properties of ML estimators invariant under parameter transformation consistent estimators converge on true parameter unbiased sometimes biased for finite samples. ˆθ efficient may be unbiased but u(θ) ˆ may be biased if a sufficient estimator exists, the ML method will produce it

43 Examples of Likelihood Distributions central values and 1σ intervals uncertainty is deduced from position where lnl is reduced by 1/2 lnl(ˆθ + σ(ˆθ)) = lnl max 0.5 even applies for non-gaussian likelihood

44 Examples of Likelihood Distributions central values and 1σ intervals asymmetric errors e.g

45 Likelihood for Two Parameters L(x θ 1,θ 2 ) θ 1,θ 2 Given plot contours of constant likelihood in the plane To find the uncertainty, plot contour with lnl(ˆθ + σ(ˆθ)) = lnl max 0.5 and look at the projection of the contour on the two axes. Using correct method, uncertainties do not depend on correlation of variables.

46 Likelihood Function and Binned Data Application of ML Method to Binned Data If the sample is very large and f(x θ) is complex, computation can be reduced by grouping the sample into bins and write L as product of probability of finding n entries in each bin. L(n 1,n 2,...) θ) =n!π(n i!) 1 p n i i pi = probability for bin i lnl = n i lnp i (θ) There will be some loss of information by binning the data, but as long as the variation in f across each bin is small there should be no great loss in precision of ˆθ

47 Method of Least Squares relate data and model frequently used method for parameter estimation but no general optimal properties to recommend it if parameter dependence is linear, method of least squares (LS) produced unbiased estimators of minimum variance ( yi f(x i θ j ) ) 2 S = i σ i if data are Gaussian distributed then LS is equivalent to ML method if in addition, observables are linear functions of the parameters, the S will follow distribution χ 2

48 Method of Least Squares Degrees of Freedom If data is Gaussian distributed then S follows a distribution with N degrees of freedom. N data points in general n degrees of freedom N data points, m parameters of linear model N-m degrees of freedom

49 Data with Error Bars For ±1σ, 1/3 of data should be outside fit

50 Method of Least Squares Degrees of Freedom

51 Method of Least Squares Application of LS Method to Binned Data If the data is split into N bins, with n i entries in bin i, and p i ( ) is the probability of an event to populate bin i, then the expected number of events in each bin is given by, f i = n p i where n = i=1 N n i If the number of bins is large enough, the error matrix is diagonal and the LS method reduces to minimising X 2 = i=1 N (ni -f i ) 2 / i 2 N i=1 (ni -f i ) 2 /f i which can be done numerically.

52 Method of Least Squares Application of LS Method to Binned Data X 2 = i=1 N (ni -f i ) 2 / i 2 N i=1 (ni -f i ) 2 /f i 2 Sometimes i is approximated by n i, but the estimates found this way are more sensitive to statistical fluctuations. (For large sample sizes the two choices give the same result.) Since 1 degree of freedom has been lost due to the normalisation condition, n i =n, X 2 min would follow f( 2 N-1-L) if the model consisted of L independent parameters.

53 Method of Least Squares Binned data two common choices - equal width - equal probability Must not choose binning to make S as small as possible. In this case it would no longer follow χ 2 distribution. Necessary to have several entries in bin to approximate Gaussian statistics (e.g. more than 5 entries).

54 Method of Least Squares Goodness of Fit Least squares is a measure of the agreement between the fitted quantities and the measurements.

55 Method of Least Squares X 2 Distribution probability density function cumulative distribution function

56 Method of Least Squares Example of X 2 Test A simulated data sample is shown with distribution function that was not used to generate the data. There are 20 bins. Distribution function was normalized to match number of events in data sample Does the model fit the data? The value of χ 2 for this distribution is 25.2 for 19 d.o.f. resulting in P χ 2 =0.16

57 Goodness of Fit Tests Example of X 2 Test A simulated data sample is shown with distribution function that was not used to generate the data. There are 20 bins. Distribution function was normalized to match number of events in data sample Does the model fit the data? The value of χ 2 for this distribution is 25.2 for 19 d.o.f. resulting in P χ 2 =0.16 BUT... does this look right to you? We will get back to this question.

58 Karsten Heeger, Univ. of Wisconsin NUSS, July 13, 2009

Physics 736. Experimental Methods in Nuclear-, Particle-, and Astrophysics. - Statistics and Error Analysis -

Physics 736. Experimental Methods in Nuclear-, Particle-, and Astrophysics. - Statistics and Error Analysis - Physics 736 Experimental Methods in Nuclear-, Particle-, and Astrophysics - Statistics and Error Analysis - Karsten Heeger heeger@wisc.edu Feldman&Cousin what are the issues they deal with? what processes

More information

Physics 736. Experimental Methods in Nuclear-, Particle-, and Astrophysics. Lecture 14

Physics 736. Experimental Methods in Nuclear-, Particle-, and Astrophysics. Lecture 14 Physics 736 Experimental Methods in Nuclear-, Particle-, and Astrophysics Lecture 14 Karsten Heeger heeger@wisc.edu Course Schedule and Reading course website http://neutrino.physics.wisc.edu/teaching/phys736/

More information

Computational Methods. Randomness and Monte Carlo Methods

Computational Methods. Randomness and Monte Carlo Methods Computational Methods Randomness and Monte Carlo Methods Manfred Huber 2010 1 Randomness and Monte Carlo Methods Introducing randomness in an algorithm can lead to improved efficiencies Random sampling

More information

Monte Carlo Integration and Random Numbers

Monte Carlo Integration and Random Numbers Monte Carlo Integration and Random Numbers Higher dimensional integration u Simpson rule with M evaluations in u one dimension the error is order M -4! u d dimensions the error is order M -4/d u In general

More information

VARIANCE REDUCTION TECHNIQUES IN MONTE CARLO SIMULATIONS K. Ming Leung

VARIANCE REDUCTION TECHNIQUES IN MONTE CARLO SIMULATIONS K. Ming Leung POLYTECHNIC UNIVERSITY Department of Computer and Information Science VARIANCE REDUCTION TECHNIQUES IN MONTE CARLO SIMULATIONS K. Ming Leung Abstract: Techniques for reducing the variance in Monte Carlo

More information

Generating random samples from user-defined distributions

Generating random samples from user-defined distributions The Stata Journal (2011) 11, Number 2, pp. 299 304 Generating random samples from user-defined distributions Katarína Lukácsy Central European University Budapest, Hungary lukacsy katarina@phd.ceu.hu Abstract.

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 13 Random Numbers and Stochastic Simulation Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright

More information

You ve already read basics of simulation now I will be taking up method of simulation, that is Random Number Generation

You ve already read basics of simulation now I will be taking up method of simulation, that is Random Number Generation Unit 5 SIMULATION THEORY Lesson 39 Learning objective: To learn random number generation. Methods of simulation. Monte Carlo method of simulation You ve already read basics of simulation now I will be

More information

Numerical Integration

Numerical Integration Lecture 12: Numerical Integration (with a focus on Monte Carlo integration) Computer Graphics CMU 15-462/15-662, Fall 2015 Review: fundamental theorem of calculus Z b f(x)dx = F (b) F (a) a f(x) = d dx

More information

MULTI-DIMENSIONAL MONTE CARLO INTEGRATION

MULTI-DIMENSIONAL MONTE CARLO INTEGRATION CS580: Computer Graphics KAIST School of Computing Chapter 3 MULTI-DIMENSIONAL MONTE CARLO INTEGRATION 2 1 Monte Carlo Integration This describes a simple technique for the numerical evaluation of integrals

More information

Monte Carlo Methods and Statistical Computing: My Personal E

Monte Carlo Methods and Statistical Computing: My Personal E Monte Carlo Methods and Statistical Computing: My Personal Experience Department of Mathematics & Statistics Indian Institute of Technology Kanpur November 29, 2014 Outline Preface 1 Preface 2 3 4 5 6

More information

Lecture 2: Introduction to Numerical Simulation

Lecture 2: Introduction to Numerical Simulation Lecture 2: Introduction to Numerical Simulation Ahmed Kebaier kebaier@math.univ-paris13.fr HEC, Paris Outline of The Talk 1 Simulation of Random variables Outline 1 Simulation of Random variables Random

More information

What is the Monte Carlo Method?

What is the Monte Carlo Method? Program What is the Monte Carlo Method? A bit of history Applications The core of Monte Carlo: Random realizations 1st example: Initial conditions for N-body simulations 2nd example: Simulating a proper

More information

GAMES Webinar: Rendering Tutorial 2. Monte Carlo Methods. Shuang Zhao

GAMES Webinar: Rendering Tutorial 2. Monte Carlo Methods. Shuang Zhao GAMES Webinar: Rendering Tutorial 2 Monte Carlo Methods Shuang Zhao Assistant Professor Computer Science Department University of California, Irvine GAMES Webinar Shuang Zhao 1 Outline 1. Monte Carlo integration

More information

Probability Models.S4 Simulating Random Variables

Probability Models.S4 Simulating Random Variables Operations Research Models and Methods Paul A. Jensen and Jonathan F. Bard Probability Models.S4 Simulating Random Variables In the fashion of the last several sections, we will often create probability

More information

Biostatistics 615/815 Lecture 16: Importance sampling Single dimensional optimization

Biostatistics 615/815 Lecture 16: Importance sampling Single dimensional optimization Biostatistics 615/815 Lecture 16: Single dimensional optimization Hyun Min Kang November 1st, 2012 Hyun Min Kang Biostatistics 615/815 - Lecture 16 November 1st, 2012 1 / 59 The crude Monte-Carlo Methods

More information

Monte Carlo Methods; Combinatorial Search

Monte Carlo Methods; Combinatorial Search Monte Carlo Methods; Combinatorial Search Parallel and Distributed Computing Department of Computer Science and Engineering (DEI) Instituto Superior Técnico November 22, 2012 CPD (DEI / IST) Parallel and

More information

Chapter 1. Introduction

Chapter 1. Introduction Chapter 1 Introduction A Monte Carlo method is a compuational method that uses random numbers to compute (estimate) some quantity of interest. Very often the quantity we want to compute is the mean of

More information

Combinatorial Search; Monte Carlo Methods

Combinatorial Search; Monte Carlo Methods Combinatorial Search; Monte Carlo Methods Parallel and Distributed Computing Department of Computer Science and Engineering (DEI) Instituto Superior Técnico May 02, 2016 CPD (DEI / IST) Parallel and Distributed

More information

Monte Carlo Techniques. Professor Stephen Sekula Guest Lecture PHY 4321/7305 Sep. 3, 2014

Monte Carlo Techniques. Professor Stephen Sekula Guest Lecture PHY 4321/7305 Sep. 3, 2014 Monte Carlo Techniques Professor Stephen Sekula Guest Lecture PHY 431/7305 Sep. 3, 014 What are Monte Carlo Techniques? Computational algorithms that rely on repeated random sampling in order to obtain

More information

ACCURACY AND EFFICIENCY OF MONTE CARLO METHOD. Julius Goodman. Bechtel Power Corporation E. Imperial Hwy. Norwalk, CA 90650, U.S.A.

ACCURACY AND EFFICIENCY OF MONTE CARLO METHOD. Julius Goodman. Bechtel Power Corporation E. Imperial Hwy. Norwalk, CA 90650, U.S.A. - 430 - ACCURACY AND EFFICIENCY OF MONTE CARLO METHOD Julius Goodman Bechtel Power Corporation 12400 E. Imperial Hwy. Norwalk, CA 90650, U.S.A. ABSTRACT The accuracy of Monte Carlo method of simulating

More information

Lecture 7: Monte Carlo Rendering. MC Advantages

Lecture 7: Monte Carlo Rendering. MC Advantages Lecture 7: Monte Carlo Rendering CS 6620, Spring 2009 Kavita Bala Computer Science Cornell University MC Advantages Convergence rate of O( ) Simple Sampling Point evaluation Can use black boxes General

More information

Cluster Analysis. Jia Li Department of Statistics Penn State University. Summer School in Statistics for Astronomers IV June 9-14, 2008

Cluster Analysis. Jia Li Department of Statistics Penn State University. Summer School in Statistics for Astronomers IV June 9-14, 2008 Cluster Analysis Jia Li Department of Statistics Penn State University Summer School in Statistics for Astronomers IV June 9-1, 8 1 Clustering A basic tool in data mining/pattern recognition: Divide a

More information

Probabilistic (Randomized) algorithms

Probabilistic (Randomized) algorithms Probabilistic (Randomized) algorithms Idea: Build algorithms using a random element so as gain improved performance. For some cases, improved performance is very dramatic, moving from intractable to tractable.

More information

Bootstrapping Method for 14 June 2016 R. Russell Rhinehart. Bootstrapping

Bootstrapping Method for  14 June 2016 R. Russell Rhinehart. Bootstrapping Bootstrapping Method for www.r3eda.com 14 June 2016 R. Russell Rhinehart Bootstrapping This is extracted from the book, Nonlinear Regression Modeling for Engineering Applications: Modeling, Model Validation,

More information

Will Monroe July 21, with materials by Mehran Sahami and Chris Piech. Joint Distributions

Will Monroe July 21, with materials by Mehran Sahami and Chris Piech. Joint Distributions Will Monroe July 1, 017 with materials by Mehran Sahami and Chris Piech Joint Distributions Review: Normal random variable An normal (= Gaussian) random variable is a good approximation to many other distributions.

More information

Random Numbers Random Walk

Random Numbers Random Walk Random Numbers Random Walk Computational Physics Random Numbers Random Walk Outline Random Systems Random Numbers Monte Carlo Integration Example Random Walk Exercise 7 Introduction Random Systems Deterministic

More information

Random Numbers and Monte Carlo Methods

Random Numbers and Monte Carlo Methods Random Numbers and Monte Carlo Methods Methods which make use of random numbers are often called Monte Carlo Methods after the Monte Carlo Casino in Monaco which has long been famous for games of chance.

More information

PHYSICS 115/242 Homework 5, Solutions X = and the mean and variance of X are N times the mean and variance of 12/N y, so

PHYSICS 115/242 Homework 5, Solutions X = and the mean and variance of X are N times the mean and variance of 12/N y, so PHYSICS 5/242 Homework 5, Solutions. Central limit theorem. (a) Let y i = x i /2. The distribution of y i is equal to for /2 y /2 and zero otherwise. Hence We consider µ y y i = /2 /2 σ 2 y y 2 i y i 2

More information

Lecture 8: Jointly distributed random variables

Lecture 8: Jointly distributed random variables Lecture : Jointly distributed random variables Random Vectors and Joint Probability Distributions Definition: Random Vector. An n-dimensional random vector, denoted as Z = (Z, Z,, Z n ), is a function

More information

MCMC Methods for data modeling

MCMC Methods for data modeling MCMC Methods for data modeling Kenneth Scerri Department of Automatic Control and Systems Engineering Introduction 1. Symposium on Data Modelling 2. Outline: a. Definition and uses of MCMC b. MCMC algorithms

More information

Integration. Volume Estimation

Integration. Volume Estimation Monte Carlo Integration Lab Objective: Many important integrals cannot be evaluated symbolically because the integrand has no antiderivative. Traditional numerical integration techniques like Newton-Cotes

More information

Nested Sampling: Introduction and Implementation

Nested Sampling: Introduction and Implementation UNIVERSITY OF TEXAS AT SAN ANTONIO Nested Sampling: Introduction and Implementation Liang Jing May 2009 1 1 ABSTRACT Nested Sampling is a new technique to calculate the evidence, Z = P(D M) = p(d θ, M)p(θ

More information

CS 563 Advanced Topics in Computer Graphics Monte Carlo Integration: Basic Concepts. by Emmanuel Agu

CS 563 Advanced Topics in Computer Graphics Monte Carlo Integration: Basic Concepts. by Emmanuel Agu CS 563 Advanced Topics in Computer Graphics Monte Carlo Integration: Basic Concepts by Emmanuel Agu Introduction The integral equations generally don t have analytic solutions, so we must turn to numerical

More information

Professor Stephen Sekula

Professor Stephen Sekula Monte Carlo Techniques Professor Stephen Sekula Guest Lecture PHYS 4321/7305 What are Monte Carlo Techniques? Computational algorithms that rely on repeated random sampling in order to obtain numerical

More information

Overview. Monte Carlo Methods. Statistics & Bayesian Inference Lecture 3. Situation At End Of Last Week

Overview. Monte Carlo Methods. Statistics & Bayesian Inference Lecture 3. Situation At End Of Last Week Statistics & Bayesian Inference Lecture 3 Joe Zuntz Overview Overview & Motivation Metropolis Hastings Monte Carlo Methods Importance sampling Direct sampling Gibbs sampling Monte-Carlo Markov Chains Emcee

More information

Optimal designs for comparing curves

Optimal designs for comparing curves Optimal designs for comparing curves Holger Dette, Ruhr-Universität Bochum Maria Konstantinou, Ruhr-Universität Bochum Kirsten Schorning, Ruhr-Universität Bochum FP7 HEALTH 2013-602552 Outline 1 Motivation

More information

3 Nonlinear Regression

3 Nonlinear Regression CSC 4 / CSC D / CSC C 3 Sometimes linear models are not sufficient to capture the real-world phenomena, and thus nonlinear models are necessary. In regression, all such models will have the same basic

More information

Kinematic Fitting. Mark M. Ito. July 11, Jefferson Lab. Mark Ito (JLab) Kinematic Fitting July 11, / 25

Kinematic Fitting. Mark M. Ito. July 11, Jefferson Lab. Mark Ito (JLab) Kinematic Fitting July 11, / 25 Kinematic Fitting Mark M. Ito Jefferson Lab July 11, 2013 Mark Ito (JLab) Kinematic Fitting July 11, 2013 1 / 25 Introduction Motivation Particles measured independently Often: known relationships between

More information

Fitting. Instructor: Jason Corso (jjcorso)! web.eecs.umich.edu/~jjcorso/t/598f14!! EECS Fall 2014! Foundations of Computer Vision!

Fitting. Instructor: Jason Corso (jjcorso)! web.eecs.umich.edu/~jjcorso/t/598f14!! EECS Fall 2014! Foundations of Computer Vision! Fitting EECS 598-08 Fall 2014! Foundations of Computer Vision!! Instructor: Jason Corso (jjcorso)! web.eecs.umich.edu/~jjcorso/t/598f14!! Readings: FP 10; SZ 4.3, 5.1! Date: 10/8/14!! Materials on these

More information

Multivariate Normal Random Numbers

Multivariate Normal Random Numbers Multivariate Normal Random Numbers Revised: 10/11/2017 Summary... 1 Data Input... 3 Analysis Options... 4 Analysis Summary... 5 Matrix Plot... 6 Save Results... 8 Calculations... 9 Summary This procedure

More information

An interesting related problem is Buffon s Needle which was first proposed in the mid-1700 s.

An interesting related problem is Buffon s Needle which was first proposed in the mid-1700 s. Using Monte Carlo to Estimate π using Buffon s Needle Problem An interesting related problem is Buffon s Needle which was first proposed in the mid-1700 s. Here s the problem (in a simplified form). Suppose

More information

1. In this problem we use Monte Carlo integration to approximate the area of a circle of radius, R = 1.

1. In this problem we use Monte Carlo integration to approximate the area of a circle of radius, R = 1. 1. In this problem we use Monte Carlo integration to approximate the area of a circle of radius, R = 1. A. For method 1, the idea is to conduct a binomial experiment using the random number generator:

More information

Chapter 7: Computation of the Camera Matrix P

Chapter 7: Computation of the Camera Matrix P Chapter 7: Computation of the Camera Matrix P Arco Nederveen Eagle Vision March 18, 2008 Arco Nederveen (Eagle Vision) The Camera Matrix P March 18, 2008 1 / 25 1 Chapter 7: Computation of the camera Matrix

More information

Computational Cognitive Science

Computational Cognitive Science Computational Cognitive Science Lecture 5: Maximum Likelihood Estimation; Parameter Uncertainty Chris Lucas (Slides adapted from Frank Keller s) School of Informatics University of Edinburgh clucas2@inf.ed.ac.uk

More information

Robotics. Lecture 5: Monte Carlo Localisation. See course website for up to date information.

Robotics. Lecture 5: Monte Carlo Localisation. See course website  for up to date information. Robotics Lecture 5: Monte Carlo Localisation See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College London Review:

More information

Sampling and Monte-Carlo Integration

Sampling and Monte-Carlo Integration Sampling and Monte-Carlo Integration Sampling and Monte-Carlo Integration Last Time Pixels are samples Sampling theorem Convolution & multiplication Aliasing: spectrum replication Ideal filter And its

More information

Estimation of Item Response Models

Estimation of Item Response Models Estimation of Item Response Models Lecture #5 ICPSR Item Response Theory Workshop Lecture #5: 1of 39 The Big Picture of Estimation ESTIMATOR = Maximum Likelihood; Mplus Any questions? answers Lecture #5:

More information

Simulation. Monte Carlo

Simulation. Monte Carlo Simulation Monte Carlo Monte Carlo simulation Outcome of a single stochastic simulation run is always random A single instance of a random variable Goal of a simulation experiment is to get knowledge about

More information

Expectation-Maximization Methods in Population Analysis. Robert J. Bauer, Ph.D. ICON plc.

Expectation-Maximization Methods in Population Analysis. Robert J. Bauer, Ph.D. ICON plc. Expectation-Maximization Methods in Population Analysis Robert J. Bauer, Ph.D. ICON plc. 1 Objective The objective of this tutorial is to briefly describe the statistical basis of Expectation-Maximization

More information

Statistical Analysis of Metabolomics Data. Xiuxia Du Department of Bioinformatics & Genomics University of North Carolina at Charlotte

Statistical Analysis of Metabolomics Data. Xiuxia Du Department of Bioinformatics & Genomics University of North Carolina at Charlotte Statistical Analysis of Metabolomics Data Xiuxia Du Department of Bioinformatics & Genomics University of North Carolina at Charlotte Outline Introduction Data pre-treatment 1. Normalization 2. Centering,

More information

SUPPLEMENTARY FILE S1: 3D AIRWAY TUBE RECONSTRUCTION AND CELL-BASED MECHANICAL MODEL. RELATED TO FIGURE 1, FIGURE 7, AND STAR METHODS.

SUPPLEMENTARY FILE S1: 3D AIRWAY TUBE RECONSTRUCTION AND CELL-BASED MECHANICAL MODEL. RELATED TO FIGURE 1, FIGURE 7, AND STAR METHODS. SUPPLEMENTARY FILE S1: 3D AIRWAY TUBE RECONSTRUCTION AND CELL-BASED MECHANICAL MODEL. RELATED TO FIGURE 1, FIGURE 7, AND STAR METHODS. 1. 3D AIRWAY TUBE RECONSTRUCTION. RELATED TO FIGURE 1 AND STAR METHODS

More information

= f (a, b) + (hf x + kf y ) (a,b) +

= f (a, b) + (hf x + kf y ) (a,b) + Chapter 14 Multiple Integrals 1 Double Integrals, Iterated Integrals, Cross-sections 2 Double Integrals over more general regions, Definition, Evaluation of Double Integrals, Properties of Double Integrals

More information

ISyE 6416: Computational Statistics Spring Lecture 13: Monte Carlo Methods

ISyE 6416: Computational Statistics Spring Lecture 13: Monte Carlo Methods ISyE 6416: Computational Statistics Spring 2017 Lecture 13: Monte Carlo Methods Prof. Yao Xie H. Milton Stewart School of Industrial and Systems Engineering Georgia Institute of Technology Determine area

More information

Package UnivRNG. R topics documented: January 10, Type Package

Package UnivRNG. R topics documented: January 10, Type Package Type Package Package UnivRNG January 10, 2018 Title Univariate Pseudo-Random Number Generation Version 1.1 Date 2018-01-10 Author Hakan Demirtas, Rawan Allozi Maintainer Rawan Allozi

More information

Lab 5 Monte Carlo integration

Lab 5 Monte Carlo integration Lab 5 Monte Carlo integration Edvin Listo Zec 9065-976 edvinli@student.chalmers.se October 0, 014 Co-worker: Jessica Fredby Introduction In this computer assignment we will discuss a technique for solving

More information

Unit 8 SUPPLEMENT Normal, T, Chi Square, F, and Sums of Normals

Unit 8 SUPPLEMENT Normal, T, Chi Square, F, and Sums of Normals BIOSTATS 540 Fall 017 8. SUPPLEMENT Normal, T, Chi Square, F and Sums of Normals Page 1 of Unit 8 SUPPLEMENT Normal, T, Chi Square, F, and Sums of Normals Topic 1. Normal Distribution.. a. Definition..

More information

Chapter 6 Normal Probability Distributions

Chapter 6 Normal Probability Distributions Chapter 6 Normal Probability Distributions 6-1 Review and Preview 6-2 The Standard Normal Distribution 6-3 Applications of Normal Distributions 6-4 Sampling Distributions and Estimators 6-5 The Central

More information

Quantitative Biology II!

Quantitative Biology II! Quantitative Biology II! Lecture 3: Markov Chain Monte Carlo! March 9, 2015! 2! Plan for Today!! Introduction to Sampling!! Introduction to MCMC!! Metropolis Algorithm!! Metropolis-Hastings Algorithm!!

More information

EECS 442 Computer vision. Fitting methods

EECS 442 Computer vision. Fitting methods EECS 442 Computer vision Fitting methods - Problem formulation - Least square methods - RANSAC - Hough transforms - Multi-model fitting - Fitting helps matching! Reading: [HZ] Chapters: 4, 11 [FP] Chapters:

More information

AQA GCSE Maths - Higher Self-Assessment Checklist

AQA GCSE Maths - Higher Self-Assessment Checklist AQA GCSE Maths - Higher Self-Assessment Checklist Number 1 Use place value when calculating with decimals. 1 Order positive and negative integers and decimals using the symbols =,, , and. 1 Round to

More information

Off-Line and Real-Time Methods for ML-PDA Target Validation

Off-Line and Real-Time Methods for ML-PDA Target Validation Off-Line and Real-Time Methods for ML-PDA Target Validation Wayne R. Blanding*, Member, IEEE, Peter K. Willett, Fellow, IEEE and Yaakov Bar-Shalom, Fellow, IEEE 1 Abstract We present two procedures for

More information

GPU-based Quasi-Monte Carlo Algorithms for Matrix Computations

GPU-based Quasi-Monte Carlo Algorithms for Matrix Computations GPU-based Quasi-Monte Carlo Algorithms for Matrix Computations Aneta Karaivanova (with E. Atanassov, S. Ivanovska, M. Durchova) anet@parallel.bas.bg Institute of Information and Communication Technologies

More information

Monte Carlo Integration

Monte Carlo Integration Lecture 11: Monte Carlo Integration Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2016 Reminder: Quadrature-Based Numerical Integration f(x) Z b a f(x)dx x 0 = a x 1 x 2 x 3 x 4 = b E.g.

More information

Probability and Statistics for Final Year Engineering Students

Probability and Statistics for Final Year Engineering Students Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: April 11, 2011. Lecture 1: Introduction and Basic Terms Welcome to the course, time table, assessment, etc..

More information

Introduction. Communication Systems Simulation - I. Monte Carlo method. Simulation methods

Introduction. Communication Systems Simulation - I. Monte Carlo method. Simulation methods Introduction Communication Systems Simulation - I Harri Saarnisaari Part of Simulations and Tools for Telecommunication Course First we study what simulation methods are available Use of the Monte Carlo

More information

Edge detection. Gradient-based edge operators

Edge detection. Gradient-based edge operators Edge detection Gradient-based edge operators Prewitt Sobel Roberts Laplacian zero-crossings Canny edge detector Hough transform for detection of straight lines Circle Hough Transform Digital Image Processing:

More information

Simulation: Solving Dynamic Models ABE 5646 Week 12, Spring 2009

Simulation: Solving Dynamic Models ABE 5646 Week 12, Spring 2009 Simulation: Solving Dynamic Models ABE 5646 Week 12, Spring 2009 Week Description Reading Material 12 Mar 23- Mar 27 Uncertainty and Sensitivity Analysis Two forms of crop models Random sampling for stochastic

More information

Convexization in Markov Chain Monte Carlo

Convexization in Markov Chain Monte Carlo in Markov Chain Monte Carlo 1 IBM T. J. Watson Yorktown Heights, NY 2 Department of Aerospace Engineering Technion, Israel August 23, 2011 Problem Statement MCMC processes in general are governed by non

More information

Monte Carlo Integration COS 323

Monte Carlo Integration COS 323 Monte Carlo Integration COS 323 Integration in d Dimensions? One option: nested 1-D integration f(x,y) g(y) y f ( x, y) dx dy ( ) = g y dy x Evaluate the latter numerically, but each sample of g(y) is

More information

Starting a Data Analysis

Starting a Data Analysis 03/20/07 PHY310: Statistical Data Analysis 1 PHY310: Lecture 17 Starting a Data Analysis Road Map Your Analysis Log Exploring the Data Reading the input file (and making sure it's right) Taking a first

More information

1. Estimation equations for strip transect sampling, using notation consistent with that used to

1. Estimation equations for strip transect sampling, using notation consistent with that used to Web-based Supplementary Materials for Line Transect Methods for Plant Surveys by S.T. Buckland, D.L. Borchers, A. Johnston, P.A. Henrys and T.A. Marques Web Appendix A. Introduction In this on-line appendix,

More information

RANDOM NUMBERS GENERATION

RANDOM NUMBERS GENERATION Chapter 4 RANDOM NUMBERS GENERATION M. Ragheb 10/2/2015 4.1. INTRODUCTION The use and generation of random numbers uniformly distributed over the unit interval: [0, 1] is a unique feature of the Monte

More information

Motivation. Advanced Computer Graphics (Fall 2009) CS 283, Lecture 11: Monte Carlo Integration Ravi Ramamoorthi

Motivation. Advanced Computer Graphics (Fall 2009) CS 283, Lecture 11: Monte Carlo Integration Ravi Ramamoorthi Advanced Computer Graphics (Fall 2009) CS 283, Lecture 11: Monte Carlo Integration Ravi Ramamoorthi http://inst.eecs.berkeley.edu/~cs283 Acknowledgements and many slides courtesy: Thomas Funkhouser, Szymon

More information

BME I5000: Biomedical Imaging

BME I5000: Biomedical Imaging BME I5000: Biomedical Imaging Lecture 1 Introduction Lucas C. Parra, parra@ccny.cuny.edu 1 Content Topics: Physics of medial imaging modalities (blue) Digital Image Processing (black) Schedule: 1. Introduction,

More information

STATS PAD USER MANUAL

STATS PAD USER MANUAL STATS PAD USER MANUAL For Version 2.0 Manual Version 2.0 1 Table of Contents Basic Navigation! 3 Settings! 7 Entering Data! 7 Sharing Data! 8 Managing Files! 10 Running Tests! 11 Interpreting Output! 11

More information

BESTFIT, DISTRIBUTION FITTING SOFTWARE BY PALISADE CORPORATION

BESTFIT, DISTRIBUTION FITTING SOFTWARE BY PALISADE CORPORATION Proceedings of the 1996 Winter Simulation Conference ed. J. M. Charnes, D. J. Morrice, D. T. Brunner, and J. J. S\vain BESTFIT, DISTRIBUTION FITTING SOFTWARE BY PALISADE CORPORATION Linda lankauskas Sam

More information

Neural Network Weight Selection Using Genetic Algorithms

Neural Network Weight Selection Using Genetic Algorithms Neural Network Weight Selection Using Genetic Algorithms David Montana presented by: Carl Fink, Hongyi Chen, Jack Cheng, Xinglong Li, Bruce Lin, Chongjie Zhang April 12, 2005 1 Neural Networks Neural networks

More information

arxiv:hep-ex/ v1 24 Jun 1994

arxiv:hep-ex/ v1 24 Jun 1994 MULTIPLE SCATTERING ERROR PROPAGATION IN PARTICLE TRACK RECONSTRUCTION M. Penţia, G. Iorgovan INSTITUTE OF ATOMIC PHYSICS RO-76900 P.O.Box MG-6, Bucharest, ROMANIA e-mail: pentia@roifa.bitnet arxiv:hep-ex/9406006v1

More information

(X 1:n η) 1 θ e 1. i=1. Using the traditional MLE derivation technique, the penalized MLEs for η and θ are: = n. (X i η) = 0. i=1 = 1.

(X 1:n η) 1 θ e 1. i=1. Using the traditional MLE derivation technique, the penalized MLEs for η and θ are: = n. (X i η) = 0. i=1 = 1. EXAMINING THE PERFORMANCE OF A CONTROL CHART FOR THE SHIFTED EXPONENTIAL DISTRIBUTION USING PENALIZED MAXIMUM LIKELIHOOD ESTIMATORS: A SIMULATION STUDY USING SAS Austin Brown, M.S., University of Northern

More information

Spatial Outlier Detection

Spatial Outlier Detection Spatial Outlier Detection Chang-Tien Lu Department of Computer Science Northern Virginia Center Virginia Tech Joint work with Dechang Chen, Yufeng Kou, Jiang Zhao 1 Spatial Outlier A spatial data point

More information

Robust Shape Retrieval Using Maximum Likelihood Theory

Robust Shape Retrieval Using Maximum Likelihood Theory Robust Shape Retrieval Using Maximum Likelihood Theory Naif Alajlan 1, Paul Fieguth 2, and Mohamed Kamel 1 1 PAMI Lab, E & CE Dept., UW, Waterloo, ON, N2L 3G1, Canada. naif, mkamel@pami.uwaterloo.ca 2

More information

STAT 725 Notes Monte Carlo Integration

STAT 725 Notes Monte Carlo Integration STAT 725 Notes Monte Carlo Integration Two major classes of numerical problems arise in statistical inference: optimization and integration. We have already spent some time discussing different optimization

More information

Open and Closed Intervals. For interval notation: We use brackets If an endpoint value is included, and parentheses if it isn t included.

Open and Closed Intervals. For interval notation: We use brackets If an endpoint value is included, and parentheses if it isn t included. Lesson 4 Open and Closed Intervals Review: Shormann Algebra, Lessons 1, 14, 15, 0, 6 Rules For interval notation: We use brackets If an endpoint value is included, and parentheses if it isn t included.

More information

3 Nonlinear Regression

3 Nonlinear Regression 3 Linear models are often insufficient to capture the real-world phenomena. That is, the relation between the inputs and the outputs we want to be able to predict are not linear. As a consequence, nonlinear

More information

Ph3 Mathematica Homework: Week 8

Ph3 Mathematica Homework: Week 8 Ph3 Mathematica Homework: Week 8 Eric D. Black California Institute of Technology v1.1 Anyone who considers arithmetical methods of producing random digits is, of course, in a state of sin. For, as has

More information

13 Distribution Ray Tracing

13 Distribution Ray Tracing 13 In (hereafter abbreviated as DRT ), our goal is to render a scene as accurately as possible. Whereas Basic Ray Tracing computed a very crude approximation to radiance at a point, in DRT we will attempt

More information

2017 Summer Course on Optical Oceanography and Ocean Color Remote Sensing. Monte Carlo Simulation

2017 Summer Course on Optical Oceanography and Ocean Color Remote Sensing. Monte Carlo Simulation 2017 Summer Course on Optical Oceanography and Ocean Color Remote Sensing Curtis Mobley Monte Carlo Simulation Delivered at the Darling Marine Center, University of Maine July 2017 Copyright 2017 by Curtis

More information

Page 129 Exercise 5: Suppose that the joint p.d.f. of two random variables X and Y is as follows: { c(x. 0 otherwise. ( 1 = c. = c

Page 129 Exercise 5: Suppose that the joint p.d.f. of two random variables X and Y is as follows: { c(x. 0 otherwise. ( 1 = c. = c Stat Solutions for Homework Set Page 9 Exercise : Suppose that the joint p.d.f. of two random variables X and Y is as follows: { cx fx, y + y for y x, < x < otherwise. Determine a the value of the constant

More information

Chapter 6: DESCRIPTIVE STATISTICS

Chapter 6: DESCRIPTIVE STATISTICS Chapter 6: DESCRIPTIVE STATISTICS Random Sampling Numerical Summaries Stem-n-Leaf plots Histograms, and Box plots Time Sequence Plots Normal Probability Plots Sections 6-1 to 6-5, and 6-7 Random Sampling

More information

M. Usman, A.O. Hero and J.A. Fessler. University of Michigan. parameter =[ 1 ; :::; n ] T given an observation of a vector

M. Usman, A.O. Hero and J.A. Fessler. University of Michigan. parameter =[ 1 ; :::; n ] T given an observation of a vector Uniform CR Bound: Implementation Issues And Applications M. Usman, A.O. Hero and J.A. Fessler University of Michigan ABSTRACT We apply a uniform Cramer-Rao (CR) bound [] to study the bias-variance trade-os

More information

10.4 Linear interpolation method Newton s method

10.4 Linear interpolation method Newton s method 10.4 Linear interpolation method The next best thing one can do is the linear interpolation method, also known as the double false position method. This method works similarly to the bisection method by

More information

Visualizing and Exploring Data

Visualizing and Exploring Data Visualizing and Exploring Data Sargur University at Buffalo The State University of New York Visual Methods for finding structures in data Power of human eye/brain to detect structures Product of eons

More information

Monte Carlo Integration COS 323

Monte Carlo Integration COS 323 Monte Carlo Integration COS 323 Last time Interpolatory Quadrature Review formulation; error analysis Newton-Cotes Quadrature Midpoint, Trapezoid, Simpson s Rule Error analysis for trapezoid, midpoint

More information

Sensor Tasking and Control

Sensor Tasking and Control Sensor Tasking and Control Outline Task-Driven Sensing Roles of Sensor Nodes and Utilities Information-Based Sensor Tasking Joint Routing and Information Aggregation Summary Introduction To efficiently

More information

FMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu

FMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu FMA901F: Machine Learning Lecture 3: Linear Models for Regression Cristian Sminchisescu Machine Learning: Frequentist vs. Bayesian In the frequentist setting, we seek a fixed parameter (vector), with value(s)

More information

10703 Deep Reinforcement Learning and Control

10703 Deep Reinforcement Learning and Control 10703 Deep Reinforcement Learning and Control Russ Salakhutdinov Machine Learning Department rsalakhu@cs.cmu.edu Policy Gradient I Used Materials Disclaimer: Much of the material and slides for this lecture

More information

DATA MINING LECTURE 7. Hierarchical Clustering, DBSCAN The EM Algorithm

DATA MINING LECTURE 7. Hierarchical Clustering, DBSCAN The EM Algorithm DATA MINING LECTURE 7 Hierarchical Clustering, DBSCAN The EM Algorithm CLUSTERING What is a Clustering? In general a grouping of objects such that the objects in a group (cluster) are similar (or related)

More information

Equations and Functions, Variables and Expressions

Equations and Functions, Variables and Expressions Equations and Functions, Variables and Expressions Equations and functions are ubiquitous components of mathematical language. Success in mathematics beyond basic arithmetic depends on having a solid working

More information