Approximate Bayesian Computation using Auxiliary Models

Size: px
Start display at page:

Download "Approximate Bayesian Computation using Auxiliary Models"

Transcription

1 Approximate Bayesian Computation using Auxiliary Models Tony Pettitt Co-authors Chris Drovandi, Malcolm Faddy Queensland University of Technology Brisbane MCQMC February 2012 Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

2 Outline 1 Motivating Problem Application to modelling Macroparasite Immunity 2 Approximate Bayesian Computation Introduction to ABC Three ABC Algorithms Sequential Monte Carlo ABC 3 Macroparasite Population Evolution Summary Statistics Auxiliary models 4 Results Posterior results ABC fits to Data 5 Conclusions Macroparasite model ABC Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

3 Motivating Problem 1 Motivating Problem Application to modelling Macroparasite Immunity 2 Approximate Bayesian Computation Introduction to ABC Three ABC Algorithms Sequential Monte Carlo ABC 3 Macroparasite Population Evolution Summary Statistics Auxiliary models 4 Results Posterior results ABC fits to Data 5 Conclusions Macroparasite model ABC Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

4 Motivating Problem Application to modelling Macroparasite Immunity Macroparasite Immunity Estimate parameters of a Markov process model explaining macroparasite population development with host immunity 212 hosts (cats) i = 1,...,212. Each cat injected with l i juvenile Brugia pahangi larvae (approximately 100 or 200). At time t i host is sacrificed and the number of mature worms are recorded Host assumed to develop an immunity Three discrete variables: M(t) matures, L(t) juveniles, I(t) immunity. Only L(0) and M at sacrifice time are observed for each host Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

5 Motivating Problem Application to modelling Macroparasite Immunity Macroparasite Immunity data, proportion of mature vs sacrifice time Proporton of Matures Time Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

6 Motivating Problem Application to modelling Macroparasite Immunity Trivariate Markov Process of Riley et al (2003) Invisible Invisible Mature Parasites M(t) Maturation γl(t) Juvenile Parasites L(t) µmm(t) Natural death Immunity µll(t) Natural death βi(t)l(t) Death due to immunity Gain of immunity νl(t) I(t) Loss of immunity µii(t) Invisible Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

7 Motivating Problem Application to modelling Macroparasite Immunity The Model and Intractable Likelihood L,M,I are discrete counts, I hypothesised variable Deterministic form of the model dl dt = µ LL βil γl, dm dt = γl µ MM, di dt = νl µ II, µ m,γ fixed. ν,µ L,µ I,β require estimation Likelihood based on Markov process is intractable Simulation of process L,M,I using Gillespie s algorithm (Gillespie, 1977) A common mathematical model: epidemics, chemical kinetics, systems biology... Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

8 Approximate Bayesian Computation Introduction to ABC 1 Motivating Problem Application to modelling Macroparasite Immunity 2 Approximate Bayesian Computation Introduction to ABC Three ABC Algorithms Sequential Monte Carlo ABC 3 Macroparasite Population Evolution Summary Statistics Auxiliary models 4 Results Posterior results ABC fits to Data 5 Conclusions Macroparasite model ABC Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

9 Approximate Bayesian Computation Introduction to ABC ABC and the Approximate Posterior I Bayesian statistics involves inference based on the posterior distribution p(θ y) p(y θ)p(θ). and if the likelihood cannot be evaluated? but easy to simulate, x, from the likelihood Applications... genetics, biology, finance,... Involves a joint posterior distribution for θ and simulated data x p(θ,x y) g(y x)p(x θ)p(θ) where g(y x) measures closeness of observed data y to simulated data x. (Reeves and Pettitt, 2005) Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

10 Approximate Bayesian Computation Introduction to ABC ABC and the Approximate Posterior II Compare simulated values, x, and observed data, y through summary statistics S(.) = S 1 (.),...,S p (.) One choice, set ρ(y,x) = S(y) S(x) g(y x) = 1(ρ(y,x) ǫ) Choice of ǫ trade off between accuracy and computational time Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

11 Approximate Bayesian Computation Three ABC Algorithms Rejection Sampling ABC - RS-ABC Rejection Sampling (RS-ABC) (Beaumont et al, 2002) Sample θ p(θ) Simulate x p(. θ ) Accept θ if ρ(y, x) ǫ Repeat the above until we have N values, θ 1,...,θ N Advantages: Simplicity, Independent values, parallelizable Disadvantage: Inefficient. Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

12 Approximate Bayesian Computation Three ABC Algorithms MCMC ABC Majoram et al, 2003 Advantages: theoretical understanding, use in SMC Disadvantages: Dependent Samples, Markov chain convergence, sampler can get stuck, inefficient, needs tuning Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

13 Approximate Bayesian Computation Sequential Monte Carlo ABC Sequential Monte Carlo for ABC Chopin(2002), Del Moral et al (2006),Sisson et al (2007, 2009), Beaumont et al (2009) Approximate posterior by weighted sample {θ i,w i } N i=1, N particles Define sequence of joint targets p t (θ,x y) p(x θ)p(θ)1(ρ(x,y) ǫ t ), for t = 1,...,T, and a sequence of decreasing tolerances ǫ 1 ǫ 2 ǫ T. At t = 1 draw particles from prior, set ǫ 1 = Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

14 Approximate Bayesian Computation Sequential Monte Carlo ABC A Fully Adaptive SMC ABC Algorithm I Drovandi and Pettitt (2011), Del Moral et al (2011) Reweight particles, either zero or proportional to 1, W i t 1(ρ(xi t,y) ǫ t) 1(ρ(x i t,y) ǫ t 1 ), Choose ǫ t so that M have zero weights and N M have non-zero weights Replenish population by resampling M from N M alive particles. Diversify the particles with an MCMC kernel, q t (..), that is stationary for the current target. q t (..) determined adaptively using the alive set. Apply MCMC kernel R t times so that overall acceptance close to 1. Learn R t adaptively. Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

15 Macroparasite Population Evolution 1 Motivating Problem Application to modelling Macroparasite Immunity 2 Approximate Bayesian Computation Introduction to ABC Three ABC Algorithms Sequential Monte Carlo ABC 3 Macroparasite Population Evolution Summary Statistics Auxiliary models 4 Results Posterior results ABC fits to Data 5 Conclusions Macroparasite model ABC Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

16 Macroparasite Population Evolution Macroparasite Population Evolution Proporton of Matures Time Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

17 Macroparasite Population Evolution Summary Statistics Developing Summary Statistics Nunes and Balding (2009) For all sets and subsets of summary statistics, carry out AS-ABC for a fixed acceptance rate Compare ABC approximations in terms of concentration of posterior distribution. Use a non-parametric measure of entropy Fearnhead and Prangle (2012) suitable for iid cases Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

18 Macroparasite Population Evolution Summary Statistics Developing Summary Statistics using Indirect Inference Summary statistics that efficiently summarize data! Different numbers of juveniles L, different sacrifice times. An approach based on indirect inference (Gouriéroux and Ronchetti, 1993) Propose an auxiliary model p a (y θ a ) where parameter θ a is easily estimated (eg easy MLE) Auxiliary model is flexible enough to provide a good description of the data Simulate x θ from target intractable likelihood p( θ) and find ˆθ a (x θ ) Estimate θ using ˆθ a (x θ ) closest to ˆθ a (y) Alternative to ABC Estimates of parameters of the auxiliary model fitted to the data become the summary statistics in ABC. Models based on Beta-Binomial to capture variability Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

19 Macroparasite Population Evolution Summary Statistics ABC Summary Statistics from auxiliary models How to choose between different auxiliary models? Either use data analytical tools for the original data set, eg AIC Or use the Nunes and Balding approach based on ABC approximations for each model/ summary statistic choice Former is far less computer intensive but does not consider the ABC approximation Compare and contrast different Beta Binomial models Models fitted using MLE and AIC used to the rank models The models range over about 33 units of AIC How are the different fits (AIC) reflected in the ABC posteriors? Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

20 Macroparasite Population Evolution Auxiliary models Auxiliary Beta-Binomial model The data show too much variation for Binomial A Beta-Binomial model has an extra parameter to capture dispersion ( ) li B(mi + α i,l i m i + β i ) p(m i α i,β i ) =, B(α i,β i ) m i Use reparameterisation p i = α i /(α i + β i ) and θ i = 1/(α i + β i ) Relate the proportion, p i to time, t i, and over dispersion parameter θ i, to initial larvae, l i Compare various models, best (AIC=1897) with others with AIC=1911 and AIC=1930 Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

21 Macroparasite Population Evolution Auxiliary models Auxiliary Beta-Binomial model The data show too much variation for Binomial A Beta-Binomial model has an extra parameter to capture dispersion ( ) li B(mi + α i,l i m i + β i ) p(m i α i,β i ) =, B(α i,β i ) m i Use reparameterisation p i = α i /(α i + β i ) and θ i = 1/(α i + β i ) Relate the proportion, p i to time, t i, and over dispersion parameter θ i, to initial larvae, l i Compare various models, best (AIC=1897) with others with AIC=1911 and AIC=1930 Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

22 Macroparasite Population Evolution Auxiliary models Summary statistics For each model Compare the auxiliary estimates for simulated data x, θ x a, and observations y, ˆθ y a, with the Mahalanobis distance ρ(y,x) = ρ(ˆθ y a,θx a ) = (ˆθ y a θx a )T S 1 (ˆθ y a θx a ), where S is the covariance matrix for the MLE Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

23 Results 1 Motivating Problem Application to modelling Macroparasite Immunity 2 Approximate Bayesian Computation Introduction to ABC Three ABC Algorithms Sequential Monte Carlo ABC 3 Macroparasite Population Evolution Summary Statistics Auxiliary models 4 Results Posterior results ABC fits to Data 5 Conclusions Macroparasite model ABC Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

24 Results Posterior results SMC Algorithm Settings and Posterior Results Algorithm Settings Take N = 1000 particles, discard half each iteration, finish with 3% acceptance for MCMC kernel. Repeat MCMC kernel to get about 99% acceptance at each iteration Fixed values γ = 0.04, µ M = Prior choices: ν: U(0,1) µ L : U(0,1) µ I : U(0,2), β: U(0,2), Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

25 Results Posterior results SMC Algorithm Settings and Posterior Results Algorithm Settings Take N = 1000 particles, discard half each iteration, finish with 3% acceptance for MCMC kernel. Repeat MCMC kernel to get about 99% acceptance at each iteration Fixed values γ = 0.04, µ M = Prior choices: ν: U(0,1) µ L : U(0,1) µ I : U(0,2), β: U(0,2), Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

26 Results Posterior results SMC Algorithm Settings and Posterior Results Algorithm Settings Take N = 1000 particles, discard half each iteration, finish with 3% acceptance for MCMC kernel. Repeat MCMC kernel to get about 99% acceptance at each iteration Fixed values γ = 0.04, µ M = Prior choices: ν: U(0,1) µ L : U(0,1) µ I : U(0,2), β: U(0,2), Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

27 Results Posterior results Posterior Density for ν AIC 1930 AIC 1911 AIC 1897 density ν x 10 3 Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

28 Results Posterior results Posterior Density for µ L AIC 1930 AIC 1911 AIC density µ L Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

29 Results Posterior results Posterior Density Differences More concentrated ABC posterior for worse (bigger) AIC Very different modes Nunes-Balding criterion for summary statistics: prefer bigger AIC models. Real test: the predictions from the different ABC fitted models? Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

30 Results ABC fits to Data ABC Fit to data 95 percent prediction interval from the stochastic model using ABC modal estimates: with best AIC auxiliary model on left, worst on right 1 95% prediction intervals based on the auxiliary Beta Binomial model 1 95% prediction intervals based on the auxiliary Binomial mixture model mature count mature count Autopsy time Autopsy time ABC model fit based on best AIC auxiliary model captures variability of data Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

31 Conclusions 1 Motivating Problem Application to modelling Macroparasite Immunity 2 Approximate Bayesian Computation Introduction to ABC Three ABC Algorithms Sequential Monte Carlo ABC 3 Macroparasite Population Evolution Summary Statistics Auxiliary models 4 Results Posterior results ABC fits to Data 5 Conclusions Macroparasite model ABC Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

32 Conclusions Macroparasite model Conclusions: Macroparasite stochastic model µ L,ν more associated with variance of process, ABC differences between auxiliary models over range of AIC Most concentrated ABC posteriors are not best for model predictions The stochastic model using the ABC fit with the best AIC auxiliary model for summary statistics gives good fit to data Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

33 Conclusions ABC Conclusions: ABC and summary statistics Indirect inference through an easy-to-estimate (MLE +AIC) auxiliary model can be used to obtain ABC summary statistics Careful data analysis involved in getting good auxiliary model for observed data - statisticians strength! Nunes and Balding approach can be flawed where ABC posteriors have different modes. The Fearnhead and Prangle approach seems limited to IID caes otherwise ad hoc choices of summaries made. Presented an adaptive self-tuning ABC SMC algorithm Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

34 Conclusions ABC Conclusions: ABC and summary statistics Indirect inference through an easy-to-estimate (MLE +AIC) auxiliary model can be used to obtain ABC summary statistics Careful data analysis involved in getting good auxiliary model for observed data - statisticians strength! Nunes and Balding approach can be flawed where ABC posteriors have different modes. The Fearnhead and Prangle approach seems limited to IID caes otherwise ad hoc choices of summaries made. Presented an adaptive self-tuning ABC SMC algorithm Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

35 Conclusions ABC Conclusions: ABC and summary statistics Indirect inference through an easy-to-estimate (MLE +AIC) auxiliary model can be used to obtain ABC summary statistics Careful data analysis involved in getting good auxiliary model for observed data - statisticians strength! Nunes and Balding approach can be flawed where ABC posteriors have different modes. The Fearnhead and Prangle approach seems limited to IID caes otherwise ad hoc choices of summaries made. Presented an adaptive self-tuning ABC SMC algorithm Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

36 Conclusions ABC Conclusions: ABC and summary statistics Indirect inference through an easy-to-estimate (MLE +AIC) auxiliary model can be used to obtain ABC summary statistics Careful data analysis involved in getting good auxiliary model for observed data - statisticians strength! Nunes and Balding approach can be flawed where ABC posteriors have different modes. The Fearnhead and Prangle approach seems limited to IID caes otherwise ad hoc choices of summaries made. Presented an adaptive self-tuning ABC SMC algorithm Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

37 Conclusions ABC Conclusions: ABC and summary statistics Indirect inference through an easy-to-estimate (MLE +AIC) auxiliary model can be used to obtain ABC summary statistics Careful data analysis involved in getting good auxiliary model for observed data - statisticians strength! Nunes and Balding approach can be flawed where ABC posteriors have different modes. The Fearnhead and Prangle approach seems limited to IID caes otherwise ad hoc choices of summaries made. Presented an adaptive self-tuning ABC SMC algorithm Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

38 Conclusions ABC Conclusions: ABC and summary statistics Indirect inference through an easy-to-estimate (MLE +AIC) auxiliary model can be used to obtain ABC summary statistics Careful data analysis involved in getting good auxiliary model for observed data - statisticians strength! Nunes and Balding approach can be flawed where ABC posteriors have different modes. The Fearnhead and Prangle approach seems limited to IID caes otherwise ad hoc choices of summaries made. Presented an adaptive self-tuning ABC SMC algorithm Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

39 Conclusions ABC Acknowledgements Australian Research Council, Chris Drovandi, Malcolm Faddy Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

40 Conclusions ABC Foresight It will be maintained that the end of an era has now been reached, as regards both statistical methods and computational techniques, and an outline of the way in which biometric techniques in genetical demography may be expected to develop will be given. Particular emphasis will be placed on the need to formulate sound methods of estimation by simulation on complex models. Edwards AWF, 1967, Biometrics, 23, 176 Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

41 Conclusions ABC Foresight It will be maintained that the end of an era has now been reached, as regards both statistical methods and computational techniques, and an outline of the way in which biometric techniques in genetical demography may be expected to develop will be given. Particular emphasis will be placed on the need to formulate sound methods of estimation by simulation on complex models. Edwards AWF, 1967, Biometrics, 23, 176 Tony Pettitt () ABC using Auxiliary Models MCQMC February / 32

MCMC Methods for data modeling

MCMC Methods for data modeling MCMC Methods for data modeling Kenneth Scerri Department of Automatic Control and Systems Engineering Introduction 1. Symposium on Data Modelling 2. Outline: a. Definition and uses of MCMC b. MCMC algorithms

More information

1 Methods for Posterior Simulation

1 Methods for Posterior Simulation 1 Methods for Posterior Simulation Let p(θ y) be the posterior. simulation. Koop presents four methods for (posterior) 1. Monte Carlo integration: draw from p(θ y). 2. Gibbs sampler: sequentially drawing

More information

Rolling Markov Chain Monte Carlo

Rolling Markov Chain Monte Carlo Rolling Markov Chain Monte Carlo Din-Houn Lau Imperial College London Joint work with Axel Gandy 4 th July 2013 Predict final ranks of the each team. Updates quick update of predictions. Accuracy control

More information

Rolling Markov Chain Monte Carlo

Rolling Markov Chain Monte Carlo Rolling Markov Chain Monte Carlo Din-Houn Lau Imperial College London Joint work with Axel Gandy 4 th September 2013 RSS Conference 2013: Newcastle Output predicted final ranks of the each team. Updates

More information

Overview. Monte Carlo Methods. Statistics & Bayesian Inference Lecture 3. Situation At End Of Last Week

Overview. Monte Carlo Methods. Statistics & Bayesian Inference Lecture 3. Situation At End Of Last Week Statistics & Bayesian Inference Lecture 3 Joe Zuntz Overview Overview & Motivation Metropolis Hastings Monte Carlo Methods Importance sampling Direct sampling Gibbs sampling Monte-Carlo Markov Chains Emcee

More information

Computer vision: models, learning and inference. Chapter 10 Graphical Models

Computer vision: models, learning and inference. Chapter 10 Graphical Models Computer vision: models, learning and inference Chapter 10 Graphical Models Independence Two variables x 1 and x 2 are independent if their joint probability distribution factorizes as Pr(x 1, x 2 )=Pr(x

More information

Approximate Bayesian Computation. Alireza Shafaei - April 2016

Approximate Bayesian Computation. Alireza Shafaei - April 2016 Approximate Bayesian Computation Alireza Shafaei - April 2016 The Problem Given a dataset, we are interested in. The Problem Given a dataset, we are interested in. The Problem Given a dataset, we are interested

More information

Markov chain Monte Carlo methods

Markov chain Monte Carlo methods Markov chain Monte Carlo methods (supplementary material) see also the applet http://www.lbreyer.com/classic.html February 9 6 Independent Hastings Metropolis Sampler Outline Independent Hastings Metropolis

More information

Bayesian Methods. David Rosenberg. April 11, New York University. David Rosenberg (New York University) DS-GA 1003 April 11, / 19

Bayesian Methods. David Rosenberg. April 11, New York University. David Rosenberg (New York University) DS-GA 1003 April 11, / 19 Bayesian Methods David Rosenberg New York University April 11, 2017 David Rosenberg (New York University) DS-GA 1003 April 11, 2017 1 / 19 Classical Statistics Classical Statistics David Rosenberg (New

More information

Markov Chain Monte Carlo (part 1)

Markov Chain Monte Carlo (part 1) Markov Chain Monte Carlo (part 1) Edps 590BAY Carolyn J. Anderson Department of Educational Psychology c Board of Trustees, University of Illinois Spring 2018 Depending on the book that you select for

More information

CS281 Section 9: Graph Models and Practical MCMC

CS281 Section 9: Graph Models and Practical MCMC CS281 Section 9: Graph Models and Practical MCMC Scott Linderman November 11, 213 Now that we have a few MCMC inference algorithms in our toolbox, let s try them out on some random graph models. Graphs

More information

Monte Carlo for Spatial Models

Monte Carlo for Spatial Models Monte Carlo for Spatial Models Murali Haran Department of Statistics Penn State University Penn State Computational Science Lectures April 2007 Spatial Models Lots of scientific questions involve analyzing

More information

Statistical Matching using Fractional Imputation

Statistical Matching using Fractional Imputation Statistical Matching using Fractional Imputation Jae-Kwang Kim 1 Iowa State University 1 Joint work with Emily Berg and Taesung Park 1 Introduction 2 Classical Approaches 3 Proposed method 4 Application:

More information

Clustering. Mihaela van der Schaar. January 27, Department of Engineering Science University of Oxford

Clustering. Mihaela van der Schaar. January 27, Department of Engineering Science University of Oxford Department of Engineering Science University of Oxford January 27, 2017 Many datasets consist of multiple heterogeneous subsets. Cluster analysis: Given an unlabelled data, want algorithms that automatically

More information

Real Coded Genetic Algorithm Particle Filter for Improved Performance

Real Coded Genetic Algorithm Particle Filter for Improved Performance Proceedings of 2012 4th International Conference on Machine Learning and Computing IPCSIT vol. 25 (2012) (2012) IACSIT Press, Singapore Real Coded Genetic Algorithm Particle Filter for Improved Performance

More information

Journal of Mathematical Psychology

Journal of Mathematical Psychology Journal of Mathematical Psychology 56 (2012) 375 385 Contents lists available at SciVerse ScienceDirect Journal of Mathematical Psychology journal homepage: www.elsevier.com/locate/jmp Approximate Bayesian

More information

Clustering Lecture 5: Mixture Model

Clustering Lecture 5: Mixture Model Clustering Lecture 5: Mixture Model Jing Gao SUNY Buffalo 1 Outline Basics Motivation, definition, evaluation Methods Partitional Hierarchical Density-based Mixture model Spectral methods Advanced topics

More information

Chapter 1. Introduction

Chapter 1. Introduction Chapter 1 Introduction A Monte Carlo method is a compuational method that uses random numbers to compute (estimate) some quantity of interest. Very often the quantity we want to compute is the mean of

More information

A Nonparametric Bayesian Approach to Detecting Spatial Activation Patterns in fmri Data

A Nonparametric Bayesian Approach to Detecting Spatial Activation Patterns in fmri Data A Nonparametric Bayesian Approach to Detecting Spatial Activation Patterns in fmri Data Seyoung Kim, Padhraic Smyth, and Hal Stern Bren School of Information and Computer Sciences University of California,

More information

Issues in MCMC use for Bayesian model fitting. Practical Considerations for WinBUGS Users

Issues in MCMC use for Bayesian model fitting. Practical Considerations for WinBUGS Users Practical Considerations for WinBUGS Users Kate Cowles, Ph.D. Department of Statistics and Actuarial Science University of Iowa 22S:138 Lecture 12 Oct. 3, 2003 Issues in MCMC use for Bayesian model fitting

More information

GNU MCSim Frederic Yves Bois Chair of mathematical modeling and system biology for predictive toxicology

GNU MCSim Frederic Yves Bois Chair of mathematical modeling and system biology for predictive toxicology GNU MCSim Frederic Yves Bois Chair of mathematical modeling and system biology for predictive toxicology frederic.bois@utc.fr 20/06/2011 1 / 11 Talk overview GNU MCSim is... What it can do Statistical

More information

Tracking Algorithms. Lecture16: Visual Tracking I. Probabilistic Tracking. Joint Probability and Graphical Model. Deterministic methods

Tracking Algorithms. Lecture16: Visual Tracking I. Probabilistic Tracking. Joint Probability and Graphical Model. Deterministic methods Tracking Algorithms CSED441:Introduction to Computer Vision (2017F) Lecture16: Visual Tracking I Bohyung Han CSE, POSTECH bhhan@postech.ac.kr Deterministic methods Given input video and current state,

More information

Machine Learning and Data Mining. Clustering (1): Basics. Kalev Kask

Machine Learning and Data Mining. Clustering (1): Basics. Kalev Kask Machine Learning and Data Mining Clustering (1): Basics Kalev Kask Unsupervised learning Supervised learning Predict target value ( y ) given features ( x ) Unsupervised learning Understand patterns of

More information

An Efficient Model Selection for Gaussian Mixture Model in a Bayesian Framework

An Efficient Model Selection for Gaussian Mixture Model in a Bayesian Framework IEEE SIGNAL PROCESSING LETTERS, VOL. XX, NO. XX, XXX 23 An Efficient Model Selection for Gaussian Mixture Model in a Bayesian Framework Ji Won Yoon arxiv:37.99v [cs.lg] 3 Jul 23 Abstract In order to cluster

More information

Practical Course WS12/13 Introduction to Monte Carlo Localization

Practical Course WS12/13 Introduction to Monte Carlo Localization Practical Course WS12/13 Introduction to Monte Carlo Localization Cyrill Stachniss and Luciano Spinello 1 State Estimation Estimate the state of a system given observations and controls Goal: 2 Bayes Filter

More information

Bayesian Statistics with a Smile: a Resampling-Sampling Perspective

Bayesian Statistics with a Smile: a Resampling-Sampling Perspective Bayesian Statistics with a Smile: a Resampling-Sampling Perspective Hedibert F. Lopes The University of Chicago Booth School of Business 5807 S. Woodlawn Ave., Chicago, IL, 60637. hlopes@chicagobooth.edu

More information

GiRaF: a toolbox for Gibbs Random Fields analysis

GiRaF: a toolbox for Gibbs Random Fields analysis GiRaF: a toolbox for Gibbs Random Fields analysis Julien Stoehr *1, Pierre Pudlo 2, and Nial Friel 1 1 University College Dublin 2 Aix-Marseille Université February 24, 2016 Abstract GiRaF package offers

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Lecture 17 EM CS/CNS/EE 155 Andreas Krause Announcements Project poster session on Thursday Dec 3, 4-6pm in Annenberg 2 nd floor atrium! Easels, poster boards and cookies

More information

Clustering Relational Data using the Infinite Relational Model

Clustering Relational Data using the Infinite Relational Model Clustering Relational Data using the Infinite Relational Model Ana Daglis Supervised by: Matthew Ludkin September 4, 2015 Ana Daglis Clustering Data using the Infinite Relational Model September 4, 2015

More information

Introduction to Machine Learning CMU-10701

Introduction to Machine Learning CMU-10701 Introduction to Machine Learning CMU-10701 Clustering and EM Barnabás Póczos & Aarti Singh Contents Clustering K-means Mixture of Gaussians Expectation Maximization Variational Methods 2 Clustering 3 K-

More information

Survey of Evolutionary and Probabilistic Approaches for Source Term Estimation!

Survey of Evolutionary and Probabilistic Approaches for Source Term Estimation! Survey of Evolutionary and Probabilistic Approaches for Source Term Estimation! Branko Kosović" " George Young, Kerrie J. Schmehl, Dustin Truesdell (PSU), Sue Ellen Haupt, Andrew Annunzio, Luna Rodriguez

More information

Quantitative Biology II!

Quantitative Biology II! Quantitative Biology II! Lecture 3: Markov Chain Monte Carlo! March 9, 2015! 2! Plan for Today!! Introduction to Sampling!! Introduction to MCMC!! Metropolis Algorithm!! Metropolis-Hastings Algorithm!!

More information

A GENERAL GIBBS SAMPLING ALGORITHM FOR ANALYZING LINEAR MODELS USING THE SAS SYSTEM

A GENERAL GIBBS SAMPLING ALGORITHM FOR ANALYZING LINEAR MODELS USING THE SAS SYSTEM A GENERAL GIBBS SAMPLING ALGORITHM FOR ANALYZING LINEAR MODELS USING THE SAS SYSTEM Jayawant Mandrekar, Daniel J. Sargent, Paul J. Novotny, Jeff A. Sloan Mayo Clinic, Rochester, MN 55905 ABSTRACT A general

More information

Bayesian Modelling with JAGS and R

Bayesian Modelling with JAGS and R Bayesian Modelling with JAGS and R Martyn Plummer International Agency for Research on Cancer Rencontres R, 3 July 2012 CRAN Task View Bayesian Inference The CRAN Task View Bayesian Inference is maintained

More information

Probabilistic multi-resolution scanning for cross-sample differences

Probabilistic multi-resolution scanning for cross-sample differences Probabilistic multi-resolution scanning for cross-sample differences Li Ma (joint work with Jacopo Soriano) Duke University 10th Conference on Bayesian Nonparametrics Li Ma (Duke) Multi-resolution scanning

More information

BART STAT8810, Fall 2017

BART STAT8810, Fall 2017 BART STAT8810, Fall 2017 M.T. Pratola November 1, 2017 Today BART: Bayesian Additive Regression Trees BART: Bayesian Additive Regression Trees Additive model generalizes the single-tree regression model:

More information

Monte Carlo Methods and Statistical Computing: My Personal E

Monte Carlo Methods and Statistical Computing: My Personal E Monte Carlo Methods and Statistical Computing: My Personal Experience Department of Mathematics & Statistics Indian Institute of Technology Kanpur November 29, 2014 Outline Preface 1 Preface 2 3 4 5 6

More information

Computational Cognitive Science

Computational Cognitive Science Computational Cognitive Science Lecture 5: Maximum Likelihood Estimation; Parameter Uncertainty Chris Lucas (Slides adapted from Frank Keller s) School of Informatics University of Edinburgh clucas2@inf.ed.ac.uk

More information

Dynamic Thresholding for Image Analysis

Dynamic Thresholding for Image Analysis Dynamic Thresholding for Image Analysis Statistical Consulting Report for Edward Chan Clean Energy Research Center University of British Columbia by Libo Lu Department of Statistics University of British

More information

Nested Sampling: Introduction and Implementation

Nested Sampling: Introduction and Implementation UNIVERSITY OF TEXAS AT SAN ANTONIO Nested Sampling: Introduction and Implementation Liang Jing May 2009 1 1 ABSTRACT Nested Sampling is a new technique to calculate the evidence, Z = P(D M) = p(d θ, M)p(θ

More information

Graphical Models, Bayesian Method, Sampling, and Variational Inference

Graphical Models, Bayesian Method, Sampling, and Variational Inference Graphical Models, Bayesian Method, Sampling, and Variational Inference With Application in Function MRI Analysis and Other Imaging Problems Wei Liu Scientific Computing and Imaging Institute University

More information

Short-Cut MCMC: An Alternative to Adaptation

Short-Cut MCMC: An Alternative to Adaptation Short-Cut MCMC: An Alternative to Adaptation Radford M. Neal Dept. of Statistics and Dept. of Computer Science University of Toronto http://www.cs.utoronto.ca/ radford/ Third Workshop on Monte Carlo Methods,

More information

The Cross-Entropy Method for Mathematical Programming

The Cross-Entropy Method for Mathematical Programming The Cross-Entropy Method for Mathematical Programming Dirk P. Kroese Reuven Y. Rubinstein Department of Mathematics, The University of Queensland, Australia Faculty of Industrial Engineering and Management,

More information

Image analysis. Computer Vision and Classification Image Segmentation. 7 Image analysis

Image analysis. Computer Vision and Classification Image Segmentation. 7 Image analysis 7 Computer Vision and Classification 413 / 458 Computer Vision and Classification The k-nearest-neighbor method The k-nearest-neighbor (knn) procedure has been used in data analysis and machine learning

More information

MCMC Diagnostics. Yingbo Li MATH Clemson University. Yingbo Li (Clemson) MCMC Diagnostics MATH / 24

MCMC Diagnostics. Yingbo Li MATH Clemson University. Yingbo Li (Clemson) MCMC Diagnostics MATH / 24 MCMC Diagnostics Yingbo Li Clemson University MATH 9810 Yingbo Li (Clemson) MCMC Diagnostics MATH 9810 1 / 24 Convergence to Posterior Distribution Theory proves that if a Gibbs sampler iterates enough,

More information

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization. Wolfram Burgard

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization. Wolfram Burgard Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard 1 Motivation Recall: Discrete filter Discretize the continuous state space High memory complexity

More information

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods Prof. Daniel Cremers 11. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric

More information

CS839: Probabilistic Graphical Models. Lecture 10: Learning with Partially Observed Data. Theo Rekatsinas

CS839: Probabilistic Graphical Models. Lecture 10: Learning with Partially Observed Data. Theo Rekatsinas CS839: Probabilistic Graphical Models Lecture 10: Learning with Partially Observed Data Theo Rekatsinas 1 Partially Observed GMs Speech recognition 2 Partially Observed GMs Evolution 3 Partially Observed

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Overview of Part Two Probabilistic Graphical Models Part Two: Inference and Learning Christopher M. Bishop Exact inference and the junction tree MCMC Variational methods and EM Example General variational

More information

MRF-based Algorithms for Segmentation of SAR Images

MRF-based Algorithms for Segmentation of SAR Images This paper originally appeared in the Proceedings of the 998 International Conference on Image Processing, v. 3, pp. 770-774, IEEE, Chicago, (998) MRF-based Algorithms for Segmentation of SAR Images Robert

More information

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models Prof. Daniel Cremers 4. Probabilistic Graphical Models Directed Models The Bayes Filter (Rep.) (Bayes) (Markov) (Tot. prob.) (Markov) (Markov) 2 Graphical Representation (Rep.) We can describe the overall

More information

Computer Vision 2 Lecture 8

Computer Vision 2 Lecture 8 Computer Vision 2 Lecture 8 Multi-Object Tracking (30.05.2016) leibe@vision.rwth-aachen.de, stueckler@vision.rwth-aachen.de RWTH Aachen University, Computer Vision Group http://www.vision.rwth-aachen.de

More information

ECE521: Week 11, Lecture March 2017: HMM learning/inference. With thanks to Russ Salakhutdinov

ECE521: Week 11, Lecture March 2017: HMM learning/inference. With thanks to Russ Salakhutdinov ECE521: Week 11, Lecture 20 27 March 2017: HMM learning/inference With thanks to Russ Salakhutdinov Examples of other perspectives Murphy 17.4 End of Russell & Norvig 15.2 (Artificial Intelligence: A Modern

More information

University of Cambridge Engineering Part IIB Paper 4F10: Statistical Pattern Processing Handout 11: Non-Parametric Techniques.

University of Cambridge Engineering Part IIB Paper 4F10: Statistical Pattern Processing Handout 11: Non-Parametric Techniques. . Non-Parameteric Techniques University of Cambridge Engineering Part IIB Paper 4F: Statistical Pattern Processing Handout : Non-Parametric Techniques Mark Gales mjfg@eng.cam.ac.uk Michaelmas 23 Introduction

More information

Discussion on Bayesian Model Selection and Parameter Estimation in Extragalactic Astronomy by Martin Weinberg

Discussion on Bayesian Model Selection and Parameter Estimation in Extragalactic Astronomy by Martin Weinberg Discussion on Bayesian Model Selection and Parameter Estimation in Extragalactic Astronomy by Martin Weinberg Phil Gregory Physics and Astronomy Univ. of British Columbia Introduction Martin Weinberg reported

More information

Linear Modeling with Bayesian Statistics

Linear Modeling with Bayesian Statistics Linear Modeling with Bayesian Statistics Bayesian Approach I I I I I Estimate probability of a parameter State degree of believe in specific parameter values Evaluate probability of hypothesis given the

More information

Bootstrapping Methods

Bootstrapping Methods Bootstrapping Methods example of a Monte Carlo method these are one Monte Carlo statistical method some Bayesian statistical methods are Monte Carlo we can also simulate models using Monte Carlo methods

More information

Project Paper Introduction

Project Paper Introduction Project Paper Introduction Tracey Marsh Group Health Research Institute University of Washington, Department of Biostatistics Paper Estimating and Projecting Trends in HIV/AIDS Generalized Epidemics Using

More information

Estimation of Item Response Models

Estimation of Item Response Models Estimation of Item Response Models Lecture #5 ICPSR Item Response Theory Workshop Lecture #5: 1of 39 The Big Picture of Estimation ESTIMATOR = Maximum Likelihood; Mplus Any questions? answers Lecture #5:

More information

Non-parametric Approximate Bayesian Computation for Expensive Simulators

Non-parametric Approximate Bayesian Computation for Expensive Simulators Non-parametric Approximate Bayesian Computation for Expensive Simulators Steven Laan 6036031 Master s thesis 42 EC Master s programme Artificial Intelligence University of Amsterdam Supervisor Ted Meeds

More information

GAMs semi-parametric GLMs. Simon Wood Mathematical Sciences, University of Bath, U.K.

GAMs semi-parametric GLMs. Simon Wood Mathematical Sciences, University of Bath, U.K. GAMs semi-parametric GLMs Simon Wood Mathematical Sciences, University of Bath, U.K. Generalized linear models, GLM 1. A GLM models a univariate response, y i as g{e(y i )} = X i β where y i Exponential

More information

Estimation of Bilateral Connections in a Network: Copula vs. Maximum Entropy

Estimation of Bilateral Connections in a Network: Copula vs. Maximum Entropy Estimation of Bilateral Connections in a Network: Copula vs. Maximum Entropy Pallavi Baral and Jose Pedro Fique Department of Economics Indiana University at Bloomington 1st Annual CIRANO Workshop on Networks

More information

University of Cambridge Engineering Part IIB Paper 4F10: Statistical Pattern Processing Handout 11: Non-Parametric Techniques

University of Cambridge Engineering Part IIB Paper 4F10: Statistical Pattern Processing Handout 11: Non-Parametric Techniques University of Cambridge Engineering Part IIB Paper 4F10: Statistical Pattern Processing Handout 11: Non-Parametric Techniques Mark Gales mjfg@eng.cam.ac.uk Michaelmas 2015 11. Non-Parameteric Techniques

More information

Heterotachy models in BayesPhylogenies

Heterotachy models in BayesPhylogenies Heterotachy models in is a general software package for inferring phylogenetic trees using Bayesian Markov Chain Monte Carlo (MCMC) methods. The program allows a range of models of gene sequence evolution,

More information

Maximum Likelihood Network Topology Identification from Edge-based Unicast Measurements. About the Authors. Motivation and Main Point.

Maximum Likelihood Network Topology Identification from Edge-based Unicast Measurements. About the Authors. Motivation and Main Point. Maximum Likelihood Network Topology Identification from Edge-based Unicast Measurements Mark Coates McGill University Presented by Chunling Hu Rui Castro, Robert Nowak Rice University About the Authors

More information

MSA101/MVE Lecture 5

MSA101/MVE Lecture 5 MSA101/MVE187 2017 Lecture 5 Petter Mostad Chalmers University September 12, 2017 1 / 15 Importance sampling MC integration computes h(x)f (x) dx where f (x) is a probability density function, by simulating

More information

Bayesian model selection and diagnostics

Bayesian model selection and diagnostics Bayesian model selection and diagnostics A typical Bayesian analysis compares a handful of models. Example 1: Consider the spline model for the motorcycle data, how many basis functions? Example 2: Consider

More information

Optimal designs for comparing curves

Optimal designs for comparing curves Optimal designs for comparing curves Holger Dette, Ruhr-Universität Bochum Maria Konstantinou, Ruhr-Universität Bochum Kirsten Schorning, Ruhr-Universität Bochum FP7 HEALTH 2013-602552 Outline 1 Motivation

More information

Modelling and Quantitative Methods in Fisheries

Modelling and Quantitative Methods in Fisheries SUB Hamburg A/553843 Modelling and Quantitative Methods in Fisheries Second Edition Malcolm Haddon ( r oc) CRC Press \ y* J Taylor & Francis Croup Boca Raton London New York CRC Press is an imprint of

More information

MCMC for doubly-intractable distributions

MCMC for doubly-intractable distributions MCMC for doubly-intractable distributions Iain Murray Gatsby Computational Neuroscience Unit University College London i.murray@gatsby.ucl.ac.uk Zoubin Ghahramani Department of Engineering University of

More information

Fitting Social Network Models Using the Varying Truncation S. Truncation Stochastic Approximation MCMC Algorithm

Fitting Social Network Models Using the Varying Truncation S. Truncation Stochastic Approximation MCMC Algorithm Fitting Social Network Models Using the Varying Truncation Stochastic Approximation MCMC Algorithm May. 17, 2012 1 This talk is based on a joint work with Dr. Ick Hoon Jin Abstract The exponential random

More information

Missing Data and Imputation

Missing Data and Imputation Missing Data and Imputation NINA ORWITZ OCTOBER 30 TH, 2017 Outline Types of missing data Simple methods for dealing with missing data Single and multiple imputation R example Missing data is a complex

More information

GT "Calcul Ensembliste"

GT Calcul Ensembliste GT "Calcul Ensembliste" Beyond the bounded error framework for non linear state estimation Fahed Abdallah Université de Technologie de Compiègne 9 Décembre 2010 Fahed Abdallah GT "Calcul Ensembliste" 9

More information

Pattern Recognition. Kjell Elenius. Speech, Music and Hearing KTH. March 29, 2007 Speech recognition

Pattern Recognition. Kjell Elenius. Speech, Music and Hearing KTH. March 29, 2007 Speech recognition Pattern Recognition Kjell Elenius Speech, Music and Hearing KTH March 29, 2007 Speech recognition 2007 1 Ch 4. Pattern Recognition 1(3) Bayes Decision Theory Minimum-Error-Rate Decision Rules Discriminant

More information

Inference and Representation

Inference and Representation Inference and Representation Rachel Hodos New York University Lecture 5, October 6, 2015 Rachel Hodos Lecture 5: Inference and Representation Today: Learning with hidden variables Outline: Unsupervised

More information

Approximate Bayesian Computation methods and their applications for hierarchical statistical models. University College London, 2015

Approximate Bayesian Computation methods and their applications for hierarchical statistical models. University College London, 2015 Approximate Bayesian Computation methods and their applications for hierarchical statistical models University College London, 2015 Contents 1. Introduction 2. ABC methods 3. Hierarchical models 4. Application

More information

A Short History of Markov Chain Monte Carlo

A Short History of Markov Chain Monte Carlo A Short History of Markov Chain Monte Carlo Christian Robert and George Casella 2010 Introduction Lack of computing machinery, or background on Markov chains, or hesitation to trust in the practicality

More information

Hierarchical Bayesian Modeling with Ensemble MCMC. Eric B. Ford (Penn State) Bayesian Computing for Astronomical Data Analysis June 12, 2014

Hierarchical Bayesian Modeling with Ensemble MCMC. Eric B. Ford (Penn State) Bayesian Computing for Astronomical Data Analysis June 12, 2014 Hierarchical Bayesian Modeling with Ensemble MCMC Eric B. Ford (Penn State) Bayesian Computing for Astronomical Data Analysis June 12, 2014 Simple Markov Chain Monte Carlo Initialise chain with θ 0 (initial

More information

Stochastic Simulation: Algorithms and Analysis

Stochastic Simulation: Algorithms and Analysis Soren Asmussen Peter W. Glynn Stochastic Simulation: Algorithms and Analysis et Springer Contents Preface Notation v xii I What This Book Is About 1 1 An Illustrative Example: The Single-Server Queue 1

More information

Mesh segmentation. Florent Lafarge Inria Sophia Antipolis - Mediterranee

Mesh segmentation. Florent Lafarge Inria Sophia Antipolis - Mediterranee Mesh segmentation Florent Lafarge Inria Sophia Antipolis - Mediterranee Outline What is mesh segmentation? M = {V,E,F} is a mesh S is either V, E or F (usually F) A Segmentation is a set of sub-meshes

More information

Bayesian Annealed Sequential Importance Sampling (BASIS): an unbiased version of Transitional Markov Chain Monte Carlo

Bayesian Annealed Sequential Importance Sampling (BASIS): an unbiased version of Transitional Markov Chain Monte Carlo Bayesian Annealed Sequential Importance Sampling (BASIS): an unbiased version of Transitional Markov Chain Monte Carlo Stephen Wu Postdoctoral CSELab ETH-Zurich CH -892 Switzerland Email: stewu@ism.ac.jp

More information

10703 Deep Reinforcement Learning and Control

10703 Deep Reinforcement Learning and Control 10703 Deep Reinforcement Learning and Control Russ Salakhutdinov Machine Learning Department rsalakhu@cs.cmu.edu Policy Gradient I Used Materials Disclaimer: Much of the material and slides for this lecture

More information

Monte Carlo Optimization Introduction

Monte Carlo Optimization Introduction Monte Carlo Methods with R: Monte Carlo Optimization [80] Monte Carlo Optimization Introduction Optimization problems can mostly be seen as one of two kinds: Find the extrema of a function h(θ) over a

More information

Part I. Hierarchical clustering. Hierarchical Clustering. Hierarchical clustering. Produces a set of nested clusters organized as a

Part I. Hierarchical clustering. Hierarchical Clustering. Hierarchical clustering. Produces a set of nested clusters organized as a Week 9 Based in part on slides from textbook, slides of Susan Holmes Part I December 2, 2012 Hierarchical Clustering 1 / 1 Produces a set of nested clusters organized as a Hierarchical hierarchical clustering

More information

Sensor Tasking and Control

Sensor Tasking and Control Sensor Tasking and Control Outline Task-Driven Sensing Roles of Sensor Nodes and Utilities Information-Based Sensor Tasking Joint Routing and Information Aggregation Summary Introduction To efficiently

More information

Bayesian Segmentation and Motion Estimation in Video Sequences using a Markov-Potts Model

Bayesian Segmentation and Motion Estimation in Video Sequences using a Markov-Potts Model Bayesian Segmentation and Motion Estimation in Video Sequences using a Markov-Potts Model Patrice BRAULT and Ali MOHAMMAD-DJAFARI LSS, Laboratoire des Signaux et Systemes, CNRS UMR 8506, Supelec, Plateau

More information

2. A Bernoulli distribution has the following likelihood function for a data set D: N 1 N 1 + N 0

2. A Bernoulli distribution has the following likelihood function for a data set D: N 1 N 1 + N 0 Machine Learning Fall 2015 Homework 1 Homework must be submitted electronically following the instructions on the course homepage. Make sure to explain you reasoning or show your derivations. Except for

More information

A One-Pass Sequential Monte Carlo Method for Bayesian Analysis of Massive Datasets

A One-Pass Sequential Monte Carlo Method for Bayesian Analysis of Massive Datasets A One-Pass Sequential Monte Carlo Method for Bayesian Analysis of Massive Datasets Suhrid Balakrishnan and David Madigan Department of Computer Science, Rutgers University, Piscataway, NJ 08854, USA Department

More information

Samuel Coolidge, Dan Simon, Dennis Shasha, Technical Report NYU/CIMS/TR

Samuel Coolidge, Dan Simon, Dennis Shasha, Technical Report NYU/CIMS/TR Detecting Missing and Spurious Edges in Large, Dense Networks Using Parallel Computing Samuel Coolidge, sam.r.coolidge@gmail.com Dan Simon, des480@nyu.edu Dennis Shasha, shasha@cims.nyu.edu Technical Report

More information

Global modelling of air pollution using multiple data sources

Global modelling of air pollution using multiple data sources Global modelling of air pollution using multiple data sources Matthew Thomas SAMBa, University of Bath Email: M.L.Thomas@bath.ac.uk November 11, 015 1/ 3 OUTLINE Motivation Data Sources Existing Approaches

More information

Robotics. Lecture 5: Monte Carlo Localisation. See course website for up to date information.

Robotics. Lecture 5: Monte Carlo Localisation. See course website  for up to date information. Robotics Lecture 5: Monte Carlo Localisation See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College London Review:

More information

CHAPTER 1 INTRODUCTION

CHAPTER 1 INTRODUCTION Introduction CHAPTER 1 INTRODUCTION Mplus is a statistical modeling program that provides researchers with a flexible tool to analyze their data. Mplus offers researchers a wide choice of models, estimators,

More information

L10. PARTICLE FILTERING CONTINUED. NA568 Mobile Robotics: Methods & Algorithms

L10. PARTICLE FILTERING CONTINUED. NA568 Mobile Robotics: Methods & Algorithms L10. PARTICLE FILTERING CONTINUED NA568 Mobile Robotics: Methods & Algorithms Gaussian Filters The Kalman filter and its variants can only model (unimodal) Gaussian distributions Courtesy: K. Arras Motivation

More information

Time Series Analysis by State Space Methods

Time Series Analysis by State Space Methods Time Series Analysis by State Space Methods Second Edition J. Durbin London School of Economics and Political Science and University College London S. J. Koopman Vrije Universiteit Amsterdam OXFORD UNIVERSITY

More information

A Bayesian approach to artificial neural network model selection

A Bayesian approach to artificial neural network model selection A Bayesian approach to artificial neural network model selection Kingston, G. B., H. R. Maier and M. F. Lambert Centre for Applied Modelling in Water Engineering, School of Civil and Environmental Engineering,

More information

Modeling of Complex Social. MATH 800 Fall 2011

Modeling of Complex Social. MATH 800 Fall 2011 Modeling of Complex Social Systems MATH 800 Fall 2011 Complex SocialSystems A systemis a set of elements and relationships A complex system is a system whose behavior cannot be easily or intuitively predicted

More information

x n+1 = x n f(x n) f (x n ), (1)

x n+1 = x n f(x n) f (x n ), (1) 1 Optimization The field of optimization is large and vastly important, with a deep history in computer science (among other places). Generally, an optimization problem is defined by having a score function

More information

Warped Mixture Models

Warped Mixture Models Warped Mixture Models Tomoharu Iwata, David Duvenaud, Zoubin Ghahramani Cambridge University Computational and Biological Learning Lab March 11, 2013 OUTLINE Motivation Gaussian Process Latent Variable

More information

STATISTICS (STAT) Statistics (STAT) 1

STATISTICS (STAT) Statistics (STAT) 1 Statistics (STAT) 1 STATISTICS (STAT) STAT 2013 Elementary Statistics (A) Prerequisites: MATH 1483 or MATH 1513, each with a grade of "C" or better; or an acceptable placement score (see placement.okstate.edu).

More information

Chapter 3. Bootstrap. 3.1 Introduction. 3.2 The general idea

Chapter 3. Bootstrap. 3.1 Introduction. 3.2 The general idea Chapter 3 Bootstrap 3.1 Introduction The estimation of parameters in probability distributions is a basic problem in statistics that one tends to encounter already during the very first course on the subject.

More information