Package gibbs.met documentation

Similar documents
Package gibbs.met. February 19, 2015

Markov chain Monte Carlo methods

Package atmcmc. February 19, 2015

Package dirmcmc. R topics documented: February 27, 2017

Multivariate Normal Random Numbers

Bayesian Estimation for Skew Normal Distributions Using Data Augmentation

Package BPHO documentation

Package HI. February 19, 2015

Assignment 2. Unsupervised & Probabilistic Learning. Maneesh Sahani Due: Monday Nov 5, 2018

Monte Carlo sampling

Bayesian Statistics Group 8th March Slice samplers. (A very brief introduction) The basic idea

The boa Package. May 3, 2005

Markov Chain Monte Carlo (part 1)

Package HMMCont. February 19, 2015

JMP Book Descriptions

Descriptive and Graphical Analysis of the Data

Package MfUSampler. June 13, 2017

Package ECctmc. May 1, 2018

Overview. Monte Carlo Methods. Statistics & Bayesian Inference Lecture 3. Situation At End Of Last Week

Package binomlogit. February 19, 2015

MCMC Methods for data modeling

Quantitative Biology II!

Package gamm4. July 25, Index 10

Computer vision: models, learning and inference. Chapter 10 Graphical Models

The gbev Package. August 19, 2006

Today s outline: pp

Note Set 4: Finite Mixture Models and the EM Algorithm

Package mnormt. April 2, 2015

Package GGMselect. February 1, 2018

Time series, HMMs, Kalman Filters

Package glassomix. May 30, 2013

Package RidgeClust. January 18, 2011

Clustering K-means. Machine Learning CSEP546 Carlos Guestrin University of Washington February 18, Carlos Guestrin

Clustering K-means. Machine Learning CSEP546 Carlos Guestrin University of Washington February 18, Carlos Guestrin

1 Methods for Posterior Simulation

Package CombinePortfolio

Linear Modeling with Bayesian Statistics

Estimation of Item Response Models

Missing Data Analysis for the Employee Dataset

Semiparametric Mixed Effecs with Hierarchical DP Mixture

Metropolis Light Transport

Package lgarch. September 15, 2015

The supclust Package

Analysis of Incomplete Multivariate Data

Package longclust. March 18, 2018

Probabilistic Graphical Models

Debugging MCMC Code. Charles J. Geyer. April 15, 2017

Visualizing and Exploring Data

Package hsiccca. February 20, 2015

Machine Learning and Data Mining. Clustering (1): Basics. Kalev Kask

Clustering Relational Data using the Infinite Relational Model

Sampling informative/complex a priori probability distributions using Gibbs sampling assisted by sequential simulation

Bayesian data analysis using R

Package bayesdccgarch

Reddit Recommendation System Daniel Poon, Yu Wu, David (Qifan) Zhang CS229, Stanford University December 11 th, 2011

Clustering & Dimensionality Reduction. 273A Intro Machine Learning

MCMC Diagnostics. Yingbo Li MATH Clemson University. Yingbo Li (Clemson) MCMC Diagnostics MATH / 24

ECE521: Week 11, Lecture March 2017: HMM learning/inference. With thanks to Russ Salakhutdinov

Package diffusionmap

Package eco. August 1, 2017

CONDITIONAL SIMULATION OF TRUNCATED RANDOM FIELDS USING GRADIENT METHODS

The Use of Biplot Analysis and Euclidean Distance with Procrustes Measure for Outliers Detection

Package clusternomics

AN ADAPTIVE POPULATION IMPORTANCE SAMPLER. Luca Martino*, Victor Elvira\ David Luengcfi, Jukka Corander*

The nor1mix Package. August 3, 2006

Package actyr. August 5, 2015

Package DCEM. September 30, 2018

Sampling PCA, enhancing recovered missing values in large scale matrices. Luis Gabriel De Alba Rivera 80555S

Statistical techniques for data analysis in Cosmology

Package gsscopu. R topics documented: July 2, Version Date

Time Series Analysis by State Space Methods

Package texteffect. November 27, 2017

Mixture Models and the EM Algorithm

Making plots in R [things I wish someone told me when I started grad school]

Package r2d2. February 20, 2015

Package clustvarsel. April 9, 2018

Evaluation of Moving Object Tracking Techniques for Video Surveillance Applications

The nor1mix Package. June 12, 2007

Machine Learning A W 1sst KU. b) [1 P] Give an example for a probability distributions P (A, B, C) that disproves

The qp Package. December 21, 2006

SD 372 Pattern Recognition

ADAPTIVE METROPOLIS-HASTINGS SAMPLING, OR MONTE CARLO KERNEL ESTIMATION

Approximate Bayesian Computation. Alireza Shafaei - April 2016

Package cwm. R topics documented: February 19, 2015

Applications of the k-nearest neighbor method for regression and resampling

Package INLABMA. February 4, 2017

Package cgh. R topics documented: February 19, 2015

Package beast. March 16, 2018

The Amelia Package. March 25, 2007

Package bayesdp. July 10, 2018

Package FPDclustering

Package RobustGaSP. R topics documented: September 11, Type Package

Package BAMBI. R topics documented: August 28, 2017

The GLMMGibbs Package

Package dissutils. August 29, 2016

An Introduction to Estimating Monte Carlo Standard Errors with R Package mcmcse

Package Bergm. R topics documented: September 25, Type Package

We deliver Global Engineering Solutions. Efficiently. This page contains no technical data Subject to the EAR or the ITAR

spam: a Sparse Matrix R Package

The copula Package. May 16, 2006

Transcription:

Version 1.1-2 Package gibbs.met documentation Title Naive Gibbs Sampling with Metropolis Steps Author Longhai Li <longhai@math.usask.ca> of March 26, 2008 Maintainer Longhai Li <longhai@math.usask.ca> Depends R (>= 2.5.1) Description This package provides two generic functions for performing Markov chain sampling in a naive way for a user-defined target distribution, which involves only continuous variables. The function gibbs_met performs Gibbs sampling with each 1-dimensional distribution sampled with Metropolis update using Gaussian proposal distribution centered at the previous state. The function met_gaussian updates the whole state with Metropolis method using independent Gaussian proposal distribution centered at the previous state. The sampling is carried out without considering any special tricks for improving efficiency. This package is aimed at only routine applications in moderate-dimensional problems. License GPL (>=2) URL http://www.r-project.org, http://math.usask.ca/~longhai R topics documented: gibbs-metropolis...................................... 2 Index 6 1

2 gibbs-metropolis gibbs-metropolis Gibbs sampling with Metropolis steps and multivariate Metropolis sampling Description The function gibbs_met performs Gibbs sampling with each 1-dimensional distribution sampled with Metropolis update using Gaussian proposal distribution centered at the previous state. The function met_gaussian updates the whole state with Metropolis method using independent Gaussian proposal distribution centered at the previous state. The sampling is carried out without considering any special tricks for improving efficiency. The functions are written for routine applications in moderate-dimensional problems. Usage gibbs_met(log_f,p,x0,iters_mc,iters_met,stepsizes_met, iters_per.iter=1,...) met_gaussian(log_f,iters, p, x0, stepsizes, iters_per.iter=1,...) Arguments log_f p the log of the density function from which one wants to sample. the number of variables to be sampled. x0 the initial value. iters_mc, iters iterations of Gibbs sampling or Metropolis update. iters_per.iter for each transition specified by iters or iters_mc, the Markov chain sampling is run iters_per.iter times. This argument is used to avoid saving the whole Markov chain. iters_met iterations of Metropolis for each 1-dimensional sampling. stepsizes_met a vector of length p, with stepsizes_met[i] being the standard deviation of Gaussian proposal for updating i th variable. stepsizes the same as stepsizes_met for the function met_gaussian.... extra arguments needed to compute log_f. Value a matrix with dim (iters_mc + 1) * p or (iters +1) * p is returned, with each row for an iteration

gibbs-metropolis 3 Examples ## demonstration by sampling from bivariate normal distributions ## the function computing the log density function of multivariate normal ## x --- a vector, the p.d.f at x will be computed ## mu --- the mean vector of multivariate normal distribution ## A --- the inverse covariance matrix of multivariate normal distribution log_pdf_mnormal <- function(x, mu, A) { 0.5 * (-length(mu)*log(2*pi)+sum(log(svd(a)$d))-sum(t(a*(x-mu))*(x-mu)) ) } ## sampling from a bivariate normal distribution with correlation 0.1, ## both marginal standard deviations 1, mean vector (0,5) A <- solve(matrix(c(1,0.1,0.1,1),2,2)) mc_mvn <- gibbs_met(log_f=log_pdf_mnormal,2,x0=c(0,0),iters_mc=1000,iters_met=5, stepsizes_met=c(0.5,0.5), mu=c(0,5), A = A) postscript("mc_mvn_lowcor.eps",width=7,height=8,horiz=false) par(mfrow=c(2,2), oma=c(0,0,1,0)) ## looking at the trace of Markov chain in the first 100 iterations plot(mc_mvn[1:100,1],mc_mvn[1:100,2],type="b",pch=20, main="markov chain trace of both variables") plot(mc_mvn[,1],type="b",pch=20, main="markov chain trace of the 1st variable") ## looking at the QQ plot of the samples for a variable qqnorm(mc_mvn[-(1:50),1],main="normal QQ plot of the 1st variable") ## looking at the ACF of the samples for a variable acf(mc_mvn[,1],main="acf plot of the 1st variable") title(main="gibbs sampling for a bivariate normal with correlation 0.1", outer=true) dev.off() ## checking the correlation of samples cat("the sample correlation is",cor(mc_mvn[-(1:50),1],mc_mvn[-(1:50),2]),"\n") ## demonstration by sampling from a mixture bivariate normal distribution ## the function computing the log density function of mixture multivariate normal ## x --- a vector, the p.d.f at x will be computed

4 gibbs-metropolis ## mu1,mu2 --- the mean vectors of multivariate normal distributions ## A1,A2 --- the inverse covariance matrice of multivariate normal distributions ## mixture proportion is 0.5 log_pdf_twonormal <- function(x, mu1, A1, mu2, A2) { log_sum_exp(c(log_pdf_mnormal(x,mu1,a1),log_pdf_mnormal(x,mu2,a2)) ) } log_sum_exp <- function(lx) { ml <- max(lx) ml + log(sum(exp(lx-ml))) } ## set parameters defining a mixture bivariate distribution A1 <- solve(matrix(c(1,0.1,0.1,1),2,2)) A2 <- solve(matrix(c(1,0.1,0.1,1),2,2)) mu1 <- c(0,0) mu2 <-c(4,4) ## performing Gibbs sampling mc_mvn <- gibbs_met(log_f=log_pdf_twonormal,2,x0=c(0,0),iters_mc=4000,iters_met=5, stepsizes_met=c(0.5,0.5),mu1=mu1,mu2=mu2,a1=a1,a2=a2) postscript("mc_mvn_closemix.eps",width=7,height=8,horiz=false) par(mfrow=c(2,2), oma=c(0,0,2,0)) ## looking at the trace of Markov chain in the first 100 iterations plot(mc_mvn[,1],mc_mvn[,2],type="b",pch=20, main="markov chain trace of both variables") plot(mc_mvn[,1],type="b",pch=20, main="markov chain trace of the 1st variable") plot(mc_mvn[,2],type="b",pch=20, main="markov chain trace of the 2nd variable") ## looking at the ACF of the samples for a variable acf(mc_mvn[,1],main="acf plot of the 1st variable") title(main="gibbs sampling for a mixture of two bivariate normal distributions with locations (0,0) and (4,4)", outer=true) dev.off() ## checking the correlation of samples cat("the sample correlation is",cor(mc_mvn[-(1:50),1],mc_mvn[-(1:50),2]),"\n") ############################################################################### ## Sampling from a mixture bivariate normal distribution with Metropolis method ############################################################################### ## set parameters defining a mixture bivariate distribution

gibbs-metropolis 5 A1 <- solve(matrix(c(1,0.1,0.1,1),2,2)) A2 <- solve(matrix(c(1,0.1,0.1,1),2,2)) mu1 <- c(0,0) mu2 <-c(6,6) ## performing Gibbs sampling mc_mvn <- met_gaussian(log_f=log_pdf_twonormal,p=2,x0=c(0,0),iters=4000, iters_per.iter=10,stepsizes=c(1,1), mu1=mu1,mu2=mu2,a1=a1,a2=a2) postscript("mc_mvn_farmix_met.eps",width=7,height=8,horiz=false) par(mfrow=c(2,2), oma=c(0,0,2,0)) ## looking at the trace of Markov chain in the first 100 iterations plot(mc_mvn[,1],mc_mvn[,2],type="b",pch=20, main="markov chain trace of both variables") plot(mc_mvn[,1],type="b",pch=20, main="markov chain trace of the 1st variable") plot(mc_mvn[,2],type="b",pch=20, main="markov chain trace of the 2nd variable") ## looking at the ACF of the samples for a variable acf(mc_mvn[,1],main="acf plot of the 1st variable") title(main="sampling with Metropolis method for a mixture of two bivariate normal distributions with locations (0,0) and (6,6)", outer=true) dev.off() ## checking the correlation of samples cat("the sample correlation is",cor(mc_mvn[-(1:50),1],mc_mvn[-(1:50),2]),"\n")

Index Topic multivariate gibbs-metropolis, 1 begin.gibbs.met (gibbs-metropolis), 1 gibbs-metropolis, 1 gibbs_met (gibbs-metropolis), 1 met_gaussian (gibbs-metropolis), 1 6