everything you ever wanted to know about bugs, r2winbugs, and adaptive rejection sampling

Post on 24-Jan-2016

35 Views

Category:

Documents

6 Downloads

Preview:

Click to see full reader

DESCRIPTION

Everything you ever wanted to know about BUGS, R2winBUGS, and Adaptive Rejection Sampling. A Presentation by Keith Betts. About BUGS. Bayesian analysis Using Gibbs Sampling Developed by UK Medical Research Council and the Imperial College of Science, Technology and Medicine, London. - PowerPoint PPT Presentation

TRANSCRIPT

Everything you ever wanted to know about BUGS, R2winBUGS, and Adaptive Rejection Sampling

A Presentation by

Keith Betts

About BUGS

Bayesian analysis Using Gibbs Sampling Developed by UK Medical Research

Council and the Imperial College of Science, Technology and Medicine, London.

http://www.mrc-bsu.cam.ac.uk/bugs

Why Use BUGS

Sophisticated implementation of Markov Chain Monte Carlo for any given model

No derivation required Useful for double-checking hand coded

results Great for problems with no exact analytic

solution

What does every BUGS file need?

ModelSpecify the Likelihood and Prior distributions

DataExternal data in either rectangular array or as

R data type Initial Values

Starting values for MCMC parameters

Model

a syntactical representation of the model, in which the distributional form of the data and parameters are specified.

~ assigns distributions <- assigns relations for loops to assign i.i.d. data.

Distributions

Syntax can be found in User Manual, under Distributions

Parameterization may be different than in Rdnorm(μ,τ), τ is the precision (not the standard

deviation).

Data

Place Data and Global variables in one list file

Vectors represented just as in R, c(., .) Matrices require structure command

Initial Values

User requested initial values can be supplied for multiple chains

Put starting values for all variables in one list statement

BUGS can generate its own starting values

How to run

Place your Model, Data, and Initial Values in one file.

Open “Specification” from the model menuHighlight “model” statementClick “Check Model”Highlight “list” in Data sectionClick “Load Data”

How to run continued

Still in Model SpecificationClick CompileChoose how many chains to runHighlight “list” in initial values sectionClick “load inits”Alternatively click “gen inits”

How to run (Part 3)

Open “Samples” from “Inference” menuEnter all variables of interest (one at a time)

into node box, and click “set Open “Update” from “Model” menu

Enter how many iterations wanted

Inference

Open “Samples” from “Inference” MenuEnter * in node boxClick “History” to view Trace plotsClick “Density” for density plotsClick “Stats” for summary statisticsChange value of “beg” for burnout

Example 1

Poisson-Gamma ModelData:

# of Failures in power plant pumps

Model # of Failures follows Poisson(θi ti) θi failure rate follows Gamma(α, β) Prior for α is Exp(1) Prior for β is Gamma(.1, 1)

Example 1 Continued:

Computational IssuesGamma is natural conjugate for Poisson

distribution Posterior for β follows GammaNon-standard Posterior for α

Model Step

model{

for (i in 1 : N) {theta[i] ~ dgamma(alpha, beta)lambda[i] <- theta[i] * t[i]x[i] ~ dpois(lambda[i])

}alpha ~ dexp(1)beta ~ dgamma(0.1, 1.0)}

Data Step and Initial Values

Data: list(t = c(94.3, 15.7, 62.9, 126, 5.24, 31.4,

1.05, 1.05, 2.1, 10.5), x = c(5, 1, 5,14, 3, 19, 1, 1, 4, 22), N = 10)

Initial Values: list(alpha = 1, beta = 1)

Example 2

Data: 30 young rats whose weights were measured weekly for five weeks.

Yij is the weight of the ith rat measured at age xj.

Assume a random effects linear growth curve model

Example 2: Model

model{ for( i in 1 : N ) {

for( j in 1 : J ) {mu[i , j] <- alpha[i] + beta[i] * (t[j] - tbar)Y[i , j] ~ dnorm(mu[i , j], sigma.y)

}alpha[i] ~ dnorm(mu.alpha , sigma.alpha)beta[i] ~ dnorm(mu.beta, sigma.beta)

}sigma.y ~ dgamma(0.001,0.001)mu.alpha ~ dunif( -1.0E9,1.0E9) sigma.alpha ~ dgamma(0.001,0.001)mu.beta ~ dunif(-1.0E9,1.0E9)

sigma.beta ~ dgamma(0.001,0.001)}

R2winBUGS

R package Runs BUGS through R Same computational advantages of BUGS

with statistical and graphical capacities of R.

Make Model file

Model file requiredMust contain BUGS syntaxCan either be written in advance or by R itself

through the write.model() function

Initialize

Both data and Initial values stored as lists Create param vector with names of

parameters to be tracked

Run

bugs(datafile , initial.vals, parameters, modelfile, n.chains=1, n.iter=5000, n.burnin=2000, n.thin=1, bugs.directory="c:/Program Files/WinBUGS14/", working.directory=NULL)

Extract results from $sims.matrix

How BUGS works

BUGS determines the complete conditionals if possible by:Closed form distributionsAdaptive-Rejection SamplingSlice-SamplingMetropolis Algorithm

Recall Rejection Sampling

Adaptive Rejection Sampling(ARS)

ARS is a method for efficiently sampling from any univariate probability density function which is log-concave.

Useful in applications of Gibbs sampling, where full-conditional distributions are algebraically messy yet often log-concave

Idea

ARS works by constructing an envelope function of the log of the target densityEnvelope used for rejection sampling

Whenever a point is rejected by ARS, the envelope is updated to correspond more closely to the true log density.reduces the chance of rejecting subsequent

points

Results

As the envelope function adapts to the shape of the target density, sampling becomes progressively more efficient as more points are sampled from the target density.

The sampled points will be independent from the exact target density.

References

W. R. Gilks, P. Wild (1992), "Adaptive Rejection Sampling for Gibbs Sampling," Applied Statistics, Vol. 41, Issue 2, 337-348.

top related