introduction of markov chain monte carlo jeongkyun lee

15
Introduction of Markov Chain Monte Carlo Jeongkyun Lee

Upload: sydney-terry

Post on 16-Dec-2015

232 views

Category:

Documents


9 download

TRANSCRIPT

Page 1: Introduction of Markov Chain Monte Carlo Jeongkyun Lee

Introduction of Markov Chain Monte Carlo

Jeongkyun Lee

Page 2: Introduction of Markov Chain Monte Carlo Jeongkyun Lee

2

Usage Why MCMC is called MCMC MCMC methods Appendix Reference

Contents

Page 3: Introduction of Markov Chain Monte Carlo Jeongkyun Lee

3

Goal : 1) Estimate an unknown target distri-bution (or posterior) for a complex function, or 2) draw samples from the distribution.

1. Simulation Draw samples from a probability governed by a system.

2. Integration / computing Integrate or compute a high dimensional function

3. Optimization / Bayesian inference Ex. Simulated annealing, MCMC-based particle filter

4. Learning MLE learning, unsupervised learning

Usage

Page 4: Introduction of Markov Chain Monte Carlo Jeongkyun Lee

4

1. Markov Chain Markov process

For a random variable at time , the transition probabilities between different values depend only on the random variable’s current state,

Why MCMC is called MCMC

Page 5: Introduction of Markov Chain Monte Carlo Jeongkyun Lee

5

2. Monte Carlo integration To compute a complex integral, use random number generation to

compute the integral.

Ex. Compute the pi.

Why MCMC is called MCMC

Page 6: Introduction of Markov Chain Monte Carlo Jeongkyun Lee

6

3. Markov Chain Monte Carlo Construct a Markov Chain representing a target distribution. http://www.kev-smith.com/tutorial/flash/markov_chain.swf

Why MCMC is called MCMC

𝑥𝑡0 𝑥𝑡

1 𝑥𝑡2 𝑥𝑡

𝑛…

𝑥𝑡0𝑥𝑡

1 𝑥𝑡2𝑥𝑡

𝑛𝑥𝑡0𝑥𝑡1 𝑥𝑡

2

𝑥𝑡𝑛

𝑥𝑡?𝑥𝑡?

𝑥𝑡?𝑥𝑡?𝑥𝑡?𝑥𝑡?

𝑥𝑡?𝑥𝑡?𝑥𝑡?

𝑥𝑡?

Page 7: Introduction of Markov Chain Monte Carlo Jeongkyun Lee

7

1. Metropolis / Metropolis-Hastings algo-rithms Draw samples from a distribution , where is a normalizing constant. http://www.kev-smith.com/tutorial/flash/MH.swf

MCMC Methods

Initial value satisfying

Sample a candidate value from a proposal

distribution

With the probability ,Accept or

reject

Given the candidate calculate a probability

Metropolis Metropolis-Hastings

𝑞 (𝜃1 ,𝜃2 )=𝑞 (𝜃2 ,𝜃1) is not symmetric

: a probability of a move

times

Page 8: Introduction of Markov Chain Monte Carlo Jeongkyun Lee

8

1. Metropolis / Metropolis-Hastings algo-rithms Iterated times. Burn-in period: the period that chain approaches its stationary distri-

bution. Compute only the samples after the burn-in period, avoiding the ap-

proximation biased by starting position. http://www.kev-smith.com/tutorial/flash/burnin.swf

MCMC Methods

Page 9: Introduction of Markov Chain Monte Carlo Jeongkyun Lee

9

2. Gibbs Sampling A special case of MH algorithm () Draw samples for random variables sequentially from univariate con-

ditional distributions.i.e. the value of -th variable is drawn from the distribution , where represents the values of the variables except for the -th variable.

MCMC Methods

Page 10: Introduction of Markov Chain Monte Carlo Jeongkyun Lee

10

3. Reversible Jump(or trans-dimensional) MCMC When the dimension of the state is changed, Additionally consider a move type.

MCMC Methods

Page 11: Introduction of Markov Chain Monte Carlo Jeongkyun Lee

11

1. Markov Chain property Stationary distribution

(or detailed balance) Irreducible (all pi > 0) Aperiodic

Appendix

Page 12: Introduction of Markov Chain Monte Carlo Jeongkyun Lee

12

2. MH sampling as a Markov Chain The transition probability kernel in the MH algorithm

Thus, if the MH kernel satisfies

then the stationary distribution from this kernel corresponds to draws from the target distribution.

Appendix

Page 13: Introduction of Markov Chain Monte Carlo Jeongkyun Lee

13

2. MH sampling as a Markov Chain

Appendix

Page 14: Introduction of Markov Chain Monte Carlo Jeongkyun Lee

14

http://vcla.stat.ucla.edu/old/MCMC/MCMC_tutorial.htm http://www.kev-smith.com/tutorial/rjmcmc.php http://www.cs.bris.ac.uk/~damen/MCMCTutorial.htm B. Walsh, “Markov Chain Monte Carlo and Gibbs Sampling”,

Lecture Notes, MIT, 2004

Reference

Page 15: Introduction of Markov Chain Monte Carlo Jeongkyun Lee

15

Thank you!