sequential monte carlo methods
Post on 05-Apr-2018
231 Views
Preview:
TRANSCRIPT
-
7/31/2019 Sequential Monte Carlo Methods
1/44
Sequential Monte Carlo Methods
Shashidhar
School of MarineNWPU
-
7/31/2019 Sequential Monte Carlo Methods
2/44
Overview
We are faced with many problems involving large,sequentially evolving datasets:tracking, computer vision, speech and audio,
robotics, ....
We wish to form models and algorithms forBayesian sequential updating of probabilitydistributions as data evolve.
Here we consider the Sequential Monte Carlo (SMC),or `particle filtering' methodology
-
7/31/2019 Sequential Monte Carlo Methods
3/44
In many applications it is required to estimate the `state' of the system from noisy, convolved or non-linearly distortedobservations. Since data also arrive sequentially in manyapplications it is therefore desirable to estimate the state on- line , in order to avoid memory storage of huge datasets andto make inferences and decisions in real time. Some typicalapplications from the engineering perspective include:
Tracking for radar and sonar applications
Real-time enhancement of speech and audio signals
Sequence and channel estimation in digital communicationschannels
Medical monitoring of patient eeg/ecg signals
Image sequence tracking
-
7/31/2019 Sequential Monte Carlo Methods
4/44
Contents Bayes' Theorem
Monte Carlo methods
Sampling Techniques
Monte Carlo Markov Chain
Importance Sampling
State-Space System Sequential Importance Sampling (SIS)
Sequential Importance Resampling (SIR)
-
7/31/2019 Sequential Monte Carlo Methods
5/44
Bayesian Inference
Belief Before
Data
Belief After
-
7/31/2019 Sequential Monte Carlo Methods
6/44
Prior Belief
Hunter sees Cat from far
Hunter goes near and learns
Hunter decides its a Tiger
-
7/31/2019 Sequential Monte Carlo Methods
7/44
Bayesian Signal Processing (BSP):Estimation of Probability Distribution of random signal in
order to perform statistical interferences.
Observation: Y Quantity of Interest: X
Pr X Y ( )( )
Pr X Y Pr Pr( )
Posteriori
Distribution
Prior Distribution Likelihood
Evidence/ Normalizing Factor
Posteriori Distribution Likelihood Prior Distribution
-
7/31/2019 Sequential Monte Carlo Methods
8/44
Model
Belief Before = Pr(X)
Prior Distribution of
X
Belief After = Pr(X|Y)
Posteriori Distribution of X given Y
Data = Y
Likelihood of X given Y
Prior Pr(X)
Likelihood Pr(Y|X)
Posteriori Pr(X|Y)
-
7/31/2019 Sequential Monte Carlo Methods
9/44
X-(random parameter)
P r o b
( X )
Estimated distributions
Prior Pr(X)
Posteriori Pr(X|Y)
-
7/31/2019 Sequential Monte Carlo Methods
10/44
Why Monte Carlo ????
Monte Carlo method efficient in picking up randomsamples from regions of high concentration (Probability)
As the grids increase computationally
complex
-
7/31/2019 Sequential Monte Carlo Methods
11/44
In signal processing we are often interested in statisticalmeasure of a random signal or parameters in terms of moments.
Pr Instead of using direct numerical integration. We use Monte Carlo integration as an alternative.
MC integration draws random samples from the priordistribution. MC forms the sample averages to approximate the
posterior distribution.
Empirical Distribution: Pr ( (= which is a probability mass distribution with weights 1/ N andrandom variable (Sample) X( i )
-
7/31/2019 Sequential Monte Carlo Methods
12/44
Substituting the empirical distribution into integral gives
Pr 1 =
1=
Here is said to be Monte Carlo estimate of Take Pr = Gamma(4,1) Generate some random samples Plot histogram and basic approximation to PDF
0 2 4 6 8 10 12 14 16 18 200
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0 2 4 6 8 10 12 14 16 18 200
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0.45
N = 200 N = 500
-
7/31/2019 Sequential Monte Carlo Methods
13/44
0 2 4 6 8 10 12 14 16 18 200
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0 5 10 15 20 250
0.05
0.1
0.15
0.2
0.25
N = 1000 N = 5000
0 5 10 15 20 250
0.05
0.1
0.15
0.2
0.25
N = 10000
-
7/31/2019 Sequential Monte Carlo Methods
14/44
Integrals in Probabilistic Inference
Normalization: Pr Pr Pr( )Pr Pr
Marginalization: Pr Pr ,
Expectation:
Pr
Nasty Integrals
-
7/31/2019 Sequential Monte Carlo Methods
15/44
Monte Carlo Integration
Suppose we want to compute:
I Pr 1) Simulate ( )| =from Pr
2) Replace Nasty Integral with simply sum:
I (( ))=
Approximation of Pr
Pr ( ( ))=
( )
Cannot directly sample from Pr
-
7/31/2019 Sequential Monte Carlo Methods
16/44
Monte Carlo Integration formallyThe idea of Monte Carlo simulation is to draw an i.i.d set of
samples *( )+=from a target density Pr defined on ahigh-dimensional space . These N samples can be used toapproximate the target distribution with the following
empirical point-mass function (think of it as a histogram):
Pr ( )= where ( )denotes the delta-Dirac mass located at .
-
7/31/2019 Sequential Monte Carlo Methods
17/44
Summery on MC
MC Method is a powerful means for generating random samples used in estimating conditionaland marginal probability distribution
The efficiency of MC Method increases as theproblem dimensionality increases
-
7/31/2019 Sequential Monte Carlo Methods
18/44
Sampling Techniques
Uniform Sampling Rejection Sampling Metropolis Sampling
Metropolis Hastings Sampling Random walk Metropolis Hasting Sampling Importance Sampling Gibbs Sampling Slice Sampling
-
7/31/2019 Sequential Monte Carlo Methods
19/44
Rejection SamplingSet 1 for Generate a sample:
Generate a uniform sample: (0,1) ACCEPT the sample: = if ( ) ( ) Otherwise, REJECT the sample and generate the next trail sample:
end Sampling PDF: ( )
Target PDF: Pr( )
REJECT
ACCEPT
-
7/31/2019 Sequential Monte Carlo Methods
20/44
-
7/31/2019 Sequential Monte Carlo Methods
21/44
Markov Chain Monte Carlo (MCMC)
MCMC : is basically Monte Carlo integration where therandom samples are produced using Markov Chain
Markov Chain : is a discrete random process
possessing the property that the conditionaldistribution at the present sample (given all of thepast samples) depends only on the previous samplesi.e.
Pr | Pr( ( )| 1
Pr | Pr( ( )| 1 )
Markov Chain
-
7/31/2019 Sequential Monte Carlo Methods
22/44
The most powerful and efficient MCMC methods: Metropolis Hastings Sampling Gibbs Sampling
Markov Chain simulation is essentially a generaltechnique based on generating samples fromproposal distribution and then correcting(ACCEPTING or REJECTING ) those samples toapproximate a target posterior distribution.
-
7/31/2019 Sequential Monte Carlo Methods
23/44
Metropolis Sampling Initialize: p Generate a candidate sample from proposal: Calculate the acceptance probability:
( , ) minp( )p( )
,1 ACCEPT candidate sample with probability, ( ,
according to:
p> p( )
Disadvantage : Proposal distribution should besymmetric
Prob{ NEW_STATE} > Prob{ OLD_STATE} ACCEPT
-
7/31/2019 Sequential Monte Carlo Methods
24/44
Metropolis Hastings Sampling The Metropolis Hastings ( M-H ) technique
defines a Markov chain such that a new sample is generated from previous samples, , by firstdrawing a candidate sample, from aproposal distribution,
( )and then making a
decision whether this candidate should beaccepted and retained or rejected and discardedusing the previous sample as the new
If accepted, replaces ( ) otherwise theold sample is saved ( )
Can take care of asymmetric distributions
-
7/31/2019 Sequential Monte Carlo Methods
25/44
Metropolis Hastings Sampling Algorithm
prob( NEW_STATE )
prob( OLD_STATE )
-
7/31/2019 Sequential Monte Carlo Methods
26/44
Importance Sampling One way to mitigate difficulties with the inability
to directly sample from target (Posterior)distribution is based on the concept of Importance Sampling
Importance Sampling : method to computeexpectations with respect to one distributionusing random samples drawn from another.
ProposalDistribution
Draw random samples usingMC
TargetDistribution
( )
-
7/31/2019 Sequential Monte Carlo Methods
27/44
( )( )
( )is the Importance sampling distributionThe integral shown above can be estimated by: Drawing N -samples from : ~ and
( ( ))=
Computing the sample mean
( )( ) 1
=1 ( )
( )=
( ( ))=
-
7/31/2019 Sequential Monte Carlo Methods
28/44
Region of high density Large weight of particles
Discrete approximation of posterior distribution usingImportance Sampling:
Pr ( ( ))=
Sample-based PDF Representation
-
7/31/2019 Sequential Monte Carlo Methods
29/44
Sequential Importance Sampling
Likelihood Prior
Proposal Dist Sample
-
7/31/2019 Sequential Monte Carlo Methods
30/44
The Space State System
State Transition Equation:
, , , current and previous state (Velocity, Altitude,Acceleration) (., ., .) Known evolution function (possibly Non- Linear )
State noise (usually non-Gaussian) Known input
Ex: Velocity, Acceleration, Altitude
-
7/31/2019 Sequential Monte Carlo Methods
31/44
Measurement Equation: , , Current measurement
Current state(., ., .) Known measurement function (Possibly non-
linear)Known inputMeasurement noise (Usually non-Gaussian)
-
7/31/2019 Sequential Monte Carlo Methods
32/44
Need for Particle Filter Kalman filters, Extended Kalman filters, Unscented
Kalman filters can only deal with linear, unimodaldistributions.
KF, EKF, UKF considers conditional mean and
covariance to characterize Gaussian posterior . These techniques try to linearize the non-linearityto certain degree.
Particle filters can characterize multimodaldistributions and handle Non-linear stateestimations.
Particle filters are sequential MC methodology
-
7/31/2019 Sequential Monte Carlo Methods
33/44
Particle Filters Particle Filtering is a sequential Monte Carlo method employing the
sequential of relevant probability distributions using importance sampling
Particles Point Mass
Weights Prob Mass
-
7/31/2019 Sequential Monte Carlo Methods
34/44
Sampling Importance Sampling
-
7/31/2019 Sequential Monte Carlo Methods
35/44
-
7/31/2019 Sequential Monte Carlo Methods
36/44
Visualization of SIS
,1
( )
-
7/31/2019 Sequential Monte Carlo Methods
37/44
Degeneracy Problem One of the major problem with importance sampling is the
degeneracy of particles. After few iterations, the variance of the importance
weights increases thereby making it impossible to avoidweight degradation.
l
-
7/31/2019 Sequential Monte Carlo Methods
38/44
Resampling Eliminate particles with small importance weights
Concentrate on particles with large weights ,1
( , ) ,1
,1
( , )
unweighted measure
ompute importanceweights
resampling
move particles
predict
S li I t R li
-
7/31/2019 Sequential Monte Carlo Methods
39/44
Sampling Importance Resampling
-
7/31/2019 Sequential Monte Carlo Methods
40/44
-
7/31/2019 Sequential Monte Carlo Methods
41/44
-
7/31/2019 Sequential Monte Carlo Methods
42/44
-
7/31/2019 Sequential Monte Carlo Methods
43/44
Comparison of KF, EKF, UKF and PF
KF
EKF
UKF
PF
A c c u r a
c y
Complexity
KF EKF UKF PF
A c c u r a
c y
Complexity
Non Linear or Non- Gaussian System
Linear, Gaussian System
-
7/31/2019 Sequential Monte Carlo Methods
44/44
Thank YouOne must learn by doing the thing;
for though you think you know it You have no certainty, until you try.
Sophocles, Trachiniae
top related