1 particle filters 大连理工大学 金乃高 2009-01-03. 2 introduction sequential monte carlo...
TRANSCRIPT
1
Particle Filters
大连理工大学 金乃高 2009-01-03
2
Introduction
Sequential Monte Carlo Methods in Practice, Springer-Verlag,2001IEEE Transactions on Signal Processing
Special issue on Monte Carlo Methods for Statistical Signal Processing
2002,50(2)Proceedings of the IEEE, Special Issue on Sequential State Estimation,
2004,92(3)Beyond the Kalman Filter: Particle Filters for Tracking Applications, Artech House Publishers, 2004.
3
Monte Carlo Method
Ulam von Neumann Metropolis Fermi
4
Buffon 投针实验
2L N
D M
N: 投针次数M: 与平行线相交次数D: 间距 L: 针长度
1
2
4S
S
5
Monte Carlo Methods
Important Sampling Rejection SamplingMetropolis-HastingsGibbs 采样
6
Sequential Monte Carlo
Bootstrap filtering (Gordon 1993)
Condensation algorithm(Isard and Blake 1996 )
Particle filtering (Doucet 2001)
7
Sequential Monte Carlo
20 世纪 50 年代, Hammersley 便采用基于序贯重要性采样 (Sequential importance sampling,SIS) 的蒙特卡洛方法解决统计学问题。
20 世纪 60 年代后期, Handschin 与 Mayne 使用序贯蒙特卡洛方法解决自动控制领域的相关问题。
20 世纪 70 年代, Handschin 、 Akashi,Zaritskii
8
Sequential Monte Carlo
Tanizaki 、 Geweke 等采用基于重要性采样的蒙特卡洛方法成功解决了一系列高维积分问题。
Smith 与 Gelfand 提出的采样 - 重采样思想为 Bayesian推理提供了一种易于实现的计算策略。
Smith 与 Gordon 等人合作,于 20 世纪 90 年代初将重采样 (Resampling) 步骤引入到粒子滤波中 , 提出Bootstrap 滤波算法。
美国海军集成水下监控系统中的 Nodestar 便是粒子滤波应用的一个实例
9
Applications of Particle Filters
Navigation, Positioning, Tracking
Channel equalization
10
Fundamental Concepts
Bayesian inference Monte Carlo Simulation Sequential Importance Sampling Resampling
11
Bayesian Inference
X is unknown-a random variable or set (vector) of random variables
Y is observed-also a set of random variables We wish to infer X by observing Y. The probability distribution p(x) models our
prior knowledge of X. The conditional probability distribution p(Y|X)
models the relationship between Y and X.
12
Bayesian Filtering
General problem statement
13
State Space Formulation
14
Bayes Theorem
The conditional distribution p(x|y) represents posterior information about x given y.
( | ) ( )( | )
( )
p y x p xp x y
p y=
15
Recursive Bayesian Estimation
16
Recursive Bayesian Estimation
17
Monte Carlo Sampling
State space model
Solution Problem
Estimate posterior
Difficult to draw samples
Integrals are not tractable
Monte Carlo Sampling
ImportanceSampling
18
Monte Carlo Simulation
The posterior distribution p(x|y) may be difficult or impossible to compute in closed form.
An alternative is to represent p(x|y) using Monte Carlo samples (particles):– Each particle has a value and a weight
x
x
19
Monte Carlo Simulation
20
Importance Sampling
Ideally, the particles would represent samples drawn from the distribution p(x|y).– In practice, we usually cannot get p(x|y) in closed
form; in any case, it would usually be difficult to draw samples from p(x|y).
重要性采样引入一个已知的易于采样的期望分布,权值用来描述期望分布与实际后验分布的差异。重要性采样是蒙特卡罗积分中的一种方差缩减策略,在贝叶斯滤波中,我们可以将重要性函数看成对后验概率密度函数的加权近似。
21
Importance Sampling
22
Importance Sampling
23
Importance Sampling
24
Importance Sampling
25
Sequential Importance Sampling
粒子权值的递归形式可以表示为
( )( ) 0:
( )0:
( | )
( | )
ii k kk i
k k
p x Yw
q x Y
( ) ( ) ( ) ( )1 0: 1 1
( ) ( ) ( )0: 1 0: 1 1
( | ) ( | ) ( | )
( | , ) ( | )
i i i ik k k k k k
i i ik k k k k
p y x p x x p x Y
q x x Y q x Y
( ) ( ) ( )( ) 1
1 ( ) ( )0: 1
( | ) ( | )
( | , )
i i ii k k k kk i i
k k k
p y x p x xw
q x x Y
26
Resampling
我们希望经过若干次迭代,方差趋近于零以得到正确的估计。然而在 SIS 算法中的方差随着时间增加,产生权值退化现象。
1993 年 Gordon提出重采样的思想克服了这个问题,推广了粒子滤波技术的应用范围。
重采样的基本思想是舍弃权值较小的、肯定不感兴趣的粒子,代之以较大的权值的粒子。
27
Resampling
In inference problems, most weights tend to zero except a few (from particles that closely match observations), which become large.
We resample to concentrate particles in regions where p(x|y) is larger.
x
x
28
Resampling
破坏算法的并行性 粒子差异性丧失解决方案 增加粒子数 重采样之后加入随机噪声 Markov chain Monte Carlo 移动 核平滑 : 核函数代替狄拉克函数
29
粒子滤波示意图
30
Variations
Use a different importance distribution Use a different resampling technique:
– Resampling adds variance to the estimate; several resampling techniques are available that minimize this added variance.
– Our simple resampling leaves several particles with the same value; methods for spreading them are available.
31
Variations
Reduce the resampling frequency:– Our implementation resamples after every
observation, which may add unneeded variance to the estimate.
– Alternatively, one can resample only when the particle weights warrant it. This can be determined by the effective sample size.
( ) 2
1
1ˆ
( )eff N
ik
i
Nw
32
Rao-Blackwellization
Rao-Blackwellization:– Some components of the model may have linear
dynamics and can be well estimated using a conventional Kalman filter.
– The Kalman filter/Extended Kalman filter/Unscented Kalman filter/ Gauss-hermit filter is combined with a particle filter to reduce the number of particles needed to obtain a given level of performance.
33
Advantages of Particle Filters
Under general conditions, the particle filter estimate becomes asymptotically optimal as the number of particles goes to infinity.
Non-linear, non-Gaussian state update and observation equations can be used.
Multi-modal distributions are not a problem.
34
Disadvantages of Particle Filters
Naïve formulations of problems usually result in significant computation times.
The Number of particles. The best importance distribution and/or
resampling methods may be very problem specific.
35
Conclusions
Particle filter is a tractable exercise for previously difficult or impossible problems.
36
综述文章
M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp,
A tutorial on particle filters for online nonlinear/non-gaussian Bayesian tracking, IEEE Transactions on Signal Processing, 2002 ,50(2)174-188
37
相关网站
Google Sequential Monte Carlo
http://www-sigproc.eng.cam.ac.uk/smc/papers.html
38
谢谢大家!