kalman and kalman bucy @ 50: distributed and intermittency

Post on 18-Jan-2016

76 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

DESCRIPTION

Kalman and Kalman Bucy @ 50: Distributed and Intermittency. José M. F. Moura Joint Work with Soummya Kar Advanced Network Colloquium University of Maryland College Park, MD November 04, 2011. Acknowledgements: NSF under grants CCF-1011903 and CCF-1018509, and AFOSR grant FA95501010291. - PowerPoint PPT Presentation

TRANSCRIPT

Carnegie Mellon

Kalman and Kalman Bucy @ 50: Distributed and Intermittency

José M. F. MouraJoint Work with Soummya Kar

Advanced Network ColloquiumUniversity of Maryland

College Park, MDNovember 04, 2011

Acknowledgements: NSF under grants CCF-1011903 and CCF-1018509, and AFOSR grant FA95501010291

Carnegie Mellon

Outline Brief Historical Comments: From Kolmogorov to Kalman-Bucy Filtering Then … Filtering Today Consensus: Distributed Averaging in Random Environments Distributed Filtering: Consensus + innovations

Random field (parameter) estimation: Large scale Intermittency: Infrastructure failures, Sensor failures Random protocols: Gossip Limited Resources: Quantization

Linear Parameter Estimator: Mixed time scale Linear filtering: Intermittency – Random Riccati Eqn.

Stochastic boundedness Invariant distribution Moderate deviation

Conclusion

Carnegie Mellon

Outline Brief Historical Comments: From Kolmogorov to Kalman-Bucy Filtering Then … Filtering Today Consensus: Distributed Averaging in Random Environments Distributed Filtering: Consensus + innovations

Random field (parameter) estimation: Large scale Intermittency: Infrastructure failures, Sensor failures Random protocols: Gossip Limited Resources: Quantization

Linear Parameter Estimator: Mixed time scale Linear filtering: Intermittency – Random Riccati Eqn.

Stochastic boundedness Invariant distribution Moderate deviation

Conclusion

Carnegie Mellon

In the 40’s

Wiener Model

Wiener filter

Wiener-Hopf equation (1931; 1942)

1939-41: A. N. Kolmogorov, "Interpolation und Extrapolation von Stationaren Zufalligen Folgen,“ Bull. Acad. Sci. USSR, 1941

Dec 1940: anti-aircraft control pr.–extract signal from noise: N. Wiener "Extrap., Interp., and Smoothing of Stat. time Series with Eng. Applications," 1942; declassified, published Wiley, NY, 1949.

Carnegie MellonNorbert WIENER. The extrapolation, interpolation and smoothing of stationary time series with engineering applications. [Washington, D.C.: National Defense Research Council,] 1942.

Carnegie Mellon

Kalman Filter @ 51

Trans. of the ASME-J. of Basic Eng., 82 (Series D): 35-45, March 1960

Carnegie Mellon

Kalman-Bucy Filter @ 50

Transactions of the ASME-Journal of Basic Eng., 83 (Series D): 95-108, March 1961

Carnegie Mellon

Outline Brief Historical Comments: From Kolmogorov to Kalman-Bucy Filtering Then … Filtering Today Consensus: Distributed Averaging in Random Environments Distributed Filtering: Consensus + innovations

Random field (parameter) estimation: Large scale Intermittency: Infrastructure failures, Sensor failures Random protocols: Gossip Limited Resources: Quantization

Linear Parameter Estimator: Mixed time scale Linear filtering: Intermittency – Random Riccati Eqn.

Stochastic boundedness Invariant distribution Moderate deviation

Conclusion

Carnegie Mellon

Filtering Then …

Centralized

Measurements always available (not lost) Optimality: structural conditions – observability/controllability Applications: Guidance, chemical plants, noisy images, …

“Kalman Gain”

“Innovations”

“Prediction”

Carnegie Mellon

Filtering Today: Distributed Solution Local communications

Agents communicate with neighbors No central collection of data

Cooperative solution In isolation: myopic view and knowledge Cooperation: better understanding/global knowledge

Iterative solution Realistic Problem: Intermittency

Sensors fail Local communication channels fail

Limited resources: Noisy sensors Noisy communications Limited bandwidth (quantized communications)

Optimality: Asymptotically Convergence rate

Structural Random Failures

Carnegie Mellon

Outline Brief Historical Comments: From Kolmogorov to Kalman-Bucy Filtering Then … Filtering Today Consensus: Distributed Averaging

Standard consensus Consensus in random environments

Distributed Filtering: Consensus + innovations Random field (parameter) estimation Realistic large scale problem:

Intermittency: Infrastructure failures, Sensor failures Random protocols: Gossip Limited Resources: Quantization

Two Linear Estimators: LU: Stochastic Approximation GLU: Mixed time scale estimator

Performance Analysis: Asymptotics Conclusion

Carnegie Mellon

Consensus: Distributed Averaging Network of (cooperating) agents updating their beliefs:

(Distributed) Consensus:

Asymptotic agreement: λ2 (L) > 0

DeGroot, JASA 74; Tsitsiklis, 74, Tsitsiklis, Bertsekas, Athans, IEEE T-AC 1986Jadbabaie, Lin, Morse, IEEE T-AC 2003

Carnegie Mellon

Consensus: random links, comm. or quant. noise

Consensus (reinterpreted): a.s. convergence to unbiased rv θ:

Consensus in Random Environments

Xiao, Boyd, Sys Ct L., 04, Olfati-Saber, ACC 05, Kar, Moura, Allerton 06, T-SP 10, Jakovetic, Xavier, Moura, T-SP, 10, Boyd, Ghosh, Prabhakar, Shah, T-IT, 06

Var µ · 2M ¾2(1¡ p)N 2

Pi ¸ 0 ®(i)2

Carnegie Mellon

Outline Brief Historical Comments: From Kolmogorov to Kalman-Bucy Filtering Then … Filtering Today Consensus: Distributed Averaging in Random Environments Distributed Filtering: Consensus + innovations

Random field (parameter) estimation: Large scale Intermittency: Infrastructure failures, Sensor failures Random protocols: Gossip Limited Resources: Quantization

Linear Parameter Estimator: Mixed time scale Linear filtering: Intermittency – Random Riccati Eqn.

Stochastic boundedness Invariant distribution Moderate deviation

Conclusion

Carnegie Mellon

In/Out Network Time Scale Interactions

Consensus : In network dominated interactions fast comm. (cooperation) vs slow sensing (exogenous, local)

Consensus + innovations: In and Out balanced interactions communications and sensing at every time step

Distributed filtering: Consensus +Innovations

ζcomm ζsensing

ζcomm « ζsensing

time scale

ζcomm ~ ζsensing

time scale

Carnegie Mellon

Filtering: Random Field

Random field: Network of agents: each agent observes:

Intermittency: sensors fail at random times

Structural failures (random links)/ random protocol (gossip):

Quantization/communication noise

spatially correlated, temporally iid,

Carnegie Mellon

Consensus+Innovations: Generalized Lin. Unbiased

Distributed inference: Generalized linear unbiased (GLU)

Consensus: local avg “Innovations”“Prediction”

“Kalman Gain”

GainInnovations

WeightsConsensus

Weights

¯(i) > 0;P

i ¸ 0 ¯(i) = 1 ;P

i ¸ 0 ¯2(i) < 1

Carnegie Mellon

Consensus+Innovations: Asymptotic Properties

Properties Asymptotic unbiasedness, consistency, MS convergence, As. Normality Compare distributed to centralized performance

Distributed observability condition: Matrix G is full rank

Distributed connectivity: Network connected in the mean

Structural conditions

Carnegie Mellon

Consensus+Innovations: GLU Observation:

Assumptions: iid, spatially correlated, L(i) iid, independent Distributed observable + connected on average

Estimator:

A6. assumption: Weight sequences

Soummya Kar, José M. F. Moura, IEEE J. Selected Topics in Sig. Pr., Aug2011.

f»n(i)g Eµjj»(i)jj2+²1 < 1

Carnegie Mellon

Consensus+Innovations: GLU Properties

A1-A6 hold, , generic noise distribution (finite 2nd moment)0· °0 < :5

Pµ¤ (limi ! 1 xn(i) = µ¤) = 1; 8n

Consistency: sensor n is consistent

Asymptotically normality:

Asymptotic variance matches that of centralized estimator

Efficiency: Further, if noise is Gauss, GLU estimator is asymptotically efficient

Carnegie Mellon

Consensus+Innovations: Remarks on Proofs

Define Let Find dynamic equation for Show is nonnegative supermartingale, converges a.s.,

hence pathwise bounded (this would show consistency) Strong convergence rates: study sample paths more critically

Characterize information flow (consensus): study convergence to averaged estimate

Study limiting properties of averaged estimate: Rate at which convergence of averaged estimate to centralized estimate Properties of centralized estimator used to show convergence to

Carnegie Mellon

Outline Intermittency: networked systems, packet loss Random Riccati Equation: stochastic Boundedness Random Riccati Equation: Invariant distribution Random Riccati Equation: Moderate deviation principle

Rate of decay of probability of rare events Scalar numerical example Conclusions

Carnegie Mellon

Kalman Filtering with Intermittent Observations Model:

Intermittent observations:

Optimal Linear Filter (conditioned on path of observations) – Kalman filter with Random Riccati Equation

xt+1 = Axt +wt

yt = Cxt +vt

Pt = Eh¡

xt ¡ bxtjt¡ 1¢¡

xt ¡ bxtjt¡ 1¢T

j f ey(s)g0· s<t

i

Pt+1 = APtAT +Q ¡ °tAPtCT ¡CPtCT +R

¢¡ 1CPtAT

Carnegie Mellon

Outline Intermittency: networked systems, packet loss Random Riccati Equation: stochastic Boundedness Random Riccati Equation: Invariant distribution Random Riccati Equation: Moderate deviation principle

Rate of decay of probability of rare events Scalar numerical example Conclusions

Carnegie Mellon

Random Riccati Equation (RRE) Sequence is random

Define operators f0(X), f1(X) and reexpress Pt:

f Ptgt2T+

[2] S. Kar, Bruno Sinopoli and J.M.F. Moura, “Kalman filtering with intermittent observations: weak convergence to a stationary distribution,” IEEE Tr. Aut Cr, Jan 2012.

Carnegie Mellon

Outline Intermittency: networked systems, packet loss Random Riccati Equation: stochastic Boundedness Random Riccati Equation: Invariant distribution Random Riccati Equation: Moderate deviation principle

Rate of decay of probability of rare events Scalar numerical example Conclusions

Carnegie Mellon

Random Riccati Equation: Invariant Distribution

Stochastic Boundedness:°sb = inf

n° 2 [0;1] : fPtgt2Z+

is s.b.; 8P0 2 SN+

o

supp¡¹ °¢= cl(S)

¹ ° ¡©Y 2 SN

+ j Y º P ¤ª¢= 1

Carnegie Mellon

Moderate Deviation Principle (MDP) Interested in probability of rare events:

As ϒ 1: rare event: steady state cov. stays away from P* (det. Riccati) RRE satisfies an MDP at a given scale:

Pr(rare event) decays exponentially fast with good rate function String:

Counting numbers of

String (f 0; f 1; f 1; f 1; f 0; f 0;P0) written concisely¡f 0; f 3

1 ; f20 ;P0

¢

Soummya Kar and José M. F. Moura, “Kalman Filtering with Intermittent Observations: Weak Convergence and Moderate Deviations,” IEEE Tr. Automatic Control;

Carnegie Mellon

MDP for Random Riccati Equation

P*

Soummya Kar and José M. F. Moura, “Kalman Filtering with Intermittent Observations: Weak Convergence and Moderate Deviations,” IEEE Tr. Automatic Control

Carnegie Mellon

Outline Intermittency: networked systems, packet loss Random Riccati Equation: stochastic Boundedness Random Riccati Equation: Invariant distribution Random Riccati Equation: Moderate deviation principle

Rate of decay of probability of rare events Scalar numerical example Conclusions

Carnegie Mellon

Support of the Measure Example: scalar

Lyapunov/Riccati operators:

Support is independent of

0< ° < 1

Carnegie Mellon

Self-Similarity of Support of Invariant Measure ‘Fractal like’:

Carnegie Mellon

Class A Systems: MDP

Scalar system

Define

Carnegie Mellon

MDP: Scalar Example Scalar system:

Soummya Kar and José M. F. Moura, “Kalman Filtering with Intermittent Observations: Weak Convergence and Moderate Deviations,” accepted EEE Tr. Automatic Control

Carnegie Mellon

Outline Intermittency: networked systems, packet loss Random Riccati Equation: stochastic Boundedness Random Riccati Equation: Invariant distribution Random Riccati Equation: Moderate deviation principle

Rate of decay of probability of rare events Scalar numerical example Conclusions

Carnegie Mellon

Conclusion Filtering 50 years after Kalman and Kalman-Bucy: Consensus+innovations: Large scale distributed networked agents

Intermittency: sensors fail; comm links fail Gossip: random protocol Limited power: quantization Observ. Noise

Linear estimators: Interleave consensus and innovations Single scale: stochastic approximation Mixed scale: can optimize rate of convergence and limiting covariance

Structural conditions: distributed observability+ mean connectivitiy Asymptotic properties: Distributed as Good as Centralized

unbiased, consistent, normal, mixed scale converges to optimal centralized

Carnegie Mellon

Conclusion Intermittency: packet loss Stochastically bounded as long as rate of measurements strictly

positive Random Riccati Equation: Probability measure of random

covariance is invariant to initial condition Support of invariant measure is ‘fractal like’ Moderate Deviation Principle: rate of decay of probability of

‘bad’ (rare) events as rate of measurements grows to 1

All is computable

P*

Carnegie Mellon

Thanks

Questions?

top related