the relative entropy rate of two hidden markov processes or zuk dept. of phys. of comp. systems...

25
. The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

Upload: isaac-martin

Post on 04-Jan-2016

220 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

.

The Relative Entropy Rate of Two Hidden Markov

Processes

Or Zuk

Dept. of Phys. Of Comp. Systems

Weizmann Inst. Of Science

Rehovot, Israel

Page 2: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

2

Overview

Introduction Distance Measures and Relative Entropy rate Results: Generalization from Entropy Rate. Future Directions

Page 3: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

3

Introduction

Hidden Markov Processes are relevant: Error Correction (Markovian source +noise) Signal Processing, Speech recognition Experimental physics -telegraph noise, TLS+noise,

quantum jumps. Bioinformatics -biological sequences, gene

expression

t

Transmission

Noise 10%

Markov chain HMP47000 48000 49000

0

50

100

co

un

ts

time (ms)

Quantum jumps

0 200 400 600 800 1000

7.0

8.0

9.0

R (

Mo

hm

)

Mesoscopic wires

Page 4: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

4

HMP - Definitions

Markov Process:

X – Markov Process

Mλ – Transition Matrix

mλ(i,j) = Pr(Xn+1 = j| Xn = i)

Hidden Markov Process :Y – Noisy Observation of XRλ – Noise/Emission Matrix

rλ(i,j) = Pr(Yn = j| Xn = i)

RλRλ

Xn Xn+1

Yn+1Yn

Models are denoted by λ and µ.

Page 5: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

5

Example: Binary HMP

0 1

p(1|0)

p(0|1)

p(1|1)

p(0|0)

)1|1()1|0(

)0|1()0|0(

pp

pp

0 1

q(0|0) q(1|0)q(0|1)

q(1|1)

)1|1()1|0(

)0|1()0|0(

qq

qq

Transition Emission

Page 6: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

6

Example: Binary HMP (Cont.) A simple, Symmetric Binary HMP :

M = R =

All properties of the process depend on two parameters, p and . Assume w.l.og. p, < ½

pp

pp

1

1

1

1

Page 7: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

7

Overview

Introduction Distance Measures and Relative Entropy rate Results: Generalization from Entropy Rate. Future Directions

Page 8: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

8

Distance Measures for Two HMPs

Why important ? Often, one learns a HMP from data. It is important

to know how different is the learned model from the true model.

Sometimes, many HMPs may represent different sources (e.g. different authors, different protein families etc.), and we wish to know which sources are similar.

What distance measure to use? Look at joint distributions of N consecutive Y

symbols Pλ(N) and Pµ

(N) .

Page 9: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

9

Relative Entropy (RE) Rate

Notation : Relative Entropy for finite (N-symbol) distributions:

Take the limit to get the RE-rate:

jiiji YYYY ,...,,][ 1

Alternative definition, using conditional relative entropy:

Page 10: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

10

Relative Entropy (RE) Rate

First proposed for HMPs by [Juang&Rabiner 85]. Not a norm (not symmetric, no triangle inequality). Still it has several natural interpretations:

-If one generates data from λ, and gives likelihood score to µ, then D(λ || µ) is the average likelihood-loss per symbol (compared to the optimal model λ).

-If one compresses data generated λ, assuming erroneously it was generated by µ, then one ‘looses’ on average D(λ || µ) per symbol.

For Markov chains, D(λ || µ) is easily given by:

Page 11: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

11

Relative Entropy (RE) Rate

For HMPs, D(λ || µ) is difficult to compute. So far only bounds [Silva&Narayanan] or approximation algorithms

[Li et al. 05, Do 03, Mohammad&Tranter 05] are known. D(λ || µ) generalizes the concept of the Shannon entropy

rate, using:

H(λ) = log s – D(λ || u)

Where u is the uniform model, s is the alphabet size of Y. The entropy rate H for an HMP is a Lyapunov Exponent,

which is hard to compute generally. [Jacquet et al 04] What is known (for H) ? Lyapunov exponent representation,

analyticity, asymptotic expansions in different Regimes. Generalize results and techniques to the RE-rate.

Page 12: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

12

Why is calculating D(λ || µ) difficult?

Markov Chains:

-All states with the same no. of flips have the same prob.

Polynomial number of types (probs).

2N

XX

X

Y 2N

X

2N

Y

HMPs :Many Markov chains, {X} contributes to the same Y. Different {Y}s have different probs.

Exponential number of types (probs). Method of types does not work here.

Page 13: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

13

Overview

Introduction Distance Measures and Relative Entropy rate Results: Generalization from Entropy Rate. Future Directions

Page 14: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

14

RE-Rate and Lyapunov Exponents

What is Lyapunov exponent? Arises in Dynamical Systems, Control Theory, Statistical

Physics etc. Measures the stability of the system. Take two (square) matrices A,B. Choose each time at

random A (with prob. p) or B (w.p. 1-p). Look at the norm:

(1/N) log ||ABBBAABAB…BA||

The limit:

-Exists a.s. [Furstenberg&Kesten 60]

-Called Top Lyaponov Exponent.

-Independent of Matrix Norm chosen. HMP entropy rate is given as a Lyaponov Exponent

[Jacquet et al. 04]

Page 15: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

15

RE-Rate and Lyapunov Exponents

What about RE-rate? Given as the difference of two Lyapunov Exponents:

-The G’s are random matrices, which are simply obtained from M and R using the forward equations.

-Different matrices appear in the two Lyapunov exponents, but the probabilities selecting the matrices are the same.

Page 16: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

16

Analyticity of the RE-Rate

Is the RE-rate continuous, ‘smooth’, or even analytic in the parameters governing the HMPs?

For Lyapunov exponents: Known analyticity in the matrix entries [Rulle 79], and their probabilities [Peres 90,91] separately.

For HMP entropy rate, analyticity was recently shown by [Han&Marcus 05].

Page 17: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

17

Analyticity of the RE-Rate

Using both results, we are able to show:

Thm: The RE-rate is analytic in the HMPs parameters.

Analyticity is shown only in the interior of the parameters domain (i.e. strictly positive probabilities).

Behavior on the boundaries is more complicated. Sometimes analyticity remains on the boundaries (and beyond). Sometimes we encounter singularities. Full characterization is still lacking [Marcus&Han 05].

Page 18: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

18

RE-Rate Taylor Series Expansion

While in general the RE-rate is not known, there are specific parameters values for which it is easily given in closed-form (e.g. for Markov-Chains). Perhaps we can ‘expand’ around these values, and get asymptotic results near them.

Similar approach was used for Lyapunov exponents [Derrida], and for HMP entropy rate [Jacquet et al. 04, Weizmann&Ordenlich 04, Zuk et al. 05] giving first-order asymptotics in various regimes.

Page 19: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

19

Different Regimes – Binary Case

p -> 0 , p -> ½ ( fixed)

-> 0 , -> ½ (p fixed)

We concentrate on the ‘High-SNR regime’ -> 0, and

‘almost-memoryless regime’ p-> ½.

p

00 ½

½

For High-SNR (η= λ,µ) :

Solution can be given as a power-series in :

Page 20: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

20

RE-Rate Taylor Series Expansion

In [Zuk,Domany,Kanter&Aizenman 06] we give a procedure for calculating the full Taylor-Series Expansion for the HMP entropy rate, in the ‘High SNR’, and ‘almost memoryless’ regime.

Main observation: Finite systems give the correct RE rate up to a given order:

Was discovered using computer experiments (symbolic computation in Maple).

Stronger result holds for the entropy rate (orders ‘settle’ for N ≥ (k+3)/2)

Does not hold for any regime. For some regimes (e.g. p->0), even first order never settles.

Page 21: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

21

Proof Outline (with M. Aizenman)

X

Y

(k+3)/2

H(p,) up to O(k)

Two main Ideas:

A. To distinguish between noise at different site

2 3….j

….

B. When m=0, the observation Ym=Xm,

conditioning back to the past is ‘blocked’

m=0

k+2

H(λ)

D(λ||µ)

Page 22: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

22

Overview

Introduction Distance Measures and Relative Entropy rate Results: Generalization from Entropy Rate. Future Directions

Page 23: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

23

RE-Rate Taylor Series Expansion

First order :

Higher orders were computed for the binary symmetric case. Similar results for the ‘almost-memoryless’ regime. Radius of convergence seems larger for the latter

expansion, albeit no rigorous results are known.

Page 24: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

24

Future Directions

o Study other regimes. (e.g. two ‘close’ models).o Behavior of the EM algorithm. o Generalizations (e.g. different alphabets sizes,

continuous case).o Physical realization of HMPs (mesoscopic systems,

quantum jumps) o Domain of Analyticity - Radius of convergence.

Page 25: The Relative Entropy Rate of Two Hidden Markov Processes Or Zuk Dept. of Phys. Of Comp. Systems Weizmann Inst. Of Science Rehovot, Israel

25

Thanks

o Eytan Domany (Weizmann Inst.)o Ido Kanter (Bar-Ilan Univ.)o Michael Aizenman (Princeton Univ.)o Libi Hertzberg (Weizmann Inst.)