parametric density estimation using gaussian mixture models

Post on 29-Jun-2015

1.039 Views

Category:

Technology

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Based on tutorials by Jeff A. Bilmes and by Ludwig SchwardtSlides by Pardis Noorzad

TRANSCRIPT

Parametric Density Estimation using Gaussian Mixture Models

Parametric Density Estimation using

Gaussian Mixture Models

An Application of the EM Algorithm

Based on tutorials by Je� A. Bilmes and by Ludwig Schwardt

Amirkabir University of Technology � Bahman 12, 1389

Parametric Density Estimation using Gaussian Mixture Models

Outline

1 Introduction

2 Density Estimation

3 Gaussian Mixture Model

4 Some Results

5 Other Applications of EM

Parametric Density Estimation using Gaussian Mixture Models

Outline

1 Introduction

2 Density Estimation

3 Gaussian Mixture Model

4 Some Results

5 Other Applications of EM

Parametric Density Estimation using Gaussian Mixture Models

Outline

1 Introduction

2 Density Estimation

3 Gaussian Mixture Model

4 Some Results

5 Other Applications of EM

Parametric Density Estimation using Gaussian Mixture Models

Outline

1 Introduction

2 Density Estimation

3 Gaussian Mixture Model

4 Some Results

5 Other Applications of EM

Parametric Density Estimation using Gaussian Mixture Models

Outline

1 Introduction

2 Density Estimation

3 Gaussian Mixture Model

4 Some Results

5 Other Applications of EM

Parametric Density Estimation using Gaussian Mixture Models

Introduction

Outline

1 Introduction

2 Density Estimation

3 Gaussian Mixture Model

4 Some Results

5 Other Applications of EM

Parametric Density Estimation using Gaussian Mixture Models

Introduction

What does EM do?

iterative maximization of the likelihood function

when data has missing values or

when maximization of the likelihood is di�cult

data augmentation

X is incomplete data

Z = (X;Y) is the complete data set

Parametric Density Estimation using Gaussian Mixture Models

Introduction

What does EM do?

iterative maximization of the likelihood function

when data has missing values or

when maximization of the likelihood is di�cult

data augmentation

X is incomplete data

Z = (X;Y) is the complete data set

Parametric Density Estimation using Gaussian Mixture Models

Introduction

What does EM do?

iterative maximization of the likelihood function

when data has missing values or

when maximization of the likelihood is di�cult

data augmentation

X is incomplete data

Z = (X;Y) is the complete data set

Parametric Density Estimation using Gaussian Mixture Models

Introduction

What does EM do?

iterative maximization of the likelihood function

when data has missing values or

when maximization of the likelihood is di�cult

data augmentation

X is incomplete data

Z = (X;Y) is the complete data set

Parametric Density Estimation using Gaussian Mixture Models

Introduction

What does EM do?

iterative maximization of the likelihood function

when data has missing values or

when maximization of the likelihood is di�cult

data augmentation

X is incomplete data

Z = (X;Y) is the complete data set

Parametric Density Estimation using Gaussian Mixture Models

Introduction

What does EM do?

iterative maximization of the likelihood function

when data has missing values or

when maximization of the likelihood is di�cult

data augmentation

X is incomplete data

Z = (X;Y) is the complete data set

Parametric Density Estimation using Gaussian Mixture Models

Introduction

What does EM do?

iterative maximization of the likelihood function

when data has missing values or

when maximization of the likelihood is di�cult

data augmentation

X is incomplete data

Z = (X;Y) is the complete data set

Parametric Density Estimation using Gaussian Mixture Models

Introduction

EM and MLE

maximum likelihood estimation

estimates density parameters

=) EM app: parametric density estimation

when density is a mixture of Gaussians

Parametric Density Estimation using Gaussian Mixture Models

Introduction

EM and MLE

maximum likelihood estimation

estimates density parameters

=) EM app: parametric density estimation

when density is a mixture of Gaussians

Parametric Density Estimation using Gaussian Mixture Models

Introduction

EM and MLE

maximum likelihood estimation

estimates density parameters

=) EM app: parametric density estimation

when density is a mixture of Gaussians

Parametric Density Estimation using Gaussian Mixture Models

Introduction

EM and MLE

maximum likelihood estimation

estimates density parameters

=) EM app: parametric density estimation

when density is a mixture of Gaussians

Parametric Density Estimation using Gaussian Mixture Models

Introduction

EM and MLE

maximum likelihood estimation

estimates density parameters

=) EM app: parametric density estimation

when density is a mixture of Gaussians

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

Outline

1 Introduction

2 Density Estimation

3 Gaussian Mixture Model

4 Some Results

5 Other Applications of EM

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The density estimation problem

De�nition

Our parametric density estimation problem:

X = fxigNi=1, xi 2 RD

f(xijΘ), i.i.d. assumption

f(xijΘ) =PK

k=1 �kp(xij�k)

K components (given)

Θ = (�1; : : : ; �K ; �1; : : : ; �K)

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The density estimation problem

De�nition

Our parametric density estimation problem:

X = fxigNi=1, xi 2 RD

f(xijΘ), i.i.d. assumption

f(xijΘ) =PK

k=1 �kp(xij�k)

K components (given)

Θ = (�1; : : : ; �K ; �1; : : : ; �K)

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The density estimation problem

De�nition

Our parametric density estimation problem:

X = fxigNi=1, xi 2 RD

f(xijΘ), i.i.d. assumption

f(xijΘ) =PK

k=1 �kp(xij�k)

K components (given)

Θ = (�1; : : : ; �K ; �1; : : : ; �K)

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The density estimation problem

De�nition

Our parametric density estimation problem:

X = fxigNi=1, xi 2 RD

f(xijΘ), i.i.d. assumption

f(xijΘ) =PK

k=1 �kp(xij�k)

K components (given)

Θ = (�1; : : : ; �K ; �1; : : : ; �K)

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The density estimation problem

De�nition

Our parametric density estimation problem:

X = fxigNi=1, xi 2 RD

f(xijΘ), i.i.d. assumption

f(xijΘ) =PK

k=1 �kp(xij�k)

K components (given)

Θ = (�1; : : : ; �K ; �1; : : : ; �K)

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The density estimation problem

De�nition

Our parametric density estimation problem:

X = fxigNi=1, xi 2 RD

f(xijΘ), i.i.d. assumption

f(xijΘ) =PK

k=1 �kp(xij�k)

K components (given)

Θ = (�1; : : : ; �K ; �1; : : : ; �K)

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

Constraint on mixing probabilities �iSum to unity

ZRD

p(xij�k)dxi = 1

1 =

ZRD

f(xijΘ)dxi

=

ZRD

KXk=1

�kp(xij�k)dxi

=KXk=1

�k:

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

Constraint on mixing probabilities �iSum to unity

ZRD

p(xij�k)dxi = 1

1 =

ZRD

f(xijΘ)dxi

=

ZRD

KXk=1

�kp(xij�k)dxi

=KXk=1

�k:

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The maximum likelihood problem

De�nition

Our MLE problem:

p(XjΘ) =QNi=1 p(xijΘ) = L(ΘjX)

Θ� = argmaxΘ L(ΘjX)

the log-likelihood: log(L(ΘjX))

log(L(ΘjX)) =PN

i=1 log�PK

k=1 �kp(xi; �k)�

di�cult to optimize b/c it contains log of sum

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The maximum likelihood problem

De�nition

Our MLE problem:

p(XjΘ) =QNi=1 p(xijΘ) = L(ΘjX)

Θ� = argmaxΘ L(ΘjX)

the log-likelihood: log(L(ΘjX))

log(L(ΘjX)) =PN

i=1 log�PK

k=1 �kp(xi; �k)�

di�cult to optimize b/c it contains log of sum

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The maximum likelihood problem

De�nition

Our MLE problem:

p(XjΘ) =QNi=1 p(xijΘ) = L(ΘjX)

Θ� = argmaxΘ L(ΘjX)

the log-likelihood: log(L(ΘjX))

log(L(ΘjX)) =PN

i=1 log�PK

k=1 �kp(xi; �k)�

di�cult to optimize b/c it contains log of sum

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The maximum likelihood problem

De�nition

Our MLE problem:

p(XjΘ) =QNi=1 p(xijΘ) = L(ΘjX)

Θ� = argmaxΘ L(ΘjX)

the log-likelihood: log(L(ΘjX))

log(L(ΘjX)) =PN

i=1 log�PK

k=1 �kp(xi; �k)�

di�cult to optimize b/c it contains log of sum

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The maximum likelihood problem

De�nition

Our MLE problem:

p(XjΘ) =QNi=1 p(xijΘ) = L(ΘjX)

Θ� = argmaxΘ L(ΘjX)

the log-likelihood: log(L(ΘjX))

log(L(ΘjX)) =PN

i=1 log�PK

k=1 �kp(xi; �k)�

di�cult to optimize b/c it contains log of sum

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The maximum likelihood problem

De�nition

Our MLE problem:

p(XjΘ) =QNi=1 p(xijΘ) = L(ΘjX)

Θ� = argmaxΘ L(ΘjX)

the log-likelihood: log(L(ΘjX))

log(L(ΘjX)) =PN

i=1 log�PK

k=1 �kp(xi; �k)�

di�cult to optimize b/c it contains log of sum

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The EM formulation

De�nition

Our EM problem:

Z = (X;Y)

L(ΘjZ) = L(ΘjX;Y) = p(X;YjΘ)

yi 2 1 : : :K

yi = k if the ith sample was generated by the kth component

Y is a random vector

log(L(ΘjX;Y)) =PN

i=1 log(�yip(xij�yi))

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The EM formulation

De�nition

Our EM problem:

Z = (X;Y)

L(ΘjZ) = L(ΘjX;Y) = p(X;YjΘ)

yi 2 1 : : :K

yi = k if the ith sample was generated by the kth component

Y is a random vector

log(L(ΘjX;Y)) =PN

i=1 log(�yip(xij�yi))

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The EM formulation

De�nition

Our EM problem:

Z = (X;Y)

L(ΘjZ) = L(ΘjX;Y) = p(X;YjΘ)

yi 2 1 : : :K

yi = k if the ith sample was generated by the kth component

Y is a random vector

log(L(ΘjX;Y)) =PN

i=1 log(�yip(xij�yi))

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The EM formulation

De�nition

Our EM problem:

Z = (X;Y)

L(ΘjZ) = L(ΘjX;Y) = p(X;YjΘ)

yi 2 1 : : :K

yi = k if the ith sample was generated by the kth component

Y is a random vector

log(L(ΘjX;Y)) =PN

i=1 log(�yip(xij�yi))

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The EM formulation

De�nition

Our EM problem:

Z = (X;Y)

L(ΘjZ) = L(ΘjX;Y) = p(X;YjΘ)

yi 2 1 : : :K

yi = k if the ith sample was generated by the kth component

Y is a random vector

log(L(ΘjX;Y)) =PN

i=1 log(�yip(xij�yi))

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The EM formulation

De�nition

Our EM problem:

Z = (X;Y)

L(ΘjZ) = L(ΘjX;Y) = p(X;YjΘ)

yi 2 1 : : :K

yi = k if the ith sample was generated by the kth component

Y is a random vector

log(L(ΘjX;Y)) =PN

i=1 log(�yip(xij�yi))

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The EM formulation

De�nition

Our EM problem:

Z = (X;Y)

L(ΘjZ) = L(ΘjX;Y) = p(X;YjΘ)

yi 2 1 : : :K

yi = k if the ith sample was generated by the kth component

Y is a random vector

log(L(ΘjX;Y)) =PN

i=1 log(�yip(xij�yi))

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The EM formulationContinued

De�nition

The E- and M-steps with the auxiliary function Q:

E-step: Q(Θ;Θ(i�1)) = E[log(p(X;YjΘ))jX;Θ(i�1)]

M-step: Θ(i) = argmaxΘQ(Θ;Θ(i�1))

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The EM formulationContinued

De�nition

The E- and M-steps with the auxiliary function Q:

E-step: Q(Θ;Θ(i�1)) = E[log(p(X;YjΘ))jX;Θ(i�1)]

M-step: Θ(i) = argmaxΘQ(Θ;Θ(i�1))

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The EM formulationContinued

De�nition

The E- and M-steps with the auxiliary function Q:

E-step: Q(Θ;Θ(i�1)) = E[log(p(X;YjΘ))jX;Θ(i�1)]

M-step: Θ(i) = argmaxΘQ(Θ;Θ(i�1))

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The EM formulationThe Q function

Q(Θ;Θ(t)) = E[log(p(X;YjΘ))jX;Θ(t)]

=KXk=1

NXi=1

log(�kp(xij�k))p(kjxi;Θ(t))

=KXk=1

NXi=1

log(�k)p(kjxi;Θ(t))

+KXk=1

NXi=1

log(p(xij�k))p(kjxi;Θ(t))

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The EM formulationSolving for �k

We use the Lagrange multiplier to enforceP

k �k = 1.

@

@�k[Xk

Xi

log(�k)p(kjxi;Θ(t)) + �(Xk

�k � 1)] = 0

Xi

1

�kp(kjxi;Θ(t)) + � = 0

Summing both sides over k, we get � = �N .

�k =1

N

Xi

p(kjxi;Θ(t)):

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The EM formulationSolving for �k

We use the Lagrange multiplier to enforceP

k �k = 1.

@

@�k[Xk

Xi

log(�k)p(kjxi;Θ(t)) + �(Xk

�k � 1)] = 0

Xi

1

�kp(kjxi;Θ(t)) + � = 0

Summing both sides over k, we get � = �N .

�k =1

N

Xi

p(kjxi;Θ(t)):

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The EM formulationSolving for �k

We use the Lagrange multiplier to enforceP

k �k = 1.

@

@�k[Xk

Xi

log(�k)p(kjxi;Θ(t)) + �(Xk

�k � 1)] = 0

Xi

1

�kp(kjxi;Θ(t)) + � = 0

Summing both sides over k, we get � = �N .

�k =1

N

Xi

p(kjxi;Θ(t)):

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The EM formulationSolving for �k

We use the Lagrange multiplier to enforceP

k �k = 1.

@

@�k[Xk

Xi

log(�k)p(kjxi;Θ(t)) + �(Xk

�k � 1)] = 0

Xi

1

�kp(kjxi;Θ(t)) + � = 0

Summing both sides over k, we get � = �N .

�k =1

N

Xi

p(kjxi;Θ(t)):

Parametric Density Estimation using Gaussian Mixture Models

Density Estimation

The EM formulationSolving for �k

We use the Lagrange multiplier to enforceP

k �k = 1.

@

@�k[Xk

Xi

log(�k)p(kjxi;Θ(t)) + �(Xk

�k � 1)] = 0

Xi

1

�kp(kjxi;Θ(t)) + � = 0

Summing both sides over k, we get � = �N .

�k =1

N

Xi

p(kjxi;Θ(t)):

Parametric Density Estimation using Gaussian Mixture Models

Gaussian Mixture Model

Outline

1 Introduction

2 Density Estimation

3 Gaussian Mixture Model

4 Some Results

5 Other Applications of EM

Parametric Density Estimation using Gaussian Mixture Models

Gaussian Mixture Model

GMMSolving for mk and Σk

Gaussian components

p(xijxk;Σk) = (2�)�D

2 jΣkj�

12 exp(�1

2(xi �mk)T�1

k (xi �mk))

mk =

Pi xip(kjxi;Θ

(t))Pi p(kjxi;Θ(t))

Σk =p(kjxi;Θ(t))(xi � mk)(xi � mk)

T

p(kjxi;Θ(t))

Parametric Density Estimation using Gaussian Mixture Models

Gaussian Mixture Model

GMMSolving for mk and Σk

Gaussian components

p(xijxk;Σk) = (2�)�D

2 jΣkj�

12 exp(�1

2(xi �mk)T�1

k (xi �mk))

mk =

Pi xip(kjxi;Θ

(t))Pi p(kjxi;Θ(t))

Σk =p(kjxi;Θ(t))(xi � mk)(xi � mk)

T

p(kjxi;Θ(t))

Parametric Density Estimation using Gaussian Mixture Models

Gaussian Mixture Model

Choice of ΣkFull covariance

Fits data best, but costly in high-dimensional space.

Figure: full covariance

Parametric Density Estimation using Gaussian Mixture Models

Gaussian Mixture Model

Choice of ΣkDiagonal covariance

A tradeo� between cost and quality.

Figure: diagonal covariance

Parametric Density Estimation using Gaussian Mixture Models

Gaussian Mixture Model

Choice of ΣkSpherical covariance, Σk = �2

kI

Needs many components to cover data.

Figure: spherical covariance

Decision should be based on the size of training data set

so that all parameters may be tuned

Parametric Density Estimation using Gaussian Mixture Models

Gaussian Mixture Model

Choice of ΣkSpherical covariance, Σk = �2

kI

Needs many components to cover data.

Figure: spherical covariance

Decision should be based on the size of training data set

so that all parameters may be tuned

Parametric Density Estimation using Gaussian Mixture Models

Gaussian Mixture Model

Choice of ΣkSpherical covariance, Σk = �2

kI

Needs many components to cover data.

Figure: spherical covariance

Decision should be based on the size of training data set

so that all parameters may be tuned

Parametric Density Estimation using Gaussian Mixture Models

Some Results

Outline

1 Introduction

2 Density Estimation

3 Gaussian Mixture Model

4 Some Results

5 Other Applications of EM

Parametric Density Estimation using Gaussian Mixture Models

Some Results

Data

data generated from a mixture of three bivariate Gaussians

Parametric Density Estimation using Gaussian Mixture Models

Some Results

PDFs

estimated pdf contours (after 43 iterations)

Parametric Density Estimation using Gaussian Mixture Models

Some Results

Clusters

xi assigned to the cluster k corresponding to the highest p(kjxi;Θ)

Parametric Density Estimation using Gaussian Mixture Models

Other Applications of EM

Outline

1 Introduction

2 Density Estimation

3 Gaussian Mixture Model

4 Some Results

5 Other Applications of EM

Parametric Density Estimation using Gaussian Mixture Models

Other Applications of EM

A problem with missing dataLife-testing experiment

lifetime of light bulbs follows an exponential distribution with

mean �

two separate experiments

N bulbs tested until failed

failure times recorded as x1; : : : ; xN

M bulbs tested

failure times not recorded

only number of bulbs r, that failed at time t

missing data are failure times u1; : : : ; uM

Parametric Density Estimation using Gaussian Mixture Models

Other Applications of EM

A problem with missing dataLife-testing experiment

lifetime of light bulbs follows an exponential distribution with

mean �

two separate experiments

N bulbs tested until failed

failure times recorded as x1; : : : ; xN

M bulbs tested

failure times not recorded

only number of bulbs r, that failed at time t

missing data are failure times u1; : : : ; uM

Parametric Density Estimation using Gaussian Mixture Models

Other Applications of EM

A problem with missing dataLife-testing experiment

lifetime of light bulbs follows an exponential distribution with

mean �

two separate experiments

N bulbs tested until failed

failure times recorded as x1; : : : ; xN

M bulbs tested

failure times not recorded

only number of bulbs r, that failed at time t

missing data are failure times u1; : : : ; uM

Parametric Density Estimation using Gaussian Mixture Models

Other Applications of EM

A problem with missing dataLife-testing experiment

lifetime of light bulbs follows an exponential distribution with

mean �

two separate experiments

N bulbs tested until failed

failure times recorded as x1; : : : ; xN

M bulbs tested

failure times not recorded

only number of bulbs r, that failed at time t

missing data are failure times u1; : : : ; uM

Parametric Density Estimation using Gaussian Mixture Models

Other Applications of EM

A problem with missing dataLife-testing experiment

lifetime of light bulbs follows an exponential distribution with

mean �

two separate experiments

N bulbs tested until failed

failure times recorded as x1; : : : ; xN

M bulbs tested

failure times not recorded

only number of bulbs r, that failed at time t

missing data are failure times u1; : : : ; uM

Parametric Density Estimation using Gaussian Mixture Models

Other Applications of EM

A problem with missing dataLife-testing experiment

lifetime of light bulbs follows an exponential distribution with

mean �

two separate experiments

N bulbs tested until failed

failure times recorded as x1; : : : ; xN

M bulbs tested

failure times not recorded

only number of bulbs r, that failed at time t

missing data are failure times u1; : : : ; uM

Parametric Density Estimation using Gaussian Mixture Models

Other Applications of EM

A problem with missing dataLife-testing experiment

lifetime of light bulbs follows an exponential distribution with

mean �

two separate experiments

N bulbs tested until failed

failure times recorded as x1; : : : ; xN

M bulbs tested

failure times not recorded

only number of bulbs r, that failed at time t

missing data are failure times u1; : : : ; uM

Parametric Density Estimation using Gaussian Mixture Models

Other Applications of EM

A problem with missing dataLife-testing experiment

lifetime of light bulbs follows an exponential distribution with

mean �

two separate experiments

N bulbs tested until failed

failure times recorded as x1; : : : ; xN

M bulbs tested

failure times not recorded

only number of bulbs r, that failed at time t

missing data are failure times u1; : : : ; uM

Parametric Density Estimation using Gaussian Mixture Models

Other Applications of EM

A problem with missing dataLife-testing experiment

lifetime of light bulbs follows an exponential distribution with

mean �

two separate experiments

N bulbs tested until failed

failure times recorded as x1; : : : ; xN

M bulbs tested

failure times not recorded

only number of bulbs r, that failed at time t

missing data are failure times u1; : : : ; uM

Parametric Density Estimation using Gaussian Mixture Models

Other Applications of EM

EM formulationLife-testing experiment

log(L(�jx; u)) = �N(log � + �x=�)�PM

i (log � + ui=�)

expected value for bulb still burning: t+ �

one that burned out: � � te�t=�(k)

1�e�t=�(k)= � � th(k)

E-step:Q(�; �(k)) = �(N +M) log �

�1

�(N �x+ (M � r)(t+ �(k)) + r(�(k) � th(k)))

M-step: 1N+M (N �x+ (M � r)(t+ �(k)) + r(�(k) � th(k)))

Parametric Density Estimation using Gaussian Mixture Models

Other Applications of EM

EM formulationLife-testing experiment

log(L(�jx; u)) = �N(log � + �x=�)�PM

i (log � + ui=�)

expected value for bulb still burning: t+ �

one that burned out: � � te�t=�(k)

1�e�t=�(k)= � � th(k)

E-step:Q(�; �(k)) = �(N +M) log �

�1

�(N �x+ (M � r)(t+ �(k)) + r(�(k) � th(k)))

M-step: 1N+M (N �x+ (M � r)(t+ �(k)) + r(�(k) � th(k)))

Parametric Density Estimation using Gaussian Mixture Models

Other Applications of EM

EM formulationLife-testing experiment

log(L(�jx; u)) = �N(log � + �x=�)�PM

i (log � + ui=�)

expected value for bulb still burning: t+ �

one that burned out: � � te�t=�(k)

1�e�t=�(k)= � � th(k)

E-step:Q(�; �(k)) = �(N +M) log �

�1

�(N �x+ (M � r)(t+ �(k)) + r(�(k) � th(k)))

M-step: 1N+M (N �x+ (M � r)(t+ �(k)) + r(�(k) � th(k)))

Parametric Density Estimation using Gaussian Mixture Models

Other Applications of EM

EM formulationLife-testing experiment

log(L(�jx; u)) = �N(log � + �x=�)�PM

i (log � + ui=�)

expected value for bulb still burning: t+ �

one that burned out: � � te�t=�(k)

1�e�t=�(k)= � � th(k)

E-step:Q(�; �(k)) = �(N +M) log �

�1

�(N �x+ (M � r)(t+ �(k)) + r(�(k) � th(k)))

M-step: 1N+M (N �x+ (M � r)(t+ �(k)) + r(�(k) � th(k)))

Parametric Density Estimation using Gaussian Mixture Models

Other Applications of EM

EM formulationLife-testing experiment

log(L(�jx; u)) = �N(log � + �x=�)�PM

i (log � + ui=�)

expected value for bulb still burning: t+ �

one that burned out: � � te�t=�(k)

1�e�t=�(k)= � � th(k)

E-step:Q(�; �(k)) = �(N +M) log �

�1

�(N �x+ (M � r)(t+ �(k)) + r(�(k) � th(k)))

M-step: 1N+M (N �x+ (M � r)(t+ �(k)) + r(�(k) � th(k)))

Parametric Density Estimation using Gaussian Mixture Models

Other Applications of EM

EM formulationLife-testing experiment

log(L(�jx; u)) = �N(log � + �x=�)�PM

i (log � + ui=�)

expected value for bulb still burning: t+ �

one that burned out: � � te�t=�(k)

1�e�t=�(k)= � � th(k)

E-step:Q(�; �(k)) = �(N +M) log �

�1

�(N �x+ (M � r)(t+ �(k)) + r(�(k) � th(k)))

M-step: 1N+M (N �x+ (M � r)(t+ �(k)) + r(�(k) � th(k)))

Parametric Density Estimation using Gaussian Mixture Models

Thank you for your attention. Any questions?

top related