ee4-62 mlcv lecture 13-14 face recognition – subspace/manifold learning tae-kyun kim 1 ee4-62 mlcv

Post on 13-Dec-2015

240 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

1

EE4-62 MLCV

Lecture 13-14Face Recognition – Subspace/Manifold Learning

Tae-Kyun Kim

EE4-62 MLCV

EE4-62 MLCV

2

EE4-62 MLCVFace Image Tagging and Retrieval

• Face tagging at commercial weblogs

• Key issues– User interaction for face tags– Representation of a long- time

accumulated data– Online and efficient learning

• Active research area in Face Recognition Test and MPEG-7 for face image retrieval and automatic passport control• Our proposal promoted to

MPEG7 ISO/IEC standard

3

EE4-62 MLCV

Principal Component Analysis (PCA)- Maximum Variance Formulation of PCA- Minimum-error formulation of PCA

- Probabilistic PCA

EE4-62 MLCV

4

Maximum Variance Formulation of PCA

EE4-62 MLCV

5

EE4-62 MLCV

6

EE4-62 MLCV

7

EE4-62 MLCV

8

EE4-62 MLCV

9

Minimum-error formulation of PCA

EE4-62 MLCV

10

EE4-62 MLCV

11

EE4-62 MLCV

12

EE4-62 MLCV

13

EE4-62 MLCV

14

15

EE4-62 MLCV

Applications of PCA to Face Recognition

EE4-62 MLCV

16

EE4-62 MLCV

(Recap) Geometrical interpretation of PCA• Principal components are the vectors in the direction of the

maximum variance of the projection samples.

• Each two-dimensional data point is transformed to a single variable z1 representing the projection of the data point onto the eigenvector u1.

• The data points projected onto u1 has the max variance.• Infer the inherent structure of high dimensional data.• The intrinsic dimensionality of data is much smaller.

• For given 2D data points, u1 and u2 are found as PCs

EE4-62 MLCV

17

Eigenfaces• Collect a set of face images• Normalize for scale, orientation (using eye locations)

• Construct the covariance matrix and obtain eigenvectors

w

h

D=wh

NDRX

,...,1

1 xxXXXN

S T

MDRUUSU ,

EE4-62 MLCV

18

Eigenfaces• Project data onto the

subspace

• Reconstruction is obtained as

• Use the distance to the subspace for face recognition

DMRZXUZ NMT ,,

UZXUzuzxM

iii

~,~

1

x~||~|| xx

EE4-62 MLCV

x

EE4-62 MLCV

19

20

EE4-62 MLCV

Matlab Demos – Face Recognition by PCA

EE4-62 MLCV

21

• Face Images• Eigen-vectors and Eigen-value plot• Face image reconstruction• Projection coefficients (visualisation of high-

dimensional data)• Face recognition

EE4-62 MLCV

22

Probabilistic PCA

• A subspace is spanned by the orthonormal basis (eigenvectors computed from covariance matrix)

• Can interpret each observation with a generative model

• Estimate (approximately) the probability of generating each observation with Gaussian distribution,

PCA: uniform prior on the subspace PPCA: Gaussian dist.

EE4-62 MLCV

EE4-62 MLCV

23

Continuous Latent Variables

EE4-62 MLCV

24

EE4-62 MLCV

25

EE4-62 MLCV

26

Probabilistic PCAEE4-62 MLCV

EE4-62 MLCV

27

EE4-62 MLCV

28

EE4-62 MLCV

29

EE4-62 MLCV

30

EE4-62 MLCV

31

Maximum likelihood PCA

EE4-62 MLCV

32

EE4-62 MLCV

33

EE4-62 MLCV

34

EE4-62 MLCV

35

36

EE4-62 MLCV

Limitations of PCA

EE4-62 MLCV

37

Unsupervised learning

PCA vs LDA (Linear Discriminant Analysis)

EE4-62 MLCV

38

Linear model

Linear Manifold = Subspace Nonlinear

Manifold

EE4-62 MLCV

PCA vs Kernel PCA

EE4-62 MLCV

39

Gaussian Distribution Assumption

IC1

IC2

PC1

PC2

PCA vs ICA (Independent Component Analysis)

EE4-62 MLCV

40

EE4-62 MLCV

(also by ICA)

top related