face recognition using eigenfaces

27
Face Recognition Using Face Recognition Using Eigenfaces Eigenfaces Kenan Gençol Kenan Gençol presented in the course presented in the course Pattern Recognition Pattern Recognition instructed by instructed by Asst.Prof.Dr. Kemal Özkan Asst.Prof.Dr. Kemal Özkan Department of Electrical and Electronics Engineering, Department of Electrical and Electronics Engineering, Osmangazi University Osmangazi University

Upload: shannon-christian

Post on 03-Jan-2016

58 views

Category:

Documents


0 download

DESCRIPTION

Face Recognition Using Eigenfaces. Kenan Gençol presented in the course Pattern Recognition instructed by Asst.Prof.Dr. Kemal Özkan Department of Electrical and Electronics Engineering, Osmangazi University. Agenda. Introduction Principle Component Analysis (PCA) - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Face Recognition Using Eigenfaces

Face Recognition Using Face Recognition Using EigenfacesEigenfaces

Kenan GençolKenan Gençol

presented in the coursepresented in the course

Pattern RecognitionPattern Recognitioninstructed byinstructed by

Asst.Prof.Dr. Kemal ÖzkanAsst.Prof.Dr. Kemal ÖzkanDepartment of Electrical and Electronics Engineering, Osmangazi Department of Electrical and Electronics Engineering, Osmangazi

UniversityUniversity

Page 2: Face Recognition Using Eigenfaces

AgendaAgenda

IntroductionIntroduction Principle Component Analysis (PCA)Principle Component Analysis (PCA) Eigenfaces for RecognitionEigenfaces for Recognition

Page 3: Face Recognition Using Eigenfaces

IntroductionIntroduction

A method introduced by Turk and A method introduced by Turk and Pentland from MIT in 1991. Pentland from MIT in 1991.

Uses Principle Component Uses Principle Component Analysis(PCA) as a mathematical Analysis(PCA) as a mathematical framework. framework.

Page 4: Face Recognition Using Eigenfaces

Principal Component Analysis Principal Component Analysis (PCA)(PCA)

What is it?What is it? It is a powerful tool for analysing data.It is a powerful tool for analysing data. Patterns in data can be hard to find in Patterns in data can be hard to find in

complex data, or in high dimension.complex data, or in high dimension. PCA reduces a complex data set to a PCA reduces a complex data set to a

lower dimension.lower dimension. and identifies patterns in data, and identifies patterns in data,

highlights their similarities and highlights their similarities and differences. differences.

Page 5: Face Recognition Using Eigenfaces

Principal Component Analysis Principal Component Analysis (PCA)(PCA)

The goal of PCA is to find the most The goal of PCA is to find the most meaningful basis to re-express a data meaningful basis to re-express a data set.set.

PCA asks: Is there another basis, which is PCA asks: Is there another basis, which is a linear combinationa linear combination of the original basis, of the original basis, that best re-expresses our data set?that best re-expresses our data set?

Uses variance and covariance for this Uses variance and covariance for this goal.goal.

Page 6: Face Recognition Using Eigenfaces

PCA - Mathematical PCA - Mathematical FoundationsFoundations

The The covariancecovariance measures the degree measures the degree of the linear relationship between two of the linear relationship between two variables.variables.

If positive, positively correlated data.If positive, positively correlated data. If negative, negatively correlated data.If negative, negatively correlated data. If zero, uncorrelated data.If zero, uncorrelated data. The absolute magnitude of covariance The absolute magnitude of covariance

measures the degree of redundancy.measures the degree of redundancy.

Page 7: Face Recognition Using Eigenfaces

PCA - Mathematical PCA - Mathematical FoundationsFoundations

The The covariance matrixcovariance matrix shows the shows the relationship between higher dimensions.relationship between higher dimensions.

If n dimensions, it is a nxn matrix.If n dimensions, it is a nxn matrix. It is a square symmetric matrix.It is a square symmetric matrix. The diagonal terms are the variances, and The diagonal terms are the variances, and

off-diagonal terms are covariances.off-diagonal terms are covariances. The off-diagonal terms large magnitudes The off-diagonal terms large magnitudes

correspond to high redundancy.correspond to high redundancy.

Page 8: Face Recognition Using Eigenfaces

PCA - Mathematical PCA - Mathematical FoundationsFoundations

Our goals re-stated:Our goals re-stated: (1) minimize redundancy, measured (1) minimize redundancy, measured

by the magnitude of covariance.by the magnitude of covariance. (2) maximize the signal, measured (2) maximize the signal, measured

by the variance.by the variance. Diagonalize the covariance matrix!Diagonalize the covariance matrix! This means: Decorrelate data! This means: Decorrelate data!

Page 9: Face Recognition Using Eigenfaces

PCA - Mathematical PCA - Mathematical FoundationsFoundations

The Diagonalization of Covariance The Diagonalization of Covariance Matrix:Matrix:

All off-diagonal terms should be zero, or All off-diagonal terms should be zero, or say another way, data is decorrelated.say another way, data is decorrelated.

Each successive dimension should be Each successive dimension should be rank-ordered according to variance rank-ordered according to variance (large variances have important (large variances have important structure.)structure.)

Page 10: Face Recognition Using Eigenfaces

A little linear algebra...A little linear algebra...

Some crucial theorems from linear Some crucial theorems from linear algebra for PCA work:algebra for PCA work:

A matrix is symmetric if and only if A matrix is symmetric if and only if orthogonally diagonalizable.orthogonally diagonalizable.

A symmetric matrix is diagonalized A symmetric matrix is diagonalized by a matrix of its orthonormal by a matrix of its orthonormal eigenvectors. eigenvectors.

Page 11: Face Recognition Using Eigenfaces

PCA - Mathematical PCA - Mathematical FoundationsFoundations

So, finally,So, finally, Find eigenvectors of covariance matrix!Find eigenvectors of covariance matrix! Order them by eigenvalue, highest to Order them by eigenvalue, highest to

lowest (gives order of significance).lowest (gives order of significance). The eigenvector with the highest The eigenvector with the highest

eigenvalue is the principle eigenvalue is the principle component.Second, third principles etc.component.Second, third principles etc.

Ignore the components of lesser Ignore the components of lesser significance.significance.

Page 12: Face Recognition Using Eigenfaces

PCA - ConclusionPCA - Conclusion

Results:Results: The final data set will have less The final data set will have less

dimensions than the original.dimensions than the original. Aligned data in a basis with the axis of Aligned data in a basis with the axis of

maximal variance (find another direction maximal variance (find another direction along which variance is maximized).along which variance is maximized).

Rank-ordering each basis vector according Rank-ordering each basis vector according to the corresponding variances show how to the corresponding variances show how ‘principal’ each direction.‘principal’ each direction.

Page 13: Face Recognition Using Eigenfaces

Discussion of PCADiscussion of PCA

Principal components with larger associated Principal components with larger associated variances show important, interesting variances show important, interesting structure, while those with lower variances structure, while those with lower variances represent noise. This is a strong but represent noise. This is a strong but sometimes incorrect assumption.sometimes incorrect assumption.

The goal of the analysis is to decorrelate the The goal of the analysis is to decorrelate the data, or say in other terms, is to remove data, or say in other terms, is to remove second-order dependencies in the data. In the second-order dependencies in the data. In the data sets of higher dependencies exist, it is data sets of higher dependencies exist, it is insufficient at revealing all structure in the insufficient at revealing all structure in the data.data.

Page 14: Face Recognition Using Eigenfaces

Eigenfaces for RecognitionEigenfaces for Recognition

Simply think of it as a template Simply think of it as a template matching problem:matching problem:

Page 15: Face Recognition Using Eigenfaces

Computation of the Computation of the EigenfacesEigenfaces

Let Let ΓΓ is an is an NN22 x1 x1 vector, corresponding vector, corresponding

to the to the NxNNxN face image face image ΙΙ..

Step1Step1: obtain face images : obtain face images ΙΙ11, Ι, Ι2,2,.... Ι.... ΙMM

(training faces)(training faces) Step2Step2: represent every image : represent every image ΙΙii as a as a

vector vector ΓΓii

Page 16: Face Recognition Using Eigenfaces
Page 17: Face Recognition Using Eigenfaces

Computation of the Computation of the EigenfacesEigenfaces

Step3Step3: compute the average face : compute the average face vector vector ΨΨ : :

Step4Step4: subtract the mean face:: subtract the mean face:

Page 18: Face Recognition Using Eigenfaces

mean face

Page 19: Face Recognition Using Eigenfaces
Page 20: Face Recognition Using Eigenfaces

Computation of the Computation of the EigenfacesEigenfaces

Step5Step5: compute the covariance matrix : compute the covariance matrix CC

Step6Step6: compute the eigenvectors : compute the eigenvectors uuii of of

AAAATT::

Page 21: Face Recognition Using Eigenfaces

Computation of the Computation of the EigenfacesEigenfaces

The matrix The matrix AAAATTis very large is very large impractical! impractical! Consider the matrix Consider the matrix AATTAA (MxM matrix) and (MxM matrix) and

compute the eigenvectors of compute the eigenvectors of AATTAA.. The eigenvectors of The eigenvectors of AATTAA are also the best are also the best

M eigenvectors of M eigenvectors of AAAATT.. They correspond to M They correspond to M EIGENFACES EIGENFACES !!!! Keep only K eigenvectors corresponding Keep only K eigenvectors corresponding

to the K largest eigenvalues.to the K largest eigenvalues.

Page 22: Face Recognition Using Eigenfaces

1st eigen face

2st eigen face

0 5 10 15 20-2000

0

2000

4000

6000

8000

10000Eigen values

Page 23: Face Recognition Using Eigenfaces
Page 24: Face Recognition Using Eigenfaces

Recognition using Recognition using eigenfaceseigenfaces

Given an unknown face image Given an unknown face image ΓΓ , , follow these steps:follow these steps:

Step1Step1: normalize : normalize ΓΓ : : Φ = Γ – ΨΦ = Γ – Ψ.. Step2Step2: project onto the eigenspace: project onto the eigenspace

Page 25: Face Recognition Using Eigenfaces

Recognition using Recognition using eigenfaceseigenfaces

Step3Step3: represent : represent ΦΦ as: as:

Step4Step4: Find face dist : Find face dist eerr = min = minll || Ω – Ω || Ω – Ωll || || Recognize Recognize ΓΓ as face as face ll from training set !! from training set !!

Page 26: Face Recognition Using Eigenfaces

Discussion: EigenfacesDiscussion: Eigenfaces

Performance is affected from:Performance is affected from: BackgroundBackground Lighting conditionsLighting conditions Scale (head size)Scale (head size) OrientationOrientation

Page 27: Face Recognition Using Eigenfaces

Thank you!Thank you!