quantification of facial asymmetry for expression-invariant human identification yanxi liu...

27
Quantification of Facial Asymmetry for Expression- invariant Human Identification Yanxi Liu [email protected] The Robotics Institute School of Computer Science Carnegie Mellon University Pittsburgh, PA USA

Upload: silas-alexander

Post on 22-Dec-2015

218 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Quantification of Facial Asymmetry for Expression-invariant Human

Identification

Yanxi Liu [email protected]

The Robotics Institute School of Computer ScienceCarnegie Mellon University

Pittsburgh, PA USA

Page 2: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Acknowledgement• Joint work with Drs. Karen Schmidt and Jeff Cohn (Psychology, U. Of Pitt).• Students who work on the data as research projects: Sinjini Mitra, Nicoleta Serban, and Rhiannon Weaver (statistics, CMU), Yan Karklin, Dan Bohus (scomputer science) and Marc Fasnacht (physics).• Helpful discussions and advices provided by Drs. T. Minka, J. Schneider, B. Eddy, A. Moore and G. Gordon. • Partially funded by DARPA HID grant to CMU entitled:“Space Time Biometrics for Human Identification in Video”

Page 3: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Human Faces are Asymmetrical

Left Face Right Face

Page 4: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Under Balanced Frontal Lighting (from CMU PIE Database)

Page 5: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

What is Facial Asymmetry?

• Intrinsic facial asymmetry in individuals is determined by biological growth, injury, age, expression …

• Extrinsic facial asymmetry is affected by viewing orientation, illuminations, shadows, highlights …

Page 6: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Extrinsic Facial asymmetry on an image is Pose-variantOriginal ImageLeft face Right Face

Page 7: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Facial Asymmetry Analysis• A lot of studies in Psychology has been done on

the topics of– attractiveness v. facial asymmetry (Thornhill &

Buelthoff 1999)– expression v. facial movement asymmetry

• Identification– Humans are extremely sensitive to facial asymmetry– Facial attractiveness for men is inversely related to

recognition accuracy (O’Toole 1998)

Limitations: qualitative, subjective, still photos

Page 8: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Motivations

• Facial (a)symmetry is a holistic structural feature that has not been explored quantitatively before

• It is unknown whether intrinsic facial asymmetry is characteristic to human expressions or human identities

Page 9: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

The question to be answered in this work

How does intrinsic facial asymmetry affect human face identification?

Page 10: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

DATA: Expression VideosCohn-Kanade AU-Coded Facial Expression Database

joy

anger

disgust

Neutral Peak

Page 11: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Sample Facial Expression Frames

Neutral

Joy

Disgust

Anger

Total 55 subjects. Each subject has three distinct expression videos of varied number of frames. Total 3703 frames.

Page 12: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Face Image Normalization

AffineDeformation based on 3 reference points

Inner canthus

Philtrum

Face Midline

Page 13: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Quantification of Facial Asymmetry1. Density Difference: D-face

D (x,y) = I(x,y) – I’(x,y)I(x,y) --- normalized face image, I’(x,y) --- bilateral reflection of I(x,y) about face midline

2. Edge Orientation Similarity: S-face

S(x,y) = cos(Ie(x,y),I’e(x,y))

where Ie, Ie’ are edge images of I and I’ respectively, is the angle between the two gradient vectors at each pair of corresponding points

Page 14: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Asymmetry Faces

Original D-face S-face

An half of D-face or S-face contains all the needed information. We call these half faces Dh, Sh,Dhx, Dhy,

Shx,Shy AsymFaces.

Page 15: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Asymmetry Measure Dhy for two subjects each has 3 distinct expressions

Joy | anger | disgust Joy anger | disgust

forehead

chin

forehead

chin

Dhy Dhy

Page 16: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

temporalspatial

Forehead -- chin

Forehead -- chin

Forehead -- chin

Page 17: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

temporalspatial

Forehead -- chin

Forehead -- chin

Forehead -- chin

Page 18: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

spatial

Forehead -- chin

Forehead -- chin

Forehead -- chin

Page 19: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Evaluation of Discriminative Power of Each

Dimension in SymFace Dhy

Bridge of nose

forehead chin

Variance Ratio

Page 20: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Most Discriminating Facial Regions Found

Page 21: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Experiment Setup55 subjects, each has three expression video sequences (joy, anger, disgust). Total of 3703 frames. Human identification test is done on ----

Experiment #1: train on joy and anger, test on disgust;Experiment #2: train on joy and disgust, test on anger;Experiment #3: train on disgust and anger, test on joy;Experiment #4: train on neutral expression frames,test on peak Experiment #5: train on peak expression frames,test on neutral

The above five experiments are carried out using (1) AsymFaces, (2) Fisherfaces, and (3) AsymFaces and FisherFaces together.

Page 22: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Sample Results: Combining Fisherfaces (FF) with AsymFaces (AF)

(Liu et al 2002)

Data set is composed of 55 subjects, each has three expression videos.There are 1218 joy frames, 1414 anger frames and 1071 disgust frames. Total number of frames is 3703.

Page 23: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

All combinations of FF and AF features are tested and evaluated quantitatively

Page 24: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Complement Conventional Face Classifier

107 pairs of face images taken from Feret database.It is shown that asymmetry-signature’s discriminating power demonstrated (1) has a p value << 0.001 from chance

(2) is independent from features used in conventional classifiers, decreases the error rate of a PCA classifier by 38% (15% 9.3%)

Page 25: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Quantified Facial Asymmetry used for Pose estimation

Page 26: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Summary

• Quantification of facial asymmetry is computationally feasible.•The intrinsic facial asymmetry of specific regions captures individual differences that are robust to variations in facial expression• AsymFaces provides discriminating information that is complement to conventional face identification methods (FisherFaces)

Page 27: Quantification of Facial Asymmetry for Expression-invariant Human Identification Yanxi Liu yanxi@cs.cmu.edu The Robotics Institute School of Computer Science

Future Work• (1) construct multiple, more robust facial asymmetry measures that can capture intrinsic facial asymmetry under illumination and pose variations using PIE as well as publicly available facial data. • (2) develop computational models for studying how recognition rates is affected by facial asymmetry under gender, race, attractiveness, hyperspectral variations.• (3) study pose estimation using a combination of facial asymmetry with skewed symmetry.