from facial features to facial expressions a.raouzaiou, k.karpouzis and s.kollias image, video and...
TRANSCRIPT
From Facial Featuresto Facial Expressions
A.Raouzaiou, K.Karpouzis and S.KolliasA.Raouzaiou, K.Karpouzis and S.Kollias
Image, Video and Multimedia Systems LaboratoryNational Technical University of Athens
Outline
The concept of archetypal expressions
FAPs-based description and estimation of FAPs
Expression synthesis using profiles
Synthesis of intermediate emotions
Archetypal Expressions
Source: F. Parke and K. Waters, Computer Facial Animation, A K Peters
Also termed universaluniversal because they are recognized across cultures
Archetypal Expressions (cont.)Description of the
archetypal expressions through
muscle actionsTranslation of facial muscle movements
into FAPs
Creation of FAPs vocabulary for every
archetypal expression
Action Units (AUs) - FACS
raise_l_i_eyebrow
e.g.AU1= + raise_r_i_eyebrow
e.g. sadnessclose_t_l_eyelid, close_t_r_eyelid, close_b_l_eyelid, close_b_r_eyelid, raise_l_i_eyebrow, raise_r_i_eyebrow, raise_l_m_eyebrow, raise_r_m_eyebrow, raise_l_o_eyebrow, raise_r_o_eyebrow
FAPs-based description
Discrete features offer a neat, symbolic representation of expressions
Not constrained to a specific face model Suitable for face cloning applications
MPEG-4 compatible Based on feature points, not complete features
FAPs-based description (cont.)
Two issues should be addressed :
choice of FAPs involved in profiles’
formation definition of FAP intensities
Expression synthesis
Choice of FAPs is based on psychological data
Intensities are derived from expression database images
Estimation of FAPs
Absence of clear quantitative definition of FAPs
It is possible to model FAPs through FDP feature points movement using distances s(x,y)
e.g. close_t_r_eyelid (F20) - close_b_r_eyelid (F22) D13=s (3.2,3.4) f13= D13 - D13-NEUTRAL
Sample FAP vocabulary
Sadness:close_t_l_eyelid(F19), close_t_r_eyelid(F20 ), close_b_l_eyelid (F21), close_b_r_eyelid(F22), raise_l_i_eyebrow(F31), raise_r_i_eyebrow(F32 ), raise_l_m_eyebrow(F33), raise_r_m_eyebrow(F34), raise_l_o_eyebrow(F35), raise_r_o_eyebrow(F36)
Archetypal Expression Profiles
ProfileProfile: set of FAPs accompanied by the corresponding range of variation
definition of subsets of candidate FAPs
use of range variations obtained from statistics
animation of the corresponding profiles to
verify appropriateness
face formations from psychological studies
Sample Profiles of Anger
A1: F4[22, 124], F31[-131, -25], F32[-136,-34], F33[-189,-109],
F34[-183,-105], F35[-101,-31], F36[-108,-32], F37[29,85],
F38[27,89]
A2: F19[-330,-200], F20[-335,-205], F21[200,330],
F22[205,335], F31[-200,-80], F32[-194,-74], F33[-190,-70],
F34=[-190,-70]
A3: F19 [-330,-200], F20[-335,-205], F21[200,330],
F22[205,335], F31[-200,-80], F32[-194,-74], F33[70,190],
F34[70,190]
Emotion representation
Emotions can be approached as points on a plane defined by activation and evaluation
Intermediate Expression Profiles Same universal emotion category
Animation of the same FAPs using different intensitiesAbsence of expert knowledge for the (+, –) quadrant
worry < fear < terror
Intermediate Expression Profiles Different universal emotion categories
In the same evaluation half-planeAveraging of FAPs used in universal emotions
Intermediate Expression Profiles Different universal emotion categories
afraid + sad = depressed
Conclusions
FAPs provide a compact and established means of emotion representation
Necessary input from psychological and physiological studies
Universal emotions can be used to synthesize intermediate onesUseful for low-bitrate MPEG-4 applications
Extensions
Verification – Evaluation Initial resultsAcceptable performance for expression
grading Intermediate expressions: better results for
the negative evaluation half planeLack of linguistic rules for the (+, -) quadrant
Extensions
Personalized ECAsDetected facial feature points can be used to
adapt a generic ECA head (FDP FPs) Intermediate emotions based on processing
real data (FAP extraction)Processing real data temporal aspect of
FAPs