modeling expressivity in ecas

41
1 Modeling Expressivity in ECAs Catherine Pelachaud, Maurizio Mancini LINC - University of Paris 8

Upload: kaiser

Post on 21-Jan-2016

37 views

Category:

Documents


0 download

DESCRIPTION

Modeling Expressivity in ECAs. Catherine Pelachaud, Maurizio Mancini LINC - University of Paris 8. Behavior. Behavior is related to the (Wallbott, 1998): quality of the mental state (e.g. emotion) it refers to quantity (somehow linked to the intensity factor of the mental state) - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Modeling Expressivity in ECAs

1

Modeling Expressivity in ECAs

Catherine Pelachaud,Maurizio Mancini

LINC - University of Paris 8

Page 2: Modeling Expressivity in ECAs

Behavior Behavior is related to the (Wallbott, 1998):

quality of the mental state (e.g. emotion) it refers to

quantity (somehow linked to the intensity factor of the mental state)

Behaviors encode: content information (the ‘What is

communicating’) expressive information (the ‘How it is

communicating’)

Behavior expressivity refers to the manner of execution of the behavior

Page 3: Modeling Expressivity in ECAs

Behavior Representation

Behavior= signal shape, movement, expressivity

Gesticon (Gesture Lexicon): Dictionary of behavior description (B. Krenn, H. Pirker OFAI)

Modalities: face hand and arm gesture body movement and posture gaze

Page 4: Modeling Expressivity in ECAs

Behavior Representation Face:

facial expression duration: onset, apex, offset

Gesture: phases: preparation, pre-hold-stroke, stroke, post-hold-

stroke, retraction gesture shape and movement for each phase

Head: head direction head movement

Gaze eye direction

Page 5: Modeling Expressivity in ECAs

Expressivity Dimensions

Expressivity dimensions: Spatial: amplitude of movement Temporal: duration of movement Power: dynamic property of movement Fluidity: smoothness and continuity of movement Repetitiveness: tendency to rhythmic repeats Overall Activation: quantity of movement across

modalities

Implemented for gesture and facial expression

Page 6: Modeling Expressivity in ECAs

Overall Activitation

• Threshold filter on atomic behaviors during APML tag matching

• Determines the number of nonverbal signals to be executed.

Page 7: Modeling Expressivity in ECAs

Spatial Parameter

• Amplitude of movement controlled through asymmetric scaling of the reach

• Space that is used to find IK goal positions

• Expand or condense the entire space in front of agent

Page 8: Modeling Expressivity in ECAs

Temporal parameter

Stroke shift / velocity control of a beat gesture

Y p

osit

ion

of w

rist

w.r

.t. s

houl

der

[cm

]

Frame #

• Determine the speed of the arm movement of a gesture's meaning-carrying stroke phase

• Modify speed of stroke

Page 9: Modeling Expressivity in ECAs

Fluidity

• Continuity control of TCB interpolation splines and gesture-to- gesture• Continuity of arms’ trajectory paths• Control the velocity profiles of an action

coarticulation

X p

osit

ion

of w

rist

w.r

.t. s

houl

der

[cm

]

Frame #

Page 10: Modeling Expressivity in ECAs

Power

• Tension and Bias control of TCB splines;• Overshoot reduction• Acceleration and deceleration of limbs

Hand shape control for gestures that do not need hand configuration to convey their meaning (beats).

Page 11: Modeling Expressivity in ECAs

Repetitivity

• Technique of stroke expansion: Consecutive emphases are realized gesturally by repeating the stroke of the first gesture.

Page 12: Modeling Expressivity in ECAs

Expressivity

Expressivity values may act: over the whole animation: EmoTV, analysis-synthesis every instant of the movement: Greta Music on every gesture: GEMEP corpus on a particular phase of the gesture: Attract attention

study

Exploratory studies based on various data types: acted data real data 2D cartoon literature

Page 13: Modeling Expressivity in ECAs

Research Issue

Behavior representation what to encode at which levels of representation dynamism

Implementation refinement

Page 14: Modeling Expressivity in ECAs

Expressivity over the WHOLE animation

One set of values are set extracted manually from annotation of

real data video corpus

extracted automatically from video corpus of acted data using image analysis technique

Page 15: Modeling Expressivity in ECAs

From annotations to animation Jean-Claude Martin, Laurence Devillers, LIMSI-CNRS; Maurizio Mancini, Paris8

Consider what is visible: annotate signals, how there are displayed, how they are

perceived Model what is visible:

represent signals, animate them with expressivity Two-steps approach

Elaborate rules by analysis (video corpus) Animate by “copy synthesis”

No model of the processes underlying the display of the signals

Annotation extraction animation

Expressivity over the WHOLE animation

Page 16: Modeling Expressivity in ECAs

EmoTV: 51 clips French TV Interviews Annotation

Emotion labels: single and blend of emotions Multimodal behavior Expressivity dimensions

AnnotationAnnotation

StepsSteps

ExtractionExtraction GenerationGenerationAnimationAnimation

EmoTVEmoTVclipclip

GRETAGRETAanimationanimation

Expressivity over the WHOLE animation

Page 17: Modeling Expressivity in ECAs

Expressivity parameter Anger DespairAngerDespair

from annotations

Temporal Extent 1 -1 1

Fluidity -1 1 0.58

Power 1 -0.5 0.11

Repetition 1 -1 1

values obtained from literature

values obtained from annotation

Expressivity of gestures in mixed emotion

Expressivity over the WHOLE animation

Page 18: Modeling Expressivity in ECAs

Video

Page 19: Modeling Expressivity in ECAs

Real and Virtual World SensingA. Raouzaiou, G. Caridakis, K. Karpouzis ICCS; C. Peters, E. Bevacqua, M. Mancini, Paris 8

Real World Sensing

Virtual Sensing

SensoryStorage

Perception

Generation

Attention Planning

Interpretation

Personality

SceneOntology

Goals

Expressivity over the WHOLE animation

Page 20: Modeling Expressivity in ECAs

Expressivity over

the WHOLE

animation

ApplicationScenario

Page 21: Modeling Expressivity in ECAs

Interpretation: gesture specified by symbolic name facial expression:

emotion label: if the facial expression corresponds to one of the prototypical facial expression of emotions

otherwise, FAPs values

Planning: modulate expressivity parameters module emotional expressions

Expressivity over the WHOLE animation

Page 22: Modeling Expressivity in ECAs

Generation Input to ECA system:

a symbolic description of a gesture emotion label or set of FAPs value expressivity parameters value

Output: facial and gesture animation

Expressivity over the WHOLE animation

Page 23: Modeling Expressivity in ECAs

Video

Page 24: Modeling Expressivity in ECAs

Expressivity on Every Frames

Greta Music Roberto Bresin, KTH - Maurizio Mancini,

Paris8

One set of expressivity paramters values are extracted automatically from acoustic data in real-time and fet to the ECA system.

Page 25: Modeling Expressivity in ECAs

Design a tool for real-time visual feedback to expressive performance

Expressivity on Every Frames

Page 26: Modeling Expressivity in ECAs

From music expression to facial expression

From acoustic cues to emotion : extraction of acoustic cues:

Tempo, Sound Level, Articulation (staccato/legato), Attack Velocity, Spectrum, Vibrato rate, Vibrato Extent, Pitch

From acoustic cues to facial expression: mapping of acoustic cues:

music emotion facial expression music volume spatial and power music tempo temporal and overall activation music articulation fluidity

Expressivity on Every Frames

Page 27: Modeling Expressivity in ECAs

Music version of Greta

Input:Expressivity Parameters

Output:FAP values (animated head)

This version of Greta allowsonly the following actions:

• head moving• eyes blinking• emotional expression• skin colouring

Expressivity on Every Frames

Page 28: Modeling Expressivity in ECAs

Video

Expressivity on Every Frames

Page 29: Modeling Expressivity in ECAs

Attraction of attention Corpus: videos from traditional animation that

illustrate different types of conversational interaction

Analysis: the modulations of gesture expressivity over time play a role in managing communication, thus serving as a pragmatic tool

France Telecom

Expressivity on Gesture Phases

Page 30: Modeling Expressivity in ECAs

Attraction of attention Irregularities

the principle of anticipation: it enhances the visibility of a gesture it enhances our propensity to gaze at this gesture.

Discontinuities create a contrast between successive

gestures function to isolate a particular gesture from a

sequence of gestures France Telecom

Expressivity on Gesture Phases

Page 31: Modeling Expressivity in ECAs

Irregularity

Expressivity on Gesture Phases

Page 32: Modeling Expressivity in ECAs

Irregularity – slow motion

Expressivity on Gesture Phases

Page 33: Modeling Expressivity in ECAs

Discontinuity – slow motion

Expressivity on Gesture Phases

Page 34: Modeling Expressivity in ECAs

Application: ECA as web presenter

Discontinuity – spatial parameter

Expressivity on Gesture Phases

Page 35: Modeling Expressivity in ECAs

35

Annotation of multimodal behavior signals on 3 modalitiesmodalities

arm gesturehead movementbody movement

phases of each signaleach phase: physical shape + timing

expressivity of each signal

Expressivity on Each Gesture

Page 36: Modeling Expressivity in ECAs

36

Annotation of multimodal behavior

Expressivity on Each Gesture

Page 37: Modeling Expressivity in ECAs

37

Animation format

we have defined a file format for the specification of behavior for our animation engine (Greta)

we translate from the XML annotation file (ANVIL) to the engine animation file

ANVILANVILannotationannotation

animationfile

Expressivity on Each Gesture

Page 38: Modeling Expressivity in ECAs

38

Expressivity on Each Gesture

Animation format

Page 39: Modeling Expressivity in ECAs

39

Page 40: Modeling Expressivity in ECAs

40

Page 41: Modeling Expressivity in ECAs

41

Demo

demo!