Modeling Expressivity in ECAs

Post on 21-Jan-2016

32 views 0 download

description

Modeling Expressivity in ECAs. Catherine Pelachaud, Maurizio Mancini LINC - University of Paris 8. Behavior. Behavior is related to the (Wallbott, 1998): quality of the mental state (e.g. emotion) it refers to quantity (somehow linked to the intensity factor of the mental state) - PowerPoint PPT Presentation

transcript

1

Modeling Expressivity in ECAs

Catherine Pelachaud,Maurizio Mancini

LINC - University of Paris 8

Behavior Behavior is related to the (Wallbott, 1998):

quality of the mental state (e.g. emotion) it refers to

quantity (somehow linked to the intensity factor of the mental state)

Behaviors encode: content information (the ‘What is

communicating’) expressive information (the ‘How it is

communicating’)

Behavior expressivity refers to the manner of execution of the behavior

Behavior Representation

Behavior= signal shape, movement, expressivity

Gesticon (Gesture Lexicon): Dictionary of behavior description (B. Krenn, H. Pirker OFAI)

Modalities: face hand and arm gesture body movement and posture gaze

Behavior Representation Face:

facial expression duration: onset, apex, offset

Gesture: phases: preparation, pre-hold-stroke, stroke, post-hold-

stroke, retraction gesture shape and movement for each phase

Head: head direction head movement

Gaze eye direction

Expressivity Dimensions

Expressivity dimensions: Spatial: amplitude of movement Temporal: duration of movement Power: dynamic property of movement Fluidity: smoothness and continuity of movement Repetitiveness: tendency to rhythmic repeats Overall Activation: quantity of movement across

modalities

Implemented for gesture and facial expression

Overall Activitation

• Threshold filter on atomic behaviors during APML tag matching

• Determines the number of nonverbal signals to be executed.

Spatial Parameter

• Amplitude of movement controlled through asymmetric scaling of the reach

• Space that is used to find IK goal positions

• Expand or condense the entire space in front of agent

Temporal parameter

Stroke shift / velocity control of a beat gesture

Y p

osit

ion

of w

rist

w.r

.t. s

houl

der

[cm

]

Frame #

• Determine the speed of the arm movement of a gesture's meaning-carrying stroke phase

• Modify speed of stroke

Fluidity

• Continuity control of TCB interpolation splines and gesture-to- gesture• Continuity of arms’ trajectory paths• Control the velocity profiles of an action

coarticulation

X p

osit

ion

of w

rist

w.r

.t. s

houl

der

[cm

]

Frame #

Power

• Tension and Bias control of TCB splines;• Overshoot reduction• Acceleration and deceleration of limbs

Hand shape control for gestures that do not need hand configuration to convey their meaning (beats).

Repetitivity

• Technique of stroke expansion: Consecutive emphases are realized gesturally by repeating the stroke of the first gesture.

Expressivity

Expressivity values may act: over the whole animation: EmoTV, analysis-synthesis every instant of the movement: Greta Music on every gesture: GEMEP corpus on a particular phase of the gesture: Attract attention

study

Exploratory studies based on various data types: acted data real data 2D cartoon literature

Research Issue

Behavior representation what to encode at which levels of representation dynamism

Implementation refinement

Expressivity over the WHOLE animation

One set of values are set extracted manually from annotation of

real data video corpus

extracted automatically from video corpus of acted data using image analysis technique

From annotations to animation Jean-Claude Martin, Laurence Devillers, LIMSI-CNRS; Maurizio Mancini, Paris8

Consider what is visible: annotate signals, how there are displayed, how they are

perceived Model what is visible:

represent signals, animate them with expressivity Two-steps approach

Elaborate rules by analysis (video corpus) Animate by “copy synthesis”

No model of the processes underlying the display of the signals

Annotation extraction animation

Expressivity over the WHOLE animation

EmoTV: 51 clips French TV Interviews Annotation

Emotion labels: single and blend of emotions Multimodal behavior Expressivity dimensions

AnnotationAnnotation

StepsSteps

ExtractionExtraction GenerationGenerationAnimationAnimation

EmoTVEmoTVclipclip

GRETAGRETAanimationanimation

Expressivity over the WHOLE animation

Expressivity parameter Anger DespairAngerDespair

from annotations

Temporal Extent 1 -1 1

Fluidity -1 1 0.58

Power 1 -0.5 0.11

Repetition 1 -1 1

values obtained from literature

values obtained from annotation

Expressivity of gestures in mixed emotion

Expressivity over the WHOLE animation

Video

Real and Virtual World SensingA. Raouzaiou, G. Caridakis, K. Karpouzis ICCS; C. Peters, E. Bevacqua, M. Mancini, Paris 8

Real World Sensing

Virtual Sensing

SensoryStorage

Perception

Generation

Attention Planning

Interpretation

Personality

SceneOntology

Goals

Expressivity over the WHOLE animation

Expressivity over

the WHOLE

animation

ApplicationScenario

Interpretation: gesture specified by symbolic name facial expression:

emotion label: if the facial expression corresponds to one of the prototypical facial expression of emotions

otherwise, FAPs values

Planning: modulate expressivity parameters module emotional expressions

Expressivity over the WHOLE animation

Generation Input to ECA system:

a symbolic description of a gesture emotion label or set of FAPs value expressivity parameters value

Output: facial and gesture animation

Expressivity over the WHOLE animation

Video

Expressivity on Every Frames

Greta Music Roberto Bresin, KTH - Maurizio Mancini,

Paris8

One set of expressivity paramters values are extracted automatically from acoustic data in real-time and fet to the ECA system.

Design a tool for real-time visual feedback to expressive performance

Expressivity on Every Frames

From music expression to facial expression

From acoustic cues to emotion : extraction of acoustic cues:

Tempo, Sound Level, Articulation (staccato/legato), Attack Velocity, Spectrum, Vibrato rate, Vibrato Extent, Pitch

From acoustic cues to facial expression: mapping of acoustic cues:

music emotion facial expression music volume spatial and power music tempo temporal and overall activation music articulation fluidity

Expressivity on Every Frames

Music version of Greta

Input:Expressivity Parameters

Output:FAP values (animated head)

This version of Greta allowsonly the following actions:

• head moving• eyes blinking• emotional expression• skin colouring

Expressivity on Every Frames

Video

Expressivity on Every Frames

Attraction of attention Corpus: videos from traditional animation that

illustrate different types of conversational interaction

Analysis: the modulations of gesture expressivity over time play a role in managing communication, thus serving as a pragmatic tool

France Telecom

Expressivity on Gesture Phases

Attraction of attention Irregularities

the principle of anticipation: it enhances the visibility of a gesture it enhances our propensity to gaze at this gesture.

Discontinuities create a contrast between successive

gestures function to isolate a particular gesture from a

sequence of gestures France Telecom

Expressivity on Gesture Phases

Irregularity

Expressivity on Gesture Phases

Irregularity – slow motion

Expressivity on Gesture Phases

Discontinuity – slow motion

Expressivity on Gesture Phases

Application: ECA as web presenter

Discontinuity – spatial parameter

Expressivity on Gesture Phases

35

Annotation of multimodal behavior signals on 3 modalitiesmodalities

arm gesturehead movementbody movement

phases of each signaleach phase: physical shape + timing

expressivity of each signal

Expressivity on Each Gesture

36

Annotation of multimodal behavior

Expressivity on Each Gesture

37

Animation format

we have defined a file format for the specification of behavior for our animation engine (Greta)

we translate from the XML annotation file (ANVIL) to the engine animation file

ANVILANVILannotationannotation

animationfile

Expressivity on Each Gesture

38

Expressivity on Each Gesture

Animation format

39

40

41

Demo

demo!