electrophysiological correlates of perceiving and evaluating static and dynamic facial emotional...

10
Research Report Electrophysiological correlates of perceiving and evaluating static and dynamic facial emotional expressions Guillermo Recio a, , Werner Sommer a , Annekathrin Schacht a, b a Department of Psychology, Humboldt-Universität zu Berlin, Germany b CRC Text Structures, University of Göttingen, Germany ARTICLE INFO ABSTRACT Article history: Accepted 11 December 2010 Available online 21 December 2010 Recent evidence suggests that dynamic facial expressions of emotion unfolding over time are better recognized than static images. However, the mechanisms underlying this facilitation are unclear. Here, participants performed expression categorizations for faces displaying happy, angry, or neutral emotions either in a static image or dynamically evolving within 150 ms. Performance replicated facilitation of emotion evaluation for happy expressions in dynamic over static displays. An initial emotion effect in event-related brain potentials evidenced in the early posterior negativity (EPN) was both enhanced and prolonged when participants evaluated dynamic in comparison to static facial expressions. Following the common interpretation of the EPN, this finding suggests that the facilitation for dynamic expressions is related to enhanced activation in visual areas starting as early as 200 ms after stimulus onset, presumably due to shifts of visual attention. Enhancement due to dynamic display was also found for the late positive complex (LPC), indicating a more elaborative processing of emotional expressions under this condition at subsequent stages. © 2010 Elsevier B.V. All rights reserved. Keywords: Emotion Facial motion Emotional facial expression Event-related potential 1. Introduction Social communication is a dynamic process in which rapidly changing auditory and visual inputs need to be quickly evaluated. In the context of social interactions, human faces provide an extraordinarily important source of information. For instance, lip movements support speech comprehension, gaze direction informs about spatial attention, and facial expressions communicate the emotional state of others. Thus, it seems that we are geared to quickly recognize subtle changes in the facial composure of conspecifics. Although some studies have indeed shown a particular sensitivity for dynamic facial movements, for example, in learning faces (Pilz et al., 2006), identifying persons (O'Toole et al., 2002), recognizing emotional expressions (Ambadar et al., 2005; Bassili, 1978; Kamachi et al., 2001), and in the perceived intensity of expressions (e.g., Biele and Grabowska, 2006), most studies on emotional facial expression rely on static images (e.g., Adolphs, 2002). However, in reality, emotional expressions usually occur as characteristic changes of the facial configuration when coordinated muscle contractions unfold over time. Neuroimaging studies have shown activation in several brain areas while viewing static expressions of emotion, including the striate cortex, the fusiform face area (FFA), the superior temporal gyrus, the amygdala, the orbitofrontal cortex, the basal ganglia, and the superior temporal sulcus (STS) (for reviews, see Adolphs, 2002; Allison et al., 2000; Blake BRAIN RESEARCH 1376 (2011) 66 75 Corresponding author. Department of Psychology, Humboldt-Universität zu Berlin, Rudower Chaussee 18, D-12489 Berlin, Germany. Fax: +49 30 2093 4888. E-mail address: [email protected] (G. Recio). 0006-8993/$ see front matter © 2010 Elsevier B.V. All rights reserved. doi:10.1016/j.brainres.2010.12.041 available at www.sciencedirect.com www.elsevier.com/locate/brainres

Upload: guillermo-recio

Post on 04-Sep-2016

212 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Electrophysiological correlates of perceiving and evaluating static and dynamic facial emotional expressions

B R A I N R E S E A R C H 1 3 7 6 ( 2 0 1 1 ) 6 6 – 7 5

ava i l ab l e a t www.sc i enced i r ec t . com

www.e l sev i e r . com/ loca te /b ra i n res

Research Report

Electrophysiological correlates of perceiving and evaluatingstatic and dynamic facial emotional expressions

Guillermo Recioa,⁎, Werner Sommera, Annekathrin Schachta,b

aDepartment of Psychology, Humboldt-Universität zu Berlin, GermanybCRC Text Structures, University of Göttingen, Germany

A R T I C L E I N F O

⁎ Corresponding author. Department of Psych+49 30 2093 4888.

E-mail address: [email protected] (G. Re

0006-8993/$ – see front matter © 2010 Elsevdoi:10.1016/j.brainres.2010.12.041

A B S T R A C T

Article history:Accepted 11 December 2010Available online 21 December 2010

Recent evidence suggests that dynamic facial expressions of emotion unfolding over timeare better recognized than static images. However, the mechanisms underlying thisfacilitation are unclear. Here, participants performed expression categorizations for facesdisplaying happy, angry, or neutral emotions either in a static image or dynamicallyevolving within 150 ms. Performance replicated facilitation of emotion evaluation for happyexpressions in dynamic over static displays. An initial emotion effect in event-related brainpotentials evidenced in the early posterior negativity (EPN) was both enhanced andprolonged when participants evaluated dynamic in comparison to static facial expressions.Following the common interpretation of the EPN, this finding suggests that the facilitationfor dynamic expressions is related to enhanced activation in visual areas starting as early as200 ms after stimulus onset, presumably due to shifts of visual attention. Enhancement dueto dynamic display was also found for the late positive complex (LPC), indicating a moreelaborative processing of emotional expressions under this condition at subsequent stages.

© 2010 Elsevier B.V. All rights reserved.

Keywords:EmotionFacial motionEmotional facial expressionEvent-related potential

1. Introduction

Social communication is a dynamic process in which rapidlychanging auditory and visual inputs need to be quicklyevaluated. In the context of social interactions, human facesprovide an extraordinarily important source of information.For instance, lip movements support speech comprehension,gaze direction informs about spatial attention, and facialexpressions communicate the emotional state of others. Thus,it seems that we are geared to quickly recognize subtlechanges in the facial composure of conspecifics. Althoughsome studies have indeed shown a particular sensitivity fordynamic facial movements, for example, in learning faces(Pilz et al., 2006), identifying persons (O'Toole et al., 2002),

ology, Humboldt-Univers

cio).

ier B.V. All rights reserved

recognizing emotional expressions (Ambadar et al., 2005;Bassili, 1978; Kamachi et al., 2001), and in the perceivedintensity of expressions (e.g., Biele and Grabowska, 2006),most studies on emotional facial expression rely on staticimages (e.g., Adolphs, 2002). However, in reality, emotionalexpressions usually occur as characteristic changes of thefacial configuration when coordinated muscle contractionsunfold over time.

Neuroimaging studies have shown activation in severalbrain areas while viewing static expressions of emotion,including the striate cortex, the fusiform face area (FFA), thesuperior temporal gyrus, the amygdala, the orbitofrontalcortex, the basal ganglia, and the superior temporal sulcus(STS) (for reviews, see Adolphs, 2002; Allison et al., 2000; Blake

ität zu Berlin, Rudower Chaussee 18, D-12489 Berlin, Germany. Fax:

.

Page 2: Electrophysiological correlates of perceiving and evaluating static and dynamic facial emotional expressions

67B R A I N R E S E A R C H 1 3 7 6 ( 2 0 1 1 ) 6 6 – 7 5

and Shiffrar, 2007). More recently, several studies have foundenhanced and/or more widespread activation patterns inthese networks in response to dynamic face stimuli, particu-larly, in the amygdala, in visual areas (striate and extrastriatecortex, and V5/MT+), fusiform gyrus, STS, inferior frontalcortex, FFA, premotor area, parahippocampal regions, andsupplementary motor area (e.g., Kilts et al., 2003; LaBar et al.,2003; Sato et al., 2004; Trautmann et al., 2009). The enhancedactivation in striate and extrastriate visual areas has beensuggested to reflect augmented selective attention to emo-tional relative to neutral stimuli at early stages of visualprocessing (e.g., Kilts et al., 2003). Trautmann et al. (2009)proposed that the higher complexity and rapidly changingcues in dynamic facesmight result in activation of wider brainnetworks. On the other hand, the temporal pattern ofstructural changes in dynamic facial expressions, their greaterliveliness and higher ecological validity, along with increasedarousal ratings might improve the three-dimensional percep-tion of faces and facilitate the processing of emotionalexpressions.

Electrophysiological studies of dynamic facial expressionsare even scarcer. Puce, Smith and Allison (2000) foundevidence that the amplitude of the face-sensitive N170component in the event-related potential (ERP) was affectedby the direction of gaze and mouth movement in acontinuously presented face. Furthermore, there is evidencethat dynamic emotional expressions and gaze directionaffect ERP components as early as 100 ms after the onset ofthe event (P1 and subsequent N1 and P3 components)(Fichtenholtz et al., 2007, 2009), indicating shifts in atten-tional orientation. However, no direct comparison has beenmade between static and dynamic conditions in thesestudies; therefore, they are not informative about specificdifferences between these conditions. A recent study using asteady-state stimulation procedure, that directly comparedstatic with dynamic emotional faces, found a late reductionin neural processing in the temporal lobe for dynamic faces(Mayes et al., 2009).

Summarizing previous findings, facial motion seems toimprove the perception of emotional expressions (for a review,see Ambadar et al., 2005) and neuronal substrates of perceiv-ing and evaluating facial motion have been described.However, the mechanisms underlying the motion effectsand their time course of action remain largely undefined.

It has been suggested that emotional aspects of stimulifacilitate their processing by influencing early perceptual andlater elaborative stages (Öhman et al., 2000). Limited atten-tional resourcesmight be intentionally or reflexively allocatedto a given stimulus, depending on, for instance, its salience orintensity (Wickens, 1980). Thus, emotional aspects mightenhance the allocation of attention to the stimulus, facilitat-ing perceptual and subsequent recognition processes (Schuppet al., 2003). Attention capture has been attributed also tomoving objects, as already suggested by William James (1891/1950) and confirmed by recent studies for translating andlooming motion (Franconeri and Simons, 2003), featurechanges (Mühlenen et al., 2005), and motion onset (Abramsand Christ, 2003). Therefore, the superiority of dynamicemotional expressions might relate to the augmented captureof attentional resources as compared to static pictures,

boosting – among other aspects – the evaluation of theemotional expression.

Emotions in facial expressions have been reported to elicittwo ERP components: the early posterior negativity (EPN) andthe late positive complex (LPC) (e.g., Holmes et al., 2009;Schacht and Sommer, 2009; Schupp et al., 2004). Bothcomponents can be best visualized when the ERP to neutralstimuli is subtracted from the ERP to emotional stimuli. TheEPN emerges as early as 150 to 300 ms after stimulus onset as anegative deflection over occipito-parietal electrodes and – asits counterpart – fronto-central positivity and is considered toreflect attention allocation to the stimuli (Junghöfer et al.,2001). If dynamic facial expressions facilitate performance byaugmenting attention capture, the EPN to emotional ascompared to neutral stimuli should be more pronounced fordynamic stimuli.

The second ERP component modulated by emotionalexpressions, the LPC, appears at around 500 ms, as a long-lasting, enhanced positivity over centro-parietal electrodes,and is suggested to reflect elaborative processing and con-scious recognition of the stimulus (Schupp et al., 2003). Ifdynamic stimuli also augment the elaborative processesfollowing attention capture, also the LPC effect might bemore prominent for dynamic than static facial expressions.

Here, we presented face stimuli with happy, angry, orneutral expressions in either static or dynamic presentationmodes while participants explicitly categorized these expres-sions. If the dynamic presentation is responsible for improvedemotion evaluation, motion should facilitate the typicalemotion effect in the ERPs. It was of primary interest, atwhich time after stimulus onset, ERPs to dynamic and staticfacial expressions would start to be distinguishable from eachother and from ERPs to neutral expressions. Given previousevidence that both emotional images (e.g., Schupp et al., 2003)and moving objects (Abrams and Christ, 2003; Franconeri andSimons, 2003; Mühlenen et al., 2005) guide stimulus-drivenselective attention, we expected enhanced EPN amplitudes fordynamic faces. Moreover, we expected this boosted visualattention to facilitate emotion evaluation, which should bereflected inmore pronounced LPC effects. If motion in the faceincreases the intensity of the facial expression (e.g., Biele andGrabowska, 2006), which in turn facilitates the perception andevaluation of emotional expressions, the LPC effect should beenhanced in the dynamic condition. Furthermore, weexpected the scalp distributions of the observed emotionaleffects to reflect enhanced visual processing for dynamicemotional faces in posterior electrode positions, in line withthe more widespread activation patterns reported in fMRIstudies suggesting shifts of attention to dynamic stimuli (e.g.,Kilts, et al., 2003).

2. Results

2.1. Behavioral data

Behavioral data is presented in Table 1. Relative to staticpictures dynamic emotional expressions were recognizedfaster, F(1,20)=41.8, p<.001, and more accurately, F(1,20)=10.9, p<.01. Moreover, interactions between presentation

Page 3: Electrophysiological correlates of perceiving and evaluating static and dynamic facial emotional expressions

Table 1 –Mean RTs and error rates for three emotionalexpressions and static and dynamic display conditions(SDs in parentheses).

Display condition

Expression Static Dynamic

Mean RTs (ms)Neutral 584.8 (83.8) 581.8 (87.3)Angry 604.3 (81.6) 595.9 (78.4)Happy 646.6 (100.5) 577.5 (78.1)

Error rates (%)Neutral 4.9 (2.2) 5.3 (2.6)Angry 7.04 (2.7) 8.9 (4.1)Happy 28.5 (10.6) 11.24 (3.7)

68 B R A I N R E S E A R C H 1 3 7 6 ( 2 0 1 1 ) 6 6 – 7 5

mode and emotional expression were significant for both RTs,F(2,40)=19.6, p<.001, and error rates, F(2,40)=15.4, p<.001.Pairwise comparisons revealed that happy faces particularlybenefited from dynamic presentation: whereas responses tostatic faces were significantly slower for happy than forneutral and angry expressions, Fs(1,20)>11.5, ps<.05, withinthe dynamic condition, RTs did not differ between emotionalexpressions Fs(1,20)<5.0, ps> .05. Only happy faces weresignificantly faster, F(1,20)=38.1, p<.001, in the dynamic ascompared to the static condition. Similarly, error rates in thestatic condition were higher for happy than for both neutraland angry faces, Fs(1,20)>21.6, ps≤ .001. In the dynamiccondition, error rates for positive expressions were reduced,F(1,20)=15.5, p<.01, as compared with static positive faces. Nosignificant differences in error rates were observed betweenstatic and dynamic stimuli within neutral, F<1, or angryexpressions F(1,20)=4.0, p>.05.1

2.2. Event-related potentials

Fig. 1B depicts grand average ERP waveforms. In both staticand dynamic display conditions, happy and angry faceselicited larger ERP amplitudes relative to neutral faces. Thisvisual impression was confirmed by overall ANOVAs, reveal-ing main effects of emotional expression in all intervalsbetween 150 and 600 ms, Fs(112,2240)>3.6, ps≤ .001, εs= .070 to.111. Importantly, and as can be seen in Fig. 1B, these emotioneffects appear to be considerably larger in the dynamic thanthe static condition. This was statistically confirmed bysignificant interactions between emotional expression andpresentation mode during early (200–300 ms), Fs(112,2240)>

1 We analyzed gender effects in the ERPs in two subgroups of 14male and 7 female participants. In the overall ANOVA over ERPamplitudes the gender factor did not interact with emotionalvalence in any time window, all ps>.05. We also analyzed thedata from the pilot rating study in a group of 60 participants (38females). Results revealed similar ratings by males and femalesfor angry, M(male)=1.5, SD=0.2; M(female)=1.5, SD=0.2; neutral,M(male)=3.1, SD=0.3; M(female)=3.0, SD=0.3; and happy, M(male)=3.8, SD=0.3; M(female)=3.8, SD=0.3, faces. An ANOVAwith repeated measures revealed a main effect of emotionalexpression, F(2,116)=517.9; p<.001, but no significant interactionwith the between-subject factor gender, F(2,116)<1, p>.05.

2.9, ps< .01, εs= .078 to .091, and late (350–600 ms) timesegments, Fs(112,2240)>2.1, ps<.05, εs=.069 to .081. A globalshift of the ERP waveforms as a function of presentationmodewas reflected in main effects of this factor between 200 and600 ms, Fs(56,1120)>11.4, ps< .001, εs= .067 to .116.

As shown in the left panel of Fig. 1B at electrode sites Fz, Cz,and PO10, angry static faces elicited enhanced amplitudes oftwo distinguishable ERP components as compared to neutralfaces. This was statistically confirmed between 200 and250 ms, F(56,1120)=6.8, p< .001, ε= .07. The left panel ofFig. 1C, highlighted with a dotted frame, shows that thisearly effect of emotional expression, that is, the differencebetween angry and neutral expressions, depicts a posteriornegativity, which resembles the typical scalp distribution ofthe EPN. Between 500 and 550 ms, also angry faces differedfrom neutral ones, F(56,1120)=4.9, p≤ .01, ε=.065. This lateeffect of angry faces displays a centro-parietal positivity,typical for the LPC, represented in Fig. 1B at Cz electrode siteand Fig. 1C, highlighted with a dashed frame.

In dynamic trials, the early emotion effect started at thesame latency as in static trials (200 ms) but it lasted longer andwas present until 350 ms for both happy, Fs(56,1120)>6.6,ps≤ .001, εs=.06 to .077, and angry faces, Fs(56,1120)>4.9,ps<.01, εs=.066 to .08, as compared with neutral faces. Ascan be seen in the right panel of Fig. 1C, this early effectdisplayed a posterior negativity and an anterior positivity butits scalp distribution seemed to differ from the EPN to staticfaces, with a more pronounced frontal positivity and posteriornegativity in dynamic presentations. On the other hand, alsothe scalp distributions of different emotions seem to differfrom each other with a more temporal negativity for angrythan for happy faces. In order to validate this impression, wecompared scalp topographies of ERP difference waves tohappy minus neutral and to angry minus neutral faces in the200–250 ms interval after scaling by dividing ERP amplitudesby the global field power (GFP), (Lehmann and Skrandies, 1980)in each condition. ANOVA on normalized ERPs, including thefactors electrode (57 levels), presentationmode (2), and emotion(2), revealed significant interactions of factor electrode withpresentation mode, F(56,1120)=3.2, p<.001, ε=.26, as well aswith emotion, F(56,1120)=1.9, p≤ .01 ε=.35, but no three-wayinteraction was obtained, F(56,1120)=.76, p>.05, ε=.25. Thus,ANOVA confirmed the visual impressions of topographicdifferences for both emotion and presentation mode.

The early emotion effect to dynamic trialswas immediatelyfollowed by an enhanced centro-parietal positivity, which istypical for the LPC, represented in Fig. 1B at electrode Cz, andframed with a dashed line. These amplitude differences weresignificant between 350 and 550 ms, for happy, Fs(56,1120)>5.0, ps<.01, εs=.073 to .09, and between 500 and 600 ms forangry faces, Fs(56,1120)>6.4, ps≤ .001, εs=.066 to .083.

In order to rule out that the effects of presentation modesomehow relate to the specific dynamic neutral expressionused here (a blink), we also compared dynamic and staticconditions within each type of emotional expression. Resultsshowed significant differences between static and dynamicfaces from 200 to 600 ms within happy, Fs(56,1120)>7.4,ps< .001, εs= .076 to .127, and from 250 to 600 ms withinangry emotional expressions, Fs(56,1120)>2.5, ps< .01 εs= .057to .086. In contrast to the emotional conditions, differences

Page 4: Electrophysiological correlates of perceiving and evaluating static and dynamic facial emotional expressions

69B R A I N R E S E A R C H 1 3 7 6 ( 2 0 1 1 ) 6 6 – 7 5

between static and dynamic conditions within neutral faceswere small and confined to the intervals 300–450 ms,Fs(56,1120)>5.2, ps< .01, εs= .079 to .118, and 550–600 ms,F(56,1120)=3.1, p< .05, ε= .106.

ANOVAs of peak latencies for the P1 and N170 componentstaken at PO8 and P10 electrodes sites, respectively, did notdiffer between conditions, Fs<1. Region of interest (ROI)ANOVAs for P1 and N170 peak amplitudes at selected bilateralsites (see Experimental procedures) did not reveal anysignificant effects of emotional expression or presentationmode, Fs<1.6, ps>.05.

3. Discussion

In the present study, we compared dynamic and static facesdisplaying positive, negative, or neutral facial expressions thatwere morphed from portraits of neutral faces. It was our maininterest to assess whether emotional-related ERP componentsto static facial expression as obtained in previous researchwould extend also to dynamic displays or whether suchdynamically developing expressions would lead to qualita-tively different patterns of brain activity. In general, thepresent findings confirmed the superior processing of dynam-ic emotional facial expressions (Ambadar et al., 2005), howeverthis effect was most prominent for happy faces. Event-relatedpotentials confirmed the presence of an EPN for static faces,which appeared to be strongly enhanced for dynamic expres-sions. Likewise, also a LPC was present more strongly indynamic faces.

In the static condition, happy faces were evaluated slowerand less accurately than neutral and angry faces, whichseems to contradict the advantage for happy faces observedin previous studies (e.g., Leppänen and Hietanen, 2004). Thehigh error rate for happy faces revealed a difficulty in thiscondition. This may be due to the fact that our stimuli weremorphed with an animation software that produces lessemotion-typical changes around the eyes in happy than inangry faces, possibly attenuating the intensity for happyexpressions, making them more difficult to classify. Al-though RTs and error rates for this condition were reducedwhen the face was presented in motion, still the error ratewas higher than in other studies (e.g., Leppänen andHietanen, 2004), and did not differ significantly from angryand neutral expressions. Although Leppänen and Hietanen(2004) obtained similar error rates for natural and schematicfaces, other research using more complex synthetic face-stimuli suggests an advantage for the recognition of emo-tional expressions in natural over synthetic faces (e.g.,Kätsyri and Sams, 2008). On the other hand, the advantageobserved for dynamic happy faces is in accord with studiessuggesting that dynamic presentation might be particularlybeneficial for the recognition of emotional expressions at lowintensity (Ambadar et al., 2005), and from computer-animat-ed synthetic faces (Kätsyri and Sams, 2008). More researchconsidering other emotional expressions and additionalvariables regarding the temporal features of dynamic stimulisuch as speed or intensity in the expression will be necessaryto determinate whether certain emotional expressions arebetter recognized than other.

Interestingly, the emotional effects started with similarlatencies in dynamic and static conditions, although thepicture with maximal intensity of emotional expressionappeared 100 ms later in dynamic trials, and was present fora shorter time, as shown in Fig. 1A. A possible correlate on asubjective level could be that dynamic emotional expressionsare perceived and rated asmore intense (Biele and Grabowska,2006). Intensity in facial expressions has been defined as theamount of muscle displacement away from neutral state(Hess et al., 1997); hence, facial movements displayed indynamic expressions are likely to provide richer informationand higher contrast from the neutral state than static ones.Clore and Gasper (2000) claimed that the intensity of emotionactivates attentional processes that are then guided byaccessible concepts to concept-relevant information. Accord-ing to this view, the enhanced EPN and LPC effects presentedhere indicate that emotional expressions dynamically emerg-ing from the face – as they actually do in real life – seem to beperceived as more intense than static ones.

Electrophysiological recordings allowed us to elucidatesome of the mechanisms underlying the performance effect.First, we found that both early and late emotion-related ERPcomponents were enhanced in the dynamic condition, andthat the scalp topography of the early effect (EPN) wassignificantly different from the EPN to static faces. Accordingto Schupp et al.'s (2006) functional interpretation of the EPNand LPC components in the context of Öhman's (1979, 1986)two-stage model of stimulus perception, the EPN is seen asindicating a call for attentional resources after an initial stageof perceptual encoding. However, to be represented inconsciousness, stimuli must access a second stage of proces-sing in short-term memory, which might be reflected by theLPC component. In consonance with this view, presentresults – showing an increased and prolonged early effectfor dynamic faces – indicate that the motion accompanyingan emerging facial expression strongly enhanced the “moti-vated attention” (Lang et al., 1997, p. 97). This view issupported by neuroimaging studies that show augmentedactivity to dynamic faces in striate and extrastriate areas,presumably due to increased selective attention to dynamicemotional stimuli at early stages of processing (Kilts et al.,2003), and resulting in stronger and more widespreadactivation patterns (e.g., Trautmann et al., 2009). Here, wewere able to show for the first time that this enhancedattention occurred as early as 200 ms after stimulus onset.Furthermore, this augmented attention seems to lead to amore elaborate processing at later stages, presumablyreflected in the LPC. The explicit emotion categorizationtask used in our study probably directed participants'attention towards the emotional content of the expressions(e.g., Lange et al., 2003). Thus instructions might haveinfluenced the attentional shift to the movements conveyedby the dynamic expressions, which provide more and richerinformation of the configurational changes that constituteeach expression, increasing the perceived intensity, andfacilitating their perception and evaluation. As indicated byEkman and Friesen (1982), emotion identification may bemore accurate when evaluations of facial expressions rely onboth morphological and dynamic changes. Our resultsconfirm this assumption but it is still unclear whether this

Page 5: Electrophysiological correlates of perceiving and evaluating static and dynamic facial emotional expressions

70 B R A I N R E S E A R C H 1 3 7 6 ( 2 0 1 1 ) 6 6 – 7 5

benefit from dynamic facial expressions extends also to othertasks where emotional aspects are processed implicitly.

Alternatively, it is conceivable that the low intensity ofemotional expression in the initial frames presented at the

onset of the dynamic stimuli is responsible for the shift ofattention because ambiguous stimuli probably require moreattention. If this were the case, one would expect staticpictures at low intensity to elicit larger responses in the ERPs

Page 6: Electrophysiological correlates of perceiving and evaluating static and dynamic facial emotional expressions

Fig. 2 – Scalp distributions of ERP difference waves betweendynamic and static presentation modes, separated for angry,happy, and neutral faces, between 200 and 250 ms (top panel)and 450 and 600 ms (bottom panel).

71B R A I N R E S E A R C H 1 3 7 6 ( 2 0 1 1 ) 6 6 – 7 5

than at high intensity. However, evidence from studiesmanipulating the intensity of emotional expressions in staticphotographs, found enhanced ERP amplitudes to high inten-sity expressions in all time epochs from 160 to 600 ms(Sprengelmeyer and Jentzsch, 2006). Hence, it seems moreplausible that the configurational changes and the augmentedintensity were responsible for the enhanced EPN and LPCeffects in the dynamic condition, rather than the presumedambiguity of expressions at low intensity.

The neutral baseline employed in our dynamic condition, ablink, might restrict the interpretation of results because ablink displays less motion than an emotional expression, anddespite being emotionally meaningless, it may be of somesocial relevance. In order to rule out the explanation that theemotionally neutral dynamic condition was responsible forthe enhanced emotional effects, we also compared thedifference between dynamic and static faces within eachtype of expression (see Fig. 2). If the dynamic presentationmode enhances the emotion effect itself, we should findtypical emotion effects, even when comparing presentationmodes within a given emotion. Interestingly, as can be seen inFigs. 1C and 2 the topographic distributions of ERP differencesbetween dynamic and static conditions were very similar tothe emotion effects relative to the neutral condition in thedynamic conditions, suggesting very similar neural sources ofthe effects. As we obtained similar modulations in both theEPN and LPC components, the movements in the face mightnot only shift the attention in a stimulus-driven way, asreflected in the EPN, but also enrich the subsequent recogni-tion processes indexed by the LPC. Therefore, the early andlate emotion effects in ERPs were not only present whencomparing emotional with neutral faces within a presentationmode, but also comparing dynamic with static faces, demon-strating that the empowering effect of facial movement wasindependent from the particular baseline condition.

Comparisons of topographies of early effects revealedsignificant differences between the emotional expressionsand the presentation modes. Dynamic stimuli displayedmorepronounced frontal positivity and posterior negativity thanstatic, and angry faces showedmore temporal negativity thanhappy faces. This result suggests that angry and happy facesgenerate qualitatively different activation patterns at earlystages of emotional face processing. Such effects might arisewhen several sets of areas are activated or when the activityin the same set of areas is modulated differentially. In anycase, our findings imply dissociate patterns of brain activa-tion and not just uniform changes of the gain in all systemsinvolved. A possible explanation for these differences mightbe that angry faces are emotionally more arousing that happyones, thus generating enhanced activation in areas involvingevaluation of emotional significance. This view would be inaccord with evidence showing greater response amplitude to

Fig. 1 – (A) Face stimuli and trial scheme. (B) Effects of static (left) agrand average ERP waveforms at representative electrode sites. Iand LPC intervals are framed by dotted and dashed lines, respecticompared to neutral faces, and a relative negativity is shown at thshows that these differences were more pronounced for dynamienhanced positivity for emotional as compared to neutral faces,(C) Scalp distributions of ERP difference waves to angry and happ

high- than low-arousal IAPS images (e.g., Cuthbert et al., 2000;Keil et al., 2001), and increased gamma band activitymodulation for high- (angry and fear) as compared to low-arousal (happy and neutral) faces (Balconi and Lucchiari,2007). On the other hand, our evidence for qualitativelydifferent activation patterns in static and dynamic faceswould argue against a simple difference in gain (arousal)between these conditions.

Neuroimaging studies have shown differences in BOLDactivation patterns comparing dynamic facial expressions ofhappiness and disgust (Trautmann et al., 2009), fear (LaBaret al., 2003; Sato et al., 2004), and anger (Kilts et al., 2003). Ourresults confirm these differences for anger, and indicate aqualitatively different analysis of positive and negative facialexpressions in the visual cortex starting 200 ms after stimulusonset. The present findings also indicate that the EPN issomewhat emotion-specific, which appears to contradict thesuggestion that the EPN only represents the attentionallocated by any salient stimulus, regardless of the nature ofthat stimulus.

On the other hand, the main effect of presentationmode inthe topographic comparisons also suggests a differentialprocessing of static and dynamic expressions. Here theenhanced posterior negativity for dynamic faces indicatesaugmented visual processing, presumably due to a shift in thevisual attention driven by themovements within the face, andthe higher arousal level for dynamic faces (e.g., Trautmannet al., 2009). Moreover, this finding indicates that the boostedeffect for dynamic faces implies wider neural sources, and notmerely enhanced activation in the same areas than staticones. This view is in consonance with neuroimaging studiesshowing more widespread activation for dynamic faces instriate and extrastriate areas (e.g., Kilts et al., 2003; Trautmannet al., 2009), fusiform gyrus (e.g., Sato et al., 2004), and superiortemporal sulcus (Puce et al., 1998). In addition, studies

nd dynamic (right) angry, happy, and neutral expressions onntervals with significant emotion effects are shaded; the EPNvely. Electrode Fz shows amore positive ERP for emotional ase occipital position PO10, i.e. an EPN. The right side of panel Bc faces. During the LPC interval electrode Cz shows anwhich is more pronounced in dynamic than static faces.y minus neutral faces within the EPN and LPC intervals.

Page 7: Electrophysiological correlates of perceiving and evaluating static and dynamic facial emotional expressions

72 B R A I N R E S E A R C H 1 3 7 6 ( 2 0 1 1 ) 6 6 – 7 5

reporting enhanced activation in premotor and supplemen-tary motor areas (Sato et al., 2004; Trautmann et al., 2009)suggest that the greater efficiency of dynamic facial expres-sions and the additionally recruited brain systems mightrelate to the mirror neuron system, which is assumed as theneurophysiological basis for the imitation of actions but alsoof facial expressions (for review see Iacoboni, 2009).

In contrast to other ERPs studies using dynamic facestimuli (Fichtenholtz et al., 2007, 2009; Puce et al., 2000),neither the P1 nor the N170 component was affected bymotion in the face. This discrepancy might be due todifferences in the experimental designs. There is evidencesuggesting that when two face-pictures are presented con-secutively, the N170 to target pictures ismodulated if a changeof facial expression occurs between the cue and the target(Miyoshi et al., 2004). Fichtenholtz et al. (2007, 2009) usedemotional dynamic faces as cues, and in the faces employedby Puce et al. (2000), the facial movements, which did notconsist of emotional expressions, were displayed in a contin-uously present face. In our experiment the face stimuli weredisplayed without cue, which might explain the absence ofany effects in the P1 and N170 components. Besides, method-ological differences with other studies (Mayes et al., 2009)make it difficult to compare findings about emotion effectsin ERPs.

There is relative consensus that the N170 indicatesstructural encoding of faces (Bentin et al., 1996; see Eimer,2011, for a review). However, the emotional modulation of theN170 component is controversial. Some studies have shownthat the N170 amplitude is affected by emotional expression(e.g., Batty and Taylor, 2003; Blau et al., 2007; Williams et al.,2006), whereas others did not (e.g., Eimer and Holmes, 2002;2007; Eimer et al., 2003). The absence of emotional expressioneffects on the N170 in the present study is in line with thesuggestion of Eimer et al. (2003), that the structural encodingrepresented by the N170 occurs independently and in parallelto the analysis of emotional expression.

A limitation of the present studymight be that the dynamicstimuli used here had been morphed with software, control-ling for duration and speed of the facial expressions. Althoughnatural expressions probably do not follow strict motion rules,genuine facial expressions have been considered to besmooth, reflex-like, ballistic, regular and uniform in appear-ance (Ekman, 1977; Hess et al., 1989; Rinn, 1984; Tomkins, 1962;Weiss et al., 1987), and recent evidence confirms a consistentrelation between amplitude and duration of spontaneoussmiles (Cohn and Schmidt, 2004). Additional research usingmore natural stimuli, assessing the effects of the temporalcharacteristic of the stimuli such as intensity, duration, speed,onset and offset of the expression, will be needed in order toapproximate real live conditions of social communication, inwhich the dynamic facial expressions are more complexbecause they are continuously changing. Although the use ofnatural or synthetic faces might influence the classification ofemotional expressions (e.g., Kätsyri and Sams, 2008), manyessential effects in face perception can be observed withstylized stimuli (e.g., Leppänen and Hietanen, 2004).

Together, present results support the hypothesis thatmotion increases the impact of emotional expressions.Hence, we demonstrated that motion increased visual proces-

sing and continued processes, presumably by inducing addi-tional activity in early visual brain areas. Importantly, theprocessing of dynamic faces seems to be not only quantita-tively, but also qualitatively different from static faces,indicating that dynamic faces are ecologically more validthan static ones, and might have differential effect atbehavioral and cognitive levels. It therefore appears advisableto take the motion dimension into consideration in futurebasic and applied research about emotional expressionprocessing.

4. Experimental procedures

4.1. Participants, stimuli, and procedure

Participants were 21 healthy adults (7 female) between 20 and34 years of age (M=24.14 years, SD=3.3) with normal orcorrected-to-normal vision. Apart from one male left-hander,all participants were right-handed (Oldfield, 1971). Lateralityquotients for handedness were M=84.5, SD=18.9 for femaleparticipants; and M=69.6, SD=50.4 for males.

None of them reported a history of neurological or neuro-psychological problems. Prior to the experimental session allparticipants gavewritten informed consent to the study, whichwas approved by a local ethics committee and performedaccording the standards of the Declaration of Helsinki (1964).

Color pictures of 25 male and 25 female faces served asstarting point for stimulus construction. Facial expressionswere created with the 3D animation software (SingularInversions, 2006). In the static condition, all faces displayedneutral or fully (100%) emotional – happy or angry – expres-sions. In the dynamic condition, emotional expressions weredisplayed in series of three consecutive pictures. The first andsecond pictures were presented for 50 ms each, whereas thethird picture was shown for 500 ms. All pictures in thesequences that formed the dynamic stimuli were created byincreasing the intensity of the expression onanumerical scale,provided by the (Singular Inversions, 2006) software, whichallows to quantify each emotional expression between zero(neutral expression) and 100% (full expression). The emotionalintensity of the first picture in those sequences was set to 33%and increased to 66% in the second and to 100% in the thirdimage, thus attaining the same intensity as in the staticcondition (see Fig. 1A). Dynamic neutral expressions werecreated by showing the face in the second image with eyesclosed, giving the impression of a blink.

There were 50 trials for each experimental condition (3expressions x 2 presentationmodes), resulting in a total of 300trials that were presented in completely randomized order.Thus, each face identity was shown six times with differentexpressions or presentation modes, but no stimulus wasrepeated within the experimental session. Participants wereinstructed to categorize each of the presented expressions(happy, angry, or neutral) by pressing one of three adjacentbuttons with their dominant hand. For neutral expressionsthe centre button had to be operated with the middle finger.For happy and angry expressions the left and right buttonhad to be operated in counterbalanced assignment acrossparticipants.

Page 8: Electrophysiological correlates of perceiving and evaluating static and dynamic facial emotional expressions

73B R A I N R E S E A R C H 1 3 7 6 ( 2 0 1 1 ) 6 6 – 7 5

Prior to the experimental session, 60 participants (38females, age M=23.6, SD=6.9), not overlapping with thesample of the ERP experiment, rated the emotional valenceof the stimuli. Only the maximal emotion expressions wereshown without time restriction for the rating. Ratings wereperformed on a 5-point scale, using a computerized version ofthe self-assessment manikin (SAM; Lang and Cuthbert, 1984).Mean valence ratings for the different expressions increasedfrom angry (M=1.5±.02) over neutral (M=3.0±.02) to happyexpressions (M=3.8±.02). ANOVA revealed a significant maineffect of valence, F(2,46)=3214.7, p<.001. Errors in the evalu-ation task refer to discrepancies between participants'responses and the mean rating obtained for each face.

4.2. EEG recording and progress

The electroencephalogram (EEG) was recorded from 57electrodes, mounted in a cap and referenced to the leftmastoid, using Brain Vision Recorder (Brain Products GmbH,2007). The electrooculogram was recorded from four electro-des above and below, and to right and left of the eyes.Electrode site AFz was used as ground. Impedances were keptbelow 5 kΩ. All channels were amplified with a bandpass of.05–70 Hz and sampled at 250 Hz.

Offline, the continuous EEG was recalculated to averagereference and corrected for blinks applying Surrogate MultipleSource Eye Correction (MSEC; Ille et al., 2002), as implementedin BESA (Brain Electrical Source Analysis, MEGIS SoftwareGmbH, 2005). Continuous EEG data was segmented into 1.7 s-epochs, starting 200 ms before stimulus onset, and averagedseparately for each channel and experimental condition. Alldata processing was done using Brain Vision Analyzer (BrainProducts GmBH). A prestimulus baseline was established200 ms before stimulus onset. Trials containing incorrectresponses were discarded. Epochs showing amplitudes ex-ceeding −200 or +200 μV or voltage steps larger than 100 μVpersampling point in any of the channels, were consideredartefacts, and excluded from analyses. The maximal allowedabsolute difference in amplitude within a given segment was300 μV. Mean number of trials used for the final analyses were43.8 (SD=5.9), 30.9 (SD=10.4), and 43.8 (SD=5.4) for neutral,happy, and angry static faces, respectively; and 44.0 (SD=4.7),41.5 (SD=5.7.), and 42.2 (SD=6.1), for neutral, happy, and angryin the dynamic condition, respectively.

4.3. Statistical analysis

Repeated measures analyses of variance (ANOVA) with factorsemotional expression (positive, negative, and neutral), andpresentation mode (static and dynamic) were conducted formean reaction times (RTs) and error rates.

Average ERP amplitudes were segmented into 50ms timeintervals, between stimulus onset (0 ms) and stimulus offset(600ms). Preliminary analyses and visual inspection revealeddifferences in onset and offset for the typical EPN and LPCcomponents; therefore ERP data was scanned in 50ms timewindows.Thismethod isexploratoryandsuitable tocompare thelatencyofconsecutivecomponentswith lateronset (e.g.,Werheidet al., 2005). Fs and ps refer to all F- and p-values from severalconsecutive intervals with significant results in the ANOVA.

ANOVAs of ERP data included an additional repeatedmeasures factor electrode site (57 levels). Please note thatbecause the average reference sets the mean activity across allelectrodes to zero, in ANOVAs including all electrodes onlyinteractions of experimental factors with electrode site aremeaningful. Therefore, factor electrode was not explicitlymentioned in these analyses. Huynh–Feldt correction wasapplied to adjust the degrees of freedom of the F-ratios forviolations of the sphericity assumption. All post-hoc compar-isons were Bonferroni corrected for multiple testing.

Topographic comparisons were conducted by the sametype of ANOVA after normalizing mean amplitudes withineach condition combination and participant by dividing themby the GFP (Lehmann, and Skrandies, 1980). GFP is the rootmean square of averaged voltages at all electrodes, and reflectstheoverall activity across the scalp.According toMcCarthyandWoods (1985) interactions between condition and electrodesite in these ANOVAs may reflect either differences in overallERP activity or differences in scalp distributions betweenexperimental conditions. Scaling the data before the conduct-ing ANOVA eliminates the differences in overall ERP activity,and thus results refer only to the differences in scalpdistribution. All statistical tests conducted were two-tailed.The level of significance was established at 5%.

Visual inspection of topographies showed that the P1 andN170 components were maximal at PO8 and P10 electrodessites respectively. Peak latencies for the P1 and N170 at theseelectrodes were submitted to ANOVAs. Additionally, ANOVAswere calculated for regions of interest for the P1 componentover peak amplitudes at 8 equidistant electrodes in left (P7,PO9, O1, and TP9) and right hemispheres (P8, PO10, O2, andTP10) (e.g., Itier and Taylor, 2004). A similar analysis wascalculated for theN170 at left (P7, PO7, and P9) andhomologousright (P8, PO8, andPO10) electrode sites (e.g., Bentin et al., 2006).

Acknowledgments

This research was supported by the Cluster of Excellence 302“Languages of Emotion”, Grant 209 to AS and WS. We thankMarina Palazova, Julian Rellecke, and Olga Shmuilovich forassistance in data collection, and Thomas Pinkpank andRainer Kniesche for technical support.

R E F E R E N C E S

Abrams, R.A., Christ, S.E., 2003. Motion onset captures attention.Psychol. Sci. 14, 427–432.

Adolphs, R., 2002. Recognizing emotion from facial expressions:psychological and neurological mechanisms. Behav. Cogn.Neurosci. Rev. 1, 21–61.

Allison, T., Puce, A., McCarthy, G., 2000. Social perception fromvisual cues: role of the STS region. Trends Cogn. Sci. 4, 267–278.

Ambadar, Z., Schooler, J.W., Cohn, J.F., 2005. Deciphering theenigmatic face: the importance of facial dynamics ininterpreting subtle facial expressions. Psychol. Sci. 16 (5),403–410.

Balconi, M., Lucchiari, C., 2007. Consciousness and arousal effectson emotional face processing as revealed by brain oscillations.A gamma band analysis. Int. J. Psychophysiol. 67, 41–46.

Page 9: Electrophysiological correlates of perceiving and evaluating static and dynamic facial emotional expressions

74 B R A I N R E S E A R C H 1 3 7 6 ( 2 0 1 1 ) 6 6 – 7 5

Bassili, J.N., 1978. Facial motion in the perception of faces and ofemotional expression. J. Exp. Psychol. Hum. Percept. Perform.4, 373–379.

Batty, M., Taylor, M.J., 2003. Early processing of the six basic facialemotional expressions. Brain Res. Cogn. Brain Res. 17, 613–620.

Bentin, S., Allison, T., Puce, A., Perez, E., McCarthy, G., 1996.Electrophysiological studies of face perception in humans.J. Cogn. Neurosci. 8, 551–565.

Bentin, S., Golland, Y., Flevaris, A., Robertson, L.C., Moscovitch, M.,2006. Processing the trees and the forest during initial stagesof face perception: electrophysiological evidence. J. Cogn.Neurosci. 18, 1406–1421.

Biele, C., Grabowska, A., 2006. Sex differences in perception ofemotion intensity in dynamic and static facial expressions.Exp. Brain Res. 171 (1), 1–6.

Blake, R., Shiffrar, M., 2007. Perception of human motion. Annu.Rev. Psychol. 58, 47–73.

Blau, V.C., Maurer, U., Tottenham, N., McCandliss, B.D., 2007. Theface-specific N170 component is modulated by emotionalfacial expression. Behav. Brain. Funct. 3, 1–13.

Brain Products GmbH, 2007. Brain Vision Recorder (Version 1.03),Brain Vision Analyzer (Version 1.05) [Computer software].

Clore, G.L., Gasper, K., 2000. Some affective influences on belief. In:Fridja, N.H., Manstead, A.S.R., Bem, S. (Eds.), Emotions andBeliefs. Cambridge University Press, Paris, pp. 2–44.

Cohn, J.F., Schmidt, K.L., 2004. The timing of facial motion in posedand spontaneous smiles. J Wavelets, Multi-resolution andInformation Processing 2 (2), 121–132.

Cuthbert, B.N., Schupp, H.T., Bradley, M.M., Birbaumer, N., Lang,P.J., 2000. Brain potentials in affective picture processing:covariation with autonomic arousal and affective report. Biol.Psychol. 52, 95–111.

Eimer, M., 2011. The face-sensitive N170 component of the event-related brain potential. In Oxford Handbook of Face Perception,A.J. Calder, G. Rhodes, M. H. Johnson, & J. V. Haxby, eds., Oxford:Oxford University Press.

Eimer, M., Holmes, A., 2002. An ERP study on the time course ofemotional face processing. NeuroReport 13, 427–431.

Eimer,M., Holmes,A., 2007. Event-related brainpotential correlatesof emotional face processing. Neuropsychologia 45, 15–31.

Eimer, M., Holmes, A., McGlone, F.P., 2003. The role of spatialattention in the processing of facial expression: an ERP study ofrapid brain responses to six basic emotions. Cogn. Affect.Behav. Neurosci. 3, 97–110.

Ekman, P., 1977. Biological and cultural contributions to body andfacial movement. In: Blacking, J. (Ed.), The Anthropology of theBody. Academic Press, London.

Ekman, P., Friesen, W., 1982. Felt, false, and miserable smiles.J. Nonverbal Behav. 6, 238–252.

Fichtenholtz, H.M., Hopfinger, J.B., Graham, R., Detwiler, J.M.,LaBar, K.S., 2007. Facial expressions and emotional targetsproduce separable ERP effects in a gaze directed attentionstudy. Soc. Cogn. Affect. Neurosci. 2, 323–333.

Fichtenholtz, H.M., Hopfinger, J.B., Graham, R., Detwiler, J.M.,LaBar, K.S., 2009. Event-related potentials reveal temporalstaging of dynamic facial expression and gaze shift effects onattentional orienting. Soc. Neurosci. 4 (4), 317–331.

Franconeri, S.L., Simons, D.J., 2003. Moving and looming stimulicapture attention. Percept. Psychophys. 65 (7), 999–1010.

Hess, U., Blairy, S., Kleck, R.E., 1997. The intensity of emotionalfacial expressions and decoding accuracy. J. Nonverbal Behav.21, 241–257.

Hess, U., Kappas, A., McHugo, G.J., Kleck, R.E., Lanzetta, J.T., 1989.An analysis of the encoding and decoding of spontaneous andposed smiles: the use of facial electromyography. J. NonverbalBehav. 13, 121–137.

Holmes, A., Bradley, B.P., Nielsen, M.K., Mogg, K., 2009. Attentionalselectivity for emotional faces: evidence from humanelectrophysiology. Psychophysiology 46 (1), 62–68.

Iacoboni, M., 2009. Imitation, empathy, andmirror neurons. Annu.Rev. Psychol. 60, 653–670.

Ille, N., Berg, P., Scherg, M., 2002. Artifact correction of the ongoingEEG using spatial filters based on artifact and brain signaltopographies. J. Clin. Neurophysiol. 19, 113–124.

Itier, R.J., Taylor, M.J., 2004. N170 or N1? Spatiotemporaldifferences between object and face processing using ERPs.Cereb. Cortex 14, 132–142.

James, W., 1891/1950. The Principles of Psychology, Vol. 1. NewYork: Dover Publications (Original work published 1891).

Junghöfer, M., Bradley, M.M., Elbert, T.R., Lang, P.J., 2001. Fleetingimages: a new look at early emotion discrimination.Psychophysiology 38 (2), 175–178.

Kätsyri, J., Sams, M., 2008. The effect of dynamics on identifyingbasic emotions from synthetic and natural faces. Int. J. Hum.Comput. Stud. 66, 233–242.

Kamachi, M., Bruce, B., Mikaida, S., Gyoba, J., Yoshikawa, S.,Akamatsu, S., 2001. Dynamic properties influence theperception of facial expressions. Perception 30 (7), 875–887.

Keil, A., Müller, M.M., Gruber, T., Wienbruch, C., Stolarova, M.,Elbert, T., 2001. Effects of emotional arousal in the cerebralhemispheres: a study of oscillatory brain activity andevent-related potentials. Clin. Neurophysiol. 112, 2057–2068.

Kilts, C.D., Egan, G., Gideon, D.A., Ely, T.D., Hoffman, J.M., 2003.Dissociable neural pathways are involved in the recognition ofemotion in static and dynamic facial expressions. Neuroimage18 (1), 156–168.

LaBar, K.S., Crupain, M.J., Voyvodic, J.T., McCarthy, G., 2003.Dynamic perception of facial affect and identity in the humanbrain. Cereb. Cortex 13 (10), 1023–1033.

Lang, P.J., Bradley, M.M., Cuthbert, B.N., 1997. Motivated attention:affect, activation, and action. In: Lang, P.J., Simons, R.F.,Balaban, M. (Eds.), Attention and Emotion: Sensory andMotivational Processes. Erlbaum, Mahwah, pp. 97–135.

Lang, P.J., Cuthbert, B.N., 1984. Affective information processingand the assessment of anxiety. J. Behav. Assess. 6 (4), 369–395.

Lange, K., Williams, L.M., Young, A.W., Bullmore, E.T., Brammer,M.J., Williams, S.C.R., et al., 2003. Task instructions modulateneural responses to fearful facial expressions. Biol. Psychiatry53, 226–232.

Lehmann, D., Skrandies, W., 1980. Reference-free identification ofcomponents of checkerboard-evoked multichannel potentialfields. Electroencephalogr. Clin. Neurophysiol. 48 (6), 609–621.

Leppänen, J.M., Hietanen, J.K., 2004. Positive facial expressions arerecognized faster than negative facial expressions, but why?Psychol. Res. 69, 22–29.

Mayes, A.K., Pipingas, A., Silberstein, R.B., Johnston, P., 2009.Steady state visually evoked potential correlates of static anddynamic emotional face processing. Brain Topogr. 22 (3),145–157.

McCarthy, G., Wood, C.C., 1985. Scalp distributions ofevent-related potentials: an ambiguity associated withanalysis of variance models. Electroencephalogr. Clin.Neurophysiol. 62, 203–208.

MEGIS Software GmbH, 2005. Brain Electrical Source Analysis BESA(Version 5.1) [Computer software].

Miyoshi, M., Katayama, J., Morotomi, T., 2004. Face-specific N170component is modulated by facial expressional change.NeuroReport 15 (5), 911–914.

Mühlenen, A., Rempel, M.I., Ems, J.T., 2005. Unique temporalchange is the key to attentional capture. Psychol. Sci. 16 (12),979–986.

O'Toole, A., Roark, D., Abdi, H., 2002. Recognizing moving faces: apsychological and neural synthesis. Trends Cogn. Sci. 6 (6),261–266.

Öhman, A., 1979. The orienting response, attention, and learning:an information-processing perspective. In: Kimmel, H.D., vanOlst, E.H., Orlebek, J.F. (Eds.), The Orienting Reflex in Humans.Erlbaum, Hillsdale.

Page 10: Electrophysiological correlates of perceiving and evaluating static and dynamic facial emotional expressions

75B R A I N R E S E A R C H 1 3 7 6 ( 2 0 1 1 ) 6 6 – 7 5

Öhman, A., 1986. Face the beast and fear the face: animal andsocial fears as prototypes for evolutionary analyses of emotion.Psychophysiology 23 (2), 123–145.

Öhman, A., Flykt, A., Lundqvist, D., 2000. Unconscious emotion:evolutionary perspectives, psychological data andneuropsychologicalmechanisms. In: Lane, R.D., Nadel, L. (Eds.),Cognitive Neuroscience of Emotion. Oxford University Press,Oxford, pp. 296–327.

Oldfield, R., 1971. The assessment and analysis of handedness: theEdinburgh inventory. Neuropsychologia 9, 97–113.

Pilz, K.S., Thornton, I.M., Bülthoff, H.H., 2006. A searchadvantage for faces learned in motion. Exp. Brain Res. 171(4), 436–447.

Puce, A., Allison, T., Bentin, S., Gore, J.C., McCarthy, G., 1998.Temporal cortex activation in humans viewing eye and mouthmovements. J. Neurosci. 18 (6), 2188–2199.

Puce, A., Smith, A., Allison, T., 2000. In: Moscovitch, N.K.M. (Ed.),ERPs evoked by viewing facial movements. : The CognitiveNeuroscience of Face Processing, Vol. 17. Psychology Press,Hove, pp. 221–239.

Rinn, W.E., 1984. The neuropsychology of facial expression: areview of the neurological and psychological mechanisms forproducing facial expressions. Psychol. Bull. 95 (1), 52–77.

Singular Inversions, 2006. (Version 2.2) [Computer software]fromhttp://www.facegen.com/modeller.htm2006.

Sato, W., Kochiyama, T., Yoshikawa, S., Naito, E., Matsumura, M.,2004. Enhanced neural activity in response to dynamic facialexpressions of emotion: an fMRI study. Cogn. Brain Res. 20 (1),81–91.

Schacht, A., Sommer, W., 2009. Emotions in word and faceprocessing: early and late cortical responses. Brain Cogn. 69 (3),538–550.

Schupp, H.T., Flaisch, T., Stockburger, J., Junghöfer, M., 2006.Emotion and attention: event-related brain potential studies.In: Anders, G.E.S., Junghöfer, M., Kissler, J.,Wildgruber, D. (Eds.),Understanding Emotions. Elsevier, Amsterdam, pp. 31–51.

Schupp, H.T., Junghöfer, M., Weike, A., Hamm, A., 2003. Attentionand emotion: an ERP analysis of facilitated emotional stimulusprocessing. NeuroReport 14 (8), 1107–1110.

Schupp, H.T., Öhman, A., Junghofer, M.,Weike, A.I., Stockburger, J.,Hamm, A.O., 2004. The facilitated processing of threateningfaces: an ERP analysis. Emotion 4, 189–200.

Sprengelmeyer, R., Jentzsch, I., 2006. Event related potentialsand the perception of intensity in facial expressions.Neuropsychologia 44 (14), 2899–2906.

Tomkins, S.S., 1962. Affect, Imagery, Consciousness. : The PositiveAffects, Vol. 1. Springer, New York.

Trautmann, S.A., Fehr, T., Herrmann, M., 2009. Emotions inmotion: dynamic compared to static facial expressions ofdisgust and happiness reveal more widespreademotion-specific activations. Brain Res. 1284, 100–115.

Weiss, F., Blum, G.S., Gleberman, L., 1987. Anatomically basedmeasurement of facial expressions in simulated versushypnotically induced affect. Motiv. Emotion 11, 67–81.

Werheid, K., Alpay, G., Jentzsch, I., Sommer, W., 2005. Priming theprocessing of facial affect: event-related potentials reveal earlydetection of emotional expression. Int. J. Psychophysiol. 55,209–219.

Wickens, C., 1980. The structure of attentional resources. In:Nickerson, R.S. (Ed.), Attention and Performance VIII. Erlbaum,Hillsdale, pp. 239–257.

Williams, L.M., Palmer, D., Liddell, B.J., Song, L., Gordon, E., 2006.The ‘when’ and ‘where’ of perceiving signals of threat versusnon-threat. Neuroimage 31, 458–467.