the role of spatial attention in the processing of facial...

14
Cognitive, Affective, & Behavioral Neuroscience 2003, 3 (2), 97-110 Emotions play a crucial role in the regulation of inter- actions between humans and their environment. Emo- tional states produce specific bodily responses, aimed at preparing the organism for survival-related behavior, and specialized neural systems have evolved for the rapid per- ceptual analysis of emotionally salient external events, such as emotional facial expressions (Damasio, 1994; Le Doux, 1996; Öhman, Flykt, & Lundqvist, 2000). Atten- tional biases toward emotional stimuli have been found in many behavioral studies, using paradigms such as visual search (Eastwood, Smilek, & Merikle, 2001; Fox et al., 2000; Hansen & Hansen, 1988; Öhman, Flykt, & Esteves, 2001; Öhman, Lundqvist, & Esteves, 2001) and dot probe detection tasks (Mogg & Bradley, 1999; Mogg et al., 2000). Numerous studies have demonstrated an important role of the amygdala in detecting emotionally salient events and in mediating responses to these stimuli. Emo- tional stimuli, particularly fearful facial expressions, ac- tivate the amygdala and other connected limbic struc- tures (Liu, Ioannides, & Streit, 1999; Morris et al., 1996). Reentrant projections from the amygdala back to occipital cortex may be involved in an enhancement of visual processing of emotionally salient stimuli (Amaral & Price, 1984; Amaral, Price, Pitkanen, & Carmichael, 1992; see also Armony & Dolan, 2002; Lang et al., 1998; Morris et al., 1998, for supportive evidence from func- tional imaging studies). Given these findings, it is often assumed that emo- tional stimuli are detected preattentively and then auto- matically trigger attentional shifts toward their location. However, recent studies investigating the relationship be- tween spatial attention and the processing of emotionally salient events have yielded conflicting findings. On the one hand, amygdala responses to fearful faces in humans appear to be unaffected by spatial attention (Vuilleumier, Armony, Driver, & Dolan, 2001), and amygdala activa- tions triggered by highly arousing emotional scenes are not modulated by a secondary task (Lane, Chua, & Dolan, 1999). In addition, neglect and extinction patients are more likely to detect emotionally significant relative to neutral pictures when these are presented in the affected visual hemifield (Vuilleumier & Schwartz, 2001a, 2001b). These results suggest that emotional stimuli capture at- tention automatically. On the other hand, amygdala re- sponses to fearful or happy facial expressions have been found to be modulated by focal attention (Pessoa, McKenna, Gutierrez, & Ungerleider, 2002), and increased responses to attended versus unattended fearful faces have been observed in the anterior temporal pole and an- terior cingulate gyrus (Vuilleumier et al., 2001). In the present study, we used event-related brain poten- tial (ERP) measures to further investigate the role of spa- tial attention in the processing of emotionally significant events. Because of their excellent temporal resolution, 97 Copyright 2003 Psychonomic Society, Inc. This study was supported by Unilever Research. We thank two anony- mous referees for valuable comments, and Heijo Van de Werf for tech- nical assistance. Correspondence should be addressed to M. Eimer, De- partment of Psychology, Birkbeck College, University of London, Malet Street, London WC1E 7HX, England (e-mail: [email protected]). The role of spatial attention in the processing of facial expression: An ERP study of rapid brain responses to six basic emotions MARTIN EIMER and AMANDA HOLMES University of London, London, England and FRANCIS P. MCGLONE Unilever Research, Port Sunlight Laboratories, Wirral, England To investigate the time course of emotional expression processing, we recorded ERP responses to stimulus arrays containing neutral versus angry, disgusted, fearful, happy, sad, or surprised faces. In one half of the experiment, the task was to discriminate emotional and neutral facial expressions. Here, an enhanced early frontocentral positivity was elicited in response to emotional as opposed to neutral faces, followed by a broadly distributed positivity and an enhanced negativity at lateral posterior sites. These emotional expression effects were very similar for all six basic emotional expressions. In the other half of the experiment, attention was directed away from the faces toward a demanding perceptual discrimination task. Under these conditions, emotional expression effects were completely eliminated, demonstrating that brain processes involved in the detection and analysis of facial expression require focal attention. The face-specific N170 component was unaffected by any emotional expression, sup- porting the hypothesis that structural encoding and expression analysis are independent processes.

Upload: hahuong

Post on 12-Apr-2018

217 views

Category:

Documents


4 download

TRANSCRIPT

Cognitive Affective amp Behavioral Neuroscience2003 3 (2) 97-110

Emotions play a crucial role in the regulation of inter-actions between humans and their environment Emo-tional states produce specific bodily responses aimed atpreparing the organism for survival-related behavior andspecialized neural systems have evolved for the rapid per-ceptual analysis of emotionally salient external eventssuch as emotional facial expressions (Damasio 1994 LeDoux 1996 Oumlhman Flykt amp Lundqvist 2000) Atten-tional biases toward emotional stimuli have been found inmany behavioral studies using paradigms such as visualsearch (Eastwood Smilek amp Merikle 2001 Fox et al2000 Hansen amp Hansen 1988 Oumlhman Flykt amp Esteves2001 Oumlhman Lundqvist amp Esteves 2001) and dot probedetection tasks (Mogg amp Bradley 1999 Mogg et al 2000)

Numerous studies have demonstrated an importantrole of the amygdala in detecting emotionally salientevents and in mediating responses to these stimuli Emo-tional stimuli particularly fearful facial expressions ac-tivate the amygdala and other connected limbic struc-tures (Liu Ioannides amp Streit 1999 Morris et al1996) Reentrant projections from the amygdala back tooccipital cortex may be involved in an enhancement ofvisual processing of emotionally salient stimuli (Amaralamp Price 1984 Amaral Price Pitkanen amp Carmichael

1992 see also Armony amp Dolan 2002 Lang et al 1998Morris et al 1998 for supportive evidence from func-tional imaging studies)

Given these findings it is often assumed that emo-tional stimuli are detected preattentively and then auto-matically trigger attentional shifts toward their locationHowever recent studies investigating the relationship be-tween spatial attention and the processing of emotionallysalient events have yielded conflicting findings On theone hand amygdala responses to fearful faces in humansappear to be unaffected by spatial attention (VuilleumierArmony Driver amp Dolan 2001) and amygdala activa-tions triggered by highly arousing emotional scenes arenot modulated by a secondary task (Lane Chua amp Dolan1999) In addition neglect and extinction patients aremore likely to detect emotionally significant relative toneutral pictures when these are presented in the affectedvisual hemifield (Vuilleumier amp Schwartz 2001a 2001b)These results suggest that emotional stimuli capture at-tention automatically On the other hand amygdala re-sponses to fearful or happy facial expressions have beenfound to be modulated by focal attention (PessoaMcKenna Gutierrez amp Ungerleider 2002) and increasedresponses to attended versus unattended fearful faceshave been observed in the anterior temporal pole and an-terior cingulate gyrus (Vuilleumier et al 2001)

In the present study we used event-related brain poten-tial (ERP) measures to further investigate the role of spa-tial attention in the processing of emotionally significantevents Because of their excellent temporal resolution

97 Copyright 2003 Psychonomic Society Inc

This study was supported by Unilever Research We thank two anony-mous referees for valuable comments and Heijo Van de Werf for tech-nical assistance Correspondence should be addressed to M Eimer De-partment of Psychology Birkbeck College University of London MaletStreet London WC1E 7HX England (e-mail meimerbbkacuk)

The role of spatial attention in the processing offacial expression An ERP study of rapid brain

responses to six basic emotions

MARTIN EIMER and AMANDA HOLMESUniversity of London London England

and

FRANCIS P MCGLONEUnilever Research Port Sunlight Laboratories Wirral England

To investigate the time course of emotional expression processing we recorded ERP responses tostimulus arrays containing neutral versus angry disgusted fearful happy sad or surprised faces Inone half of the experiment the task was to discriminate emotional and neutral facial expressions Herean enhanced early frontocentral positivity was elicited in response to emotional as opposed to neutralfaces followed by a broadly distributed positivity and an enhanced negativity at lateral posterior sitesThese emotional expression effects were very similar for all six basic emotional expressions In theother half of the experiment attention was directed away from the faces toward a demanding perceptualdiscrimination task Under these conditions emotional expression effects were completely eliminateddemonstrating that brain processes involved in the detection and analysis of facial expression requirefocal attention The face-specific N170 component was unaffected by any emotional expression sup-porting the hypothesis that structural encoding and expression analysis are independent processes

98 EIMER HOLMES AND MCGLONE

ERPs are particularly suited for studying the time courseof emotional processes and investigating whether andwhen the processing of emotional stimuli is modulatedby selective attention For example a positive slow wavestarting at about 300 msec after stimulus onset in responseto pictures with emotional content (Cuthbert SchuppBradley Birbaumer amp Lang 2000 Diedrich NaumannMaier amp Becker 1997) has been interpreted as reflect-ing the allocation of attention to motivationally relevantinput (Cuthbert et al 2000) More recently we haveshown that an enhanced positivity in response to foveallypresented fearful relative to neutral faces can be elicitedover prefrontal areas as early as 120 msec after stimulusonset (Eimer amp Holmes 2002) This early emotional ex-pression effect suggests that cortical circuits involved inthe detection of emotionally significant events can be trig-gered rapidly by emotional facial expressions (see alsoKawasaki et al 2001 Pizzagalli Regard amp Lehmann1999 Sato Kochiyama Yoshikawa amp Matsumura 2001for similar results from ERP and single-unit studies)

In another recent ERP study (Holmes Vuilleumier ampEimer 2003) we investigated for the first time whetherand how emotional expression effects elicited by fearfulrelative to neutral faces are affected by spatial attentionOn each trial arrays consisting of two faces and twohouses arranged in horizontal and vertical pairs werepresented Participants had to attend either to the twovertical or to the two horizontal locations (as indicatedby a precue presented at the beginning of each trial) inorder to detect infrequent identical stimuli at the cued lo-cation When faces were attended fearful faces elicitedan enhanced positivity relative to neutral faces with anearly frontal effect followed by a more broadly distrib-uted emotional positivity These emotional expressioneffects were completely eliminated on trials where faceswere presented at uncued (unattended) locations Thisfinding challenges the hypothesis that the detection andprocessing of emotional facial expression occurs preat-tentively and suggests that the processes reflected byERP modulations sensitive to emotional facial expres-sion are gated by spatial attention

The present study was designed to confirm and extendthese surprising results In our previous study (Holmeset al 2003) only one emotional facial expression (fear)was employed Although fearful faces are generally re-garded to be highly salient emotional stimuli the hy-pothesis that the processing of emotional facial expres-sion depends on spatial attention clearly needs to besubstantiated by investigating whether differential ERPresponses to facial expressions other than fear are gatedby spatial attention

In the present experiment all six basic emotional facialexpressions were shown in separate experimental blocksFace stimuli were photographs of 10 different individuals(Ekman amp Friesen 1976) with facial expression neutralor angry disgusted fearful happy sad or surprised(Figure 1 top panel) Each block contained an equalnumber of trials with emotional or neutral face pairs pre-sented bilaterally to the left and right of fixation In one

half of the experiment (lines task) attention was activelydirected away from these face stimuli toward a demandingperceptual judgment task Participants had to monitor apair of vertical lines presented bilaterally close to fixation(Figure 1 bottom panel) in order to decide on each trialwhether the two lines were identical or differed in lengthFaces had to be entirely ignored The other half of theexperiment (emotion task) was physically identical to thelines task but participants now had to decide on eachtrial whether facial expression was emotional or neutralHere lines could be entirely ignored

ERP modulations sensitive to emotional facial ex-pression were identified by comparing ERPs elicited byarrays containing emotional faces to ERPs in response toarrays with neutral faces separately for experimentalblocks including angry disgusted fearful happy sadand surprised faces To investigate the impact of atten-tion on the processing of emotional facial expressionthese comparisons were conducted separately for theemotion task where emotional expression was task rel-evant and for the lines task where faces were irrelevantand thus could be entirely ignored If emotional facialexpressions were detected preattentively and attractedattention automatically systematic ERP modulations inresponse to arrays containing emotional versus neutralfaces should be found not only in the emotion task butalso although perhaps in an attenuated fashion in thelines task In contrast if the detection and processing ofemotional faces requires focal attention (as suggested byHolmes et al 2003) ERP correlates of emotional faceprocessing should be entirely absent in the lines task

In addition to investigating the role of spatial attentionon the processing of facial expression the design of thepresent study also allowed the systematic comparison ofERP responses elicited by each of the six basic emotionalfacial expressions A number of lesion and neuroimagingstudies argue for the existence of neural systems that arespecialized for processing distinct emotions (Adolphs2002) For example a disproportionate activation of theamygdala has been observed in response to facial expres-sions of fear (Breiter et al 1996 Morris et al 1996Phillips et al 1998 Whalen et al 2001 but see Rapcsaket al 2000) Prefrontal cortex has been specifically im-plicated in the recognition of angry facial expressions(Blair Morris Frith Perrett amp Dolan 1999 HarmerThilo Rothwell amp Goodwin 2001) and the insula andbasal ganglia appear to be particularly involved in pro-cessing facial expressions of disgust (Adolphs Tranelamp Damasio 2003 Calder Keane Manes Antoun ampYoung 2000 Calder Lawrence amp Young 2001 Phillipset al 1998 Phillips et al 1997 Sprengelmeyer RauschEysel amp Przuntek 1998) If the detection and analysis ofspecific facial emotional expressions is mediated by dis-tinct brain processes this might be reflected in systematicdifferences in emotional expression effects on ERP wave-forms elicited in response to different facial expressions

Another aim in the present study was to investigatewhether early stages in the perceptual encoding of facestimuli are affected by emotional facial expression The

ATTENTION AND FACIAL EXPRESSION PROCESSING 99

face-specific N170 component is assumed to reflect theprecategorical structural encoding of faces prior to theirrecognition (Bentin Allison Puce Perez amp McCarthy1996 Eimer 1998 2000) In two recent ERP studies

(Eimer amp Holmes 2002 Holmes et al 2003) we havefound that the N170 is not modulated by emotional facialexpression This suggests that the structural encoding offaces and the processing of emotional expression are

Figure 1 Top panel Examples of face stimuli used in the present experimentFaces of 10 different individuals were used with facial expression either neu-tral (central) or (clockwise from top) disgusted fearful happy sad surprisedor angry Bottom panel Illustration of the stimulus array presented on eachtrial Two identical emotional or neutral faces were presented bilaterally withtwo vertical lines located close to fixation In the trial shown here a happy facepair is presented together with two lines of different lengths

100 EIMER HOLMES AND MCGLONE

parallel and independent processes (Bruce amp Young1986) However to date this conclusion has been basedonly on a comparison of N170 components elicited in re-sponse to fearful as versus neutral faces obtained underconditions where facial expression was task irrelevantTo investigate whether the face-specific N170 is unaf-fected by any emotional facial expression even when ex-pression is task relevant we compared the N170 elicitedby emotional versus neutral faces in the emotion taskseparately for all six basic facial expressions Any sys-tematic emotional expression effects on the N170 com-ponent would challenge the hypothesis that the structuralencoding of faces is completely independent of facial ex-pression analysis

METHOD

ParticipantsFifteen participants participated in this study One had to be ex-

cluded because of excessive eye blinks so 14 participants (7 femaleand 7 male 18ndash54 years old average age 296 years) remained inthe sample One participant was left-handed all others right-handedby self-report The experiment was performed in compliance withrelevant institutional guidelines and was approved by the BirkbeckCollege School of Psychology ethics committee

StimuliThe face stimuli were photographs of faces of 10 different individ-

uals all taken from a standard set of pictures of facial affect (Ekmanamp Friesen 1976) Facial expression was angry disgusted fearfulhappy sad surprised or neutral resulting in a total of 70 differentface stimuli (see Figure 1 top panel for examples) All face stim-uli covered a visual angle of about 34ordm 3 24ordm Each display alsocontained a pair of gray vertical lines (01ordm width) and each linewas either short (04ordm) or slightly longer (05ordm) All stimuli were pre-sented on a computer screen in front of a black background A whitefixation cross was continuously present at the center of the screen

ProcedureParticipants were seated in a dimly lit sound-attenuated cabin

and a computer screen was placed at a viewing distance of 70 cmThe experiment consisted of 24 experimental blocks each contain-ing 80 trials On each trial two identical faces were presented to-gether with two line stimuli in front of a black background (Fig-ure 1 bottom) Faces were located 22 ordm to the left and right offixation (measured as the distance between the fixation cross andthe center of each face stimulus) and the bilateral lines were pre-sented close to the f ixation cross (04ordm eccentricity) All stimuliwere presented simultaneously for 300 msec and the interval be-tween two successive stimulus presentations was 2000 msec

In 12 successive blocks participants had to indicate with a left-hand or right-hand buttonpress whether the face pair presented onany given trial showed an emotional or neutral expression (emotiontask) The mapping of emotional valence to response hand wascounterbalanced across participants In 40 trials per block emo-tional faces were presented in the other randomly intermingled 40trials facial expression was neutral Long and short lines whichwere irrelevant in these blocks appeared randomly and with equalprobability to the left and right of f ixation Emotional expressionwas varied across blocks with angry disgusted fearful happy sadand surprised faces each shown in two blocks The order in whichthese blocks were presented was randomized for each participant

In the other 12 successive blocks participants were instructed todirect their attention to the pair of lines presented close to f ixation

and to indicate with a left-hand or right-hand buttonpress whetherthese lines differed in length or were identical (lines task) The map-ping of line length to response hand was counterbalanced acrossparticipants Again short and long lines appeared randomly andequiprobably on the left or right side Faces which were now taskirrelevant were emotional on 40 trials and neutral on the other 40trials with emotional expression varied across blocks (two blockseach with angry disgusted fearful happy sad and surprisedfaces) The order in which these blocks were presented was againrandomized for each participant

Seven participants performed the emotion task prior to the linestask and this order was reversed for the other 7 participants Par-ticipants were instructed to keep their gaze directed at the centralfixation cross throughout each block and to respond as fast and ac-curately as possible on each trial

ERP procedures and data analysisEEG was recorded with Ag-AgCl electrodes and linked-earlobe

reference from Fpz F7 F3 Fz F4 F8 FC5 FC6 T7 C3 Cz C4T8 CP5 CP6 T5 P3 Pz P4 T6 and Oz (according to the 10-20system) and from OL and OR (located halfway between O1 and P7and O2 and P8 respectively) Horizontal EOG (HEOG) was recordedbipolarly from the outer canthi of both eyes The impedance for allelectrodes was kept below 5 kW The amplifier bandpass was 01 to40 Hz and no additional filters were applied to the averaged dataEEG and EOG were sampled with a digitization rate of 200 Hz andstored on disk Reaction times (RTs) were measured on each trial

EEG and HEOG were epoched off-line into 800-msec periodsstarting 100 msec prior to stimulus onset and ending 700 msec afterstimulus onset Trials with horizontal eye movements (HEOG ex-ceeding plusmn30 mV) eyeblinks (Fpz exceeding plusmn60 mV) or other arti-facts (a voltage exceeding plusmn80 mV at any electrode) measured afterstimulus onset were excluded from analysis EEG obtained was av-eraged relative to a 100-msec baseline preceding stimulus onsetOnly trials with correct behavioral responses were included in theaverages Separate averages were computed for the emotion taskand the lines task for all combinations of block type (experimentalblocks including angry vs disgusted vs fearful vs happy vs sad vssurprised faces) and valence (emotional vs neutral faces) resultingin 24 average waveforms for each electrode and participant

The f irst set of analyses was based on mean amplitudes obtainedat lateral posterior electrodes T5 and T6 (where the N170 is maximal)within a time window centered on the mean latency of the face-specific posterior N170 component (160ndash200 msec poststimulus)Repeated measures analyses of variance (ANOVAs) were conductedfor the factors task (emotion task vs lines task) block type and va-lence Additional analyses were conducted separately for the emo-tion and the lines tasks The second set of analyses was based onmean amplitude values computed within f ive successive poststim-ulus time windows (120ndash155 msec 160ndash215 msec 220ndash315 msec320ndash 495 msec and 500ndash700 msec) which covered the interval wheresystematic emotional expression effects were observed in our pre-vious experiments (Eimer amp Holmes 2002 Holmes et al 2003)Mean amplitude values were computed for frontal (F3 Fz F4) cen-tral (C3 Cz C4) parietal (P3 Pz P4) lateral temporal (T5 T6)and lateral occipital sites (OL OR) Again ANOVAs were conductedfor the factors task block type and valence followed by furtheranalyses conducted separately for ERPs obtained in the emotiontask and the lines task

For keypress responses repeated measures ANOVAs were per-formed on the latencies of correct responses and on error rates sepa-rately for the emotion task and the lines task for the factors block typeand valence In the analysis of behavioral performance in the linestask the additional factor of target type (identical lines vs differentlines) was included For all analyses GreenhousendashGeisser adjust-ments to the degrees of freedom were performed when appropriate

ATTENTION AND FACIAL EXPRESSION PROCESSING 101

RESULTS

Behavioral ResultsParticipants failed to respond on less than 3 of all tri-

als Correct responses were faster in the emotion task(622 msec) than in the lines task (695 msec) and this dif-ference was significant [t(14) = 474 p lt 001] Figure 2shows mean RTs (top panel) and the percentage of incor-rect responses (bottom panel) obtained in the emotiontask displayed separately for the six different block typesand for trials with emotional and neutral faces respec-tively For RTs main effects of block type [F(565) = 214p lt 001 e = 788] and of valence [F(113) = 190 p lt001] were present RTs differed systematically betweenblock types being fastest in blocks including happy facesand slowest in blocks including sad faces In addition re-

sponses were generally faster to emotional than to neutralfaces No interaction between block type and valence wasobtained indicating that this RT advantage for emotionalfaces was equivalent across all six block types

For error rates main effects of block type [F(565) =137 p lt 001 e = 282] and valence [F(113) = 154 p lt002] were again present for the emotion task As can beseen from Figure 2 (bottom panel) incorrect responseswere most frequent in blocks including sad faces andleast frequent in blocks including surprised faces Alsoit was more likely that emotional faces would be incor-rectly classified as neutral than that neutral faces wouldbe erroneously judged as emotional No block type 3 va-lence interaction was present

In the lines task no main effects of block type or va-lence were obtained for RT or error rate (all Fs lt 1) in-

Emotional FacesNeutral Faces

Emotional FacesNeutral Faces

Anger Disgust Fear Happiness Sadness Surprise

Anger Disgust Fear Happiness Sadness Surprise

RT

(mse

c)E

rror

rate

s (

)

750

700

650

600

550

500

30

20

10

0

Figure 2 Reaction times (top panel) and percentage of incorrect responses(bottom panel) to emotional and neutral faces in the emotion task displayedseparately for experimental blocks where neutral faces were intermixed withangry disgusted fearful happy sad or surprised faces

102 EIMER HOLMES AND MCGLONE

dicating that the emotional expression of task-irrelevantfaces did not interfere with perceptual identification per-formance Target type did not affect RT but had a signif-icant effect on error rate [F(113) = 159 p lt 002] Itwas more likely that lines of different length would beclassified as identical (232) than that identical lineswould be judged as different (102)

Electrophysiological ResultsFigure 3 shows ERPs obtained in the emotion task in

response to stimulus arrays containing either neutral faces(solid lines) or emotional faces (dashed lines) collapsedacross all six different emotional expressions Figure 4shows corresponding ERP waveforms obtained in the linestask A sustained positivity was elicited in response toarrays containing emotional faces in the emotion taskThis emotional expression effect was first visible at fronto-central sites at about 180 msec poststimulus (overlappingwith the P2 component) and appeared at parietal elec-

trodes around 300 msec poststimulus (Figure 3) At lateraltemporal and occipital electrodes emotional expressioneffects appeared at about 250 msec poststimulus as an en-hanced negativity for emotional relative to neutral facesin the emotion task In contrast no systematic emotionalexpression effects were found for the lines task (Figure 4)

The difference between emotional and neutral facesappears to leave the face-specific N170 component atlateral temporal sites T5 and T6 entirely unaffected Thiswas observed not only in the lines task (Figure 4) butalso in the emotion task (Figure 3) where facial expres-sion was task relevant These informal observations weresubstantiated by statistical analyses

N170 component In the N170 time range (160ndash200 msec poststimulus) N170 amplitudes elicited at T5and T6 in response to neutral versus emotional facesshowed neither a main effect of valence nor a task 3 va-lence interaction (both Fs lt 1) demonstrating that theN170 is not modulated by facial expression (Figures 3

Figure 3 Grand-averaged ERP waveforms elicited in the emotion task in the 700-msec in-terval following stimulus onset in response to stimulus arrays containing neutral faces (solidlines) or emotional faces (dashed lines) collapsed across blocks including each of the six dif-ferent emotional facial expressions

ATTENTION AND FACIAL EXPRESSION PROCESSING 103

and 4) To further ascertain that this component is unaf-fected by emotional expression even when expression istask relevant we conducted additional analyses on N170amplitudes observed in the emotion task (Figure 3) Nomain effect of valence (F lt 12) or interaction betweenblock type and valence (F lt 1) was observed indicatingthat the N170 was similarly insensitive to emotional fa-cial expression for all six basic emotions employed hereeven though participants had to discriminate betweenemotional and neutral faces in this task This is illus-trated in Figure 5 which displays ERPs in response toneutral and emotional faces elicited in the emotion taskat right lateral temporal electrode T6 shown separatelyfor each of the six facial expressions which were pre-sented in different blocks No systematic differential ef-fects of any facial expression on the N170 are apparentand this was confirmed by additional planned pairedcomparisons of N170 amplitudes at T5 and T6 in re-

sponse to emotional versus neutral faces conducted sep-arately for all six basic emotions None of these com-parisons even approached statistical significance [allts(13) lt 15]

Emotional expression effects No main effects of va-lence or task 3 valence interactions were observed in the120- to 155-msec time window In the 160- to 215-msecanalysis window a task 3 valence interaction was pres-ent at frontal sites [F(113) = 52 p lt 05]1 Main effectsof valence were found at frontal and central sites [bothFs(114) gt 91 both ps lt 01] in the emotion task reflect-ing an enhanced positivity elicited in response to arrayscontaining emotional faces (Figure 3) These effects werecompletely absent in the lines task (both Fs lt 1) No in-teractions between block type and valence were found atfrontal and central sites in the emotion task (both Fs lt11) demonstrating that this early emotional positivitywas elicited in response to emotional versus neutral faces

Figure 4 Grand-averaged ERP waveforms elicited in the lines task in the 700-msec inter-val following stimulus onset in response to stimulus arrays containing neutral faces (solidlines) or emotional faces (dashed lines) Data are collapsed across blocks including each ofthe six different emotional facial expressions as well as across trials with identical and dif-ferent line pairs

104 EIMER HOLMES AND MCGLONE

irrespective of which of the six basic emotions was in-cluded in a given block This fact is further illustrated inFigure 6 which shows ERPs in response to neutral andemotional faces elicited in the emotion task at Fz dis-played separately for all emotional expressions used inthis experiment Emotional expression effects were verysimilar across expressions and started at approximatelythe same time for all six basic emotions No significantemotional expression effects were present between 160and 215 msec poststimulus at parietal and occipital elec-trodes

Between 220 and 315 msec poststimulus task 3 va-lence interactions were present at frontal and centralelectrodes as well as at lateral temporal and occipitalsites [all Fs(113) gt 72 all ps lt 02] indicating thatemotional expression affected ERPs in the emotion taskbut not in the lines task At frontal and central sites maineffects of valence in the emotion task [both Fs(113) gt150 both ps lt 02] reflected enhanced positivities foremotional relative to neutral faces (Figure 3) No blocktype 3 valence interactions were present (both Fs lt 1)demonstrating that this effect was elicited in similarfashion for all six basic emotions (Figure 6) Again nofrontocentral emotional expression effects were observedin the lines task (both Fs lt 16) At lateral temporal andoccipital sites an enhanced negativity was observed inthe 220- to 315-msec latency window for emotional rel-ative to neutral faces in the emotion task [both Fs(113) gt61 both ps lt 03] but not in the lines task (both Fs lt 1)Again no block type 3 valence interactions were pres-

ent for the emotion task (both Fs lt 16) indicating thatthis lateral posterior emotional negativity was elicited inresponse to all six basic emotions (Figure 5)

In the final two analysis windows (320ndash495 msec and500ndash700 msec poststimulus respectively) highly sig-nificant task 3 valence interactions were present atfrontal central and parietal electrodes [all Fs(113) gt100 all ps lt 01] again reflecting the presence of emo-tional expression effects in the emotion task (Figure 3)and their absence in the lines task (Figure 4) Main ef-fects of valence at frontal and central as well as at pari-etal electrodes in the emotion task [all Fs(113) gt 119all ps lt 01] without any significant interactions betweenvalence and block type demonstrated that enhanced pos-itivities for emotional faces were elicited at these sites ina similar fashion for all six basic emotions (Figure 6)Again effects of valence were entirely absent in the linestask2

DISCUSSION

The primary aim of the present ERP experiment wasto extend previous findings (Holmes et al 2003) that thedetection and processing of emotional information de-livered by facial expressions requires focal attention Werecorded ERPs to stimulus arrays containing emotionalor neutral bilateral faces under conditions when facialexpression was task relevant and therefore attended (emo-tion task) or when attention was actively directed awayfrom these faces toward a demanding perceptual judgment

Figure 5 Grand-averaged ERP waveforms elicited in the emotion task at right lateral tem-poral electrode T6 in the 700-msec interval following stimulus onset in response to stimulusarrays containing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are shownseparately for blocks containing angry disgusted fearful happy sad or surprised faces

ATTENTION AND FACIAL EXPRESSION PROCESSING 105

(lines task) In our previous ERP study (Holmes et al2003) spatial attention was manipulated on a trial-by-trial basis by precues presented at the start of each trialfacial expression was not task relevant (participants hadto detect infrequent identical stimulus pairs regardlessof expression) and only one emotional expression (fear)was tested In the present experiment a sustained attentionparadigm was employed (with emotion and lines tasksdelivered in separate experimental halves) facial expres-sion was task relevant in the emotion task and most im-portant all six basic facial emotional expressions wereincluded in different blocks

ERP correlates of emotional facial expression pro-cessing were identified by comparing ERPs elicited ontrials with emotional faces with ERPs in response to neu-tral faces This was done separately for the emotion taskand the lines task and for blocks including angry dis-gusted fearful happy sad and surprised faces In theemotion task where attention was directed toward task-relevant facial expressions an enhanced positivity foremotional relative to neutral faces was elicited similar toprevious observations from studies comparing ERP re-sponses to fearful versus neutral faces (Eimer amp Holmes2002 Holmes et al 2003) This emotional expressioneffect started at about 160 msec poststimulus and wasinitially distributed frontocentrally whereas a morebroadly distributed positivity was observed beyond300 msec (Figure 3) In addition an enhanced negativityfor fearful relative to neutral faces was elicited at lateral

posterior electrodes between 220 and 320 msec post-stimulus

The onset of the early frontocentral emotional expres-sion effect was slightly later in the present experimentthan in our previous experiment (Holmes et al 2003)where significant frontal differences between ERPs tofearful and neutral faces were already present at about120 msec poststimulus In the present study verticallines were presented close to fixation simultaneouslywith the bilateral faces whereas no such stimuli were in-cluded in our earlier experiment The presence of theseadditional central events may have slightly delayed theonset of early emotional expression effects It shouldalso be noted that an attenuation of amygdala responsesto emotional facial expressions has been observed whenthe demand for explicit emotion recognition was in-creased (Critchley et al 2000 Hariri Bookheimer ampMazziotta 2000) It is possible that the demand for ex-plicit emotion recognition in the emotion task con-tributed to the delayed onset of the early emotional ex-pression effect

In marked contrast to these ERP results obtained inthe emotion task emotional expression effects were en-tirely absent in the lines task (Figure 4) demonstratingthat ERP correlates of facial expression processing arestrongly dependent on spatial attention With sustainedspatial attention directed away from face stimuli towardanother demanding perceptual task the presence of emo-tional versus neutral faces had no effect whatsoever on

Figure 6 Grand-averaged ERP waveforms elicited in the emotion task at midline electrodeFz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are shown separatelyfor blocks containing angry disgusted fearful happy sad or surprised faces

106 EIMER HOLMES AND MCGLONE

ERP waveforms That is emotional expression effectswere completely eliminated for all six basic emotions in-cluded in this experiment In line with this ERP resultperformance in the lines task was entirely unaffected bythe expression of the faces presented simultaneouslywith the task-relevant line pairs Overall these findingsextend and confirm the observations of our previousERP experiment which compared ERPs in response tofearful versus neutral faces (Holmes et al 2003) Clearlythese results challenge the hypothesis that the detectionandor processing of emotional facial expression occurspreattentively If this were the case at least some system-atic ERP differences should have been elicited in responseto emotional versus neutral faces in the lines task reflect-ing the automatic detection of emotionally significantevents

Covert attention toward emotional faces under condi-tions when they were task relevant may have enhancedtheir visualndashperceptual representation (eg CarrascoPenpeci-Talgar amp Eckstein 2000) thereby enabling theextraction of features relating to the affective valence ofthese faces and thus their subsequent encoding andanalysis (as reflected by the emotion-specific ERP effectsobserved in the emotion task) The early frontocentrallydistributed emotional expression effects may be mediatedby connections from the superior temporal sulcus (STS)and amygdala to orbitofrontal cortex (Rolls 1999) TheSTS has been implicated in the early discrimination ofvisual features relating to emotional facial expressions(eg Sprengelmeyer et al 1998) In addition efferentfeedback projections from the amygdala and relatedstructures (see Lang et al 1998 Morris et al 1998)may have produced the more broadly distributed emo-tional expression effects observed in the present experi-ment at longer latencies

One could argue that the absence of emotional ex-pression effects under conditions where faces were un-attended may have been due to the fact that the presen-tation of specific emotional expressions was blocked andthat each expression was presented repeatedly in twoseparate blocks Repeated exposure to a specific emo-tional expression may have resulted in a gradual habitu-ation of emotion-specific responses thus potentially at-tenuating any emotional expression effects that may havebeen present in the lines task To investigate this possi-bility we computed separate averages for the first blockand for the second block including angry disgustedfearful happy sad or surprised faces separately for theemotion and for the lines task These data were then an-alyzed with the additional factor of block position (firstvs second block containing a specific emotional facialexpression) If emotional expression effects were subjectto habituation one would expect to find larger emotionalexpression effects for the first relative to the second blockin the emotion task and potentially also a residual emo-tional expression effect for the first block in the lines task

Figure 7 shows ERPs elicited at Fz in response to neu-tral faces (solid lines) or emotional faces (dashed lines)

collapsed across all six different emotional expressionsERPs are displayed separately for the emotion task (toppanel) and the lines task (bottom panel) and for the firstblock (left) or second block (right) including one of thesix emotional expressions As can be seen from Figure 7(top) there was no evidence whatsoever for any habitu-ation of emotional expression effects as a function ofblock position in the emotion task This was confirmedby the absence of any block position 3 valence or blockposition 3 block type 3 valence interactions for all latencywindows employed in the analyses reported above [allFs(113) lt 1] Along similar lines Figure 7 (bottom panel)suggests that there was no residual emotional expressioneffect for the first block including a specific emotionalexpression in the lines task This was confirmed by theabsence of any interactions involving block position [allFs(113) lt 16] Thus the fact that emotional expressioneffects were absent in response to unattended faces in thelines task is unlikely to have been the result of a habitu-ation of emotion-specific brain responses

The conclusion that the processing of emotional facialexpression as reflected by ERP facial expression effectsis gated by spatial attention appears to be inconsistentwith neuroimaging studies demonstrating that fearfulfaces result in amygdala activations even when thesefaces are outside the focus of attention (Vuilleumieret al 2001 see also Morris et al 1996 Whalen et al1998) However it is extremely unlikely that the ERP ef-fects observed in the present study are directly linked toamygdala activations Due to its nuclear structure ofclustered neurones the amygdala is electrically closedand thus largely inaccessible to ERP measures The earlyemotional expression effects observed in response to at-tended faces are more likely to be generated in prefrontalcortex where emotion-specific single-cell responseshave recently been recorded at short latencies (Kawasakiet al 2001) Such prefrontal responses may reflect stagesin emotional processing that could be contingent uponbut functionally separate from prior amygdala activa-tions (see Le Doux 1996 Rolls 1999) It is possible thatamygdala responses can be triggered by unattendedemotional stimuli (although these responses may be at-tenuated) whereas subsequent neocortical stages ofemotional processing (as reflected by the ERP effects ob-served in the present experiment) are fully dependent onfocal attention An alternative possibility is that amyg-dala responses to emotional stimuli may also require at-tention (see Pessoa Kastner amp Ungerleider 2002 Pes-soa McKenna et al 2002) and that the elimination ofemotional expression effects in the lines task reflects anearlier attentional gating of such subcortical processing

Another important new finding of the present experi-ment was that the onset time course and scalp distribu-tion of emotional expression effects obtained in the emo-tion task were remarkably similar for all six basic facialexpressions used here (Figures 5 and 6) The absence ofany differential ERP responses to different emotional ex-pressions was reflected by the absence of any significant

ATTENTION AND FACIAL EXPRESSION PROCESSING 107

interactions between block type (blocks with angry dis-gusted fearful happy sad or surprised faces) and va-lence (emotional vs neutral expression) In line withthese observations the size of the RT advantage for emo-tional relative to neutral faces in the emotion task wassimilar for all six emotional facial expressions (Figure 2top panel) The similarity in the time course of emotionalexpression effects across all six emotional expressionsobserved here suggests that emotionally relevant infor-mation delivered by facial expression is available to neo-cortical processes within less then 200 msec after stim-ulus onset and at approximately the same time for allbasic emotional expressions

These observations do not seem to support the ideasuggested by recent f MRI results that distinct neuralsubsystems specialize in the processing of specific emo-tions (Adolphs 2002) If this were the case one mighthave expected some systematic differences between ERPemotional expression effects elicited by different facial

expressions However it should be noted that althoughsome neuroimaging data show emotion-specific differ-ential activation of brain regions such as the amygdala orinsula few studies point to differential activation withinsurface cortical structures (where the ERP effects ob-served in the present experiments are likely to be gener-ated see also Pizzagalli et al 1999 Sato et al 2001for related results from recent ERP studies)

Thus one could argue that early stages in the processingof emotionally relevant information subserved by lim-bic structures or the basal ganglia and subsequent neo-cortical emotional processing stages differ not only intheir dependence on focal attention (see above) but alsoin their specificity Early processes may be differentiallyengaged by specific emotional expressions thus provid-ing a rapid classif ication of emotionally significantevents Data in support of this view come from single-unit recordings which reveal a rapid emergence of dif-ferential effects to emotional expressions in the human

Figure 7 Grand-averaged ERP waveforms elicited in the emotion task (toppanel) and in the lines task (bottom panel) at midline electrode Fz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are col-lapsed across blocks including each of the six different emotional facialexpressions and are shown separately for the first block (left) and the secondblock (right) including one specific emotional expression

108 EIMER HOLMES AND MCGLONE

amygdala (Liu et al 1999) Conversely later stagesmight be involved in the in-depth processing of variouskinds of affective information and thus would be muchless selective with respect to different facial expressions

This suggestion is consistent with some recent evi-dence that subcortical and neocortical routes for visualprocessing are involved differentially in emotional ex-pression analysis A subcortical magnocellular pathwayto the amygdala would appear to support valence dis-crimination processes whereas parvocellular subsys-tems of ventral visual cortices may be preferentially in-volved in emotional intensity evaluation irrespective ofemotional valence (Schyns amp Oliva 1999 VuilleumierArmony Driver amp Dolan 2003) Recent neuroimagingresults (Vuilleumier et al 2003) suggest that low andhigh spatial frequency components of fearful faces se-lectively drive amygdala and visual cortical responsesrespectively However although enhanced amygdala ac-tivation was found in response to low-spatial-frequencyfearful face stimuli explicit judgments relating to theperceived intensity of fearfulness were increased by thepresence of high-spatial-frequency cues These resultssupport the view that coarse visual information may bedirected via magnocellular channels from the retina tothe amygdala through a tectopulvinar pathway (eg Bistiamp Sireteanu 1976 Jones amp Burton 1976) enabling thefast appraisal of the affective significance of a stimulus(eg Morris Oumlhman amp Dolan 1999)3

Another aim of the present study was to investigatewhether the face-specific N170 component which is as-sumed to reflect the structural encoding of faces is sen-sitive to emotional facial expressions In previous ERPstudies which have not found any modulations of theN170 elicited by fearful relative to neutral faces (Eimeramp Holmes 2002 Holmes et al 2003) facial expressionwas always task irrelevant In contrast participantsrsquo re-sponses were contingent upon facial expression in thepresent emotion task In spite of this fact the N170 wasfound to be completely unaffected by facial expressionsin the emotion task and this was consistently the casefor all six emotional expressions used in the presentstudy (Figure 5)

In line with earlier findings from depth electrodes (Mc-Carthy Puce Belger amp Allison 1999) this pattern of re-sults now demonstrates comprehensively that the structuralencoding of faces as reflected by the N170 is entirely in-sensitive to information derived from emotional facial ex-pression Thus the rapid detection of emotional facialexpression appears to occur independently and in paral-lel to the construction of a detailed perceptual representa-tion of a face The absence of systematic early emotionalexpression effects at posterior sites and the presence ofsuch ERP effects at frontocentral electrodes at about160 msec poststimulus suggests that higher order visualprocessing stages involved in face processing are af-fected by emotional facial expression only after this in-formation has been processed in prefrontal cortex Thisis consistent with the face processing model proposed byBruce and Young (1986) in which the extraction of per-

ceptual information for emotional expression processingoccurs independently and simultaneously with structuralencoding for face recognition

In summary the present ERP results demonstrate thatthe neocortical processing of emotional facial expres-sion is strongly dependent on focal attention Whenfaces were attended systematic emotional expression ef-fects were elicited by emotional relative to neutral facesand these effects were strikingly similar in terms of theirtiming and morphology for all six basic facial expres-sions In contrast when attention was actively directedaway from these faces emotional expression effectswere completely eliminated The rapid and automatic en-coding of emotionally significant events occurring out-side the focus of attention may be adaptively advanta-geous because it prepares the organism for f ight orflight through subcortically mediated autonomic activa-tion (eg Oumlhman Flykt amp Lundqvist 2000) Howeverit is equally important that irrelevant affective stimuli donot continuously divert attention This suggests a divi-sion of labor between limbic structures involved in theobligatory detection of emotional informationmdashprepar-ing the organism for rapid action (Morris et al 1999Whalen et al 1998)mdashand subsequent neocortical emo-tional processing stages Limbic structures may be re-sponsible for establishing a readiness to respond to anyenvironmental threat that could become the focus of at-tention presumably through heightened autonomic acti-vation However neocortical stages appear to be pro-tected by efficient attentional gating mechanisms whichreduce distractibility by emotional stimuli so that ongo-ing goals and plans can be accomplished without inter-ference from irrelevant events

REFERENCES

Adolphs R (2002) Recognizing emotion from facial expressionsPsychological and neurological mechanisms Behavioral CognitiveNeuroscience Review 1 21-61

Adolphs R Tranel D amp Damasio A R ( 2003) Dissociableneural systems for recognizing emotions Brain amp Cognition 52 61-69

Amaral D G amp Price J L (1984) Amygdalo-cortical projectionsin the monkey (Macaca fasicularis) Journal of Comparative Neu-rology 230 465-496

Amaral D G Price J L Pitkanen A amp Carmichael S T(1992) Anatomical organization of the primate amygdaloid complexIn J P Aggleton (Ed) The amygdala Neurobiological aspects ofemotion memory and mental dysfunction (pp 1-66) New YorkWiley-Liss

Armony J L amp Dolan R J (2002) Modulation of spatial attentionby fear-conditioned stimuli An event-related fMRI study Neuro-psychologia 7 817-826

Bentin S Allison T Puce A Perez E amp McCarthy G (1996)Electrophysiological studies of face perception in humans Journal ofCognitive Neuroscience 8 551-565

Bisti S amp Sireteanu R C (1976) Sensitivity to spatial frequencyand contrast of visual cells in the cat superior colliculus Vision Re-search 16 247-251

Blair R J R Morris J S Frith C D Perrett D I amp DolanR J (1999) Dissociable neural responses to facial expressions ofsadness and anger Brain 122 883-893

Breiter H C Etcoff N L Whalen P J Kennedy W ARauch S L Buckner R L Strauss M M Hyman S E amp

ATTENTION AND FACIAL EXPRESSION PROCESSING 109

Rosen B R (1996) Response and habituation of the human amyg-dala during visual processing of facial expression Neuron 17 875-887

Bruce V amp Young A (1986) Understanding face recognitionBritish Journal of Psychology 77 305-327

Calder A J Keane J Manes F Antoun N amp Young A W(2000) Impaired recognition and experience of disgust followingbrain injury Nature Neuroscience 3 1077-1078

Calder A J Lawrence A D amp Young A W (2001) Neuropsy-chology of fear and loathing Nature Reviews Neuroscience 2 352-363

Carrasco M Penpeci-Talgar C amp Eckstein M (2000) Spatialcovert attention increases contrast sensitivity across the CSF Sup-port for signal enhancement Vision Research 40 1203-1215

Critchley H [D] Daly E [M] Phillips M Brammer M Bull-more E Williams S [C] van Amelsvoort T Robertson DDavid A amp Murphy D [G M] (2000) Explicit and implicit neuralmechanisms for processing of social information from facial expres-sions A functional magnetic resonance imaging study Human BrainMapping 9 93-105

Cuthbert B N Schupp H T Bradley M M Birbaumer N ampLang P J (2000) Brain potentials in affective picture processingCovariation with autonomic arousal and affective report BiologicalPsychology 52 95-111

Damasio A R (1994) Descartesrsquo error Emotion reason and the humanbrain New York G P Putnamrsquos Sons

Diedrich O Naumann E Maier S amp Becker G (1997) Afrontal slow wave in the ERP associated with emotional slides Jour-nal of Psychophysiology 11 71-84

Eastwood J D Smilek D amp Merikle P M (2001) Differential at-tentional guidance by unattended faces expressing positive and neg-ative emotion Perception amp Psychophysics 63 1004-1013

Eimer M (1998) Does the face-specif ic N170 component reflect theactivity of a specialized eye detector NeuroReport 9 2945-2948

Eimer M (2000) The face-specific N170 component reflects late stagesin the structural encoding of faces NeuroReport 11 2319-2324

Eimer M amp Holmes A (2002) An ERP study on the time course ofemotional face processing NeuroReport 13 427-431

Ekman P amp Friesen W V (1976) Pictures of facial affect Palo AltoCA Consulting Psychologists Press

Fox E Lester V Russo R Bowles R J Pichler A amp Dutton K(2000) Facial expressions of emotion Are angry faces detected moreefficiently Cognition amp Emotion 14 61-92

Hansen C H amp Hansen R D (1988) Finding the face in the crowdAn anger superiority effect Journal of Personality amp Social Psy-chology 54 917-924

Hariri A R Bookheimer S Y amp Mazziotta J C (2000) Modu-lating emotional responses Effects of a neocortical network on thelimbic system NeuroReport 11 43-48

Harmer C J Thilo K V Rothwell J C amp Goodwin G M(2001) Transcranial magnetic stimulation of medial-frontal corteximpairs the processing of angry facial expressions Nature Neuro-science 4 17-18

Holmes A Vuilleumier P amp Eimer M (2003) The processing ofemotional facial expression is gated by spatial attention Evidencefrom event-related brain potentials Cognitive Brain Research 16174-184

Jones E G amp Burton H (1976) A projection from the medial pul-vinar to the amygdala in primates Brain Research 104 142-147

Kawasaki H Kaufman O Damasio H Damasio A R Gran-ner M Bakken H Hori T Howard M A III amp Adolphs R(2001) Single-neuron responses to emotional visual stimuli recordedin human ventral prefrontal cortex Nature Neuroscience 4 15-16

Lane R D Chua P M amp Dolan R J (1999) Common effects ofemotional valence arousal and attention on neural activation duringvisual processing of pictures Neuropsychologia 37 989-997

Lang P J Bradley M M Fitzsimmons J R Cuthbert B NScott J D Moulder B amp Nangia V (1998) Emotional arousaland activation of the visual cortex An fMRI analysis Psychophysi-ology 35 199-210

Le Doux J E (1996) The emotional brain New York Simon ampSchuster

Liu L Ioannides A A amp Streit M (1999) Single trial analysis ofneurophysiological correlates of the recognition of complex objectsand facial expressions of emotion Brain Topography 11 291-303

McCarthy G Puce A Belger A amp Allison T (1999) Electro-physiological studies of human face perception II Response proper-ties of face-specific potentials generated in occipitotemporal cortexCerebral Cortex 9 431-444

Mogg K amp Bradley B P (1999) Orienting of attention to threaten-ing facial expressions presented under conditions of restricted aware-ness Cognition amp Emotion 13 713-740

Mogg K McNamara J Powys M Rawlinson H Seiffer A ampBradley B P (2000) Selective attention to threat A test of twocognitive models of anxiety Cognition amp Emotion 14 375-399

Morris J S Friston K J Buechel C Frith C D Young A WCalder A J amp Dolan R J (1998) A neuromodulatory role forthe human amygdala in processing emotional facial expressionsBrain 121 47-57

Morris J S Frith C D Perrett D I Rowland D YoungA W Calder A J amp Dolan R J (1996) A differential neural re-sponse in the human amygdala to fearful and happy facial expres-sions Nature 383 812-815

Morris J S Oumlhman A amp Dolan R J (1999) A subcortical path-way to the right amygdala mediating ldquounseenrdquo fear Proceedings ofthe National Academy of Sciences 96 1680-1685

Oumlhman A Flykt A amp Esteves F (2001) Emotion drives attentionDetecting the snake in the grass Journal of Experimental Psychol-ogy General 130 466-478

Oumlhman A Flykt A amp Lundqvist D (2000) Unconscious emo-tion Evolutionary perspectives psychophysiological data and neu-ropsychological mechanisms In R D Lane amp L Nadel (Eds) Cog-nitive neuroscience of emotion (pp 296-327) New York OxfordUniversity Press

Oumlhman A Lundqvist D amp Esteves F (2001) The face in thecrowd revisited A threat advantage with schematic stimuli Journalof Personality amp Social Psychology 80 381-396

Pessoa L Kastner S amp Ungerleider LG (2002) Attentionalcontrol of the processing of neutral and emotional stimuli CognitiveBrain Research 15 31-45

Pessoa L McKenna M Gutierrez E amp Ungerleider LG(2002) Neural processing of emotional faces requires attention Pro-ceedings of the National Academy of Sciences 99 11458-11463

Phillips M L Young A W Scott S K Calder A J Andrew CGiampietro V Williams S C R Bullmore E T BrammerMamp Gray J A (1998) Neural responses to facial and vocal expres-sions of fear and disgust Proceedings of the Royal Society LondonSeries B 265 1809-1817

Phillips M L Young A W Senior C Brammer M Andrew CCalder A J Bullmore E T Perrett D I Rowland D ampWilliams S C R et al (1997) A specif ic neural substrate forperceiving facial expressions of disgust Nature 389 495-498

Pizzagalli D Regard M amp Lehmann D (1999) Rapid emotionalface processing in the human right and left brain hemispheres AnERP study NeuroReport 10 2691-2698

Rapcsak S Z Galper S R Comer J F Reminger S L NielsenLKaszniak A W Verfaellie M Laguna J F Labiner D Mamp Cohen R A (2000) Fear recognition deficits after focal braindamage Neurology 54 575-581

Rolls E T (1999) The brain and emotion Oxford Oxford UniversityPress

Sato W Kochiyama T Yoshikawa S amp Matsumura M (2001)Emotional expression boosts early visual processing of the face ERPrecording and its decomposition by independent component analysisNeuroReport 12 709-714

Schyns P G amp Oliva A (1999) Dr Angry and Mr Smile Whencategorization flexibly modifies the perception of faces in rapid vi-sual presentations Cognition 69 243-265

Sprengelmeyer R Rausch M Eysel U T amp Przuntek H (1998)Neural structures associated with recognition of facial expressions of

110 EIMER HOLMES AND MCGLONE

basic emotions Proceedings of the Royal Society London Series B265 1927-1931

Stepniewska I Qi H X amp Kaas J H (1999) Do superior collicu-lus projection zones in the inferior pulvinar project to MT in pri-mates European Journal of Neuroscience 11 469-480

Vuilleumier P Armony J L Driver J amp Dolan R J (2001) Ef-fects of attention and emotion on face processing in the human brainAn event-related fMRI study Neuron 30 829-841

Vuilleumier P Armony J L Driver J amp Dolan R J (2003)Distinct spatial frequency sensitivities for processing faces and emo-tional expressions Nature Neuroscience 6 624-631

Vuilleumier P amp Schwartz S (2001a) Beware and be aware Cap-ture of spatial attention by fear-related stimuli in neglect Neuro-Report 12 1119-1122

Vuilleumier P amp Schwartz S (2001b) Emotional expressionscapture attention Neurology 56 153-158

Whalen P J Rach S L Etcoff N L McInerney S C LeeM B amp Jenike M A (1998) Masked presentations of emotionalfacial expressions modulate amygdala activity without explicitknowledge Journal of Neuroscience 18 411-418

Whalen P J Shin L M McInerney S C Fischer H WrightC I amp Rauch S L (2001) A functional MRI study of humanamygdala responses to facial expressions of fear versus anger Emo-tion 1 70-83

Yeterian E H amp Pandya D N (1991) Corticothalamic connec-tions of the superior temporal sulcus in rhesus monkeys Experimen-tal Brain Research 83 268-284

NOTES

1 In spite of the fact that significant valence effects were present atcentral electrodes in the emotion task but were absent in the lines taskthis interaction failed to reach significance at central sites

2 At lateral occipital electrodes a significantly enhanced positivityfor emotional relative to neutral faces was present between 320 and495 msec in the emotion task [F(113) = 62 p lt 03] but not in thelines task and this was reflected in a nearly significant task 3 valenceinteraction [F(113) = 46 p lt 06]

3 It should be noted that anatomical evidence for a colliculo-pulvinar-amygdalar pathway is currently lacking since the medial pulvinarwhich projects to the amygdala does not receive a significant directinput from the superior colliculus (eg Stepniewska Qi amp Kaas1999) However possible connections between the inferior pulvinar(which receives visual inputs from the superior colliculus) and the me-dial nucleus may support the transmission of information to the amyg-dala through a colliculo-pulvinar route Alternatively cortical inputmay be involved since STS (implicated in facial expression processingSprengelmeyer et al 1998) is known to project to the medial pulvinar(Yeterian amp Pandya 1991) Our thanks to an anonymous reviewer forraising this important point

(Manuscript received December 10 2002revision accepted for publication June 11 2003)

98 EIMER HOLMES AND MCGLONE

ERPs are particularly suited for studying the time courseof emotional processes and investigating whether andwhen the processing of emotional stimuli is modulatedby selective attention For example a positive slow wavestarting at about 300 msec after stimulus onset in responseto pictures with emotional content (Cuthbert SchuppBradley Birbaumer amp Lang 2000 Diedrich NaumannMaier amp Becker 1997) has been interpreted as reflect-ing the allocation of attention to motivationally relevantinput (Cuthbert et al 2000) More recently we haveshown that an enhanced positivity in response to foveallypresented fearful relative to neutral faces can be elicitedover prefrontal areas as early as 120 msec after stimulusonset (Eimer amp Holmes 2002) This early emotional ex-pression effect suggests that cortical circuits involved inthe detection of emotionally significant events can be trig-gered rapidly by emotional facial expressions (see alsoKawasaki et al 2001 Pizzagalli Regard amp Lehmann1999 Sato Kochiyama Yoshikawa amp Matsumura 2001for similar results from ERP and single-unit studies)

In another recent ERP study (Holmes Vuilleumier ampEimer 2003) we investigated for the first time whetherand how emotional expression effects elicited by fearfulrelative to neutral faces are affected by spatial attentionOn each trial arrays consisting of two faces and twohouses arranged in horizontal and vertical pairs werepresented Participants had to attend either to the twovertical or to the two horizontal locations (as indicatedby a precue presented at the beginning of each trial) inorder to detect infrequent identical stimuli at the cued lo-cation When faces were attended fearful faces elicitedan enhanced positivity relative to neutral faces with anearly frontal effect followed by a more broadly distrib-uted emotional positivity These emotional expressioneffects were completely eliminated on trials where faceswere presented at uncued (unattended) locations Thisfinding challenges the hypothesis that the detection andprocessing of emotional facial expression occurs preat-tentively and suggests that the processes reflected byERP modulations sensitive to emotional facial expres-sion are gated by spatial attention

The present study was designed to confirm and extendthese surprising results In our previous study (Holmeset al 2003) only one emotional facial expression (fear)was employed Although fearful faces are generally re-garded to be highly salient emotional stimuli the hy-pothesis that the processing of emotional facial expres-sion depends on spatial attention clearly needs to besubstantiated by investigating whether differential ERPresponses to facial expressions other than fear are gatedby spatial attention

In the present experiment all six basic emotional facialexpressions were shown in separate experimental blocksFace stimuli were photographs of 10 different individuals(Ekman amp Friesen 1976) with facial expression neutralor angry disgusted fearful happy sad or surprised(Figure 1 top panel) Each block contained an equalnumber of trials with emotional or neutral face pairs pre-sented bilaterally to the left and right of fixation In one

half of the experiment (lines task) attention was activelydirected away from these face stimuli toward a demandingperceptual judgment task Participants had to monitor apair of vertical lines presented bilaterally close to fixation(Figure 1 bottom panel) in order to decide on each trialwhether the two lines were identical or differed in lengthFaces had to be entirely ignored The other half of theexperiment (emotion task) was physically identical to thelines task but participants now had to decide on eachtrial whether facial expression was emotional or neutralHere lines could be entirely ignored

ERP modulations sensitive to emotional facial ex-pression were identified by comparing ERPs elicited byarrays containing emotional faces to ERPs in response toarrays with neutral faces separately for experimentalblocks including angry disgusted fearful happy sadand surprised faces To investigate the impact of atten-tion on the processing of emotional facial expressionthese comparisons were conducted separately for theemotion task where emotional expression was task rel-evant and for the lines task where faces were irrelevantand thus could be entirely ignored If emotional facialexpressions were detected preattentively and attractedattention automatically systematic ERP modulations inresponse to arrays containing emotional versus neutralfaces should be found not only in the emotion task butalso although perhaps in an attenuated fashion in thelines task In contrast if the detection and processing ofemotional faces requires focal attention (as suggested byHolmes et al 2003) ERP correlates of emotional faceprocessing should be entirely absent in the lines task

In addition to investigating the role of spatial attentionon the processing of facial expression the design of thepresent study also allowed the systematic comparison ofERP responses elicited by each of the six basic emotionalfacial expressions A number of lesion and neuroimagingstudies argue for the existence of neural systems that arespecialized for processing distinct emotions (Adolphs2002) For example a disproportionate activation of theamygdala has been observed in response to facial expres-sions of fear (Breiter et al 1996 Morris et al 1996Phillips et al 1998 Whalen et al 2001 but see Rapcsaket al 2000) Prefrontal cortex has been specifically im-plicated in the recognition of angry facial expressions(Blair Morris Frith Perrett amp Dolan 1999 HarmerThilo Rothwell amp Goodwin 2001) and the insula andbasal ganglia appear to be particularly involved in pro-cessing facial expressions of disgust (Adolphs Tranelamp Damasio 2003 Calder Keane Manes Antoun ampYoung 2000 Calder Lawrence amp Young 2001 Phillipset al 1998 Phillips et al 1997 Sprengelmeyer RauschEysel amp Przuntek 1998) If the detection and analysis ofspecific facial emotional expressions is mediated by dis-tinct brain processes this might be reflected in systematicdifferences in emotional expression effects on ERP wave-forms elicited in response to different facial expressions

Another aim in the present study was to investigatewhether early stages in the perceptual encoding of facestimuli are affected by emotional facial expression The

ATTENTION AND FACIAL EXPRESSION PROCESSING 99

face-specific N170 component is assumed to reflect theprecategorical structural encoding of faces prior to theirrecognition (Bentin Allison Puce Perez amp McCarthy1996 Eimer 1998 2000) In two recent ERP studies

(Eimer amp Holmes 2002 Holmes et al 2003) we havefound that the N170 is not modulated by emotional facialexpression This suggests that the structural encoding offaces and the processing of emotional expression are

Figure 1 Top panel Examples of face stimuli used in the present experimentFaces of 10 different individuals were used with facial expression either neu-tral (central) or (clockwise from top) disgusted fearful happy sad surprisedor angry Bottom panel Illustration of the stimulus array presented on eachtrial Two identical emotional or neutral faces were presented bilaterally withtwo vertical lines located close to fixation In the trial shown here a happy facepair is presented together with two lines of different lengths

100 EIMER HOLMES AND MCGLONE

parallel and independent processes (Bruce amp Young1986) However to date this conclusion has been basedonly on a comparison of N170 components elicited in re-sponse to fearful as versus neutral faces obtained underconditions where facial expression was task irrelevantTo investigate whether the face-specific N170 is unaf-fected by any emotional facial expression even when ex-pression is task relevant we compared the N170 elicitedby emotional versus neutral faces in the emotion taskseparately for all six basic facial expressions Any sys-tematic emotional expression effects on the N170 com-ponent would challenge the hypothesis that the structuralencoding of faces is completely independent of facial ex-pression analysis

METHOD

ParticipantsFifteen participants participated in this study One had to be ex-

cluded because of excessive eye blinks so 14 participants (7 femaleand 7 male 18ndash54 years old average age 296 years) remained inthe sample One participant was left-handed all others right-handedby self-report The experiment was performed in compliance withrelevant institutional guidelines and was approved by the BirkbeckCollege School of Psychology ethics committee

StimuliThe face stimuli were photographs of faces of 10 different individ-

uals all taken from a standard set of pictures of facial affect (Ekmanamp Friesen 1976) Facial expression was angry disgusted fearfulhappy sad surprised or neutral resulting in a total of 70 differentface stimuli (see Figure 1 top panel for examples) All face stim-uli covered a visual angle of about 34ordm 3 24ordm Each display alsocontained a pair of gray vertical lines (01ordm width) and each linewas either short (04ordm) or slightly longer (05ordm) All stimuli were pre-sented on a computer screen in front of a black background A whitefixation cross was continuously present at the center of the screen

ProcedureParticipants were seated in a dimly lit sound-attenuated cabin

and a computer screen was placed at a viewing distance of 70 cmThe experiment consisted of 24 experimental blocks each contain-ing 80 trials On each trial two identical faces were presented to-gether with two line stimuli in front of a black background (Fig-ure 1 bottom) Faces were located 22 ordm to the left and right offixation (measured as the distance between the fixation cross andthe center of each face stimulus) and the bilateral lines were pre-sented close to the f ixation cross (04ordm eccentricity) All stimuliwere presented simultaneously for 300 msec and the interval be-tween two successive stimulus presentations was 2000 msec

In 12 successive blocks participants had to indicate with a left-hand or right-hand buttonpress whether the face pair presented onany given trial showed an emotional or neutral expression (emotiontask) The mapping of emotional valence to response hand wascounterbalanced across participants In 40 trials per block emo-tional faces were presented in the other randomly intermingled 40trials facial expression was neutral Long and short lines whichwere irrelevant in these blocks appeared randomly and with equalprobability to the left and right of f ixation Emotional expressionwas varied across blocks with angry disgusted fearful happy sadand surprised faces each shown in two blocks The order in whichthese blocks were presented was randomized for each participant

In the other 12 successive blocks participants were instructed todirect their attention to the pair of lines presented close to f ixation

and to indicate with a left-hand or right-hand buttonpress whetherthese lines differed in length or were identical (lines task) The map-ping of line length to response hand was counterbalanced acrossparticipants Again short and long lines appeared randomly andequiprobably on the left or right side Faces which were now taskirrelevant were emotional on 40 trials and neutral on the other 40trials with emotional expression varied across blocks (two blockseach with angry disgusted fearful happy sad and surprisedfaces) The order in which these blocks were presented was againrandomized for each participant

Seven participants performed the emotion task prior to the linestask and this order was reversed for the other 7 participants Par-ticipants were instructed to keep their gaze directed at the centralfixation cross throughout each block and to respond as fast and ac-curately as possible on each trial

ERP procedures and data analysisEEG was recorded with Ag-AgCl electrodes and linked-earlobe

reference from Fpz F7 F3 Fz F4 F8 FC5 FC6 T7 C3 Cz C4T8 CP5 CP6 T5 P3 Pz P4 T6 and Oz (according to the 10-20system) and from OL and OR (located halfway between O1 and P7and O2 and P8 respectively) Horizontal EOG (HEOG) was recordedbipolarly from the outer canthi of both eyes The impedance for allelectrodes was kept below 5 kW The amplifier bandpass was 01 to40 Hz and no additional filters were applied to the averaged dataEEG and EOG were sampled with a digitization rate of 200 Hz andstored on disk Reaction times (RTs) were measured on each trial

EEG and HEOG were epoched off-line into 800-msec periodsstarting 100 msec prior to stimulus onset and ending 700 msec afterstimulus onset Trials with horizontal eye movements (HEOG ex-ceeding plusmn30 mV) eyeblinks (Fpz exceeding plusmn60 mV) or other arti-facts (a voltage exceeding plusmn80 mV at any electrode) measured afterstimulus onset were excluded from analysis EEG obtained was av-eraged relative to a 100-msec baseline preceding stimulus onsetOnly trials with correct behavioral responses were included in theaverages Separate averages were computed for the emotion taskand the lines task for all combinations of block type (experimentalblocks including angry vs disgusted vs fearful vs happy vs sad vssurprised faces) and valence (emotional vs neutral faces) resultingin 24 average waveforms for each electrode and participant

The f irst set of analyses was based on mean amplitudes obtainedat lateral posterior electrodes T5 and T6 (where the N170 is maximal)within a time window centered on the mean latency of the face-specific posterior N170 component (160ndash200 msec poststimulus)Repeated measures analyses of variance (ANOVAs) were conductedfor the factors task (emotion task vs lines task) block type and va-lence Additional analyses were conducted separately for the emo-tion and the lines tasks The second set of analyses was based onmean amplitude values computed within f ive successive poststim-ulus time windows (120ndash155 msec 160ndash215 msec 220ndash315 msec320ndash 495 msec and 500ndash700 msec) which covered the interval wheresystematic emotional expression effects were observed in our pre-vious experiments (Eimer amp Holmes 2002 Holmes et al 2003)Mean amplitude values were computed for frontal (F3 Fz F4) cen-tral (C3 Cz C4) parietal (P3 Pz P4) lateral temporal (T5 T6)and lateral occipital sites (OL OR) Again ANOVAs were conductedfor the factors task block type and valence followed by furtheranalyses conducted separately for ERPs obtained in the emotiontask and the lines task

For keypress responses repeated measures ANOVAs were per-formed on the latencies of correct responses and on error rates sepa-rately for the emotion task and the lines task for the factors block typeand valence In the analysis of behavioral performance in the linestask the additional factor of target type (identical lines vs differentlines) was included For all analyses GreenhousendashGeisser adjust-ments to the degrees of freedom were performed when appropriate

ATTENTION AND FACIAL EXPRESSION PROCESSING 101

RESULTS

Behavioral ResultsParticipants failed to respond on less than 3 of all tri-

als Correct responses were faster in the emotion task(622 msec) than in the lines task (695 msec) and this dif-ference was significant [t(14) = 474 p lt 001] Figure 2shows mean RTs (top panel) and the percentage of incor-rect responses (bottom panel) obtained in the emotiontask displayed separately for the six different block typesand for trials with emotional and neutral faces respec-tively For RTs main effects of block type [F(565) = 214p lt 001 e = 788] and of valence [F(113) = 190 p lt001] were present RTs differed systematically betweenblock types being fastest in blocks including happy facesand slowest in blocks including sad faces In addition re-

sponses were generally faster to emotional than to neutralfaces No interaction between block type and valence wasobtained indicating that this RT advantage for emotionalfaces was equivalent across all six block types

For error rates main effects of block type [F(565) =137 p lt 001 e = 282] and valence [F(113) = 154 p lt002] were again present for the emotion task As can beseen from Figure 2 (bottom panel) incorrect responseswere most frequent in blocks including sad faces andleast frequent in blocks including surprised faces Alsoit was more likely that emotional faces would be incor-rectly classified as neutral than that neutral faces wouldbe erroneously judged as emotional No block type 3 va-lence interaction was present

In the lines task no main effects of block type or va-lence were obtained for RT or error rate (all Fs lt 1) in-

Emotional FacesNeutral Faces

Emotional FacesNeutral Faces

Anger Disgust Fear Happiness Sadness Surprise

Anger Disgust Fear Happiness Sadness Surprise

RT

(mse

c)E

rror

rate

s (

)

750

700

650

600

550

500

30

20

10

0

Figure 2 Reaction times (top panel) and percentage of incorrect responses(bottom panel) to emotional and neutral faces in the emotion task displayedseparately for experimental blocks where neutral faces were intermixed withangry disgusted fearful happy sad or surprised faces

102 EIMER HOLMES AND MCGLONE

dicating that the emotional expression of task-irrelevantfaces did not interfere with perceptual identification per-formance Target type did not affect RT but had a signif-icant effect on error rate [F(113) = 159 p lt 002] Itwas more likely that lines of different length would beclassified as identical (232) than that identical lineswould be judged as different (102)

Electrophysiological ResultsFigure 3 shows ERPs obtained in the emotion task in

response to stimulus arrays containing either neutral faces(solid lines) or emotional faces (dashed lines) collapsedacross all six different emotional expressions Figure 4shows corresponding ERP waveforms obtained in the linestask A sustained positivity was elicited in response toarrays containing emotional faces in the emotion taskThis emotional expression effect was first visible at fronto-central sites at about 180 msec poststimulus (overlappingwith the P2 component) and appeared at parietal elec-

trodes around 300 msec poststimulus (Figure 3) At lateraltemporal and occipital electrodes emotional expressioneffects appeared at about 250 msec poststimulus as an en-hanced negativity for emotional relative to neutral facesin the emotion task In contrast no systematic emotionalexpression effects were found for the lines task (Figure 4)

The difference between emotional and neutral facesappears to leave the face-specific N170 component atlateral temporal sites T5 and T6 entirely unaffected Thiswas observed not only in the lines task (Figure 4) butalso in the emotion task (Figure 3) where facial expres-sion was task relevant These informal observations weresubstantiated by statistical analyses

N170 component In the N170 time range (160ndash200 msec poststimulus) N170 amplitudes elicited at T5and T6 in response to neutral versus emotional facesshowed neither a main effect of valence nor a task 3 va-lence interaction (both Fs lt 1) demonstrating that theN170 is not modulated by facial expression (Figures 3

Figure 3 Grand-averaged ERP waveforms elicited in the emotion task in the 700-msec in-terval following stimulus onset in response to stimulus arrays containing neutral faces (solidlines) or emotional faces (dashed lines) collapsed across blocks including each of the six dif-ferent emotional facial expressions

ATTENTION AND FACIAL EXPRESSION PROCESSING 103

and 4) To further ascertain that this component is unaf-fected by emotional expression even when expression istask relevant we conducted additional analyses on N170amplitudes observed in the emotion task (Figure 3) Nomain effect of valence (F lt 12) or interaction betweenblock type and valence (F lt 1) was observed indicatingthat the N170 was similarly insensitive to emotional fa-cial expression for all six basic emotions employed hereeven though participants had to discriminate betweenemotional and neutral faces in this task This is illus-trated in Figure 5 which displays ERPs in response toneutral and emotional faces elicited in the emotion taskat right lateral temporal electrode T6 shown separatelyfor each of the six facial expressions which were pre-sented in different blocks No systematic differential ef-fects of any facial expression on the N170 are apparentand this was confirmed by additional planned pairedcomparisons of N170 amplitudes at T5 and T6 in re-

sponse to emotional versus neutral faces conducted sep-arately for all six basic emotions None of these com-parisons even approached statistical significance [allts(13) lt 15]

Emotional expression effects No main effects of va-lence or task 3 valence interactions were observed in the120- to 155-msec time window In the 160- to 215-msecanalysis window a task 3 valence interaction was pres-ent at frontal sites [F(113) = 52 p lt 05]1 Main effectsof valence were found at frontal and central sites [bothFs(114) gt 91 both ps lt 01] in the emotion task reflect-ing an enhanced positivity elicited in response to arrayscontaining emotional faces (Figure 3) These effects werecompletely absent in the lines task (both Fs lt 1) No in-teractions between block type and valence were found atfrontal and central sites in the emotion task (both Fs lt11) demonstrating that this early emotional positivitywas elicited in response to emotional versus neutral faces

Figure 4 Grand-averaged ERP waveforms elicited in the lines task in the 700-msec inter-val following stimulus onset in response to stimulus arrays containing neutral faces (solidlines) or emotional faces (dashed lines) Data are collapsed across blocks including each ofthe six different emotional facial expressions as well as across trials with identical and dif-ferent line pairs

104 EIMER HOLMES AND MCGLONE

irrespective of which of the six basic emotions was in-cluded in a given block This fact is further illustrated inFigure 6 which shows ERPs in response to neutral andemotional faces elicited in the emotion task at Fz dis-played separately for all emotional expressions used inthis experiment Emotional expression effects were verysimilar across expressions and started at approximatelythe same time for all six basic emotions No significantemotional expression effects were present between 160and 215 msec poststimulus at parietal and occipital elec-trodes

Between 220 and 315 msec poststimulus task 3 va-lence interactions were present at frontal and centralelectrodes as well as at lateral temporal and occipitalsites [all Fs(113) gt 72 all ps lt 02] indicating thatemotional expression affected ERPs in the emotion taskbut not in the lines task At frontal and central sites maineffects of valence in the emotion task [both Fs(113) gt150 both ps lt 02] reflected enhanced positivities foremotional relative to neutral faces (Figure 3) No blocktype 3 valence interactions were present (both Fs lt 1)demonstrating that this effect was elicited in similarfashion for all six basic emotions (Figure 6) Again nofrontocentral emotional expression effects were observedin the lines task (both Fs lt 16) At lateral temporal andoccipital sites an enhanced negativity was observed inthe 220- to 315-msec latency window for emotional rel-ative to neutral faces in the emotion task [both Fs(113) gt61 both ps lt 03] but not in the lines task (both Fs lt 1)Again no block type 3 valence interactions were pres-

ent for the emotion task (both Fs lt 16) indicating thatthis lateral posterior emotional negativity was elicited inresponse to all six basic emotions (Figure 5)

In the final two analysis windows (320ndash495 msec and500ndash700 msec poststimulus respectively) highly sig-nificant task 3 valence interactions were present atfrontal central and parietal electrodes [all Fs(113) gt100 all ps lt 01] again reflecting the presence of emo-tional expression effects in the emotion task (Figure 3)and their absence in the lines task (Figure 4) Main ef-fects of valence at frontal and central as well as at pari-etal electrodes in the emotion task [all Fs(113) gt 119all ps lt 01] without any significant interactions betweenvalence and block type demonstrated that enhanced pos-itivities for emotional faces were elicited at these sites ina similar fashion for all six basic emotions (Figure 6)Again effects of valence were entirely absent in the linestask2

DISCUSSION

The primary aim of the present ERP experiment wasto extend previous findings (Holmes et al 2003) that thedetection and processing of emotional information de-livered by facial expressions requires focal attention Werecorded ERPs to stimulus arrays containing emotionalor neutral bilateral faces under conditions when facialexpression was task relevant and therefore attended (emo-tion task) or when attention was actively directed awayfrom these faces toward a demanding perceptual judgment

Figure 5 Grand-averaged ERP waveforms elicited in the emotion task at right lateral tem-poral electrode T6 in the 700-msec interval following stimulus onset in response to stimulusarrays containing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are shownseparately for blocks containing angry disgusted fearful happy sad or surprised faces

ATTENTION AND FACIAL EXPRESSION PROCESSING 105

(lines task) In our previous ERP study (Holmes et al2003) spatial attention was manipulated on a trial-by-trial basis by precues presented at the start of each trialfacial expression was not task relevant (participants hadto detect infrequent identical stimulus pairs regardlessof expression) and only one emotional expression (fear)was tested In the present experiment a sustained attentionparadigm was employed (with emotion and lines tasksdelivered in separate experimental halves) facial expres-sion was task relevant in the emotion task and most im-portant all six basic facial emotional expressions wereincluded in different blocks

ERP correlates of emotional facial expression pro-cessing were identified by comparing ERPs elicited ontrials with emotional faces with ERPs in response to neu-tral faces This was done separately for the emotion taskand the lines task and for blocks including angry dis-gusted fearful happy sad and surprised faces In theemotion task where attention was directed toward task-relevant facial expressions an enhanced positivity foremotional relative to neutral faces was elicited similar toprevious observations from studies comparing ERP re-sponses to fearful versus neutral faces (Eimer amp Holmes2002 Holmes et al 2003) This emotional expressioneffect started at about 160 msec poststimulus and wasinitially distributed frontocentrally whereas a morebroadly distributed positivity was observed beyond300 msec (Figure 3) In addition an enhanced negativityfor fearful relative to neutral faces was elicited at lateral

posterior electrodes between 220 and 320 msec post-stimulus

The onset of the early frontocentral emotional expres-sion effect was slightly later in the present experimentthan in our previous experiment (Holmes et al 2003)where significant frontal differences between ERPs tofearful and neutral faces were already present at about120 msec poststimulus In the present study verticallines were presented close to fixation simultaneouslywith the bilateral faces whereas no such stimuli were in-cluded in our earlier experiment The presence of theseadditional central events may have slightly delayed theonset of early emotional expression effects It shouldalso be noted that an attenuation of amygdala responsesto emotional facial expressions has been observed whenthe demand for explicit emotion recognition was in-creased (Critchley et al 2000 Hariri Bookheimer ampMazziotta 2000) It is possible that the demand for ex-plicit emotion recognition in the emotion task con-tributed to the delayed onset of the early emotional ex-pression effect

In marked contrast to these ERP results obtained inthe emotion task emotional expression effects were en-tirely absent in the lines task (Figure 4) demonstratingthat ERP correlates of facial expression processing arestrongly dependent on spatial attention With sustainedspatial attention directed away from face stimuli towardanother demanding perceptual task the presence of emo-tional versus neutral faces had no effect whatsoever on

Figure 6 Grand-averaged ERP waveforms elicited in the emotion task at midline electrodeFz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are shown separatelyfor blocks containing angry disgusted fearful happy sad or surprised faces

106 EIMER HOLMES AND MCGLONE

ERP waveforms That is emotional expression effectswere completely eliminated for all six basic emotions in-cluded in this experiment In line with this ERP resultperformance in the lines task was entirely unaffected bythe expression of the faces presented simultaneouslywith the task-relevant line pairs Overall these findingsextend and confirm the observations of our previousERP experiment which compared ERPs in response tofearful versus neutral faces (Holmes et al 2003) Clearlythese results challenge the hypothesis that the detectionandor processing of emotional facial expression occurspreattentively If this were the case at least some system-atic ERP differences should have been elicited in responseto emotional versus neutral faces in the lines task reflect-ing the automatic detection of emotionally significantevents

Covert attention toward emotional faces under condi-tions when they were task relevant may have enhancedtheir visualndashperceptual representation (eg CarrascoPenpeci-Talgar amp Eckstein 2000) thereby enabling theextraction of features relating to the affective valence ofthese faces and thus their subsequent encoding andanalysis (as reflected by the emotion-specific ERP effectsobserved in the emotion task) The early frontocentrallydistributed emotional expression effects may be mediatedby connections from the superior temporal sulcus (STS)and amygdala to orbitofrontal cortex (Rolls 1999) TheSTS has been implicated in the early discrimination ofvisual features relating to emotional facial expressions(eg Sprengelmeyer et al 1998) In addition efferentfeedback projections from the amygdala and relatedstructures (see Lang et al 1998 Morris et al 1998)may have produced the more broadly distributed emo-tional expression effects observed in the present experi-ment at longer latencies

One could argue that the absence of emotional ex-pression effects under conditions where faces were un-attended may have been due to the fact that the presen-tation of specific emotional expressions was blocked andthat each expression was presented repeatedly in twoseparate blocks Repeated exposure to a specific emo-tional expression may have resulted in a gradual habitu-ation of emotion-specific responses thus potentially at-tenuating any emotional expression effects that may havebeen present in the lines task To investigate this possi-bility we computed separate averages for the first blockand for the second block including angry disgustedfearful happy sad or surprised faces separately for theemotion and for the lines task These data were then an-alyzed with the additional factor of block position (firstvs second block containing a specific emotional facialexpression) If emotional expression effects were subjectto habituation one would expect to find larger emotionalexpression effects for the first relative to the second blockin the emotion task and potentially also a residual emo-tional expression effect for the first block in the lines task

Figure 7 shows ERPs elicited at Fz in response to neu-tral faces (solid lines) or emotional faces (dashed lines)

collapsed across all six different emotional expressionsERPs are displayed separately for the emotion task (toppanel) and the lines task (bottom panel) and for the firstblock (left) or second block (right) including one of thesix emotional expressions As can be seen from Figure 7(top) there was no evidence whatsoever for any habitu-ation of emotional expression effects as a function ofblock position in the emotion task This was confirmedby the absence of any block position 3 valence or blockposition 3 block type 3 valence interactions for all latencywindows employed in the analyses reported above [allFs(113) lt 1] Along similar lines Figure 7 (bottom panel)suggests that there was no residual emotional expressioneffect for the first block including a specific emotionalexpression in the lines task This was confirmed by theabsence of any interactions involving block position [allFs(113) lt 16] Thus the fact that emotional expressioneffects were absent in response to unattended faces in thelines task is unlikely to have been the result of a habitu-ation of emotion-specific brain responses

The conclusion that the processing of emotional facialexpression as reflected by ERP facial expression effectsis gated by spatial attention appears to be inconsistentwith neuroimaging studies demonstrating that fearfulfaces result in amygdala activations even when thesefaces are outside the focus of attention (Vuilleumieret al 2001 see also Morris et al 1996 Whalen et al1998) However it is extremely unlikely that the ERP ef-fects observed in the present study are directly linked toamygdala activations Due to its nuclear structure ofclustered neurones the amygdala is electrically closedand thus largely inaccessible to ERP measures The earlyemotional expression effects observed in response to at-tended faces are more likely to be generated in prefrontalcortex where emotion-specific single-cell responseshave recently been recorded at short latencies (Kawasakiet al 2001) Such prefrontal responses may reflect stagesin emotional processing that could be contingent uponbut functionally separate from prior amygdala activa-tions (see Le Doux 1996 Rolls 1999) It is possible thatamygdala responses can be triggered by unattendedemotional stimuli (although these responses may be at-tenuated) whereas subsequent neocortical stages ofemotional processing (as reflected by the ERP effects ob-served in the present experiment) are fully dependent onfocal attention An alternative possibility is that amyg-dala responses to emotional stimuli may also require at-tention (see Pessoa Kastner amp Ungerleider 2002 Pes-soa McKenna et al 2002) and that the elimination ofemotional expression effects in the lines task reflects anearlier attentional gating of such subcortical processing

Another important new finding of the present experi-ment was that the onset time course and scalp distribu-tion of emotional expression effects obtained in the emo-tion task were remarkably similar for all six basic facialexpressions used here (Figures 5 and 6) The absence ofany differential ERP responses to different emotional ex-pressions was reflected by the absence of any significant

ATTENTION AND FACIAL EXPRESSION PROCESSING 107

interactions between block type (blocks with angry dis-gusted fearful happy sad or surprised faces) and va-lence (emotional vs neutral expression) In line withthese observations the size of the RT advantage for emo-tional relative to neutral faces in the emotion task wassimilar for all six emotional facial expressions (Figure 2top panel) The similarity in the time course of emotionalexpression effects across all six emotional expressionsobserved here suggests that emotionally relevant infor-mation delivered by facial expression is available to neo-cortical processes within less then 200 msec after stim-ulus onset and at approximately the same time for allbasic emotional expressions

These observations do not seem to support the ideasuggested by recent f MRI results that distinct neuralsubsystems specialize in the processing of specific emo-tions (Adolphs 2002) If this were the case one mighthave expected some systematic differences between ERPemotional expression effects elicited by different facial

expressions However it should be noted that althoughsome neuroimaging data show emotion-specific differ-ential activation of brain regions such as the amygdala orinsula few studies point to differential activation withinsurface cortical structures (where the ERP effects ob-served in the present experiments are likely to be gener-ated see also Pizzagalli et al 1999 Sato et al 2001for related results from recent ERP studies)

Thus one could argue that early stages in the processingof emotionally relevant information subserved by lim-bic structures or the basal ganglia and subsequent neo-cortical emotional processing stages differ not only intheir dependence on focal attention (see above) but alsoin their specificity Early processes may be differentiallyengaged by specific emotional expressions thus provid-ing a rapid classif ication of emotionally significantevents Data in support of this view come from single-unit recordings which reveal a rapid emergence of dif-ferential effects to emotional expressions in the human

Figure 7 Grand-averaged ERP waveforms elicited in the emotion task (toppanel) and in the lines task (bottom panel) at midline electrode Fz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are col-lapsed across blocks including each of the six different emotional facialexpressions and are shown separately for the first block (left) and the secondblock (right) including one specific emotional expression

108 EIMER HOLMES AND MCGLONE

amygdala (Liu et al 1999) Conversely later stagesmight be involved in the in-depth processing of variouskinds of affective information and thus would be muchless selective with respect to different facial expressions

This suggestion is consistent with some recent evi-dence that subcortical and neocortical routes for visualprocessing are involved differentially in emotional ex-pression analysis A subcortical magnocellular pathwayto the amygdala would appear to support valence dis-crimination processes whereas parvocellular subsys-tems of ventral visual cortices may be preferentially in-volved in emotional intensity evaluation irrespective ofemotional valence (Schyns amp Oliva 1999 VuilleumierArmony Driver amp Dolan 2003) Recent neuroimagingresults (Vuilleumier et al 2003) suggest that low andhigh spatial frequency components of fearful faces se-lectively drive amygdala and visual cortical responsesrespectively However although enhanced amygdala ac-tivation was found in response to low-spatial-frequencyfearful face stimuli explicit judgments relating to theperceived intensity of fearfulness were increased by thepresence of high-spatial-frequency cues These resultssupport the view that coarse visual information may bedirected via magnocellular channels from the retina tothe amygdala through a tectopulvinar pathway (eg Bistiamp Sireteanu 1976 Jones amp Burton 1976) enabling thefast appraisal of the affective significance of a stimulus(eg Morris Oumlhman amp Dolan 1999)3

Another aim of the present study was to investigatewhether the face-specific N170 component which is as-sumed to reflect the structural encoding of faces is sen-sitive to emotional facial expressions In previous ERPstudies which have not found any modulations of theN170 elicited by fearful relative to neutral faces (Eimeramp Holmes 2002 Holmes et al 2003) facial expressionwas always task irrelevant In contrast participantsrsquo re-sponses were contingent upon facial expression in thepresent emotion task In spite of this fact the N170 wasfound to be completely unaffected by facial expressionsin the emotion task and this was consistently the casefor all six emotional expressions used in the presentstudy (Figure 5)

In line with earlier findings from depth electrodes (Mc-Carthy Puce Belger amp Allison 1999) this pattern of re-sults now demonstrates comprehensively that the structuralencoding of faces as reflected by the N170 is entirely in-sensitive to information derived from emotional facial ex-pression Thus the rapid detection of emotional facialexpression appears to occur independently and in paral-lel to the construction of a detailed perceptual representa-tion of a face The absence of systematic early emotionalexpression effects at posterior sites and the presence ofsuch ERP effects at frontocentral electrodes at about160 msec poststimulus suggests that higher order visualprocessing stages involved in face processing are af-fected by emotional facial expression only after this in-formation has been processed in prefrontal cortex Thisis consistent with the face processing model proposed byBruce and Young (1986) in which the extraction of per-

ceptual information for emotional expression processingoccurs independently and simultaneously with structuralencoding for face recognition

In summary the present ERP results demonstrate thatthe neocortical processing of emotional facial expres-sion is strongly dependent on focal attention Whenfaces were attended systematic emotional expression ef-fects were elicited by emotional relative to neutral facesand these effects were strikingly similar in terms of theirtiming and morphology for all six basic facial expres-sions In contrast when attention was actively directedaway from these faces emotional expression effectswere completely eliminated The rapid and automatic en-coding of emotionally significant events occurring out-side the focus of attention may be adaptively advanta-geous because it prepares the organism for f ight orflight through subcortically mediated autonomic activa-tion (eg Oumlhman Flykt amp Lundqvist 2000) Howeverit is equally important that irrelevant affective stimuli donot continuously divert attention This suggests a divi-sion of labor between limbic structures involved in theobligatory detection of emotional informationmdashprepar-ing the organism for rapid action (Morris et al 1999Whalen et al 1998)mdashand subsequent neocortical emo-tional processing stages Limbic structures may be re-sponsible for establishing a readiness to respond to anyenvironmental threat that could become the focus of at-tention presumably through heightened autonomic acti-vation However neocortical stages appear to be pro-tected by efficient attentional gating mechanisms whichreduce distractibility by emotional stimuli so that ongo-ing goals and plans can be accomplished without inter-ference from irrelevant events

REFERENCES

Adolphs R (2002) Recognizing emotion from facial expressionsPsychological and neurological mechanisms Behavioral CognitiveNeuroscience Review 1 21-61

Adolphs R Tranel D amp Damasio A R ( 2003) Dissociableneural systems for recognizing emotions Brain amp Cognition 52 61-69

Amaral D G amp Price J L (1984) Amygdalo-cortical projectionsin the monkey (Macaca fasicularis) Journal of Comparative Neu-rology 230 465-496

Amaral D G Price J L Pitkanen A amp Carmichael S T(1992) Anatomical organization of the primate amygdaloid complexIn J P Aggleton (Ed) The amygdala Neurobiological aspects ofemotion memory and mental dysfunction (pp 1-66) New YorkWiley-Liss

Armony J L amp Dolan R J (2002) Modulation of spatial attentionby fear-conditioned stimuli An event-related fMRI study Neuro-psychologia 7 817-826

Bentin S Allison T Puce A Perez E amp McCarthy G (1996)Electrophysiological studies of face perception in humans Journal ofCognitive Neuroscience 8 551-565

Bisti S amp Sireteanu R C (1976) Sensitivity to spatial frequencyand contrast of visual cells in the cat superior colliculus Vision Re-search 16 247-251

Blair R J R Morris J S Frith C D Perrett D I amp DolanR J (1999) Dissociable neural responses to facial expressions ofsadness and anger Brain 122 883-893

Breiter H C Etcoff N L Whalen P J Kennedy W ARauch S L Buckner R L Strauss M M Hyman S E amp

ATTENTION AND FACIAL EXPRESSION PROCESSING 109

Rosen B R (1996) Response and habituation of the human amyg-dala during visual processing of facial expression Neuron 17 875-887

Bruce V amp Young A (1986) Understanding face recognitionBritish Journal of Psychology 77 305-327

Calder A J Keane J Manes F Antoun N amp Young A W(2000) Impaired recognition and experience of disgust followingbrain injury Nature Neuroscience 3 1077-1078

Calder A J Lawrence A D amp Young A W (2001) Neuropsy-chology of fear and loathing Nature Reviews Neuroscience 2 352-363

Carrasco M Penpeci-Talgar C amp Eckstein M (2000) Spatialcovert attention increases contrast sensitivity across the CSF Sup-port for signal enhancement Vision Research 40 1203-1215

Critchley H [D] Daly E [M] Phillips M Brammer M Bull-more E Williams S [C] van Amelsvoort T Robertson DDavid A amp Murphy D [G M] (2000) Explicit and implicit neuralmechanisms for processing of social information from facial expres-sions A functional magnetic resonance imaging study Human BrainMapping 9 93-105

Cuthbert B N Schupp H T Bradley M M Birbaumer N ampLang P J (2000) Brain potentials in affective picture processingCovariation with autonomic arousal and affective report BiologicalPsychology 52 95-111

Damasio A R (1994) Descartesrsquo error Emotion reason and the humanbrain New York G P Putnamrsquos Sons

Diedrich O Naumann E Maier S amp Becker G (1997) Afrontal slow wave in the ERP associated with emotional slides Jour-nal of Psychophysiology 11 71-84

Eastwood J D Smilek D amp Merikle P M (2001) Differential at-tentional guidance by unattended faces expressing positive and neg-ative emotion Perception amp Psychophysics 63 1004-1013

Eimer M (1998) Does the face-specif ic N170 component reflect theactivity of a specialized eye detector NeuroReport 9 2945-2948

Eimer M (2000) The face-specific N170 component reflects late stagesin the structural encoding of faces NeuroReport 11 2319-2324

Eimer M amp Holmes A (2002) An ERP study on the time course ofemotional face processing NeuroReport 13 427-431

Ekman P amp Friesen W V (1976) Pictures of facial affect Palo AltoCA Consulting Psychologists Press

Fox E Lester V Russo R Bowles R J Pichler A amp Dutton K(2000) Facial expressions of emotion Are angry faces detected moreefficiently Cognition amp Emotion 14 61-92

Hansen C H amp Hansen R D (1988) Finding the face in the crowdAn anger superiority effect Journal of Personality amp Social Psy-chology 54 917-924

Hariri A R Bookheimer S Y amp Mazziotta J C (2000) Modu-lating emotional responses Effects of a neocortical network on thelimbic system NeuroReport 11 43-48

Harmer C J Thilo K V Rothwell J C amp Goodwin G M(2001) Transcranial magnetic stimulation of medial-frontal corteximpairs the processing of angry facial expressions Nature Neuro-science 4 17-18

Holmes A Vuilleumier P amp Eimer M (2003) The processing ofemotional facial expression is gated by spatial attention Evidencefrom event-related brain potentials Cognitive Brain Research 16174-184

Jones E G amp Burton H (1976) A projection from the medial pul-vinar to the amygdala in primates Brain Research 104 142-147

Kawasaki H Kaufman O Damasio H Damasio A R Gran-ner M Bakken H Hori T Howard M A III amp Adolphs R(2001) Single-neuron responses to emotional visual stimuli recordedin human ventral prefrontal cortex Nature Neuroscience 4 15-16

Lane R D Chua P M amp Dolan R J (1999) Common effects ofemotional valence arousal and attention on neural activation duringvisual processing of pictures Neuropsychologia 37 989-997

Lang P J Bradley M M Fitzsimmons J R Cuthbert B NScott J D Moulder B amp Nangia V (1998) Emotional arousaland activation of the visual cortex An fMRI analysis Psychophysi-ology 35 199-210

Le Doux J E (1996) The emotional brain New York Simon ampSchuster

Liu L Ioannides A A amp Streit M (1999) Single trial analysis ofneurophysiological correlates of the recognition of complex objectsand facial expressions of emotion Brain Topography 11 291-303

McCarthy G Puce A Belger A amp Allison T (1999) Electro-physiological studies of human face perception II Response proper-ties of face-specific potentials generated in occipitotemporal cortexCerebral Cortex 9 431-444

Mogg K amp Bradley B P (1999) Orienting of attention to threaten-ing facial expressions presented under conditions of restricted aware-ness Cognition amp Emotion 13 713-740

Mogg K McNamara J Powys M Rawlinson H Seiffer A ampBradley B P (2000) Selective attention to threat A test of twocognitive models of anxiety Cognition amp Emotion 14 375-399

Morris J S Friston K J Buechel C Frith C D Young A WCalder A J amp Dolan R J (1998) A neuromodulatory role forthe human amygdala in processing emotional facial expressionsBrain 121 47-57

Morris J S Frith C D Perrett D I Rowland D YoungA W Calder A J amp Dolan R J (1996) A differential neural re-sponse in the human amygdala to fearful and happy facial expres-sions Nature 383 812-815

Morris J S Oumlhman A amp Dolan R J (1999) A subcortical path-way to the right amygdala mediating ldquounseenrdquo fear Proceedings ofthe National Academy of Sciences 96 1680-1685

Oumlhman A Flykt A amp Esteves F (2001) Emotion drives attentionDetecting the snake in the grass Journal of Experimental Psychol-ogy General 130 466-478

Oumlhman A Flykt A amp Lundqvist D (2000) Unconscious emo-tion Evolutionary perspectives psychophysiological data and neu-ropsychological mechanisms In R D Lane amp L Nadel (Eds) Cog-nitive neuroscience of emotion (pp 296-327) New York OxfordUniversity Press

Oumlhman A Lundqvist D amp Esteves F (2001) The face in thecrowd revisited A threat advantage with schematic stimuli Journalof Personality amp Social Psychology 80 381-396

Pessoa L Kastner S amp Ungerleider LG (2002) Attentionalcontrol of the processing of neutral and emotional stimuli CognitiveBrain Research 15 31-45

Pessoa L McKenna M Gutierrez E amp Ungerleider LG(2002) Neural processing of emotional faces requires attention Pro-ceedings of the National Academy of Sciences 99 11458-11463

Phillips M L Young A W Scott S K Calder A J Andrew CGiampietro V Williams S C R Bullmore E T BrammerMamp Gray J A (1998) Neural responses to facial and vocal expres-sions of fear and disgust Proceedings of the Royal Society LondonSeries B 265 1809-1817

Phillips M L Young A W Senior C Brammer M Andrew CCalder A J Bullmore E T Perrett D I Rowland D ampWilliams S C R et al (1997) A specif ic neural substrate forperceiving facial expressions of disgust Nature 389 495-498

Pizzagalli D Regard M amp Lehmann D (1999) Rapid emotionalface processing in the human right and left brain hemispheres AnERP study NeuroReport 10 2691-2698

Rapcsak S Z Galper S R Comer J F Reminger S L NielsenLKaszniak A W Verfaellie M Laguna J F Labiner D Mamp Cohen R A (2000) Fear recognition deficits after focal braindamage Neurology 54 575-581

Rolls E T (1999) The brain and emotion Oxford Oxford UniversityPress

Sato W Kochiyama T Yoshikawa S amp Matsumura M (2001)Emotional expression boosts early visual processing of the face ERPrecording and its decomposition by independent component analysisNeuroReport 12 709-714

Schyns P G amp Oliva A (1999) Dr Angry and Mr Smile Whencategorization flexibly modifies the perception of faces in rapid vi-sual presentations Cognition 69 243-265

Sprengelmeyer R Rausch M Eysel U T amp Przuntek H (1998)Neural structures associated with recognition of facial expressions of

110 EIMER HOLMES AND MCGLONE

basic emotions Proceedings of the Royal Society London Series B265 1927-1931

Stepniewska I Qi H X amp Kaas J H (1999) Do superior collicu-lus projection zones in the inferior pulvinar project to MT in pri-mates European Journal of Neuroscience 11 469-480

Vuilleumier P Armony J L Driver J amp Dolan R J (2001) Ef-fects of attention and emotion on face processing in the human brainAn event-related fMRI study Neuron 30 829-841

Vuilleumier P Armony J L Driver J amp Dolan R J (2003)Distinct spatial frequency sensitivities for processing faces and emo-tional expressions Nature Neuroscience 6 624-631

Vuilleumier P amp Schwartz S (2001a) Beware and be aware Cap-ture of spatial attention by fear-related stimuli in neglect Neuro-Report 12 1119-1122

Vuilleumier P amp Schwartz S (2001b) Emotional expressionscapture attention Neurology 56 153-158

Whalen P J Rach S L Etcoff N L McInerney S C LeeM B amp Jenike M A (1998) Masked presentations of emotionalfacial expressions modulate amygdala activity without explicitknowledge Journal of Neuroscience 18 411-418

Whalen P J Shin L M McInerney S C Fischer H WrightC I amp Rauch S L (2001) A functional MRI study of humanamygdala responses to facial expressions of fear versus anger Emo-tion 1 70-83

Yeterian E H amp Pandya D N (1991) Corticothalamic connec-tions of the superior temporal sulcus in rhesus monkeys Experimen-tal Brain Research 83 268-284

NOTES

1 In spite of the fact that significant valence effects were present atcentral electrodes in the emotion task but were absent in the lines taskthis interaction failed to reach significance at central sites

2 At lateral occipital electrodes a significantly enhanced positivityfor emotional relative to neutral faces was present between 320 and495 msec in the emotion task [F(113) = 62 p lt 03] but not in thelines task and this was reflected in a nearly significant task 3 valenceinteraction [F(113) = 46 p lt 06]

3 It should be noted that anatomical evidence for a colliculo-pulvinar-amygdalar pathway is currently lacking since the medial pulvinarwhich projects to the amygdala does not receive a significant directinput from the superior colliculus (eg Stepniewska Qi amp Kaas1999) However possible connections between the inferior pulvinar(which receives visual inputs from the superior colliculus) and the me-dial nucleus may support the transmission of information to the amyg-dala through a colliculo-pulvinar route Alternatively cortical inputmay be involved since STS (implicated in facial expression processingSprengelmeyer et al 1998) is known to project to the medial pulvinar(Yeterian amp Pandya 1991) Our thanks to an anonymous reviewer forraising this important point

(Manuscript received December 10 2002revision accepted for publication June 11 2003)

ATTENTION AND FACIAL EXPRESSION PROCESSING 99

face-specific N170 component is assumed to reflect theprecategorical structural encoding of faces prior to theirrecognition (Bentin Allison Puce Perez amp McCarthy1996 Eimer 1998 2000) In two recent ERP studies

(Eimer amp Holmes 2002 Holmes et al 2003) we havefound that the N170 is not modulated by emotional facialexpression This suggests that the structural encoding offaces and the processing of emotional expression are

Figure 1 Top panel Examples of face stimuli used in the present experimentFaces of 10 different individuals were used with facial expression either neu-tral (central) or (clockwise from top) disgusted fearful happy sad surprisedor angry Bottom panel Illustration of the stimulus array presented on eachtrial Two identical emotional or neutral faces were presented bilaterally withtwo vertical lines located close to fixation In the trial shown here a happy facepair is presented together with two lines of different lengths

100 EIMER HOLMES AND MCGLONE

parallel and independent processes (Bruce amp Young1986) However to date this conclusion has been basedonly on a comparison of N170 components elicited in re-sponse to fearful as versus neutral faces obtained underconditions where facial expression was task irrelevantTo investigate whether the face-specific N170 is unaf-fected by any emotional facial expression even when ex-pression is task relevant we compared the N170 elicitedby emotional versus neutral faces in the emotion taskseparately for all six basic facial expressions Any sys-tematic emotional expression effects on the N170 com-ponent would challenge the hypothesis that the structuralencoding of faces is completely independent of facial ex-pression analysis

METHOD

ParticipantsFifteen participants participated in this study One had to be ex-

cluded because of excessive eye blinks so 14 participants (7 femaleand 7 male 18ndash54 years old average age 296 years) remained inthe sample One participant was left-handed all others right-handedby self-report The experiment was performed in compliance withrelevant institutional guidelines and was approved by the BirkbeckCollege School of Psychology ethics committee

StimuliThe face stimuli were photographs of faces of 10 different individ-

uals all taken from a standard set of pictures of facial affect (Ekmanamp Friesen 1976) Facial expression was angry disgusted fearfulhappy sad surprised or neutral resulting in a total of 70 differentface stimuli (see Figure 1 top panel for examples) All face stim-uli covered a visual angle of about 34ordm 3 24ordm Each display alsocontained a pair of gray vertical lines (01ordm width) and each linewas either short (04ordm) or slightly longer (05ordm) All stimuli were pre-sented on a computer screen in front of a black background A whitefixation cross was continuously present at the center of the screen

ProcedureParticipants were seated in a dimly lit sound-attenuated cabin

and a computer screen was placed at a viewing distance of 70 cmThe experiment consisted of 24 experimental blocks each contain-ing 80 trials On each trial two identical faces were presented to-gether with two line stimuli in front of a black background (Fig-ure 1 bottom) Faces were located 22 ordm to the left and right offixation (measured as the distance between the fixation cross andthe center of each face stimulus) and the bilateral lines were pre-sented close to the f ixation cross (04ordm eccentricity) All stimuliwere presented simultaneously for 300 msec and the interval be-tween two successive stimulus presentations was 2000 msec

In 12 successive blocks participants had to indicate with a left-hand or right-hand buttonpress whether the face pair presented onany given trial showed an emotional or neutral expression (emotiontask) The mapping of emotional valence to response hand wascounterbalanced across participants In 40 trials per block emo-tional faces were presented in the other randomly intermingled 40trials facial expression was neutral Long and short lines whichwere irrelevant in these blocks appeared randomly and with equalprobability to the left and right of f ixation Emotional expressionwas varied across blocks with angry disgusted fearful happy sadand surprised faces each shown in two blocks The order in whichthese blocks were presented was randomized for each participant

In the other 12 successive blocks participants were instructed todirect their attention to the pair of lines presented close to f ixation

and to indicate with a left-hand or right-hand buttonpress whetherthese lines differed in length or were identical (lines task) The map-ping of line length to response hand was counterbalanced acrossparticipants Again short and long lines appeared randomly andequiprobably on the left or right side Faces which were now taskirrelevant were emotional on 40 trials and neutral on the other 40trials with emotional expression varied across blocks (two blockseach with angry disgusted fearful happy sad and surprisedfaces) The order in which these blocks were presented was againrandomized for each participant

Seven participants performed the emotion task prior to the linestask and this order was reversed for the other 7 participants Par-ticipants were instructed to keep their gaze directed at the centralfixation cross throughout each block and to respond as fast and ac-curately as possible on each trial

ERP procedures and data analysisEEG was recorded with Ag-AgCl electrodes and linked-earlobe

reference from Fpz F7 F3 Fz F4 F8 FC5 FC6 T7 C3 Cz C4T8 CP5 CP6 T5 P3 Pz P4 T6 and Oz (according to the 10-20system) and from OL and OR (located halfway between O1 and P7and O2 and P8 respectively) Horizontal EOG (HEOG) was recordedbipolarly from the outer canthi of both eyes The impedance for allelectrodes was kept below 5 kW The amplifier bandpass was 01 to40 Hz and no additional filters were applied to the averaged dataEEG and EOG were sampled with a digitization rate of 200 Hz andstored on disk Reaction times (RTs) were measured on each trial

EEG and HEOG were epoched off-line into 800-msec periodsstarting 100 msec prior to stimulus onset and ending 700 msec afterstimulus onset Trials with horizontal eye movements (HEOG ex-ceeding plusmn30 mV) eyeblinks (Fpz exceeding plusmn60 mV) or other arti-facts (a voltage exceeding plusmn80 mV at any electrode) measured afterstimulus onset were excluded from analysis EEG obtained was av-eraged relative to a 100-msec baseline preceding stimulus onsetOnly trials with correct behavioral responses were included in theaverages Separate averages were computed for the emotion taskand the lines task for all combinations of block type (experimentalblocks including angry vs disgusted vs fearful vs happy vs sad vssurprised faces) and valence (emotional vs neutral faces) resultingin 24 average waveforms for each electrode and participant

The f irst set of analyses was based on mean amplitudes obtainedat lateral posterior electrodes T5 and T6 (where the N170 is maximal)within a time window centered on the mean latency of the face-specific posterior N170 component (160ndash200 msec poststimulus)Repeated measures analyses of variance (ANOVAs) were conductedfor the factors task (emotion task vs lines task) block type and va-lence Additional analyses were conducted separately for the emo-tion and the lines tasks The second set of analyses was based onmean amplitude values computed within f ive successive poststim-ulus time windows (120ndash155 msec 160ndash215 msec 220ndash315 msec320ndash 495 msec and 500ndash700 msec) which covered the interval wheresystematic emotional expression effects were observed in our pre-vious experiments (Eimer amp Holmes 2002 Holmes et al 2003)Mean amplitude values were computed for frontal (F3 Fz F4) cen-tral (C3 Cz C4) parietal (P3 Pz P4) lateral temporal (T5 T6)and lateral occipital sites (OL OR) Again ANOVAs were conductedfor the factors task block type and valence followed by furtheranalyses conducted separately for ERPs obtained in the emotiontask and the lines task

For keypress responses repeated measures ANOVAs were per-formed on the latencies of correct responses and on error rates sepa-rately for the emotion task and the lines task for the factors block typeand valence In the analysis of behavioral performance in the linestask the additional factor of target type (identical lines vs differentlines) was included For all analyses GreenhousendashGeisser adjust-ments to the degrees of freedom were performed when appropriate

ATTENTION AND FACIAL EXPRESSION PROCESSING 101

RESULTS

Behavioral ResultsParticipants failed to respond on less than 3 of all tri-

als Correct responses were faster in the emotion task(622 msec) than in the lines task (695 msec) and this dif-ference was significant [t(14) = 474 p lt 001] Figure 2shows mean RTs (top panel) and the percentage of incor-rect responses (bottom panel) obtained in the emotiontask displayed separately for the six different block typesand for trials with emotional and neutral faces respec-tively For RTs main effects of block type [F(565) = 214p lt 001 e = 788] and of valence [F(113) = 190 p lt001] were present RTs differed systematically betweenblock types being fastest in blocks including happy facesand slowest in blocks including sad faces In addition re-

sponses were generally faster to emotional than to neutralfaces No interaction between block type and valence wasobtained indicating that this RT advantage for emotionalfaces was equivalent across all six block types

For error rates main effects of block type [F(565) =137 p lt 001 e = 282] and valence [F(113) = 154 p lt002] were again present for the emotion task As can beseen from Figure 2 (bottom panel) incorrect responseswere most frequent in blocks including sad faces andleast frequent in blocks including surprised faces Alsoit was more likely that emotional faces would be incor-rectly classified as neutral than that neutral faces wouldbe erroneously judged as emotional No block type 3 va-lence interaction was present

In the lines task no main effects of block type or va-lence were obtained for RT or error rate (all Fs lt 1) in-

Emotional FacesNeutral Faces

Emotional FacesNeutral Faces

Anger Disgust Fear Happiness Sadness Surprise

Anger Disgust Fear Happiness Sadness Surprise

RT

(mse

c)E

rror

rate

s (

)

750

700

650

600

550

500

30

20

10

0

Figure 2 Reaction times (top panel) and percentage of incorrect responses(bottom panel) to emotional and neutral faces in the emotion task displayedseparately for experimental blocks where neutral faces were intermixed withangry disgusted fearful happy sad or surprised faces

102 EIMER HOLMES AND MCGLONE

dicating that the emotional expression of task-irrelevantfaces did not interfere with perceptual identification per-formance Target type did not affect RT but had a signif-icant effect on error rate [F(113) = 159 p lt 002] Itwas more likely that lines of different length would beclassified as identical (232) than that identical lineswould be judged as different (102)

Electrophysiological ResultsFigure 3 shows ERPs obtained in the emotion task in

response to stimulus arrays containing either neutral faces(solid lines) or emotional faces (dashed lines) collapsedacross all six different emotional expressions Figure 4shows corresponding ERP waveforms obtained in the linestask A sustained positivity was elicited in response toarrays containing emotional faces in the emotion taskThis emotional expression effect was first visible at fronto-central sites at about 180 msec poststimulus (overlappingwith the P2 component) and appeared at parietal elec-

trodes around 300 msec poststimulus (Figure 3) At lateraltemporal and occipital electrodes emotional expressioneffects appeared at about 250 msec poststimulus as an en-hanced negativity for emotional relative to neutral facesin the emotion task In contrast no systematic emotionalexpression effects were found for the lines task (Figure 4)

The difference between emotional and neutral facesappears to leave the face-specific N170 component atlateral temporal sites T5 and T6 entirely unaffected Thiswas observed not only in the lines task (Figure 4) butalso in the emotion task (Figure 3) where facial expres-sion was task relevant These informal observations weresubstantiated by statistical analyses

N170 component In the N170 time range (160ndash200 msec poststimulus) N170 amplitudes elicited at T5and T6 in response to neutral versus emotional facesshowed neither a main effect of valence nor a task 3 va-lence interaction (both Fs lt 1) demonstrating that theN170 is not modulated by facial expression (Figures 3

Figure 3 Grand-averaged ERP waveforms elicited in the emotion task in the 700-msec in-terval following stimulus onset in response to stimulus arrays containing neutral faces (solidlines) or emotional faces (dashed lines) collapsed across blocks including each of the six dif-ferent emotional facial expressions

ATTENTION AND FACIAL EXPRESSION PROCESSING 103

and 4) To further ascertain that this component is unaf-fected by emotional expression even when expression istask relevant we conducted additional analyses on N170amplitudes observed in the emotion task (Figure 3) Nomain effect of valence (F lt 12) or interaction betweenblock type and valence (F lt 1) was observed indicatingthat the N170 was similarly insensitive to emotional fa-cial expression for all six basic emotions employed hereeven though participants had to discriminate betweenemotional and neutral faces in this task This is illus-trated in Figure 5 which displays ERPs in response toneutral and emotional faces elicited in the emotion taskat right lateral temporal electrode T6 shown separatelyfor each of the six facial expressions which were pre-sented in different blocks No systematic differential ef-fects of any facial expression on the N170 are apparentand this was confirmed by additional planned pairedcomparisons of N170 amplitudes at T5 and T6 in re-

sponse to emotional versus neutral faces conducted sep-arately for all six basic emotions None of these com-parisons even approached statistical significance [allts(13) lt 15]

Emotional expression effects No main effects of va-lence or task 3 valence interactions were observed in the120- to 155-msec time window In the 160- to 215-msecanalysis window a task 3 valence interaction was pres-ent at frontal sites [F(113) = 52 p lt 05]1 Main effectsof valence were found at frontal and central sites [bothFs(114) gt 91 both ps lt 01] in the emotion task reflect-ing an enhanced positivity elicited in response to arrayscontaining emotional faces (Figure 3) These effects werecompletely absent in the lines task (both Fs lt 1) No in-teractions between block type and valence were found atfrontal and central sites in the emotion task (both Fs lt11) demonstrating that this early emotional positivitywas elicited in response to emotional versus neutral faces

Figure 4 Grand-averaged ERP waveforms elicited in the lines task in the 700-msec inter-val following stimulus onset in response to stimulus arrays containing neutral faces (solidlines) or emotional faces (dashed lines) Data are collapsed across blocks including each ofthe six different emotional facial expressions as well as across trials with identical and dif-ferent line pairs

104 EIMER HOLMES AND MCGLONE

irrespective of which of the six basic emotions was in-cluded in a given block This fact is further illustrated inFigure 6 which shows ERPs in response to neutral andemotional faces elicited in the emotion task at Fz dis-played separately for all emotional expressions used inthis experiment Emotional expression effects were verysimilar across expressions and started at approximatelythe same time for all six basic emotions No significantemotional expression effects were present between 160and 215 msec poststimulus at parietal and occipital elec-trodes

Between 220 and 315 msec poststimulus task 3 va-lence interactions were present at frontal and centralelectrodes as well as at lateral temporal and occipitalsites [all Fs(113) gt 72 all ps lt 02] indicating thatemotional expression affected ERPs in the emotion taskbut not in the lines task At frontal and central sites maineffects of valence in the emotion task [both Fs(113) gt150 both ps lt 02] reflected enhanced positivities foremotional relative to neutral faces (Figure 3) No blocktype 3 valence interactions were present (both Fs lt 1)demonstrating that this effect was elicited in similarfashion for all six basic emotions (Figure 6) Again nofrontocentral emotional expression effects were observedin the lines task (both Fs lt 16) At lateral temporal andoccipital sites an enhanced negativity was observed inthe 220- to 315-msec latency window for emotional rel-ative to neutral faces in the emotion task [both Fs(113) gt61 both ps lt 03] but not in the lines task (both Fs lt 1)Again no block type 3 valence interactions were pres-

ent for the emotion task (both Fs lt 16) indicating thatthis lateral posterior emotional negativity was elicited inresponse to all six basic emotions (Figure 5)

In the final two analysis windows (320ndash495 msec and500ndash700 msec poststimulus respectively) highly sig-nificant task 3 valence interactions were present atfrontal central and parietal electrodes [all Fs(113) gt100 all ps lt 01] again reflecting the presence of emo-tional expression effects in the emotion task (Figure 3)and their absence in the lines task (Figure 4) Main ef-fects of valence at frontal and central as well as at pari-etal electrodes in the emotion task [all Fs(113) gt 119all ps lt 01] without any significant interactions betweenvalence and block type demonstrated that enhanced pos-itivities for emotional faces were elicited at these sites ina similar fashion for all six basic emotions (Figure 6)Again effects of valence were entirely absent in the linestask2

DISCUSSION

The primary aim of the present ERP experiment wasto extend previous findings (Holmes et al 2003) that thedetection and processing of emotional information de-livered by facial expressions requires focal attention Werecorded ERPs to stimulus arrays containing emotionalor neutral bilateral faces under conditions when facialexpression was task relevant and therefore attended (emo-tion task) or when attention was actively directed awayfrom these faces toward a demanding perceptual judgment

Figure 5 Grand-averaged ERP waveforms elicited in the emotion task at right lateral tem-poral electrode T6 in the 700-msec interval following stimulus onset in response to stimulusarrays containing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are shownseparately for blocks containing angry disgusted fearful happy sad or surprised faces

ATTENTION AND FACIAL EXPRESSION PROCESSING 105

(lines task) In our previous ERP study (Holmes et al2003) spatial attention was manipulated on a trial-by-trial basis by precues presented at the start of each trialfacial expression was not task relevant (participants hadto detect infrequent identical stimulus pairs regardlessof expression) and only one emotional expression (fear)was tested In the present experiment a sustained attentionparadigm was employed (with emotion and lines tasksdelivered in separate experimental halves) facial expres-sion was task relevant in the emotion task and most im-portant all six basic facial emotional expressions wereincluded in different blocks

ERP correlates of emotional facial expression pro-cessing were identified by comparing ERPs elicited ontrials with emotional faces with ERPs in response to neu-tral faces This was done separately for the emotion taskand the lines task and for blocks including angry dis-gusted fearful happy sad and surprised faces In theemotion task where attention was directed toward task-relevant facial expressions an enhanced positivity foremotional relative to neutral faces was elicited similar toprevious observations from studies comparing ERP re-sponses to fearful versus neutral faces (Eimer amp Holmes2002 Holmes et al 2003) This emotional expressioneffect started at about 160 msec poststimulus and wasinitially distributed frontocentrally whereas a morebroadly distributed positivity was observed beyond300 msec (Figure 3) In addition an enhanced negativityfor fearful relative to neutral faces was elicited at lateral

posterior electrodes between 220 and 320 msec post-stimulus

The onset of the early frontocentral emotional expres-sion effect was slightly later in the present experimentthan in our previous experiment (Holmes et al 2003)where significant frontal differences between ERPs tofearful and neutral faces were already present at about120 msec poststimulus In the present study verticallines were presented close to fixation simultaneouslywith the bilateral faces whereas no such stimuli were in-cluded in our earlier experiment The presence of theseadditional central events may have slightly delayed theonset of early emotional expression effects It shouldalso be noted that an attenuation of amygdala responsesto emotional facial expressions has been observed whenthe demand for explicit emotion recognition was in-creased (Critchley et al 2000 Hariri Bookheimer ampMazziotta 2000) It is possible that the demand for ex-plicit emotion recognition in the emotion task con-tributed to the delayed onset of the early emotional ex-pression effect

In marked contrast to these ERP results obtained inthe emotion task emotional expression effects were en-tirely absent in the lines task (Figure 4) demonstratingthat ERP correlates of facial expression processing arestrongly dependent on spatial attention With sustainedspatial attention directed away from face stimuli towardanother demanding perceptual task the presence of emo-tional versus neutral faces had no effect whatsoever on

Figure 6 Grand-averaged ERP waveforms elicited in the emotion task at midline electrodeFz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are shown separatelyfor blocks containing angry disgusted fearful happy sad or surprised faces

106 EIMER HOLMES AND MCGLONE

ERP waveforms That is emotional expression effectswere completely eliminated for all six basic emotions in-cluded in this experiment In line with this ERP resultperformance in the lines task was entirely unaffected bythe expression of the faces presented simultaneouslywith the task-relevant line pairs Overall these findingsextend and confirm the observations of our previousERP experiment which compared ERPs in response tofearful versus neutral faces (Holmes et al 2003) Clearlythese results challenge the hypothesis that the detectionandor processing of emotional facial expression occurspreattentively If this were the case at least some system-atic ERP differences should have been elicited in responseto emotional versus neutral faces in the lines task reflect-ing the automatic detection of emotionally significantevents

Covert attention toward emotional faces under condi-tions when they were task relevant may have enhancedtheir visualndashperceptual representation (eg CarrascoPenpeci-Talgar amp Eckstein 2000) thereby enabling theextraction of features relating to the affective valence ofthese faces and thus their subsequent encoding andanalysis (as reflected by the emotion-specific ERP effectsobserved in the emotion task) The early frontocentrallydistributed emotional expression effects may be mediatedby connections from the superior temporal sulcus (STS)and amygdala to orbitofrontal cortex (Rolls 1999) TheSTS has been implicated in the early discrimination ofvisual features relating to emotional facial expressions(eg Sprengelmeyer et al 1998) In addition efferentfeedback projections from the amygdala and relatedstructures (see Lang et al 1998 Morris et al 1998)may have produced the more broadly distributed emo-tional expression effects observed in the present experi-ment at longer latencies

One could argue that the absence of emotional ex-pression effects under conditions where faces were un-attended may have been due to the fact that the presen-tation of specific emotional expressions was blocked andthat each expression was presented repeatedly in twoseparate blocks Repeated exposure to a specific emo-tional expression may have resulted in a gradual habitu-ation of emotion-specific responses thus potentially at-tenuating any emotional expression effects that may havebeen present in the lines task To investigate this possi-bility we computed separate averages for the first blockand for the second block including angry disgustedfearful happy sad or surprised faces separately for theemotion and for the lines task These data were then an-alyzed with the additional factor of block position (firstvs second block containing a specific emotional facialexpression) If emotional expression effects were subjectto habituation one would expect to find larger emotionalexpression effects for the first relative to the second blockin the emotion task and potentially also a residual emo-tional expression effect for the first block in the lines task

Figure 7 shows ERPs elicited at Fz in response to neu-tral faces (solid lines) or emotional faces (dashed lines)

collapsed across all six different emotional expressionsERPs are displayed separately for the emotion task (toppanel) and the lines task (bottom panel) and for the firstblock (left) or second block (right) including one of thesix emotional expressions As can be seen from Figure 7(top) there was no evidence whatsoever for any habitu-ation of emotional expression effects as a function ofblock position in the emotion task This was confirmedby the absence of any block position 3 valence or blockposition 3 block type 3 valence interactions for all latencywindows employed in the analyses reported above [allFs(113) lt 1] Along similar lines Figure 7 (bottom panel)suggests that there was no residual emotional expressioneffect for the first block including a specific emotionalexpression in the lines task This was confirmed by theabsence of any interactions involving block position [allFs(113) lt 16] Thus the fact that emotional expressioneffects were absent in response to unattended faces in thelines task is unlikely to have been the result of a habitu-ation of emotion-specific brain responses

The conclusion that the processing of emotional facialexpression as reflected by ERP facial expression effectsis gated by spatial attention appears to be inconsistentwith neuroimaging studies demonstrating that fearfulfaces result in amygdala activations even when thesefaces are outside the focus of attention (Vuilleumieret al 2001 see also Morris et al 1996 Whalen et al1998) However it is extremely unlikely that the ERP ef-fects observed in the present study are directly linked toamygdala activations Due to its nuclear structure ofclustered neurones the amygdala is electrically closedand thus largely inaccessible to ERP measures The earlyemotional expression effects observed in response to at-tended faces are more likely to be generated in prefrontalcortex where emotion-specific single-cell responseshave recently been recorded at short latencies (Kawasakiet al 2001) Such prefrontal responses may reflect stagesin emotional processing that could be contingent uponbut functionally separate from prior amygdala activa-tions (see Le Doux 1996 Rolls 1999) It is possible thatamygdala responses can be triggered by unattendedemotional stimuli (although these responses may be at-tenuated) whereas subsequent neocortical stages ofemotional processing (as reflected by the ERP effects ob-served in the present experiment) are fully dependent onfocal attention An alternative possibility is that amyg-dala responses to emotional stimuli may also require at-tention (see Pessoa Kastner amp Ungerleider 2002 Pes-soa McKenna et al 2002) and that the elimination ofemotional expression effects in the lines task reflects anearlier attentional gating of such subcortical processing

Another important new finding of the present experi-ment was that the onset time course and scalp distribu-tion of emotional expression effects obtained in the emo-tion task were remarkably similar for all six basic facialexpressions used here (Figures 5 and 6) The absence ofany differential ERP responses to different emotional ex-pressions was reflected by the absence of any significant

ATTENTION AND FACIAL EXPRESSION PROCESSING 107

interactions between block type (blocks with angry dis-gusted fearful happy sad or surprised faces) and va-lence (emotional vs neutral expression) In line withthese observations the size of the RT advantage for emo-tional relative to neutral faces in the emotion task wassimilar for all six emotional facial expressions (Figure 2top panel) The similarity in the time course of emotionalexpression effects across all six emotional expressionsobserved here suggests that emotionally relevant infor-mation delivered by facial expression is available to neo-cortical processes within less then 200 msec after stim-ulus onset and at approximately the same time for allbasic emotional expressions

These observations do not seem to support the ideasuggested by recent f MRI results that distinct neuralsubsystems specialize in the processing of specific emo-tions (Adolphs 2002) If this were the case one mighthave expected some systematic differences between ERPemotional expression effects elicited by different facial

expressions However it should be noted that althoughsome neuroimaging data show emotion-specific differ-ential activation of brain regions such as the amygdala orinsula few studies point to differential activation withinsurface cortical structures (where the ERP effects ob-served in the present experiments are likely to be gener-ated see also Pizzagalli et al 1999 Sato et al 2001for related results from recent ERP studies)

Thus one could argue that early stages in the processingof emotionally relevant information subserved by lim-bic structures or the basal ganglia and subsequent neo-cortical emotional processing stages differ not only intheir dependence on focal attention (see above) but alsoin their specificity Early processes may be differentiallyengaged by specific emotional expressions thus provid-ing a rapid classif ication of emotionally significantevents Data in support of this view come from single-unit recordings which reveal a rapid emergence of dif-ferential effects to emotional expressions in the human

Figure 7 Grand-averaged ERP waveforms elicited in the emotion task (toppanel) and in the lines task (bottom panel) at midline electrode Fz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are col-lapsed across blocks including each of the six different emotional facialexpressions and are shown separately for the first block (left) and the secondblock (right) including one specific emotional expression

108 EIMER HOLMES AND MCGLONE

amygdala (Liu et al 1999) Conversely later stagesmight be involved in the in-depth processing of variouskinds of affective information and thus would be muchless selective with respect to different facial expressions

This suggestion is consistent with some recent evi-dence that subcortical and neocortical routes for visualprocessing are involved differentially in emotional ex-pression analysis A subcortical magnocellular pathwayto the amygdala would appear to support valence dis-crimination processes whereas parvocellular subsys-tems of ventral visual cortices may be preferentially in-volved in emotional intensity evaluation irrespective ofemotional valence (Schyns amp Oliva 1999 VuilleumierArmony Driver amp Dolan 2003) Recent neuroimagingresults (Vuilleumier et al 2003) suggest that low andhigh spatial frequency components of fearful faces se-lectively drive amygdala and visual cortical responsesrespectively However although enhanced amygdala ac-tivation was found in response to low-spatial-frequencyfearful face stimuli explicit judgments relating to theperceived intensity of fearfulness were increased by thepresence of high-spatial-frequency cues These resultssupport the view that coarse visual information may bedirected via magnocellular channels from the retina tothe amygdala through a tectopulvinar pathway (eg Bistiamp Sireteanu 1976 Jones amp Burton 1976) enabling thefast appraisal of the affective significance of a stimulus(eg Morris Oumlhman amp Dolan 1999)3

Another aim of the present study was to investigatewhether the face-specific N170 component which is as-sumed to reflect the structural encoding of faces is sen-sitive to emotional facial expressions In previous ERPstudies which have not found any modulations of theN170 elicited by fearful relative to neutral faces (Eimeramp Holmes 2002 Holmes et al 2003) facial expressionwas always task irrelevant In contrast participantsrsquo re-sponses were contingent upon facial expression in thepresent emotion task In spite of this fact the N170 wasfound to be completely unaffected by facial expressionsin the emotion task and this was consistently the casefor all six emotional expressions used in the presentstudy (Figure 5)

In line with earlier findings from depth electrodes (Mc-Carthy Puce Belger amp Allison 1999) this pattern of re-sults now demonstrates comprehensively that the structuralencoding of faces as reflected by the N170 is entirely in-sensitive to information derived from emotional facial ex-pression Thus the rapid detection of emotional facialexpression appears to occur independently and in paral-lel to the construction of a detailed perceptual representa-tion of a face The absence of systematic early emotionalexpression effects at posterior sites and the presence ofsuch ERP effects at frontocentral electrodes at about160 msec poststimulus suggests that higher order visualprocessing stages involved in face processing are af-fected by emotional facial expression only after this in-formation has been processed in prefrontal cortex Thisis consistent with the face processing model proposed byBruce and Young (1986) in which the extraction of per-

ceptual information for emotional expression processingoccurs independently and simultaneously with structuralencoding for face recognition

In summary the present ERP results demonstrate thatthe neocortical processing of emotional facial expres-sion is strongly dependent on focal attention Whenfaces were attended systematic emotional expression ef-fects were elicited by emotional relative to neutral facesand these effects were strikingly similar in terms of theirtiming and morphology for all six basic facial expres-sions In contrast when attention was actively directedaway from these faces emotional expression effectswere completely eliminated The rapid and automatic en-coding of emotionally significant events occurring out-side the focus of attention may be adaptively advanta-geous because it prepares the organism for f ight orflight through subcortically mediated autonomic activa-tion (eg Oumlhman Flykt amp Lundqvist 2000) Howeverit is equally important that irrelevant affective stimuli donot continuously divert attention This suggests a divi-sion of labor between limbic structures involved in theobligatory detection of emotional informationmdashprepar-ing the organism for rapid action (Morris et al 1999Whalen et al 1998)mdashand subsequent neocortical emo-tional processing stages Limbic structures may be re-sponsible for establishing a readiness to respond to anyenvironmental threat that could become the focus of at-tention presumably through heightened autonomic acti-vation However neocortical stages appear to be pro-tected by efficient attentional gating mechanisms whichreduce distractibility by emotional stimuli so that ongo-ing goals and plans can be accomplished without inter-ference from irrelevant events

REFERENCES

Adolphs R (2002) Recognizing emotion from facial expressionsPsychological and neurological mechanisms Behavioral CognitiveNeuroscience Review 1 21-61

Adolphs R Tranel D amp Damasio A R ( 2003) Dissociableneural systems for recognizing emotions Brain amp Cognition 52 61-69

Amaral D G amp Price J L (1984) Amygdalo-cortical projectionsin the monkey (Macaca fasicularis) Journal of Comparative Neu-rology 230 465-496

Amaral D G Price J L Pitkanen A amp Carmichael S T(1992) Anatomical organization of the primate amygdaloid complexIn J P Aggleton (Ed) The amygdala Neurobiological aspects ofemotion memory and mental dysfunction (pp 1-66) New YorkWiley-Liss

Armony J L amp Dolan R J (2002) Modulation of spatial attentionby fear-conditioned stimuli An event-related fMRI study Neuro-psychologia 7 817-826

Bentin S Allison T Puce A Perez E amp McCarthy G (1996)Electrophysiological studies of face perception in humans Journal ofCognitive Neuroscience 8 551-565

Bisti S amp Sireteanu R C (1976) Sensitivity to spatial frequencyand contrast of visual cells in the cat superior colliculus Vision Re-search 16 247-251

Blair R J R Morris J S Frith C D Perrett D I amp DolanR J (1999) Dissociable neural responses to facial expressions ofsadness and anger Brain 122 883-893

Breiter H C Etcoff N L Whalen P J Kennedy W ARauch S L Buckner R L Strauss M M Hyman S E amp

ATTENTION AND FACIAL EXPRESSION PROCESSING 109

Rosen B R (1996) Response and habituation of the human amyg-dala during visual processing of facial expression Neuron 17 875-887

Bruce V amp Young A (1986) Understanding face recognitionBritish Journal of Psychology 77 305-327

Calder A J Keane J Manes F Antoun N amp Young A W(2000) Impaired recognition and experience of disgust followingbrain injury Nature Neuroscience 3 1077-1078

Calder A J Lawrence A D amp Young A W (2001) Neuropsy-chology of fear and loathing Nature Reviews Neuroscience 2 352-363

Carrasco M Penpeci-Talgar C amp Eckstein M (2000) Spatialcovert attention increases contrast sensitivity across the CSF Sup-port for signal enhancement Vision Research 40 1203-1215

Critchley H [D] Daly E [M] Phillips M Brammer M Bull-more E Williams S [C] van Amelsvoort T Robertson DDavid A amp Murphy D [G M] (2000) Explicit and implicit neuralmechanisms for processing of social information from facial expres-sions A functional magnetic resonance imaging study Human BrainMapping 9 93-105

Cuthbert B N Schupp H T Bradley M M Birbaumer N ampLang P J (2000) Brain potentials in affective picture processingCovariation with autonomic arousal and affective report BiologicalPsychology 52 95-111

Damasio A R (1994) Descartesrsquo error Emotion reason and the humanbrain New York G P Putnamrsquos Sons

Diedrich O Naumann E Maier S amp Becker G (1997) Afrontal slow wave in the ERP associated with emotional slides Jour-nal of Psychophysiology 11 71-84

Eastwood J D Smilek D amp Merikle P M (2001) Differential at-tentional guidance by unattended faces expressing positive and neg-ative emotion Perception amp Psychophysics 63 1004-1013

Eimer M (1998) Does the face-specif ic N170 component reflect theactivity of a specialized eye detector NeuroReport 9 2945-2948

Eimer M (2000) The face-specific N170 component reflects late stagesin the structural encoding of faces NeuroReport 11 2319-2324

Eimer M amp Holmes A (2002) An ERP study on the time course ofemotional face processing NeuroReport 13 427-431

Ekman P amp Friesen W V (1976) Pictures of facial affect Palo AltoCA Consulting Psychologists Press

Fox E Lester V Russo R Bowles R J Pichler A amp Dutton K(2000) Facial expressions of emotion Are angry faces detected moreefficiently Cognition amp Emotion 14 61-92

Hansen C H amp Hansen R D (1988) Finding the face in the crowdAn anger superiority effect Journal of Personality amp Social Psy-chology 54 917-924

Hariri A R Bookheimer S Y amp Mazziotta J C (2000) Modu-lating emotional responses Effects of a neocortical network on thelimbic system NeuroReport 11 43-48

Harmer C J Thilo K V Rothwell J C amp Goodwin G M(2001) Transcranial magnetic stimulation of medial-frontal corteximpairs the processing of angry facial expressions Nature Neuro-science 4 17-18

Holmes A Vuilleumier P amp Eimer M (2003) The processing ofemotional facial expression is gated by spatial attention Evidencefrom event-related brain potentials Cognitive Brain Research 16174-184

Jones E G amp Burton H (1976) A projection from the medial pul-vinar to the amygdala in primates Brain Research 104 142-147

Kawasaki H Kaufman O Damasio H Damasio A R Gran-ner M Bakken H Hori T Howard M A III amp Adolphs R(2001) Single-neuron responses to emotional visual stimuli recordedin human ventral prefrontal cortex Nature Neuroscience 4 15-16

Lane R D Chua P M amp Dolan R J (1999) Common effects ofemotional valence arousal and attention on neural activation duringvisual processing of pictures Neuropsychologia 37 989-997

Lang P J Bradley M M Fitzsimmons J R Cuthbert B NScott J D Moulder B amp Nangia V (1998) Emotional arousaland activation of the visual cortex An fMRI analysis Psychophysi-ology 35 199-210

Le Doux J E (1996) The emotional brain New York Simon ampSchuster

Liu L Ioannides A A amp Streit M (1999) Single trial analysis ofneurophysiological correlates of the recognition of complex objectsand facial expressions of emotion Brain Topography 11 291-303

McCarthy G Puce A Belger A amp Allison T (1999) Electro-physiological studies of human face perception II Response proper-ties of face-specific potentials generated in occipitotemporal cortexCerebral Cortex 9 431-444

Mogg K amp Bradley B P (1999) Orienting of attention to threaten-ing facial expressions presented under conditions of restricted aware-ness Cognition amp Emotion 13 713-740

Mogg K McNamara J Powys M Rawlinson H Seiffer A ampBradley B P (2000) Selective attention to threat A test of twocognitive models of anxiety Cognition amp Emotion 14 375-399

Morris J S Friston K J Buechel C Frith C D Young A WCalder A J amp Dolan R J (1998) A neuromodulatory role forthe human amygdala in processing emotional facial expressionsBrain 121 47-57

Morris J S Frith C D Perrett D I Rowland D YoungA W Calder A J amp Dolan R J (1996) A differential neural re-sponse in the human amygdala to fearful and happy facial expres-sions Nature 383 812-815

Morris J S Oumlhman A amp Dolan R J (1999) A subcortical path-way to the right amygdala mediating ldquounseenrdquo fear Proceedings ofthe National Academy of Sciences 96 1680-1685

Oumlhman A Flykt A amp Esteves F (2001) Emotion drives attentionDetecting the snake in the grass Journal of Experimental Psychol-ogy General 130 466-478

Oumlhman A Flykt A amp Lundqvist D (2000) Unconscious emo-tion Evolutionary perspectives psychophysiological data and neu-ropsychological mechanisms In R D Lane amp L Nadel (Eds) Cog-nitive neuroscience of emotion (pp 296-327) New York OxfordUniversity Press

Oumlhman A Lundqvist D amp Esteves F (2001) The face in thecrowd revisited A threat advantage with schematic stimuli Journalof Personality amp Social Psychology 80 381-396

Pessoa L Kastner S amp Ungerleider LG (2002) Attentionalcontrol of the processing of neutral and emotional stimuli CognitiveBrain Research 15 31-45

Pessoa L McKenna M Gutierrez E amp Ungerleider LG(2002) Neural processing of emotional faces requires attention Pro-ceedings of the National Academy of Sciences 99 11458-11463

Phillips M L Young A W Scott S K Calder A J Andrew CGiampietro V Williams S C R Bullmore E T BrammerMamp Gray J A (1998) Neural responses to facial and vocal expres-sions of fear and disgust Proceedings of the Royal Society LondonSeries B 265 1809-1817

Phillips M L Young A W Senior C Brammer M Andrew CCalder A J Bullmore E T Perrett D I Rowland D ampWilliams S C R et al (1997) A specif ic neural substrate forperceiving facial expressions of disgust Nature 389 495-498

Pizzagalli D Regard M amp Lehmann D (1999) Rapid emotionalface processing in the human right and left brain hemispheres AnERP study NeuroReport 10 2691-2698

Rapcsak S Z Galper S R Comer J F Reminger S L NielsenLKaszniak A W Verfaellie M Laguna J F Labiner D Mamp Cohen R A (2000) Fear recognition deficits after focal braindamage Neurology 54 575-581

Rolls E T (1999) The brain and emotion Oxford Oxford UniversityPress

Sato W Kochiyama T Yoshikawa S amp Matsumura M (2001)Emotional expression boosts early visual processing of the face ERPrecording and its decomposition by independent component analysisNeuroReport 12 709-714

Schyns P G amp Oliva A (1999) Dr Angry and Mr Smile Whencategorization flexibly modifies the perception of faces in rapid vi-sual presentations Cognition 69 243-265

Sprengelmeyer R Rausch M Eysel U T amp Przuntek H (1998)Neural structures associated with recognition of facial expressions of

110 EIMER HOLMES AND MCGLONE

basic emotions Proceedings of the Royal Society London Series B265 1927-1931

Stepniewska I Qi H X amp Kaas J H (1999) Do superior collicu-lus projection zones in the inferior pulvinar project to MT in pri-mates European Journal of Neuroscience 11 469-480

Vuilleumier P Armony J L Driver J amp Dolan R J (2001) Ef-fects of attention and emotion on face processing in the human brainAn event-related fMRI study Neuron 30 829-841

Vuilleumier P Armony J L Driver J amp Dolan R J (2003)Distinct spatial frequency sensitivities for processing faces and emo-tional expressions Nature Neuroscience 6 624-631

Vuilleumier P amp Schwartz S (2001a) Beware and be aware Cap-ture of spatial attention by fear-related stimuli in neglect Neuro-Report 12 1119-1122

Vuilleumier P amp Schwartz S (2001b) Emotional expressionscapture attention Neurology 56 153-158

Whalen P J Rach S L Etcoff N L McInerney S C LeeM B amp Jenike M A (1998) Masked presentations of emotionalfacial expressions modulate amygdala activity without explicitknowledge Journal of Neuroscience 18 411-418

Whalen P J Shin L M McInerney S C Fischer H WrightC I amp Rauch S L (2001) A functional MRI study of humanamygdala responses to facial expressions of fear versus anger Emo-tion 1 70-83

Yeterian E H amp Pandya D N (1991) Corticothalamic connec-tions of the superior temporal sulcus in rhesus monkeys Experimen-tal Brain Research 83 268-284

NOTES

1 In spite of the fact that significant valence effects were present atcentral electrodes in the emotion task but were absent in the lines taskthis interaction failed to reach significance at central sites

2 At lateral occipital electrodes a significantly enhanced positivityfor emotional relative to neutral faces was present between 320 and495 msec in the emotion task [F(113) = 62 p lt 03] but not in thelines task and this was reflected in a nearly significant task 3 valenceinteraction [F(113) = 46 p lt 06]

3 It should be noted that anatomical evidence for a colliculo-pulvinar-amygdalar pathway is currently lacking since the medial pulvinarwhich projects to the amygdala does not receive a significant directinput from the superior colliculus (eg Stepniewska Qi amp Kaas1999) However possible connections between the inferior pulvinar(which receives visual inputs from the superior colliculus) and the me-dial nucleus may support the transmission of information to the amyg-dala through a colliculo-pulvinar route Alternatively cortical inputmay be involved since STS (implicated in facial expression processingSprengelmeyer et al 1998) is known to project to the medial pulvinar(Yeterian amp Pandya 1991) Our thanks to an anonymous reviewer forraising this important point

(Manuscript received December 10 2002revision accepted for publication June 11 2003)

100 EIMER HOLMES AND MCGLONE

parallel and independent processes (Bruce amp Young1986) However to date this conclusion has been basedonly on a comparison of N170 components elicited in re-sponse to fearful as versus neutral faces obtained underconditions where facial expression was task irrelevantTo investigate whether the face-specific N170 is unaf-fected by any emotional facial expression even when ex-pression is task relevant we compared the N170 elicitedby emotional versus neutral faces in the emotion taskseparately for all six basic facial expressions Any sys-tematic emotional expression effects on the N170 com-ponent would challenge the hypothesis that the structuralencoding of faces is completely independent of facial ex-pression analysis

METHOD

ParticipantsFifteen participants participated in this study One had to be ex-

cluded because of excessive eye blinks so 14 participants (7 femaleand 7 male 18ndash54 years old average age 296 years) remained inthe sample One participant was left-handed all others right-handedby self-report The experiment was performed in compliance withrelevant institutional guidelines and was approved by the BirkbeckCollege School of Psychology ethics committee

StimuliThe face stimuli were photographs of faces of 10 different individ-

uals all taken from a standard set of pictures of facial affect (Ekmanamp Friesen 1976) Facial expression was angry disgusted fearfulhappy sad surprised or neutral resulting in a total of 70 differentface stimuli (see Figure 1 top panel for examples) All face stim-uli covered a visual angle of about 34ordm 3 24ordm Each display alsocontained a pair of gray vertical lines (01ordm width) and each linewas either short (04ordm) or slightly longer (05ordm) All stimuli were pre-sented on a computer screen in front of a black background A whitefixation cross was continuously present at the center of the screen

ProcedureParticipants were seated in a dimly lit sound-attenuated cabin

and a computer screen was placed at a viewing distance of 70 cmThe experiment consisted of 24 experimental blocks each contain-ing 80 trials On each trial two identical faces were presented to-gether with two line stimuli in front of a black background (Fig-ure 1 bottom) Faces were located 22 ordm to the left and right offixation (measured as the distance between the fixation cross andthe center of each face stimulus) and the bilateral lines were pre-sented close to the f ixation cross (04ordm eccentricity) All stimuliwere presented simultaneously for 300 msec and the interval be-tween two successive stimulus presentations was 2000 msec

In 12 successive blocks participants had to indicate with a left-hand or right-hand buttonpress whether the face pair presented onany given trial showed an emotional or neutral expression (emotiontask) The mapping of emotional valence to response hand wascounterbalanced across participants In 40 trials per block emo-tional faces were presented in the other randomly intermingled 40trials facial expression was neutral Long and short lines whichwere irrelevant in these blocks appeared randomly and with equalprobability to the left and right of f ixation Emotional expressionwas varied across blocks with angry disgusted fearful happy sadand surprised faces each shown in two blocks The order in whichthese blocks were presented was randomized for each participant

In the other 12 successive blocks participants were instructed todirect their attention to the pair of lines presented close to f ixation

and to indicate with a left-hand or right-hand buttonpress whetherthese lines differed in length or were identical (lines task) The map-ping of line length to response hand was counterbalanced acrossparticipants Again short and long lines appeared randomly andequiprobably on the left or right side Faces which were now taskirrelevant were emotional on 40 trials and neutral on the other 40trials with emotional expression varied across blocks (two blockseach with angry disgusted fearful happy sad and surprisedfaces) The order in which these blocks were presented was againrandomized for each participant

Seven participants performed the emotion task prior to the linestask and this order was reversed for the other 7 participants Par-ticipants were instructed to keep their gaze directed at the centralfixation cross throughout each block and to respond as fast and ac-curately as possible on each trial

ERP procedures and data analysisEEG was recorded with Ag-AgCl electrodes and linked-earlobe

reference from Fpz F7 F3 Fz F4 F8 FC5 FC6 T7 C3 Cz C4T8 CP5 CP6 T5 P3 Pz P4 T6 and Oz (according to the 10-20system) and from OL and OR (located halfway between O1 and P7and O2 and P8 respectively) Horizontal EOG (HEOG) was recordedbipolarly from the outer canthi of both eyes The impedance for allelectrodes was kept below 5 kW The amplifier bandpass was 01 to40 Hz and no additional filters were applied to the averaged dataEEG and EOG were sampled with a digitization rate of 200 Hz andstored on disk Reaction times (RTs) were measured on each trial

EEG and HEOG were epoched off-line into 800-msec periodsstarting 100 msec prior to stimulus onset and ending 700 msec afterstimulus onset Trials with horizontal eye movements (HEOG ex-ceeding plusmn30 mV) eyeblinks (Fpz exceeding plusmn60 mV) or other arti-facts (a voltage exceeding plusmn80 mV at any electrode) measured afterstimulus onset were excluded from analysis EEG obtained was av-eraged relative to a 100-msec baseline preceding stimulus onsetOnly trials with correct behavioral responses were included in theaverages Separate averages were computed for the emotion taskand the lines task for all combinations of block type (experimentalblocks including angry vs disgusted vs fearful vs happy vs sad vssurprised faces) and valence (emotional vs neutral faces) resultingin 24 average waveforms for each electrode and participant

The f irst set of analyses was based on mean amplitudes obtainedat lateral posterior electrodes T5 and T6 (where the N170 is maximal)within a time window centered on the mean latency of the face-specific posterior N170 component (160ndash200 msec poststimulus)Repeated measures analyses of variance (ANOVAs) were conductedfor the factors task (emotion task vs lines task) block type and va-lence Additional analyses were conducted separately for the emo-tion and the lines tasks The second set of analyses was based onmean amplitude values computed within f ive successive poststim-ulus time windows (120ndash155 msec 160ndash215 msec 220ndash315 msec320ndash 495 msec and 500ndash700 msec) which covered the interval wheresystematic emotional expression effects were observed in our pre-vious experiments (Eimer amp Holmes 2002 Holmes et al 2003)Mean amplitude values were computed for frontal (F3 Fz F4) cen-tral (C3 Cz C4) parietal (P3 Pz P4) lateral temporal (T5 T6)and lateral occipital sites (OL OR) Again ANOVAs were conductedfor the factors task block type and valence followed by furtheranalyses conducted separately for ERPs obtained in the emotiontask and the lines task

For keypress responses repeated measures ANOVAs were per-formed on the latencies of correct responses and on error rates sepa-rately for the emotion task and the lines task for the factors block typeand valence In the analysis of behavioral performance in the linestask the additional factor of target type (identical lines vs differentlines) was included For all analyses GreenhousendashGeisser adjust-ments to the degrees of freedom were performed when appropriate

ATTENTION AND FACIAL EXPRESSION PROCESSING 101

RESULTS

Behavioral ResultsParticipants failed to respond on less than 3 of all tri-

als Correct responses were faster in the emotion task(622 msec) than in the lines task (695 msec) and this dif-ference was significant [t(14) = 474 p lt 001] Figure 2shows mean RTs (top panel) and the percentage of incor-rect responses (bottom panel) obtained in the emotiontask displayed separately for the six different block typesand for trials with emotional and neutral faces respec-tively For RTs main effects of block type [F(565) = 214p lt 001 e = 788] and of valence [F(113) = 190 p lt001] were present RTs differed systematically betweenblock types being fastest in blocks including happy facesand slowest in blocks including sad faces In addition re-

sponses were generally faster to emotional than to neutralfaces No interaction between block type and valence wasobtained indicating that this RT advantage for emotionalfaces was equivalent across all six block types

For error rates main effects of block type [F(565) =137 p lt 001 e = 282] and valence [F(113) = 154 p lt002] were again present for the emotion task As can beseen from Figure 2 (bottom panel) incorrect responseswere most frequent in blocks including sad faces andleast frequent in blocks including surprised faces Alsoit was more likely that emotional faces would be incor-rectly classified as neutral than that neutral faces wouldbe erroneously judged as emotional No block type 3 va-lence interaction was present

In the lines task no main effects of block type or va-lence were obtained for RT or error rate (all Fs lt 1) in-

Emotional FacesNeutral Faces

Emotional FacesNeutral Faces

Anger Disgust Fear Happiness Sadness Surprise

Anger Disgust Fear Happiness Sadness Surprise

RT

(mse

c)E

rror

rate

s (

)

750

700

650

600

550

500

30

20

10

0

Figure 2 Reaction times (top panel) and percentage of incorrect responses(bottom panel) to emotional and neutral faces in the emotion task displayedseparately for experimental blocks where neutral faces were intermixed withangry disgusted fearful happy sad or surprised faces

102 EIMER HOLMES AND MCGLONE

dicating that the emotional expression of task-irrelevantfaces did not interfere with perceptual identification per-formance Target type did not affect RT but had a signif-icant effect on error rate [F(113) = 159 p lt 002] Itwas more likely that lines of different length would beclassified as identical (232) than that identical lineswould be judged as different (102)

Electrophysiological ResultsFigure 3 shows ERPs obtained in the emotion task in

response to stimulus arrays containing either neutral faces(solid lines) or emotional faces (dashed lines) collapsedacross all six different emotional expressions Figure 4shows corresponding ERP waveforms obtained in the linestask A sustained positivity was elicited in response toarrays containing emotional faces in the emotion taskThis emotional expression effect was first visible at fronto-central sites at about 180 msec poststimulus (overlappingwith the P2 component) and appeared at parietal elec-

trodes around 300 msec poststimulus (Figure 3) At lateraltemporal and occipital electrodes emotional expressioneffects appeared at about 250 msec poststimulus as an en-hanced negativity for emotional relative to neutral facesin the emotion task In contrast no systematic emotionalexpression effects were found for the lines task (Figure 4)

The difference between emotional and neutral facesappears to leave the face-specific N170 component atlateral temporal sites T5 and T6 entirely unaffected Thiswas observed not only in the lines task (Figure 4) butalso in the emotion task (Figure 3) where facial expres-sion was task relevant These informal observations weresubstantiated by statistical analyses

N170 component In the N170 time range (160ndash200 msec poststimulus) N170 amplitudes elicited at T5and T6 in response to neutral versus emotional facesshowed neither a main effect of valence nor a task 3 va-lence interaction (both Fs lt 1) demonstrating that theN170 is not modulated by facial expression (Figures 3

Figure 3 Grand-averaged ERP waveforms elicited in the emotion task in the 700-msec in-terval following stimulus onset in response to stimulus arrays containing neutral faces (solidlines) or emotional faces (dashed lines) collapsed across blocks including each of the six dif-ferent emotional facial expressions

ATTENTION AND FACIAL EXPRESSION PROCESSING 103

and 4) To further ascertain that this component is unaf-fected by emotional expression even when expression istask relevant we conducted additional analyses on N170amplitudes observed in the emotion task (Figure 3) Nomain effect of valence (F lt 12) or interaction betweenblock type and valence (F lt 1) was observed indicatingthat the N170 was similarly insensitive to emotional fa-cial expression for all six basic emotions employed hereeven though participants had to discriminate betweenemotional and neutral faces in this task This is illus-trated in Figure 5 which displays ERPs in response toneutral and emotional faces elicited in the emotion taskat right lateral temporal electrode T6 shown separatelyfor each of the six facial expressions which were pre-sented in different blocks No systematic differential ef-fects of any facial expression on the N170 are apparentand this was confirmed by additional planned pairedcomparisons of N170 amplitudes at T5 and T6 in re-

sponse to emotional versus neutral faces conducted sep-arately for all six basic emotions None of these com-parisons even approached statistical significance [allts(13) lt 15]

Emotional expression effects No main effects of va-lence or task 3 valence interactions were observed in the120- to 155-msec time window In the 160- to 215-msecanalysis window a task 3 valence interaction was pres-ent at frontal sites [F(113) = 52 p lt 05]1 Main effectsof valence were found at frontal and central sites [bothFs(114) gt 91 both ps lt 01] in the emotion task reflect-ing an enhanced positivity elicited in response to arrayscontaining emotional faces (Figure 3) These effects werecompletely absent in the lines task (both Fs lt 1) No in-teractions between block type and valence were found atfrontal and central sites in the emotion task (both Fs lt11) demonstrating that this early emotional positivitywas elicited in response to emotional versus neutral faces

Figure 4 Grand-averaged ERP waveforms elicited in the lines task in the 700-msec inter-val following stimulus onset in response to stimulus arrays containing neutral faces (solidlines) or emotional faces (dashed lines) Data are collapsed across blocks including each ofthe six different emotional facial expressions as well as across trials with identical and dif-ferent line pairs

104 EIMER HOLMES AND MCGLONE

irrespective of which of the six basic emotions was in-cluded in a given block This fact is further illustrated inFigure 6 which shows ERPs in response to neutral andemotional faces elicited in the emotion task at Fz dis-played separately for all emotional expressions used inthis experiment Emotional expression effects were verysimilar across expressions and started at approximatelythe same time for all six basic emotions No significantemotional expression effects were present between 160and 215 msec poststimulus at parietal and occipital elec-trodes

Between 220 and 315 msec poststimulus task 3 va-lence interactions were present at frontal and centralelectrodes as well as at lateral temporal and occipitalsites [all Fs(113) gt 72 all ps lt 02] indicating thatemotional expression affected ERPs in the emotion taskbut not in the lines task At frontal and central sites maineffects of valence in the emotion task [both Fs(113) gt150 both ps lt 02] reflected enhanced positivities foremotional relative to neutral faces (Figure 3) No blocktype 3 valence interactions were present (both Fs lt 1)demonstrating that this effect was elicited in similarfashion for all six basic emotions (Figure 6) Again nofrontocentral emotional expression effects were observedin the lines task (both Fs lt 16) At lateral temporal andoccipital sites an enhanced negativity was observed inthe 220- to 315-msec latency window for emotional rel-ative to neutral faces in the emotion task [both Fs(113) gt61 both ps lt 03] but not in the lines task (both Fs lt 1)Again no block type 3 valence interactions were pres-

ent for the emotion task (both Fs lt 16) indicating thatthis lateral posterior emotional negativity was elicited inresponse to all six basic emotions (Figure 5)

In the final two analysis windows (320ndash495 msec and500ndash700 msec poststimulus respectively) highly sig-nificant task 3 valence interactions were present atfrontal central and parietal electrodes [all Fs(113) gt100 all ps lt 01] again reflecting the presence of emo-tional expression effects in the emotion task (Figure 3)and their absence in the lines task (Figure 4) Main ef-fects of valence at frontal and central as well as at pari-etal electrodes in the emotion task [all Fs(113) gt 119all ps lt 01] without any significant interactions betweenvalence and block type demonstrated that enhanced pos-itivities for emotional faces were elicited at these sites ina similar fashion for all six basic emotions (Figure 6)Again effects of valence were entirely absent in the linestask2

DISCUSSION

The primary aim of the present ERP experiment wasto extend previous findings (Holmes et al 2003) that thedetection and processing of emotional information de-livered by facial expressions requires focal attention Werecorded ERPs to stimulus arrays containing emotionalor neutral bilateral faces under conditions when facialexpression was task relevant and therefore attended (emo-tion task) or when attention was actively directed awayfrom these faces toward a demanding perceptual judgment

Figure 5 Grand-averaged ERP waveforms elicited in the emotion task at right lateral tem-poral electrode T6 in the 700-msec interval following stimulus onset in response to stimulusarrays containing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are shownseparately for blocks containing angry disgusted fearful happy sad or surprised faces

ATTENTION AND FACIAL EXPRESSION PROCESSING 105

(lines task) In our previous ERP study (Holmes et al2003) spatial attention was manipulated on a trial-by-trial basis by precues presented at the start of each trialfacial expression was not task relevant (participants hadto detect infrequent identical stimulus pairs regardlessof expression) and only one emotional expression (fear)was tested In the present experiment a sustained attentionparadigm was employed (with emotion and lines tasksdelivered in separate experimental halves) facial expres-sion was task relevant in the emotion task and most im-portant all six basic facial emotional expressions wereincluded in different blocks

ERP correlates of emotional facial expression pro-cessing were identified by comparing ERPs elicited ontrials with emotional faces with ERPs in response to neu-tral faces This was done separately for the emotion taskand the lines task and for blocks including angry dis-gusted fearful happy sad and surprised faces In theemotion task where attention was directed toward task-relevant facial expressions an enhanced positivity foremotional relative to neutral faces was elicited similar toprevious observations from studies comparing ERP re-sponses to fearful versus neutral faces (Eimer amp Holmes2002 Holmes et al 2003) This emotional expressioneffect started at about 160 msec poststimulus and wasinitially distributed frontocentrally whereas a morebroadly distributed positivity was observed beyond300 msec (Figure 3) In addition an enhanced negativityfor fearful relative to neutral faces was elicited at lateral

posterior electrodes between 220 and 320 msec post-stimulus

The onset of the early frontocentral emotional expres-sion effect was slightly later in the present experimentthan in our previous experiment (Holmes et al 2003)where significant frontal differences between ERPs tofearful and neutral faces were already present at about120 msec poststimulus In the present study verticallines were presented close to fixation simultaneouslywith the bilateral faces whereas no such stimuli were in-cluded in our earlier experiment The presence of theseadditional central events may have slightly delayed theonset of early emotional expression effects It shouldalso be noted that an attenuation of amygdala responsesto emotional facial expressions has been observed whenthe demand for explicit emotion recognition was in-creased (Critchley et al 2000 Hariri Bookheimer ampMazziotta 2000) It is possible that the demand for ex-plicit emotion recognition in the emotion task con-tributed to the delayed onset of the early emotional ex-pression effect

In marked contrast to these ERP results obtained inthe emotion task emotional expression effects were en-tirely absent in the lines task (Figure 4) demonstratingthat ERP correlates of facial expression processing arestrongly dependent on spatial attention With sustainedspatial attention directed away from face stimuli towardanother demanding perceptual task the presence of emo-tional versus neutral faces had no effect whatsoever on

Figure 6 Grand-averaged ERP waveforms elicited in the emotion task at midline electrodeFz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are shown separatelyfor blocks containing angry disgusted fearful happy sad or surprised faces

106 EIMER HOLMES AND MCGLONE

ERP waveforms That is emotional expression effectswere completely eliminated for all six basic emotions in-cluded in this experiment In line with this ERP resultperformance in the lines task was entirely unaffected bythe expression of the faces presented simultaneouslywith the task-relevant line pairs Overall these findingsextend and confirm the observations of our previousERP experiment which compared ERPs in response tofearful versus neutral faces (Holmes et al 2003) Clearlythese results challenge the hypothesis that the detectionandor processing of emotional facial expression occurspreattentively If this were the case at least some system-atic ERP differences should have been elicited in responseto emotional versus neutral faces in the lines task reflect-ing the automatic detection of emotionally significantevents

Covert attention toward emotional faces under condi-tions when they were task relevant may have enhancedtheir visualndashperceptual representation (eg CarrascoPenpeci-Talgar amp Eckstein 2000) thereby enabling theextraction of features relating to the affective valence ofthese faces and thus their subsequent encoding andanalysis (as reflected by the emotion-specific ERP effectsobserved in the emotion task) The early frontocentrallydistributed emotional expression effects may be mediatedby connections from the superior temporal sulcus (STS)and amygdala to orbitofrontal cortex (Rolls 1999) TheSTS has been implicated in the early discrimination ofvisual features relating to emotional facial expressions(eg Sprengelmeyer et al 1998) In addition efferentfeedback projections from the amygdala and relatedstructures (see Lang et al 1998 Morris et al 1998)may have produced the more broadly distributed emo-tional expression effects observed in the present experi-ment at longer latencies

One could argue that the absence of emotional ex-pression effects under conditions where faces were un-attended may have been due to the fact that the presen-tation of specific emotional expressions was blocked andthat each expression was presented repeatedly in twoseparate blocks Repeated exposure to a specific emo-tional expression may have resulted in a gradual habitu-ation of emotion-specific responses thus potentially at-tenuating any emotional expression effects that may havebeen present in the lines task To investigate this possi-bility we computed separate averages for the first blockand for the second block including angry disgustedfearful happy sad or surprised faces separately for theemotion and for the lines task These data were then an-alyzed with the additional factor of block position (firstvs second block containing a specific emotional facialexpression) If emotional expression effects were subjectto habituation one would expect to find larger emotionalexpression effects for the first relative to the second blockin the emotion task and potentially also a residual emo-tional expression effect for the first block in the lines task

Figure 7 shows ERPs elicited at Fz in response to neu-tral faces (solid lines) or emotional faces (dashed lines)

collapsed across all six different emotional expressionsERPs are displayed separately for the emotion task (toppanel) and the lines task (bottom panel) and for the firstblock (left) or second block (right) including one of thesix emotional expressions As can be seen from Figure 7(top) there was no evidence whatsoever for any habitu-ation of emotional expression effects as a function ofblock position in the emotion task This was confirmedby the absence of any block position 3 valence or blockposition 3 block type 3 valence interactions for all latencywindows employed in the analyses reported above [allFs(113) lt 1] Along similar lines Figure 7 (bottom panel)suggests that there was no residual emotional expressioneffect for the first block including a specific emotionalexpression in the lines task This was confirmed by theabsence of any interactions involving block position [allFs(113) lt 16] Thus the fact that emotional expressioneffects were absent in response to unattended faces in thelines task is unlikely to have been the result of a habitu-ation of emotion-specific brain responses

The conclusion that the processing of emotional facialexpression as reflected by ERP facial expression effectsis gated by spatial attention appears to be inconsistentwith neuroimaging studies demonstrating that fearfulfaces result in amygdala activations even when thesefaces are outside the focus of attention (Vuilleumieret al 2001 see also Morris et al 1996 Whalen et al1998) However it is extremely unlikely that the ERP ef-fects observed in the present study are directly linked toamygdala activations Due to its nuclear structure ofclustered neurones the amygdala is electrically closedand thus largely inaccessible to ERP measures The earlyemotional expression effects observed in response to at-tended faces are more likely to be generated in prefrontalcortex where emotion-specific single-cell responseshave recently been recorded at short latencies (Kawasakiet al 2001) Such prefrontal responses may reflect stagesin emotional processing that could be contingent uponbut functionally separate from prior amygdala activa-tions (see Le Doux 1996 Rolls 1999) It is possible thatamygdala responses can be triggered by unattendedemotional stimuli (although these responses may be at-tenuated) whereas subsequent neocortical stages ofemotional processing (as reflected by the ERP effects ob-served in the present experiment) are fully dependent onfocal attention An alternative possibility is that amyg-dala responses to emotional stimuli may also require at-tention (see Pessoa Kastner amp Ungerleider 2002 Pes-soa McKenna et al 2002) and that the elimination ofemotional expression effects in the lines task reflects anearlier attentional gating of such subcortical processing

Another important new finding of the present experi-ment was that the onset time course and scalp distribu-tion of emotional expression effects obtained in the emo-tion task were remarkably similar for all six basic facialexpressions used here (Figures 5 and 6) The absence ofany differential ERP responses to different emotional ex-pressions was reflected by the absence of any significant

ATTENTION AND FACIAL EXPRESSION PROCESSING 107

interactions between block type (blocks with angry dis-gusted fearful happy sad or surprised faces) and va-lence (emotional vs neutral expression) In line withthese observations the size of the RT advantage for emo-tional relative to neutral faces in the emotion task wassimilar for all six emotional facial expressions (Figure 2top panel) The similarity in the time course of emotionalexpression effects across all six emotional expressionsobserved here suggests that emotionally relevant infor-mation delivered by facial expression is available to neo-cortical processes within less then 200 msec after stim-ulus onset and at approximately the same time for allbasic emotional expressions

These observations do not seem to support the ideasuggested by recent f MRI results that distinct neuralsubsystems specialize in the processing of specific emo-tions (Adolphs 2002) If this were the case one mighthave expected some systematic differences between ERPemotional expression effects elicited by different facial

expressions However it should be noted that althoughsome neuroimaging data show emotion-specific differ-ential activation of brain regions such as the amygdala orinsula few studies point to differential activation withinsurface cortical structures (where the ERP effects ob-served in the present experiments are likely to be gener-ated see also Pizzagalli et al 1999 Sato et al 2001for related results from recent ERP studies)

Thus one could argue that early stages in the processingof emotionally relevant information subserved by lim-bic structures or the basal ganglia and subsequent neo-cortical emotional processing stages differ not only intheir dependence on focal attention (see above) but alsoin their specificity Early processes may be differentiallyengaged by specific emotional expressions thus provid-ing a rapid classif ication of emotionally significantevents Data in support of this view come from single-unit recordings which reveal a rapid emergence of dif-ferential effects to emotional expressions in the human

Figure 7 Grand-averaged ERP waveforms elicited in the emotion task (toppanel) and in the lines task (bottom panel) at midline electrode Fz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are col-lapsed across blocks including each of the six different emotional facialexpressions and are shown separately for the first block (left) and the secondblock (right) including one specific emotional expression

108 EIMER HOLMES AND MCGLONE

amygdala (Liu et al 1999) Conversely later stagesmight be involved in the in-depth processing of variouskinds of affective information and thus would be muchless selective with respect to different facial expressions

This suggestion is consistent with some recent evi-dence that subcortical and neocortical routes for visualprocessing are involved differentially in emotional ex-pression analysis A subcortical magnocellular pathwayto the amygdala would appear to support valence dis-crimination processes whereas parvocellular subsys-tems of ventral visual cortices may be preferentially in-volved in emotional intensity evaluation irrespective ofemotional valence (Schyns amp Oliva 1999 VuilleumierArmony Driver amp Dolan 2003) Recent neuroimagingresults (Vuilleumier et al 2003) suggest that low andhigh spatial frequency components of fearful faces se-lectively drive amygdala and visual cortical responsesrespectively However although enhanced amygdala ac-tivation was found in response to low-spatial-frequencyfearful face stimuli explicit judgments relating to theperceived intensity of fearfulness were increased by thepresence of high-spatial-frequency cues These resultssupport the view that coarse visual information may bedirected via magnocellular channels from the retina tothe amygdala through a tectopulvinar pathway (eg Bistiamp Sireteanu 1976 Jones amp Burton 1976) enabling thefast appraisal of the affective significance of a stimulus(eg Morris Oumlhman amp Dolan 1999)3

Another aim of the present study was to investigatewhether the face-specific N170 component which is as-sumed to reflect the structural encoding of faces is sen-sitive to emotional facial expressions In previous ERPstudies which have not found any modulations of theN170 elicited by fearful relative to neutral faces (Eimeramp Holmes 2002 Holmes et al 2003) facial expressionwas always task irrelevant In contrast participantsrsquo re-sponses were contingent upon facial expression in thepresent emotion task In spite of this fact the N170 wasfound to be completely unaffected by facial expressionsin the emotion task and this was consistently the casefor all six emotional expressions used in the presentstudy (Figure 5)

In line with earlier findings from depth electrodes (Mc-Carthy Puce Belger amp Allison 1999) this pattern of re-sults now demonstrates comprehensively that the structuralencoding of faces as reflected by the N170 is entirely in-sensitive to information derived from emotional facial ex-pression Thus the rapid detection of emotional facialexpression appears to occur independently and in paral-lel to the construction of a detailed perceptual representa-tion of a face The absence of systematic early emotionalexpression effects at posterior sites and the presence ofsuch ERP effects at frontocentral electrodes at about160 msec poststimulus suggests that higher order visualprocessing stages involved in face processing are af-fected by emotional facial expression only after this in-formation has been processed in prefrontal cortex Thisis consistent with the face processing model proposed byBruce and Young (1986) in which the extraction of per-

ceptual information for emotional expression processingoccurs independently and simultaneously with structuralencoding for face recognition

In summary the present ERP results demonstrate thatthe neocortical processing of emotional facial expres-sion is strongly dependent on focal attention Whenfaces were attended systematic emotional expression ef-fects were elicited by emotional relative to neutral facesand these effects were strikingly similar in terms of theirtiming and morphology for all six basic facial expres-sions In contrast when attention was actively directedaway from these faces emotional expression effectswere completely eliminated The rapid and automatic en-coding of emotionally significant events occurring out-side the focus of attention may be adaptively advanta-geous because it prepares the organism for f ight orflight through subcortically mediated autonomic activa-tion (eg Oumlhman Flykt amp Lundqvist 2000) Howeverit is equally important that irrelevant affective stimuli donot continuously divert attention This suggests a divi-sion of labor between limbic structures involved in theobligatory detection of emotional informationmdashprepar-ing the organism for rapid action (Morris et al 1999Whalen et al 1998)mdashand subsequent neocortical emo-tional processing stages Limbic structures may be re-sponsible for establishing a readiness to respond to anyenvironmental threat that could become the focus of at-tention presumably through heightened autonomic acti-vation However neocortical stages appear to be pro-tected by efficient attentional gating mechanisms whichreduce distractibility by emotional stimuli so that ongo-ing goals and plans can be accomplished without inter-ference from irrelevant events

REFERENCES

Adolphs R (2002) Recognizing emotion from facial expressionsPsychological and neurological mechanisms Behavioral CognitiveNeuroscience Review 1 21-61

Adolphs R Tranel D amp Damasio A R ( 2003) Dissociableneural systems for recognizing emotions Brain amp Cognition 52 61-69

Amaral D G amp Price J L (1984) Amygdalo-cortical projectionsin the monkey (Macaca fasicularis) Journal of Comparative Neu-rology 230 465-496

Amaral D G Price J L Pitkanen A amp Carmichael S T(1992) Anatomical organization of the primate amygdaloid complexIn J P Aggleton (Ed) The amygdala Neurobiological aspects ofemotion memory and mental dysfunction (pp 1-66) New YorkWiley-Liss

Armony J L amp Dolan R J (2002) Modulation of spatial attentionby fear-conditioned stimuli An event-related fMRI study Neuro-psychologia 7 817-826

Bentin S Allison T Puce A Perez E amp McCarthy G (1996)Electrophysiological studies of face perception in humans Journal ofCognitive Neuroscience 8 551-565

Bisti S amp Sireteanu R C (1976) Sensitivity to spatial frequencyand contrast of visual cells in the cat superior colliculus Vision Re-search 16 247-251

Blair R J R Morris J S Frith C D Perrett D I amp DolanR J (1999) Dissociable neural responses to facial expressions ofsadness and anger Brain 122 883-893

Breiter H C Etcoff N L Whalen P J Kennedy W ARauch S L Buckner R L Strauss M M Hyman S E amp

ATTENTION AND FACIAL EXPRESSION PROCESSING 109

Rosen B R (1996) Response and habituation of the human amyg-dala during visual processing of facial expression Neuron 17 875-887

Bruce V amp Young A (1986) Understanding face recognitionBritish Journal of Psychology 77 305-327

Calder A J Keane J Manes F Antoun N amp Young A W(2000) Impaired recognition and experience of disgust followingbrain injury Nature Neuroscience 3 1077-1078

Calder A J Lawrence A D amp Young A W (2001) Neuropsy-chology of fear and loathing Nature Reviews Neuroscience 2 352-363

Carrasco M Penpeci-Talgar C amp Eckstein M (2000) Spatialcovert attention increases contrast sensitivity across the CSF Sup-port for signal enhancement Vision Research 40 1203-1215

Critchley H [D] Daly E [M] Phillips M Brammer M Bull-more E Williams S [C] van Amelsvoort T Robertson DDavid A amp Murphy D [G M] (2000) Explicit and implicit neuralmechanisms for processing of social information from facial expres-sions A functional magnetic resonance imaging study Human BrainMapping 9 93-105

Cuthbert B N Schupp H T Bradley M M Birbaumer N ampLang P J (2000) Brain potentials in affective picture processingCovariation with autonomic arousal and affective report BiologicalPsychology 52 95-111

Damasio A R (1994) Descartesrsquo error Emotion reason and the humanbrain New York G P Putnamrsquos Sons

Diedrich O Naumann E Maier S amp Becker G (1997) Afrontal slow wave in the ERP associated with emotional slides Jour-nal of Psychophysiology 11 71-84

Eastwood J D Smilek D amp Merikle P M (2001) Differential at-tentional guidance by unattended faces expressing positive and neg-ative emotion Perception amp Psychophysics 63 1004-1013

Eimer M (1998) Does the face-specif ic N170 component reflect theactivity of a specialized eye detector NeuroReport 9 2945-2948

Eimer M (2000) The face-specific N170 component reflects late stagesin the structural encoding of faces NeuroReport 11 2319-2324

Eimer M amp Holmes A (2002) An ERP study on the time course ofemotional face processing NeuroReport 13 427-431

Ekman P amp Friesen W V (1976) Pictures of facial affect Palo AltoCA Consulting Psychologists Press

Fox E Lester V Russo R Bowles R J Pichler A amp Dutton K(2000) Facial expressions of emotion Are angry faces detected moreefficiently Cognition amp Emotion 14 61-92

Hansen C H amp Hansen R D (1988) Finding the face in the crowdAn anger superiority effect Journal of Personality amp Social Psy-chology 54 917-924

Hariri A R Bookheimer S Y amp Mazziotta J C (2000) Modu-lating emotional responses Effects of a neocortical network on thelimbic system NeuroReport 11 43-48

Harmer C J Thilo K V Rothwell J C amp Goodwin G M(2001) Transcranial magnetic stimulation of medial-frontal corteximpairs the processing of angry facial expressions Nature Neuro-science 4 17-18

Holmes A Vuilleumier P amp Eimer M (2003) The processing ofemotional facial expression is gated by spatial attention Evidencefrom event-related brain potentials Cognitive Brain Research 16174-184

Jones E G amp Burton H (1976) A projection from the medial pul-vinar to the amygdala in primates Brain Research 104 142-147

Kawasaki H Kaufman O Damasio H Damasio A R Gran-ner M Bakken H Hori T Howard M A III amp Adolphs R(2001) Single-neuron responses to emotional visual stimuli recordedin human ventral prefrontal cortex Nature Neuroscience 4 15-16

Lane R D Chua P M amp Dolan R J (1999) Common effects ofemotional valence arousal and attention on neural activation duringvisual processing of pictures Neuropsychologia 37 989-997

Lang P J Bradley M M Fitzsimmons J R Cuthbert B NScott J D Moulder B amp Nangia V (1998) Emotional arousaland activation of the visual cortex An fMRI analysis Psychophysi-ology 35 199-210

Le Doux J E (1996) The emotional brain New York Simon ampSchuster

Liu L Ioannides A A amp Streit M (1999) Single trial analysis ofneurophysiological correlates of the recognition of complex objectsand facial expressions of emotion Brain Topography 11 291-303

McCarthy G Puce A Belger A amp Allison T (1999) Electro-physiological studies of human face perception II Response proper-ties of face-specific potentials generated in occipitotemporal cortexCerebral Cortex 9 431-444

Mogg K amp Bradley B P (1999) Orienting of attention to threaten-ing facial expressions presented under conditions of restricted aware-ness Cognition amp Emotion 13 713-740

Mogg K McNamara J Powys M Rawlinson H Seiffer A ampBradley B P (2000) Selective attention to threat A test of twocognitive models of anxiety Cognition amp Emotion 14 375-399

Morris J S Friston K J Buechel C Frith C D Young A WCalder A J amp Dolan R J (1998) A neuromodulatory role forthe human amygdala in processing emotional facial expressionsBrain 121 47-57

Morris J S Frith C D Perrett D I Rowland D YoungA W Calder A J amp Dolan R J (1996) A differential neural re-sponse in the human amygdala to fearful and happy facial expres-sions Nature 383 812-815

Morris J S Oumlhman A amp Dolan R J (1999) A subcortical path-way to the right amygdala mediating ldquounseenrdquo fear Proceedings ofthe National Academy of Sciences 96 1680-1685

Oumlhman A Flykt A amp Esteves F (2001) Emotion drives attentionDetecting the snake in the grass Journal of Experimental Psychol-ogy General 130 466-478

Oumlhman A Flykt A amp Lundqvist D (2000) Unconscious emo-tion Evolutionary perspectives psychophysiological data and neu-ropsychological mechanisms In R D Lane amp L Nadel (Eds) Cog-nitive neuroscience of emotion (pp 296-327) New York OxfordUniversity Press

Oumlhman A Lundqvist D amp Esteves F (2001) The face in thecrowd revisited A threat advantage with schematic stimuli Journalof Personality amp Social Psychology 80 381-396

Pessoa L Kastner S amp Ungerleider LG (2002) Attentionalcontrol of the processing of neutral and emotional stimuli CognitiveBrain Research 15 31-45

Pessoa L McKenna M Gutierrez E amp Ungerleider LG(2002) Neural processing of emotional faces requires attention Pro-ceedings of the National Academy of Sciences 99 11458-11463

Phillips M L Young A W Scott S K Calder A J Andrew CGiampietro V Williams S C R Bullmore E T BrammerMamp Gray J A (1998) Neural responses to facial and vocal expres-sions of fear and disgust Proceedings of the Royal Society LondonSeries B 265 1809-1817

Phillips M L Young A W Senior C Brammer M Andrew CCalder A J Bullmore E T Perrett D I Rowland D ampWilliams S C R et al (1997) A specif ic neural substrate forperceiving facial expressions of disgust Nature 389 495-498

Pizzagalli D Regard M amp Lehmann D (1999) Rapid emotionalface processing in the human right and left brain hemispheres AnERP study NeuroReport 10 2691-2698

Rapcsak S Z Galper S R Comer J F Reminger S L NielsenLKaszniak A W Verfaellie M Laguna J F Labiner D Mamp Cohen R A (2000) Fear recognition deficits after focal braindamage Neurology 54 575-581

Rolls E T (1999) The brain and emotion Oxford Oxford UniversityPress

Sato W Kochiyama T Yoshikawa S amp Matsumura M (2001)Emotional expression boosts early visual processing of the face ERPrecording and its decomposition by independent component analysisNeuroReport 12 709-714

Schyns P G amp Oliva A (1999) Dr Angry and Mr Smile Whencategorization flexibly modifies the perception of faces in rapid vi-sual presentations Cognition 69 243-265

Sprengelmeyer R Rausch M Eysel U T amp Przuntek H (1998)Neural structures associated with recognition of facial expressions of

110 EIMER HOLMES AND MCGLONE

basic emotions Proceedings of the Royal Society London Series B265 1927-1931

Stepniewska I Qi H X amp Kaas J H (1999) Do superior collicu-lus projection zones in the inferior pulvinar project to MT in pri-mates European Journal of Neuroscience 11 469-480

Vuilleumier P Armony J L Driver J amp Dolan R J (2001) Ef-fects of attention and emotion on face processing in the human brainAn event-related fMRI study Neuron 30 829-841

Vuilleumier P Armony J L Driver J amp Dolan R J (2003)Distinct spatial frequency sensitivities for processing faces and emo-tional expressions Nature Neuroscience 6 624-631

Vuilleumier P amp Schwartz S (2001a) Beware and be aware Cap-ture of spatial attention by fear-related stimuli in neglect Neuro-Report 12 1119-1122

Vuilleumier P amp Schwartz S (2001b) Emotional expressionscapture attention Neurology 56 153-158

Whalen P J Rach S L Etcoff N L McInerney S C LeeM B amp Jenike M A (1998) Masked presentations of emotionalfacial expressions modulate amygdala activity without explicitknowledge Journal of Neuroscience 18 411-418

Whalen P J Shin L M McInerney S C Fischer H WrightC I amp Rauch S L (2001) A functional MRI study of humanamygdala responses to facial expressions of fear versus anger Emo-tion 1 70-83

Yeterian E H amp Pandya D N (1991) Corticothalamic connec-tions of the superior temporal sulcus in rhesus monkeys Experimen-tal Brain Research 83 268-284

NOTES

1 In spite of the fact that significant valence effects were present atcentral electrodes in the emotion task but were absent in the lines taskthis interaction failed to reach significance at central sites

2 At lateral occipital electrodes a significantly enhanced positivityfor emotional relative to neutral faces was present between 320 and495 msec in the emotion task [F(113) = 62 p lt 03] but not in thelines task and this was reflected in a nearly significant task 3 valenceinteraction [F(113) = 46 p lt 06]

3 It should be noted that anatomical evidence for a colliculo-pulvinar-amygdalar pathway is currently lacking since the medial pulvinarwhich projects to the amygdala does not receive a significant directinput from the superior colliculus (eg Stepniewska Qi amp Kaas1999) However possible connections between the inferior pulvinar(which receives visual inputs from the superior colliculus) and the me-dial nucleus may support the transmission of information to the amyg-dala through a colliculo-pulvinar route Alternatively cortical inputmay be involved since STS (implicated in facial expression processingSprengelmeyer et al 1998) is known to project to the medial pulvinar(Yeterian amp Pandya 1991) Our thanks to an anonymous reviewer forraising this important point

(Manuscript received December 10 2002revision accepted for publication June 11 2003)

ATTENTION AND FACIAL EXPRESSION PROCESSING 101

RESULTS

Behavioral ResultsParticipants failed to respond on less than 3 of all tri-

als Correct responses were faster in the emotion task(622 msec) than in the lines task (695 msec) and this dif-ference was significant [t(14) = 474 p lt 001] Figure 2shows mean RTs (top panel) and the percentage of incor-rect responses (bottom panel) obtained in the emotiontask displayed separately for the six different block typesand for trials with emotional and neutral faces respec-tively For RTs main effects of block type [F(565) = 214p lt 001 e = 788] and of valence [F(113) = 190 p lt001] were present RTs differed systematically betweenblock types being fastest in blocks including happy facesand slowest in blocks including sad faces In addition re-

sponses were generally faster to emotional than to neutralfaces No interaction between block type and valence wasobtained indicating that this RT advantage for emotionalfaces was equivalent across all six block types

For error rates main effects of block type [F(565) =137 p lt 001 e = 282] and valence [F(113) = 154 p lt002] were again present for the emotion task As can beseen from Figure 2 (bottom panel) incorrect responseswere most frequent in blocks including sad faces andleast frequent in blocks including surprised faces Alsoit was more likely that emotional faces would be incor-rectly classified as neutral than that neutral faces wouldbe erroneously judged as emotional No block type 3 va-lence interaction was present

In the lines task no main effects of block type or va-lence were obtained for RT or error rate (all Fs lt 1) in-

Emotional FacesNeutral Faces

Emotional FacesNeutral Faces

Anger Disgust Fear Happiness Sadness Surprise

Anger Disgust Fear Happiness Sadness Surprise

RT

(mse

c)E

rror

rate

s (

)

750

700

650

600

550

500

30

20

10

0

Figure 2 Reaction times (top panel) and percentage of incorrect responses(bottom panel) to emotional and neutral faces in the emotion task displayedseparately for experimental blocks where neutral faces were intermixed withangry disgusted fearful happy sad or surprised faces

102 EIMER HOLMES AND MCGLONE

dicating that the emotional expression of task-irrelevantfaces did not interfere with perceptual identification per-formance Target type did not affect RT but had a signif-icant effect on error rate [F(113) = 159 p lt 002] Itwas more likely that lines of different length would beclassified as identical (232) than that identical lineswould be judged as different (102)

Electrophysiological ResultsFigure 3 shows ERPs obtained in the emotion task in

response to stimulus arrays containing either neutral faces(solid lines) or emotional faces (dashed lines) collapsedacross all six different emotional expressions Figure 4shows corresponding ERP waveforms obtained in the linestask A sustained positivity was elicited in response toarrays containing emotional faces in the emotion taskThis emotional expression effect was first visible at fronto-central sites at about 180 msec poststimulus (overlappingwith the P2 component) and appeared at parietal elec-

trodes around 300 msec poststimulus (Figure 3) At lateraltemporal and occipital electrodes emotional expressioneffects appeared at about 250 msec poststimulus as an en-hanced negativity for emotional relative to neutral facesin the emotion task In contrast no systematic emotionalexpression effects were found for the lines task (Figure 4)

The difference between emotional and neutral facesappears to leave the face-specific N170 component atlateral temporal sites T5 and T6 entirely unaffected Thiswas observed not only in the lines task (Figure 4) butalso in the emotion task (Figure 3) where facial expres-sion was task relevant These informal observations weresubstantiated by statistical analyses

N170 component In the N170 time range (160ndash200 msec poststimulus) N170 amplitudes elicited at T5and T6 in response to neutral versus emotional facesshowed neither a main effect of valence nor a task 3 va-lence interaction (both Fs lt 1) demonstrating that theN170 is not modulated by facial expression (Figures 3

Figure 3 Grand-averaged ERP waveforms elicited in the emotion task in the 700-msec in-terval following stimulus onset in response to stimulus arrays containing neutral faces (solidlines) or emotional faces (dashed lines) collapsed across blocks including each of the six dif-ferent emotional facial expressions

ATTENTION AND FACIAL EXPRESSION PROCESSING 103

and 4) To further ascertain that this component is unaf-fected by emotional expression even when expression istask relevant we conducted additional analyses on N170amplitudes observed in the emotion task (Figure 3) Nomain effect of valence (F lt 12) or interaction betweenblock type and valence (F lt 1) was observed indicatingthat the N170 was similarly insensitive to emotional fa-cial expression for all six basic emotions employed hereeven though participants had to discriminate betweenemotional and neutral faces in this task This is illus-trated in Figure 5 which displays ERPs in response toneutral and emotional faces elicited in the emotion taskat right lateral temporal electrode T6 shown separatelyfor each of the six facial expressions which were pre-sented in different blocks No systematic differential ef-fects of any facial expression on the N170 are apparentand this was confirmed by additional planned pairedcomparisons of N170 amplitudes at T5 and T6 in re-

sponse to emotional versus neutral faces conducted sep-arately for all six basic emotions None of these com-parisons even approached statistical significance [allts(13) lt 15]

Emotional expression effects No main effects of va-lence or task 3 valence interactions were observed in the120- to 155-msec time window In the 160- to 215-msecanalysis window a task 3 valence interaction was pres-ent at frontal sites [F(113) = 52 p lt 05]1 Main effectsof valence were found at frontal and central sites [bothFs(114) gt 91 both ps lt 01] in the emotion task reflect-ing an enhanced positivity elicited in response to arrayscontaining emotional faces (Figure 3) These effects werecompletely absent in the lines task (both Fs lt 1) No in-teractions between block type and valence were found atfrontal and central sites in the emotion task (both Fs lt11) demonstrating that this early emotional positivitywas elicited in response to emotional versus neutral faces

Figure 4 Grand-averaged ERP waveforms elicited in the lines task in the 700-msec inter-val following stimulus onset in response to stimulus arrays containing neutral faces (solidlines) or emotional faces (dashed lines) Data are collapsed across blocks including each ofthe six different emotional facial expressions as well as across trials with identical and dif-ferent line pairs

104 EIMER HOLMES AND MCGLONE

irrespective of which of the six basic emotions was in-cluded in a given block This fact is further illustrated inFigure 6 which shows ERPs in response to neutral andemotional faces elicited in the emotion task at Fz dis-played separately for all emotional expressions used inthis experiment Emotional expression effects were verysimilar across expressions and started at approximatelythe same time for all six basic emotions No significantemotional expression effects were present between 160and 215 msec poststimulus at parietal and occipital elec-trodes

Between 220 and 315 msec poststimulus task 3 va-lence interactions were present at frontal and centralelectrodes as well as at lateral temporal and occipitalsites [all Fs(113) gt 72 all ps lt 02] indicating thatemotional expression affected ERPs in the emotion taskbut not in the lines task At frontal and central sites maineffects of valence in the emotion task [both Fs(113) gt150 both ps lt 02] reflected enhanced positivities foremotional relative to neutral faces (Figure 3) No blocktype 3 valence interactions were present (both Fs lt 1)demonstrating that this effect was elicited in similarfashion for all six basic emotions (Figure 6) Again nofrontocentral emotional expression effects were observedin the lines task (both Fs lt 16) At lateral temporal andoccipital sites an enhanced negativity was observed inthe 220- to 315-msec latency window for emotional rel-ative to neutral faces in the emotion task [both Fs(113) gt61 both ps lt 03] but not in the lines task (both Fs lt 1)Again no block type 3 valence interactions were pres-

ent for the emotion task (both Fs lt 16) indicating thatthis lateral posterior emotional negativity was elicited inresponse to all six basic emotions (Figure 5)

In the final two analysis windows (320ndash495 msec and500ndash700 msec poststimulus respectively) highly sig-nificant task 3 valence interactions were present atfrontal central and parietal electrodes [all Fs(113) gt100 all ps lt 01] again reflecting the presence of emo-tional expression effects in the emotion task (Figure 3)and their absence in the lines task (Figure 4) Main ef-fects of valence at frontal and central as well as at pari-etal electrodes in the emotion task [all Fs(113) gt 119all ps lt 01] without any significant interactions betweenvalence and block type demonstrated that enhanced pos-itivities for emotional faces were elicited at these sites ina similar fashion for all six basic emotions (Figure 6)Again effects of valence were entirely absent in the linestask2

DISCUSSION

The primary aim of the present ERP experiment wasto extend previous findings (Holmes et al 2003) that thedetection and processing of emotional information de-livered by facial expressions requires focal attention Werecorded ERPs to stimulus arrays containing emotionalor neutral bilateral faces under conditions when facialexpression was task relevant and therefore attended (emo-tion task) or when attention was actively directed awayfrom these faces toward a demanding perceptual judgment

Figure 5 Grand-averaged ERP waveforms elicited in the emotion task at right lateral tem-poral electrode T6 in the 700-msec interval following stimulus onset in response to stimulusarrays containing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are shownseparately for blocks containing angry disgusted fearful happy sad or surprised faces

ATTENTION AND FACIAL EXPRESSION PROCESSING 105

(lines task) In our previous ERP study (Holmes et al2003) spatial attention was manipulated on a trial-by-trial basis by precues presented at the start of each trialfacial expression was not task relevant (participants hadto detect infrequent identical stimulus pairs regardlessof expression) and only one emotional expression (fear)was tested In the present experiment a sustained attentionparadigm was employed (with emotion and lines tasksdelivered in separate experimental halves) facial expres-sion was task relevant in the emotion task and most im-portant all six basic facial emotional expressions wereincluded in different blocks

ERP correlates of emotional facial expression pro-cessing were identified by comparing ERPs elicited ontrials with emotional faces with ERPs in response to neu-tral faces This was done separately for the emotion taskand the lines task and for blocks including angry dis-gusted fearful happy sad and surprised faces In theemotion task where attention was directed toward task-relevant facial expressions an enhanced positivity foremotional relative to neutral faces was elicited similar toprevious observations from studies comparing ERP re-sponses to fearful versus neutral faces (Eimer amp Holmes2002 Holmes et al 2003) This emotional expressioneffect started at about 160 msec poststimulus and wasinitially distributed frontocentrally whereas a morebroadly distributed positivity was observed beyond300 msec (Figure 3) In addition an enhanced negativityfor fearful relative to neutral faces was elicited at lateral

posterior electrodes between 220 and 320 msec post-stimulus

The onset of the early frontocentral emotional expres-sion effect was slightly later in the present experimentthan in our previous experiment (Holmes et al 2003)where significant frontal differences between ERPs tofearful and neutral faces were already present at about120 msec poststimulus In the present study verticallines were presented close to fixation simultaneouslywith the bilateral faces whereas no such stimuli were in-cluded in our earlier experiment The presence of theseadditional central events may have slightly delayed theonset of early emotional expression effects It shouldalso be noted that an attenuation of amygdala responsesto emotional facial expressions has been observed whenthe demand for explicit emotion recognition was in-creased (Critchley et al 2000 Hariri Bookheimer ampMazziotta 2000) It is possible that the demand for ex-plicit emotion recognition in the emotion task con-tributed to the delayed onset of the early emotional ex-pression effect

In marked contrast to these ERP results obtained inthe emotion task emotional expression effects were en-tirely absent in the lines task (Figure 4) demonstratingthat ERP correlates of facial expression processing arestrongly dependent on spatial attention With sustainedspatial attention directed away from face stimuli towardanother demanding perceptual task the presence of emo-tional versus neutral faces had no effect whatsoever on

Figure 6 Grand-averaged ERP waveforms elicited in the emotion task at midline electrodeFz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are shown separatelyfor blocks containing angry disgusted fearful happy sad or surprised faces

106 EIMER HOLMES AND MCGLONE

ERP waveforms That is emotional expression effectswere completely eliminated for all six basic emotions in-cluded in this experiment In line with this ERP resultperformance in the lines task was entirely unaffected bythe expression of the faces presented simultaneouslywith the task-relevant line pairs Overall these findingsextend and confirm the observations of our previousERP experiment which compared ERPs in response tofearful versus neutral faces (Holmes et al 2003) Clearlythese results challenge the hypothesis that the detectionandor processing of emotional facial expression occurspreattentively If this were the case at least some system-atic ERP differences should have been elicited in responseto emotional versus neutral faces in the lines task reflect-ing the automatic detection of emotionally significantevents

Covert attention toward emotional faces under condi-tions when they were task relevant may have enhancedtheir visualndashperceptual representation (eg CarrascoPenpeci-Talgar amp Eckstein 2000) thereby enabling theextraction of features relating to the affective valence ofthese faces and thus their subsequent encoding andanalysis (as reflected by the emotion-specific ERP effectsobserved in the emotion task) The early frontocentrallydistributed emotional expression effects may be mediatedby connections from the superior temporal sulcus (STS)and amygdala to orbitofrontal cortex (Rolls 1999) TheSTS has been implicated in the early discrimination ofvisual features relating to emotional facial expressions(eg Sprengelmeyer et al 1998) In addition efferentfeedback projections from the amygdala and relatedstructures (see Lang et al 1998 Morris et al 1998)may have produced the more broadly distributed emo-tional expression effects observed in the present experi-ment at longer latencies

One could argue that the absence of emotional ex-pression effects under conditions where faces were un-attended may have been due to the fact that the presen-tation of specific emotional expressions was blocked andthat each expression was presented repeatedly in twoseparate blocks Repeated exposure to a specific emo-tional expression may have resulted in a gradual habitu-ation of emotion-specific responses thus potentially at-tenuating any emotional expression effects that may havebeen present in the lines task To investigate this possi-bility we computed separate averages for the first blockand for the second block including angry disgustedfearful happy sad or surprised faces separately for theemotion and for the lines task These data were then an-alyzed with the additional factor of block position (firstvs second block containing a specific emotional facialexpression) If emotional expression effects were subjectto habituation one would expect to find larger emotionalexpression effects for the first relative to the second blockin the emotion task and potentially also a residual emo-tional expression effect for the first block in the lines task

Figure 7 shows ERPs elicited at Fz in response to neu-tral faces (solid lines) or emotional faces (dashed lines)

collapsed across all six different emotional expressionsERPs are displayed separately for the emotion task (toppanel) and the lines task (bottom panel) and for the firstblock (left) or second block (right) including one of thesix emotional expressions As can be seen from Figure 7(top) there was no evidence whatsoever for any habitu-ation of emotional expression effects as a function ofblock position in the emotion task This was confirmedby the absence of any block position 3 valence or blockposition 3 block type 3 valence interactions for all latencywindows employed in the analyses reported above [allFs(113) lt 1] Along similar lines Figure 7 (bottom panel)suggests that there was no residual emotional expressioneffect for the first block including a specific emotionalexpression in the lines task This was confirmed by theabsence of any interactions involving block position [allFs(113) lt 16] Thus the fact that emotional expressioneffects were absent in response to unattended faces in thelines task is unlikely to have been the result of a habitu-ation of emotion-specific brain responses

The conclusion that the processing of emotional facialexpression as reflected by ERP facial expression effectsis gated by spatial attention appears to be inconsistentwith neuroimaging studies demonstrating that fearfulfaces result in amygdala activations even when thesefaces are outside the focus of attention (Vuilleumieret al 2001 see also Morris et al 1996 Whalen et al1998) However it is extremely unlikely that the ERP ef-fects observed in the present study are directly linked toamygdala activations Due to its nuclear structure ofclustered neurones the amygdala is electrically closedand thus largely inaccessible to ERP measures The earlyemotional expression effects observed in response to at-tended faces are more likely to be generated in prefrontalcortex where emotion-specific single-cell responseshave recently been recorded at short latencies (Kawasakiet al 2001) Such prefrontal responses may reflect stagesin emotional processing that could be contingent uponbut functionally separate from prior amygdala activa-tions (see Le Doux 1996 Rolls 1999) It is possible thatamygdala responses can be triggered by unattendedemotional stimuli (although these responses may be at-tenuated) whereas subsequent neocortical stages ofemotional processing (as reflected by the ERP effects ob-served in the present experiment) are fully dependent onfocal attention An alternative possibility is that amyg-dala responses to emotional stimuli may also require at-tention (see Pessoa Kastner amp Ungerleider 2002 Pes-soa McKenna et al 2002) and that the elimination ofemotional expression effects in the lines task reflects anearlier attentional gating of such subcortical processing

Another important new finding of the present experi-ment was that the onset time course and scalp distribu-tion of emotional expression effects obtained in the emo-tion task were remarkably similar for all six basic facialexpressions used here (Figures 5 and 6) The absence ofany differential ERP responses to different emotional ex-pressions was reflected by the absence of any significant

ATTENTION AND FACIAL EXPRESSION PROCESSING 107

interactions between block type (blocks with angry dis-gusted fearful happy sad or surprised faces) and va-lence (emotional vs neutral expression) In line withthese observations the size of the RT advantage for emo-tional relative to neutral faces in the emotion task wassimilar for all six emotional facial expressions (Figure 2top panel) The similarity in the time course of emotionalexpression effects across all six emotional expressionsobserved here suggests that emotionally relevant infor-mation delivered by facial expression is available to neo-cortical processes within less then 200 msec after stim-ulus onset and at approximately the same time for allbasic emotional expressions

These observations do not seem to support the ideasuggested by recent f MRI results that distinct neuralsubsystems specialize in the processing of specific emo-tions (Adolphs 2002) If this were the case one mighthave expected some systematic differences between ERPemotional expression effects elicited by different facial

expressions However it should be noted that althoughsome neuroimaging data show emotion-specific differ-ential activation of brain regions such as the amygdala orinsula few studies point to differential activation withinsurface cortical structures (where the ERP effects ob-served in the present experiments are likely to be gener-ated see also Pizzagalli et al 1999 Sato et al 2001for related results from recent ERP studies)

Thus one could argue that early stages in the processingof emotionally relevant information subserved by lim-bic structures or the basal ganglia and subsequent neo-cortical emotional processing stages differ not only intheir dependence on focal attention (see above) but alsoin their specificity Early processes may be differentiallyengaged by specific emotional expressions thus provid-ing a rapid classif ication of emotionally significantevents Data in support of this view come from single-unit recordings which reveal a rapid emergence of dif-ferential effects to emotional expressions in the human

Figure 7 Grand-averaged ERP waveforms elicited in the emotion task (toppanel) and in the lines task (bottom panel) at midline electrode Fz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are col-lapsed across blocks including each of the six different emotional facialexpressions and are shown separately for the first block (left) and the secondblock (right) including one specific emotional expression

108 EIMER HOLMES AND MCGLONE

amygdala (Liu et al 1999) Conversely later stagesmight be involved in the in-depth processing of variouskinds of affective information and thus would be muchless selective with respect to different facial expressions

This suggestion is consistent with some recent evi-dence that subcortical and neocortical routes for visualprocessing are involved differentially in emotional ex-pression analysis A subcortical magnocellular pathwayto the amygdala would appear to support valence dis-crimination processes whereas parvocellular subsys-tems of ventral visual cortices may be preferentially in-volved in emotional intensity evaluation irrespective ofemotional valence (Schyns amp Oliva 1999 VuilleumierArmony Driver amp Dolan 2003) Recent neuroimagingresults (Vuilleumier et al 2003) suggest that low andhigh spatial frequency components of fearful faces se-lectively drive amygdala and visual cortical responsesrespectively However although enhanced amygdala ac-tivation was found in response to low-spatial-frequencyfearful face stimuli explicit judgments relating to theperceived intensity of fearfulness were increased by thepresence of high-spatial-frequency cues These resultssupport the view that coarse visual information may bedirected via magnocellular channels from the retina tothe amygdala through a tectopulvinar pathway (eg Bistiamp Sireteanu 1976 Jones amp Burton 1976) enabling thefast appraisal of the affective significance of a stimulus(eg Morris Oumlhman amp Dolan 1999)3

Another aim of the present study was to investigatewhether the face-specific N170 component which is as-sumed to reflect the structural encoding of faces is sen-sitive to emotional facial expressions In previous ERPstudies which have not found any modulations of theN170 elicited by fearful relative to neutral faces (Eimeramp Holmes 2002 Holmes et al 2003) facial expressionwas always task irrelevant In contrast participantsrsquo re-sponses were contingent upon facial expression in thepresent emotion task In spite of this fact the N170 wasfound to be completely unaffected by facial expressionsin the emotion task and this was consistently the casefor all six emotional expressions used in the presentstudy (Figure 5)

In line with earlier findings from depth electrodes (Mc-Carthy Puce Belger amp Allison 1999) this pattern of re-sults now demonstrates comprehensively that the structuralencoding of faces as reflected by the N170 is entirely in-sensitive to information derived from emotional facial ex-pression Thus the rapid detection of emotional facialexpression appears to occur independently and in paral-lel to the construction of a detailed perceptual representa-tion of a face The absence of systematic early emotionalexpression effects at posterior sites and the presence ofsuch ERP effects at frontocentral electrodes at about160 msec poststimulus suggests that higher order visualprocessing stages involved in face processing are af-fected by emotional facial expression only after this in-formation has been processed in prefrontal cortex Thisis consistent with the face processing model proposed byBruce and Young (1986) in which the extraction of per-

ceptual information for emotional expression processingoccurs independently and simultaneously with structuralencoding for face recognition

In summary the present ERP results demonstrate thatthe neocortical processing of emotional facial expres-sion is strongly dependent on focal attention Whenfaces were attended systematic emotional expression ef-fects were elicited by emotional relative to neutral facesand these effects were strikingly similar in terms of theirtiming and morphology for all six basic facial expres-sions In contrast when attention was actively directedaway from these faces emotional expression effectswere completely eliminated The rapid and automatic en-coding of emotionally significant events occurring out-side the focus of attention may be adaptively advanta-geous because it prepares the organism for f ight orflight through subcortically mediated autonomic activa-tion (eg Oumlhman Flykt amp Lundqvist 2000) Howeverit is equally important that irrelevant affective stimuli donot continuously divert attention This suggests a divi-sion of labor between limbic structures involved in theobligatory detection of emotional informationmdashprepar-ing the organism for rapid action (Morris et al 1999Whalen et al 1998)mdashand subsequent neocortical emo-tional processing stages Limbic structures may be re-sponsible for establishing a readiness to respond to anyenvironmental threat that could become the focus of at-tention presumably through heightened autonomic acti-vation However neocortical stages appear to be pro-tected by efficient attentional gating mechanisms whichreduce distractibility by emotional stimuli so that ongo-ing goals and plans can be accomplished without inter-ference from irrelevant events

REFERENCES

Adolphs R (2002) Recognizing emotion from facial expressionsPsychological and neurological mechanisms Behavioral CognitiveNeuroscience Review 1 21-61

Adolphs R Tranel D amp Damasio A R ( 2003) Dissociableneural systems for recognizing emotions Brain amp Cognition 52 61-69

Amaral D G amp Price J L (1984) Amygdalo-cortical projectionsin the monkey (Macaca fasicularis) Journal of Comparative Neu-rology 230 465-496

Amaral D G Price J L Pitkanen A amp Carmichael S T(1992) Anatomical organization of the primate amygdaloid complexIn J P Aggleton (Ed) The amygdala Neurobiological aspects ofemotion memory and mental dysfunction (pp 1-66) New YorkWiley-Liss

Armony J L amp Dolan R J (2002) Modulation of spatial attentionby fear-conditioned stimuli An event-related fMRI study Neuro-psychologia 7 817-826

Bentin S Allison T Puce A Perez E amp McCarthy G (1996)Electrophysiological studies of face perception in humans Journal ofCognitive Neuroscience 8 551-565

Bisti S amp Sireteanu R C (1976) Sensitivity to spatial frequencyand contrast of visual cells in the cat superior colliculus Vision Re-search 16 247-251

Blair R J R Morris J S Frith C D Perrett D I amp DolanR J (1999) Dissociable neural responses to facial expressions ofsadness and anger Brain 122 883-893

Breiter H C Etcoff N L Whalen P J Kennedy W ARauch S L Buckner R L Strauss M M Hyman S E amp

ATTENTION AND FACIAL EXPRESSION PROCESSING 109

Rosen B R (1996) Response and habituation of the human amyg-dala during visual processing of facial expression Neuron 17 875-887

Bruce V amp Young A (1986) Understanding face recognitionBritish Journal of Psychology 77 305-327

Calder A J Keane J Manes F Antoun N amp Young A W(2000) Impaired recognition and experience of disgust followingbrain injury Nature Neuroscience 3 1077-1078

Calder A J Lawrence A D amp Young A W (2001) Neuropsy-chology of fear and loathing Nature Reviews Neuroscience 2 352-363

Carrasco M Penpeci-Talgar C amp Eckstein M (2000) Spatialcovert attention increases contrast sensitivity across the CSF Sup-port for signal enhancement Vision Research 40 1203-1215

Critchley H [D] Daly E [M] Phillips M Brammer M Bull-more E Williams S [C] van Amelsvoort T Robertson DDavid A amp Murphy D [G M] (2000) Explicit and implicit neuralmechanisms for processing of social information from facial expres-sions A functional magnetic resonance imaging study Human BrainMapping 9 93-105

Cuthbert B N Schupp H T Bradley M M Birbaumer N ampLang P J (2000) Brain potentials in affective picture processingCovariation with autonomic arousal and affective report BiologicalPsychology 52 95-111

Damasio A R (1994) Descartesrsquo error Emotion reason and the humanbrain New York G P Putnamrsquos Sons

Diedrich O Naumann E Maier S amp Becker G (1997) Afrontal slow wave in the ERP associated with emotional slides Jour-nal of Psychophysiology 11 71-84

Eastwood J D Smilek D amp Merikle P M (2001) Differential at-tentional guidance by unattended faces expressing positive and neg-ative emotion Perception amp Psychophysics 63 1004-1013

Eimer M (1998) Does the face-specif ic N170 component reflect theactivity of a specialized eye detector NeuroReport 9 2945-2948

Eimer M (2000) The face-specific N170 component reflects late stagesin the structural encoding of faces NeuroReport 11 2319-2324

Eimer M amp Holmes A (2002) An ERP study on the time course ofemotional face processing NeuroReport 13 427-431

Ekman P amp Friesen W V (1976) Pictures of facial affect Palo AltoCA Consulting Psychologists Press

Fox E Lester V Russo R Bowles R J Pichler A amp Dutton K(2000) Facial expressions of emotion Are angry faces detected moreefficiently Cognition amp Emotion 14 61-92

Hansen C H amp Hansen R D (1988) Finding the face in the crowdAn anger superiority effect Journal of Personality amp Social Psy-chology 54 917-924

Hariri A R Bookheimer S Y amp Mazziotta J C (2000) Modu-lating emotional responses Effects of a neocortical network on thelimbic system NeuroReport 11 43-48

Harmer C J Thilo K V Rothwell J C amp Goodwin G M(2001) Transcranial magnetic stimulation of medial-frontal corteximpairs the processing of angry facial expressions Nature Neuro-science 4 17-18

Holmes A Vuilleumier P amp Eimer M (2003) The processing ofemotional facial expression is gated by spatial attention Evidencefrom event-related brain potentials Cognitive Brain Research 16174-184

Jones E G amp Burton H (1976) A projection from the medial pul-vinar to the amygdala in primates Brain Research 104 142-147

Kawasaki H Kaufman O Damasio H Damasio A R Gran-ner M Bakken H Hori T Howard M A III amp Adolphs R(2001) Single-neuron responses to emotional visual stimuli recordedin human ventral prefrontal cortex Nature Neuroscience 4 15-16

Lane R D Chua P M amp Dolan R J (1999) Common effects ofemotional valence arousal and attention on neural activation duringvisual processing of pictures Neuropsychologia 37 989-997

Lang P J Bradley M M Fitzsimmons J R Cuthbert B NScott J D Moulder B amp Nangia V (1998) Emotional arousaland activation of the visual cortex An fMRI analysis Psychophysi-ology 35 199-210

Le Doux J E (1996) The emotional brain New York Simon ampSchuster

Liu L Ioannides A A amp Streit M (1999) Single trial analysis ofneurophysiological correlates of the recognition of complex objectsand facial expressions of emotion Brain Topography 11 291-303

McCarthy G Puce A Belger A amp Allison T (1999) Electro-physiological studies of human face perception II Response proper-ties of face-specific potentials generated in occipitotemporal cortexCerebral Cortex 9 431-444

Mogg K amp Bradley B P (1999) Orienting of attention to threaten-ing facial expressions presented under conditions of restricted aware-ness Cognition amp Emotion 13 713-740

Mogg K McNamara J Powys M Rawlinson H Seiffer A ampBradley B P (2000) Selective attention to threat A test of twocognitive models of anxiety Cognition amp Emotion 14 375-399

Morris J S Friston K J Buechel C Frith C D Young A WCalder A J amp Dolan R J (1998) A neuromodulatory role forthe human amygdala in processing emotional facial expressionsBrain 121 47-57

Morris J S Frith C D Perrett D I Rowland D YoungA W Calder A J amp Dolan R J (1996) A differential neural re-sponse in the human amygdala to fearful and happy facial expres-sions Nature 383 812-815

Morris J S Oumlhman A amp Dolan R J (1999) A subcortical path-way to the right amygdala mediating ldquounseenrdquo fear Proceedings ofthe National Academy of Sciences 96 1680-1685

Oumlhman A Flykt A amp Esteves F (2001) Emotion drives attentionDetecting the snake in the grass Journal of Experimental Psychol-ogy General 130 466-478

Oumlhman A Flykt A amp Lundqvist D (2000) Unconscious emo-tion Evolutionary perspectives psychophysiological data and neu-ropsychological mechanisms In R D Lane amp L Nadel (Eds) Cog-nitive neuroscience of emotion (pp 296-327) New York OxfordUniversity Press

Oumlhman A Lundqvist D amp Esteves F (2001) The face in thecrowd revisited A threat advantage with schematic stimuli Journalof Personality amp Social Psychology 80 381-396

Pessoa L Kastner S amp Ungerleider LG (2002) Attentionalcontrol of the processing of neutral and emotional stimuli CognitiveBrain Research 15 31-45

Pessoa L McKenna M Gutierrez E amp Ungerleider LG(2002) Neural processing of emotional faces requires attention Pro-ceedings of the National Academy of Sciences 99 11458-11463

Phillips M L Young A W Scott S K Calder A J Andrew CGiampietro V Williams S C R Bullmore E T BrammerMamp Gray J A (1998) Neural responses to facial and vocal expres-sions of fear and disgust Proceedings of the Royal Society LondonSeries B 265 1809-1817

Phillips M L Young A W Senior C Brammer M Andrew CCalder A J Bullmore E T Perrett D I Rowland D ampWilliams S C R et al (1997) A specif ic neural substrate forperceiving facial expressions of disgust Nature 389 495-498

Pizzagalli D Regard M amp Lehmann D (1999) Rapid emotionalface processing in the human right and left brain hemispheres AnERP study NeuroReport 10 2691-2698

Rapcsak S Z Galper S R Comer J F Reminger S L NielsenLKaszniak A W Verfaellie M Laguna J F Labiner D Mamp Cohen R A (2000) Fear recognition deficits after focal braindamage Neurology 54 575-581

Rolls E T (1999) The brain and emotion Oxford Oxford UniversityPress

Sato W Kochiyama T Yoshikawa S amp Matsumura M (2001)Emotional expression boosts early visual processing of the face ERPrecording and its decomposition by independent component analysisNeuroReport 12 709-714

Schyns P G amp Oliva A (1999) Dr Angry and Mr Smile Whencategorization flexibly modifies the perception of faces in rapid vi-sual presentations Cognition 69 243-265

Sprengelmeyer R Rausch M Eysel U T amp Przuntek H (1998)Neural structures associated with recognition of facial expressions of

110 EIMER HOLMES AND MCGLONE

basic emotions Proceedings of the Royal Society London Series B265 1927-1931

Stepniewska I Qi H X amp Kaas J H (1999) Do superior collicu-lus projection zones in the inferior pulvinar project to MT in pri-mates European Journal of Neuroscience 11 469-480

Vuilleumier P Armony J L Driver J amp Dolan R J (2001) Ef-fects of attention and emotion on face processing in the human brainAn event-related fMRI study Neuron 30 829-841

Vuilleumier P Armony J L Driver J amp Dolan R J (2003)Distinct spatial frequency sensitivities for processing faces and emo-tional expressions Nature Neuroscience 6 624-631

Vuilleumier P amp Schwartz S (2001a) Beware and be aware Cap-ture of spatial attention by fear-related stimuli in neglect Neuro-Report 12 1119-1122

Vuilleumier P amp Schwartz S (2001b) Emotional expressionscapture attention Neurology 56 153-158

Whalen P J Rach S L Etcoff N L McInerney S C LeeM B amp Jenike M A (1998) Masked presentations of emotionalfacial expressions modulate amygdala activity without explicitknowledge Journal of Neuroscience 18 411-418

Whalen P J Shin L M McInerney S C Fischer H WrightC I amp Rauch S L (2001) A functional MRI study of humanamygdala responses to facial expressions of fear versus anger Emo-tion 1 70-83

Yeterian E H amp Pandya D N (1991) Corticothalamic connec-tions of the superior temporal sulcus in rhesus monkeys Experimen-tal Brain Research 83 268-284

NOTES

1 In spite of the fact that significant valence effects were present atcentral electrodes in the emotion task but were absent in the lines taskthis interaction failed to reach significance at central sites

2 At lateral occipital electrodes a significantly enhanced positivityfor emotional relative to neutral faces was present between 320 and495 msec in the emotion task [F(113) = 62 p lt 03] but not in thelines task and this was reflected in a nearly significant task 3 valenceinteraction [F(113) = 46 p lt 06]

3 It should be noted that anatomical evidence for a colliculo-pulvinar-amygdalar pathway is currently lacking since the medial pulvinarwhich projects to the amygdala does not receive a significant directinput from the superior colliculus (eg Stepniewska Qi amp Kaas1999) However possible connections between the inferior pulvinar(which receives visual inputs from the superior colliculus) and the me-dial nucleus may support the transmission of information to the amyg-dala through a colliculo-pulvinar route Alternatively cortical inputmay be involved since STS (implicated in facial expression processingSprengelmeyer et al 1998) is known to project to the medial pulvinar(Yeterian amp Pandya 1991) Our thanks to an anonymous reviewer forraising this important point

(Manuscript received December 10 2002revision accepted for publication June 11 2003)

102 EIMER HOLMES AND MCGLONE

dicating that the emotional expression of task-irrelevantfaces did not interfere with perceptual identification per-formance Target type did not affect RT but had a signif-icant effect on error rate [F(113) = 159 p lt 002] Itwas more likely that lines of different length would beclassified as identical (232) than that identical lineswould be judged as different (102)

Electrophysiological ResultsFigure 3 shows ERPs obtained in the emotion task in

response to stimulus arrays containing either neutral faces(solid lines) or emotional faces (dashed lines) collapsedacross all six different emotional expressions Figure 4shows corresponding ERP waveforms obtained in the linestask A sustained positivity was elicited in response toarrays containing emotional faces in the emotion taskThis emotional expression effect was first visible at fronto-central sites at about 180 msec poststimulus (overlappingwith the P2 component) and appeared at parietal elec-

trodes around 300 msec poststimulus (Figure 3) At lateraltemporal and occipital electrodes emotional expressioneffects appeared at about 250 msec poststimulus as an en-hanced negativity for emotional relative to neutral facesin the emotion task In contrast no systematic emotionalexpression effects were found for the lines task (Figure 4)

The difference between emotional and neutral facesappears to leave the face-specific N170 component atlateral temporal sites T5 and T6 entirely unaffected Thiswas observed not only in the lines task (Figure 4) butalso in the emotion task (Figure 3) where facial expres-sion was task relevant These informal observations weresubstantiated by statistical analyses

N170 component In the N170 time range (160ndash200 msec poststimulus) N170 amplitudes elicited at T5and T6 in response to neutral versus emotional facesshowed neither a main effect of valence nor a task 3 va-lence interaction (both Fs lt 1) demonstrating that theN170 is not modulated by facial expression (Figures 3

Figure 3 Grand-averaged ERP waveforms elicited in the emotion task in the 700-msec in-terval following stimulus onset in response to stimulus arrays containing neutral faces (solidlines) or emotional faces (dashed lines) collapsed across blocks including each of the six dif-ferent emotional facial expressions

ATTENTION AND FACIAL EXPRESSION PROCESSING 103

and 4) To further ascertain that this component is unaf-fected by emotional expression even when expression istask relevant we conducted additional analyses on N170amplitudes observed in the emotion task (Figure 3) Nomain effect of valence (F lt 12) or interaction betweenblock type and valence (F lt 1) was observed indicatingthat the N170 was similarly insensitive to emotional fa-cial expression for all six basic emotions employed hereeven though participants had to discriminate betweenemotional and neutral faces in this task This is illus-trated in Figure 5 which displays ERPs in response toneutral and emotional faces elicited in the emotion taskat right lateral temporal electrode T6 shown separatelyfor each of the six facial expressions which were pre-sented in different blocks No systematic differential ef-fects of any facial expression on the N170 are apparentand this was confirmed by additional planned pairedcomparisons of N170 amplitudes at T5 and T6 in re-

sponse to emotional versus neutral faces conducted sep-arately for all six basic emotions None of these com-parisons even approached statistical significance [allts(13) lt 15]

Emotional expression effects No main effects of va-lence or task 3 valence interactions were observed in the120- to 155-msec time window In the 160- to 215-msecanalysis window a task 3 valence interaction was pres-ent at frontal sites [F(113) = 52 p lt 05]1 Main effectsof valence were found at frontal and central sites [bothFs(114) gt 91 both ps lt 01] in the emotion task reflect-ing an enhanced positivity elicited in response to arrayscontaining emotional faces (Figure 3) These effects werecompletely absent in the lines task (both Fs lt 1) No in-teractions between block type and valence were found atfrontal and central sites in the emotion task (both Fs lt11) demonstrating that this early emotional positivitywas elicited in response to emotional versus neutral faces

Figure 4 Grand-averaged ERP waveforms elicited in the lines task in the 700-msec inter-val following stimulus onset in response to stimulus arrays containing neutral faces (solidlines) or emotional faces (dashed lines) Data are collapsed across blocks including each ofthe six different emotional facial expressions as well as across trials with identical and dif-ferent line pairs

104 EIMER HOLMES AND MCGLONE

irrespective of which of the six basic emotions was in-cluded in a given block This fact is further illustrated inFigure 6 which shows ERPs in response to neutral andemotional faces elicited in the emotion task at Fz dis-played separately for all emotional expressions used inthis experiment Emotional expression effects were verysimilar across expressions and started at approximatelythe same time for all six basic emotions No significantemotional expression effects were present between 160and 215 msec poststimulus at parietal and occipital elec-trodes

Between 220 and 315 msec poststimulus task 3 va-lence interactions were present at frontal and centralelectrodes as well as at lateral temporal and occipitalsites [all Fs(113) gt 72 all ps lt 02] indicating thatemotional expression affected ERPs in the emotion taskbut not in the lines task At frontal and central sites maineffects of valence in the emotion task [both Fs(113) gt150 both ps lt 02] reflected enhanced positivities foremotional relative to neutral faces (Figure 3) No blocktype 3 valence interactions were present (both Fs lt 1)demonstrating that this effect was elicited in similarfashion for all six basic emotions (Figure 6) Again nofrontocentral emotional expression effects were observedin the lines task (both Fs lt 16) At lateral temporal andoccipital sites an enhanced negativity was observed inthe 220- to 315-msec latency window for emotional rel-ative to neutral faces in the emotion task [both Fs(113) gt61 both ps lt 03] but not in the lines task (both Fs lt 1)Again no block type 3 valence interactions were pres-

ent for the emotion task (both Fs lt 16) indicating thatthis lateral posterior emotional negativity was elicited inresponse to all six basic emotions (Figure 5)

In the final two analysis windows (320ndash495 msec and500ndash700 msec poststimulus respectively) highly sig-nificant task 3 valence interactions were present atfrontal central and parietal electrodes [all Fs(113) gt100 all ps lt 01] again reflecting the presence of emo-tional expression effects in the emotion task (Figure 3)and their absence in the lines task (Figure 4) Main ef-fects of valence at frontal and central as well as at pari-etal electrodes in the emotion task [all Fs(113) gt 119all ps lt 01] without any significant interactions betweenvalence and block type demonstrated that enhanced pos-itivities for emotional faces were elicited at these sites ina similar fashion for all six basic emotions (Figure 6)Again effects of valence were entirely absent in the linestask2

DISCUSSION

The primary aim of the present ERP experiment wasto extend previous findings (Holmes et al 2003) that thedetection and processing of emotional information de-livered by facial expressions requires focal attention Werecorded ERPs to stimulus arrays containing emotionalor neutral bilateral faces under conditions when facialexpression was task relevant and therefore attended (emo-tion task) or when attention was actively directed awayfrom these faces toward a demanding perceptual judgment

Figure 5 Grand-averaged ERP waveforms elicited in the emotion task at right lateral tem-poral electrode T6 in the 700-msec interval following stimulus onset in response to stimulusarrays containing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are shownseparately for blocks containing angry disgusted fearful happy sad or surprised faces

ATTENTION AND FACIAL EXPRESSION PROCESSING 105

(lines task) In our previous ERP study (Holmes et al2003) spatial attention was manipulated on a trial-by-trial basis by precues presented at the start of each trialfacial expression was not task relevant (participants hadto detect infrequent identical stimulus pairs regardlessof expression) and only one emotional expression (fear)was tested In the present experiment a sustained attentionparadigm was employed (with emotion and lines tasksdelivered in separate experimental halves) facial expres-sion was task relevant in the emotion task and most im-portant all six basic facial emotional expressions wereincluded in different blocks

ERP correlates of emotional facial expression pro-cessing were identified by comparing ERPs elicited ontrials with emotional faces with ERPs in response to neu-tral faces This was done separately for the emotion taskand the lines task and for blocks including angry dis-gusted fearful happy sad and surprised faces In theemotion task where attention was directed toward task-relevant facial expressions an enhanced positivity foremotional relative to neutral faces was elicited similar toprevious observations from studies comparing ERP re-sponses to fearful versus neutral faces (Eimer amp Holmes2002 Holmes et al 2003) This emotional expressioneffect started at about 160 msec poststimulus and wasinitially distributed frontocentrally whereas a morebroadly distributed positivity was observed beyond300 msec (Figure 3) In addition an enhanced negativityfor fearful relative to neutral faces was elicited at lateral

posterior electrodes between 220 and 320 msec post-stimulus

The onset of the early frontocentral emotional expres-sion effect was slightly later in the present experimentthan in our previous experiment (Holmes et al 2003)where significant frontal differences between ERPs tofearful and neutral faces were already present at about120 msec poststimulus In the present study verticallines were presented close to fixation simultaneouslywith the bilateral faces whereas no such stimuli were in-cluded in our earlier experiment The presence of theseadditional central events may have slightly delayed theonset of early emotional expression effects It shouldalso be noted that an attenuation of amygdala responsesto emotional facial expressions has been observed whenthe demand for explicit emotion recognition was in-creased (Critchley et al 2000 Hariri Bookheimer ampMazziotta 2000) It is possible that the demand for ex-plicit emotion recognition in the emotion task con-tributed to the delayed onset of the early emotional ex-pression effect

In marked contrast to these ERP results obtained inthe emotion task emotional expression effects were en-tirely absent in the lines task (Figure 4) demonstratingthat ERP correlates of facial expression processing arestrongly dependent on spatial attention With sustainedspatial attention directed away from face stimuli towardanother demanding perceptual task the presence of emo-tional versus neutral faces had no effect whatsoever on

Figure 6 Grand-averaged ERP waveforms elicited in the emotion task at midline electrodeFz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are shown separatelyfor blocks containing angry disgusted fearful happy sad or surprised faces

106 EIMER HOLMES AND MCGLONE

ERP waveforms That is emotional expression effectswere completely eliminated for all six basic emotions in-cluded in this experiment In line with this ERP resultperformance in the lines task was entirely unaffected bythe expression of the faces presented simultaneouslywith the task-relevant line pairs Overall these findingsextend and confirm the observations of our previousERP experiment which compared ERPs in response tofearful versus neutral faces (Holmes et al 2003) Clearlythese results challenge the hypothesis that the detectionandor processing of emotional facial expression occurspreattentively If this were the case at least some system-atic ERP differences should have been elicited in responseto emotional versus neutral faces in the lines task reflect-ing the automatic detection of emotionally significantevents

Covert attention toward emotional faces under condi-tions when they were task relevant may have enhancedtheir visualndashperceptual representation (eg CarrascoPenpeci-Talgar amp Eckstein 2000) thereby enabling theextraction of features relating to the affective valence ofthese faces and thus their subsequent encoding andanalysis (as reflected by the emotion-specific ERP effectsobserved in the emotion task) The early frontocentrallydistributed emotional expression effects may be mediatedby connections from the superior temporal sulcus (STS)and amygdala to orbitofrontal cortex (Rolls 1999) TheSTS has been implicated in the early discrimination ofvisual features relating to emotional facial expressions(eg Sprengelmeyer et al 1998) In addition efferentfeedback projections from the amygdala and relatedstructures (see Lang et al 1998 Morris et al 1998)may have produced the more broadly distributed emo-tional expression effects observed in the present experi-ment at longer latencies

One could argue that the absence of emotional ex-pression effects under conditions where faces were un-attended may have been due to the fact that the presen-tation of specific emotional expressions was blocked andthat each expression was presented repeatedly in twoseparate blocks Repeated exposure to a specific emo-tional expression may have resulted in a gradual habitu-ation of emotion-specific responses thus potentially at-tenuating any emotional expression effects that may havebeen present in the lines task To investigate this possi-bility we computed separate averages for the first blockand for the second block including angry disgustedfearful happy sad or surprised faces separately for theemotion and for the lines task These data were then an-alyzed with the additional factor of block position (firstvs second block containing a specific emotional facialexpression) If emotional expression effects were subjectto habituation one would expect to find larger emotionalexpression effects for the first relative to the second blockin the emotion task and potentially also a residual emo-tional expression effect for the first block in the lines task

Figure 7 shows ERPs elicited at Fz in response to neu-tral faces (solid lines) or emotional faces (dashed lines)

collapsed across all six different emotional expressionsERPs are displayed separately for the emotion task (toppanel) and the lines task (bottom panel) and for the firstblock (left) or second block (right) including one of thesix emotional expressions As can be seen from Figure 7(top) there was no evidence whatsoever for any habitu-ation of emotional expression effects as a function ofblock position in the emotion task This was confirmedby the absence of any block position 3 valence or blockposition 3 block type 3 valence interactions for all latencywindows employed in the analyses reported above [allFs(113) lt 1] Along similar lines Figure 7 (bottom panel)suggests that there was no residual emotional expressioneffect for the first block including a specific emotionalexpression in the lines task This was confirmed by theabsence of any interactions involving block position [allFs(113) lt 16] Thus the fact that emotional expressioneffects were absent in response to unattended faces in thelines task is unlikely to have been the result of a habitu-ation of emotion-specific brain responses

The conclusion that the processing of emotional facialexpression as reflected by ERP facial expression effectsis gated by spatial attention appears to be inconsistentwith neuroimaging studies demonstrating that fearfulfaces result in amygdala activations even when thesefaces are outside the focus of attention (Vuilleumieret al 2001 see also Morris et al 1996 Whalen et al1998) However it is extremely unlikely that the ERP ef-fects observed in the present study are directly linked toamygdala activations Due to its nuclear structure ofclustered neurones the amygdala is electrically closedand thus largely inaccessible to ERP measures The earlyemotional expression effects observed in response to at-tended faces are more likely to be generated in prefrontalcortex where emotion-specific single-cell responseshave recently been recorded at short latencies (Kawasakiet al 2001) Such prefrontal responses may reflect stagesin emotional processing that could be contingent uponbut functionally separate from prior amygdala activa-tions (see Le Doux 1996 Rolls 1999) It is possible thatamygdala responses can be triggered by unattendedemotional stimuli (although these responses may be at-tenuated) whereas subsequent neocortical stages ofemotional processing (as reflected by the ERP effects ob-served in the present experiment) are fully dependent onfocal attention An alternative possibility is that amyg-dala responses to emotional stimuli may also require at-tention (see Pessoa Kastner amp Ungerleider 2002 Pes-soa McKenna et al 2002) and that the elimination ofemotional expression effects in the lines task reflects anearlier attentional gating of such subcortical processing

Another important new finding of the present experi-ment was that the onset time course and scalp distribu-tion of emotional expression effects obtained in the emo-tion task were remarkably similar for all six basic facialexpressions used here (Figures 5 and 6) The absence ofany differential ERP responses to different emotional ex-pressions was reflected by the absence of any significant

ATTENTION AND FACIAL EXPRESSION PROCESSING 107

interactions between block type (blocks with angry dis-gusted fearful happy sad or surprised faces) and va-lence (emotional vs neutral expression) In line withthese observations the size of the RT advantage for emo-tional relative to neutral faces in the emotion task wassimilar for all six emotional facial expressions (Figure 2top panel) The similarity in the time course of emotionalexpression effects across all six emotional expressionsobserved here suggests that emotionally relevant infor-mation delivered by facial expression is available to neo-cortical processes within less then 200 msec after stim-ulus onset and at approximately the same time for allbasic emotional expressions

These observations do not seem to support the ideasuggested by recent f MRI results that distinct neuralsubsystems specialize in the processing of specific emo-tions (Adolphs 2002) If this were the case one mighthave expected some systematic differences between ERPemotional expression effects elicited by different facial

expressions However it should be noted that althoughsome neuroimaging data show emotion-specific differ-ential activation of brain regions such as the amygdala orinsula few studies point to differential activation withinsurface cortical structures (where the ERP effects ob-served in the present experiments are likely to be gener-ated see also Pizzagalli et al 1999 Sato et al 2001for related results from recent ERP studies)

Thus one could argue that early stages in the processingof emotionally relevant information subserved by lim-bic structures or the basal ganglia and subsequent neo-cortical emotional processing stages differ not only intheir dependence on focal attention (see above) but alsoin their specificity Early processes may be differentiallyengaged by specific emotional expressions thus provid-ing a rapid classif ication of emotionally significantevents Data in support of this view come from single-unit recordings which reveal a rapid emergence of dif-ferential effects to emotional expressions in the human

Figure 7 Grand-averaged ERP waveforms elicited in the emotion task (toppanel) and in the lines task (bottom panel) at midline electrode Fz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are col-lapsed across blocks including each of the six different emotional facialexpressions and are shown separately for the first block (left) and the secondblock (right) including one specific emotional expression

108 EIMER HOLMES AND MCGLONE

amygdala (Liu et al 1999) Conversely later stagesmight be involved in the in-depth processing of variouskinds of affective information and thus would be muchless selective with respect to different facial expressions

This suggestion is consistent with some recent evi-dence that subcortical and neocortical routes for visualprocessing are involved differentially in emotional ex-pression analysis A subcortical magnocellular pathwayto the amygdala would appear to support valence dis-crimination processes whereas parvocellular subsys-tems of ventral visual cortices may be preferentially in-volved in emotional intensity evaluation irrespective ofemotional valence (Schyns amp Oliva 1999 VuilleumierArmony Driver amp Dolan 2003) Recent neuroimagingresults (Vuilleumier et al 2003) suggest that low andhigh spatial frequency components of fearful faces se-lectively drive amygdala and visual cortical responsesrespectively However although enhanced amygdala ac-tivation was found in response to low-spatial-frequencyfearful face stimuli explicit judgments relating to theperceived intensity of fearfulness were increased by thepresence of high-spatial-frequency cues These resultssupport the view that coarse visual information may bedirected via magnocellular channels from the retina tothe amygdala through a tectopulvinar pathway (eg Bistiamp Sireteanu 1976 Jones amp Burton 1976) enabling thefast appraisal of the affective significance of a stimulus(eg Morris Oumlhman amp Dolan 1999)3

Another aim of the present study was to investigatewhether the face-specific N170 component which is as-sumed to reflect the structural encoding of faces is sen-sitive to emotional facial expressions In previous ERPstudies which have not found any modulations of theN170 elicited by fearful relative to neutral faces (Eimeramp Holmes 2002 Holmes et al 2003) facial expressionwas always task irrelevant In contrast participantsrsquo re-sponses were contingent upon facial expression in thepresent emotion task In spite of this fact the N170 wasfound to be completely unaffected by facial expressionsin the emotion task and this was consistently the casefor all six emotional expressions used in the presentstudy (Figure 5)

In line with earlier findings from depth electrodes (Mc-Carthy Puce Belger amp Allison 1999) this pattern of re-sults now demonstrates comprehensively that the structuralencoding of faces as reflected by the N170 is entirely in-sensitive to information derived from emotional facial ex-pression Thus the rapid detection of emotional facialexpression appears to occur independently and in paral-lel to the construction of a detailed perceptual representa-tion of a face The absence of systematic early emotionalexpression effects at posterior sites and the presence ofsuch ERP effects at frontocentral electrodes at about160 msec poststimulus suggests that higher order visualprocessing stages involved in face processing are af-fected by emotional facial expression only after this in-formation has been processed in prefrontal cortex Thisis consistent with the face processing model proposed byBruce and Young (1986) in which the extraction of per-

ceptual information for emotional expression processingoccurs independently and simultaneously with structuralencoding for face recognition

In summary the present ERP results demonstrate thatthe neocortical processing of emotional facial expres-sion is strongly dependent on focal attention Whenfaces were attended systematic emotional expression ef-fects were elicited by emotional relative to neutral facesand these effects were strikingly similar in terms of theirtiming and morphology for all six basic facial expres-sions In contrast when attention was actively directedaway from these faces emotional expression effectswere completely eliminated The rapid and automatic en-coding of emotionally significant events occurring out-side the focus of attention may be adaptively advanta-geous because it prepares the organism for f ight orflight through subcortically mediated autonomic activa-tion (eg Oumlhman Flykt amp Lundqvist 2000) Howeverit is equally important that irrelevant affective stimuli donot continuously divert attention This suggests a divi-sion of labor between limbic structures involved in theobligatory detection of emotional informationmdashprepar-ing the organism for rapid action (Morris et al 1999Whalen et al 1998)mdashand subsequent neocortical emo-tional processing stages Limbic structures may be re-sponsible for establishing a readiness to respond to anyenvironmental threat that could become the focus of at-tention presumably through heightened autonomic acti-vation However neocortical stages appear to be pro-tected by efficient attentional gating mechanisms whichreduce distractibility by emotional stimuli so that ongo-ing goals and plans can be accomplished without inter-ference from irrelevant events

REFERENCES

Adolphs R (2002) Recognizing emotion from facial expressionsPsychological and neurological mechanisms Behavioral CognitiveNeuroscience Review 1 21-61

Adolphs R Tranel D amp Damasio A R ( 2003) Dissociableneural systems for recognizing emotions Brain amp Cognition 52 61-69

Amaral D G amp Price J L (1984) Amygdalo-cortical projectionsin the monkey (Macaca fasicularis) Journal of Comparative Neu-rology 230 465-496

Amaral D G Price J L Pitkanen A amp Carmichael S T(1992) Anatomical organization of the primate amygdaloid complexIn J P Aggleton (Ed) The amygdala Neurobiological aspects ofemotion memory and mental dysfunction (pp 1-66) New YorkWiley-Liss

Armony J L amp Dolan R J (2002) Modulation of spatial attentionby fear-conditioned stimuli An event-related fMRI study Neuro-psychologia 7 817-826

Bentin S Allison T Puce A Perez E amp McCarthy G (1996)Electrophysiological studies of face perception in humans Journal ofCognitive Neuroscience 8 551-565

Bisti S amp Sireteanu R C (1976) Sensitivity to spatial frequencyand contrast of visual cells in the cat superior colliculus Vision Re-search 16 247-251

Blair R J R Morris J S Frith C D Perrett D I amp DolanR J (1999) Dissociable neural responses to facial expressions ofsadness and anger Brain 122 883-893

Breiter H C Etcoff N L Whalen P J Kennedy W ARauch S L Buckner R L Strauss M M Hyman S E amp

ATTENTION AND FACIAL EXPRESSION PROCESSING 109

Rosen B R (1996) Response and habituation of the human amyg-dala during visual processing of facial expression Neuron 17 875-887

Bruce V amp Young A (1986) Understanding face recognitionBritish Journal of Psychology 77 305-327

Calder A J Keane J Manes F Antoun N amp Young A W(2000) Impaired recognition and experience of disgust followingbrain injury Nature Neuroscience 3 1077-1078

Calder A J Lawrence A D amp Young A W (2001) Neuropsy-chology of fear and loathing Nature Reviews Neuroscience 2 352-363

Carrasco M Penpeci-Talgar C amp Eckstein M (2000) Spatialcovert attention increases contrast sensitivity across the CSF Sup-port for signal enhancement Vision Research 40 1203-1215

Critchley H [D] Daly E [M] Phillips M Brammer M Bull-more E Williams S [C] van Amelsvoort T Robertson DDavid A amp Murphy D [G M] (2000) Explicit and implicit neuralmechanisms for processing of social information from facial expres-sions A functional magnetic resonance imaging study Human BrainMapping 9 93-105

Cuthbert B N Schupp H T Bradley M M Birbaumer N ampLang P J (2000) Brain potentials in affective picture processingCovariation with autonomic arousal and affective report BiologicalPsychology 52 95-111

Damasio A R (1994) Descartesrsquo error Emotion reason and the humanbrain New York G P Putnamrsquos Sons

Diedrich O Naumann E Maier S amp Becker G (1997) Afrontal slow wave in the ERP associated with emotional slides Jour-nal of Psychophysiology 11 71-84

Eastwood J D Smilek D amp Merikle P M (2001) Differential at-tentional guidance by unattended faces expressing positive and neg-ative emotion Perception amp Psychophysics 63 1004-1013

Eimer M (1998) Does the face-specif ic N170 component reflect theactivity of a specialized eye detector NeuroReport 9 2945-2948

Eimer M (2000) The face-specific N170 component reflects late stagesin the structural encoding of faces NeuroReport 11 2319-2324

Eimer M amp Holmes A (2002) An ERP study on the time course ofemotional face processing NeuroReport 13 427-431

Ekman P amp Friesen W V (1976) Pictures of facial affect Palo AltoCA Consulting Psychologists Press

Fox E Lester V Russo R Bowles R J Pichler A amp Dutton K(2000) Facial expressions of emotion Are angry faces detected moreefficiently Cognition amp Emotion 14 61-92

Hansen C H amp Hansen R D (1988) Finding the face in the crowdAn anger superiority effect Journal of Personality amp Social Psy-chology 54 917-924

Hariri A R Bookheimer S Y amp Mazziotta J C (2000) Modu-lating emotional responses Effects of a neocortical network on thelimbic system NeuroReport 11 43-48

Harmer C J Thilo K V Rothwell J C amp Goodwin G M(2001) Transcranial magnetic stimulation of medial-frontal corteximpairs the processing of angry facial expressions Nature Neuro-science 4 17-18

Holmes A Vuilleumier P amp Eimer M (2003) The processing ofemotional facial expression is gated by spatial attention Evidencefrom event-related brain potentials Cognitive Brain Research 16174-184

Jones E G amp Burton H (1976) A projection from the medial pul-vinar to the amygdala in primates Brain Research 104 142-147

Kawasaki H Kaufman O Damasio H Damasio A R Gran-ner M Bakken H Hori T Howard M A III amp Adolphs R(2001) Single-neuron responses to emotional visual stimuli recordedin human ventral prefrontal cortex Nature Neuroscience 4 15-16

Lane R D Chua P M amp Dolan R J (1999) Common effects ofemotional valence arousal and attention on neural activation duringvisual processing of pictures Neuropsychologia 37 989-997

Lang P J Bradley M M Fitzsimmons J R Cuthbert B NScott J D Moulder B amp Nangia V (1998) Emotional arousaland activation of the visual cortex An fMRI analysis Psychophysi-ology 35 199-210

Le Doux J E (1996) The emotional brain New York Simon ampSchuster

Liu L Ioannides A A amp Streit M (1999) Single trial analysis ofneurophysiological correlates of the recognition of complex objectsand facial expressions of emotion Brain Topography 11 291-303

McCarthy G Puce A Belger A amp Allison T (1999) Electro-physiological studies of human face perception II Response proper-ties of face-specific potentials generated in occipitotemporal cortexCerebral Cortex 9 431-444

Mogg K amp Bradley B P (1999) Orienting of attention to threaten-ing facial expressions presented under conditions of restricted aware-ness Cognition amp Emotion 13 713-740

Mogg K McNamara J Powys M Rawlinson H Seiffer A ampBradley B P (2000) Selective attention to threat A test of twocognitive models of anxiety Cognition amp Emotion 14 375-399

Morris J S Friston K J Buechel C Frith C D Young A WCalder A J amp Dolan R J (1998) A neuromodulatory role forthe human amygdala in processing emotional facial expressionsBrain 121 47-57

Morris J S Frith C D Perrett D I Rowland D YoungA W Calder A J amp Dolan R J (1996) A differential neural re-sponse in the human amygdala to fearful and happy facial expres-sions Nature 383 812-815

Morris J S Oumlhman A amp Dolan R J (1999) A subcortical path-way to the right amygdala mediating ldquounseenrdquo fear Proceedings ofthe National Academy of Sciences 96 1680-1685

Oumlhman A Flykt A amp Esteves F (2001) Emotion drives attentionDetecting the snake in the grass Journal of Experimental Psychol-ogy General 130 466-478

Oumlhman A Flykt A amp Lundqvist D (2000) Unconscious emo-tion Evolutionary perspectives psychophysiological data and neu-ropsychological mechanisms In R D Lane amp L Nadel (Eds) Cog-nitive neuroscience of emotion (pp 296-327) New York OxfordUniversity Press

Oumlhman A Lundqvist D amp Esteves F (2001) The face in thecrowd revisited A threat advantage with schematic stimuli Journalof Personality amp Social Psychology 80 381-396

Pessoa L Kastner S amp Ungerleider LG (2002) Attentionalcontrol of the processing of neutral and emotional stimuli CognitiveBrain Research 15 31-45

Pessoa L McKenna M Gutierrez E amp Ungerleider LG(2002) Neural processing of emotional faces requires attention Pro-ceedings of the National Academy of Sciences 99 11458-11463

Phillips M L Young A W Scott S K Calder A J Andrew CGiampietro V Williams S C R Bullmore E T BrammerMamp Gray J A (1998) Neural responses to facial and vocal expres-sions of fear and disgust Proceedings of the Royal Society LondonSeries B 265 1809-1817

Phillips M L Young A W Senior C Brammer M Andrew CCalder A J Bullmore E T Perrett D I Rowland D ampWilliams S C R et al (1997) A specif ic neural substrate forperceiving facial expressions of disgust Nature 389 495-498

Pizzagalli D Regard M amp Lehmann D (1999) Rapid emotionalface processing in the human right and left brain hemispheres AnERP study NeuroReport 10 2691-2698

Rapcsak S Z Galper S R Comer J F Reminger S L NielsenLKaszniak A W Verfaellie M Laguna J F Labiner D Mamp Cohen R A (2000) Fear recognition deficits after focal braindamage Neurology 54 575-581

Rolls E T (1999) The brain and emotion Oxford Oxford UniversityPress

Sato W Kochiyama T Yoshikawa S amp Matsumura M (2001)Emotional expression boosts early visual processing of the face ERPrecording and its decomposition by independent component analysisNeuroReport 12 709-714

Schyns P G amp Oliva A (1999) Dr Angry and Mr Smile Whencategorization flexibly modifies the perception of faces in rapid vi-sual presentations Cognition 69 243-265

Sprengelmeyer R Rausch M Eysel U T amp Przuntek H (1998)Neural structures associated with recognition of facial expressions of

110 EIMER HOLMES AND MCGLONE

basic emotions Proceedings of the Royal Society London Series B265 1927-1931

Stepniewska I Qi H X amp Kaas J H (1999) Do superior collicu-lus projection zones in the inferior pulvinar project to MT in pri-mates European Journal of Neuroscience 11 469-480

Vuilleumier P Armony J L Driver J amp Dolan R J (2001) Ef-fects of attention and emotion on face processing in the human brainAn event-related fMRI study Neuron 30 829-841

Vuilleumier P Armony J L Driver J amp Dolan R J (2003)Distinct spatial frequency sensitivities for processing faces and emo-tional expressions Nature Neuroscience 6 624-631

Vuilleumier P amp Schwartz S (2001a) Beware and be aware Cap-ture of spatial attention by fear-related stimuli in neglect Neuro-Report 12 1119-1122

Vuilleumier P amp Schwartz S (2001b) Emotional expressionscapture attention Neurology 56 153-158

Whalen P J Rach S L Etcoff N L McInerney S C LeeM B amp Jenike M A (1998) Masked presentations of emotionalfacial expressions modulate amygdala activity without explicitknowledge Journal of Neuroscience 18 411-418

Whalen P J Shin L M McInerney S C Fischer H WrightC I amp Rauch S L (2001) A functional MRI study of humanamygdala responses to facial expressions of fear versus anger Emo-tion 1 70-83

Yeterian E H amp Pandya D N (1991) Corticothalamic connec-tions of the superior temporal sulcus in rhesus monkeys Experimen-tal Brain Research 83 268-284

NOTES

1 In spite of the fact that significant valence effects were present atcentral electrodes in the emotion task but were absent in the lines taskthis interaction failed to reach significance at central sites

2 At lateral occipital electrodes a significantly enhanced positivityfor emotional relative to neutral faces was present between 320 and495 msec in the emotion task [F(113) = 62 p lt 03] but not in thelines task and this was reflected in a nearly significant task 3 valenceinteraction [F(113) = 46 p lt 06]

3 It should be noted that anatomical evidence for a colliculo-pulvinar-amygdalar pathway is currently lacking since the medial pulvinarwhich projects to the amygdala does not receive a significant directinput from the superior colliculus (eg Stepniewska Qi amp Kaas1999) However possible connections between the inferior pulvinar(which receives visual inputs from the superior colliculus) and the me-dial nucleus may support the transmission of information to the amyg-dala through a colliculo-pulvinar route Alternatively cortical inputmay be involved since STS (implicated in facial expression processingSprengelmeyer et al 1998) is known to project to the medial pulvinar(Yeterian amp Pandya 1991) Our thanks to an anonymous reviewer forraising this important point

(Manuscript received December 10 2002revision accepted for publication June 11 2003)

ATTENTION AND FACIAL EXPRESSION PROCESSING 103

and 4) To further ascertain that this component is unaf-fected by emotional expression even when expression istask relevant we conducted additional analyses on N170amplitudes observed in the emotion task (Figure 3) Nomain effect of valence (F lt 12) or interaction betweenblock type and valence (F lt 1) was observed indicatingthat the N170 was similarly insensitive to emotional fa-cial expression for all six basic emotions employed hereeven though participants had to discriminate betweenemotional and neutral faces in this task This is illus-trated in Figure 5 which displays ERPs in response toneutral and emotional faces elicited in the emotion taskat right lateral temporal electrode T6 shown separatelyfor each of the six facial expressions which were pre-sented in different blocks No systematic differential ef-fects of any facial expression on the N170 are apparentand this was confirmed by additional planned pairedcomparisons of N170 amplitudes at T5 and T6 in re-

sponse to emotional versus neutral faces conducted sep-arately for all six basic emotions None of these com-parisons even approached statistical significance [allts(13) lt 15]

Emotional expression effects No main effects of va-lence or task 3 valence interactions were observed in the120- to 155-msec time window In the 160- to 215-msecanalysis window a task 3 valence interaction was pres-ent at frontal sites [F(113) = 52 p lt 05]1 Main effectsof valence were found at frontal and central sites [bothFs(114) gt 91 both ps lt 01] in the emotion task reflect-ing an enhanced positivity elicited in response to arrayscontaining emotional faces (Figure 3) These effects werecompletely absent in the lines task (both Fs lt 1) No in-teractions between block type and valence were found atfrontal and central sites in the emotion task (both Fs lt11) demonstrating that this early emotional positivitywas elicited in response to emotional versus neutral faces

Figure 4 Grand-averaged ERP waveforms elicited in the lines task in the 700-msec inter-val following stimulus onset in response to stimulus arrays containing neutral faces (solidlines) or emotional faces (dashed lines) Data are collapsed across blocks including each ofthe six different emotional facial expressions as well as across trials with identical and dif-ferent line pairs

104 EIMER HOLMES AND MCGLONE

irrespective of which of the six basic emotions was in-cluded in a given block This fact is further illustrated inFigure 6 which shows ERPs in response to neutral andemotional faces elicited in the emotion task at Fz dis-played separately for all emotional expressions used inthis experiment Emotional expression effects were verysimilar across expressions and started at approximatelythe same time for all six basic emotions No significantemotional expression effects were present between 160and 215 msec poststimulus at parietal and occipital elec-trodes

Between 220 and 315 msec poststimulus task 3 va-lence interactions were present at frontal and centralelectrodes as well as at lateral temporal and occipitalsites [all Fs(113) gt 72 all ps lt 02] indicating thatemotional expression affected ERPs in the emotion taskbut not in the lines task At frontal and central sites maineffects of valence in the emotion task [both Fs(113) gt150 both ps lt 02] reflected enhanced positivities foremotional relative to neutral faces (Figure 3) No blocktype 3 valence interactions were present (both Fs lt 1)demonstrating that this effect was elicited in similarfashion for all six basic emotions (Figure 6) Again nofrontocentral emotional expression effects were observedin the lines task (both Fs lt 16) At lateral temporal andoccipital sites an enhanced negativity was observed inthe 220- to 315-msec latency window for emotional rel-ative to neutral faces in the emotion task [both Fs(113) gt61 both ps lt 03] but not in the lines task (both Fs lt 1)Again no block type 3 valence interactions were pres-

ent for the emotion task (both Fs lt 16) indicating thatthis lateral posterior emotional negativity was elicited inresponse to all six basic emotions (Figure 5)

In the final two analysis windows (320ndash495 msec and500ndash700 msec poststimulus respectively) highly sig-nificant task 3 valence interactions were present atfrontal central and parietal electrodes [all Fs(113) gt100 all ps lt 01] again reflecting the presence of emo-tional expression effects in the emotion task (Figure 3)and their absence in the lines task (Figure 4) Main ef-fects of valence at frontal and central as well as at pari-etal electrodes in the emotion task [all Fs(113) gt 119all ps lt 01] without any significant interactions betweenvalence and block type demonstrated that enhanced pos-itivities for emotional faces were elicited at these sites ina similar fashion for all six basic emotions (Figure 6)Again effects of valence were entirely absent in the linestask2

DISCUSSION

The primary aim of the present ERP experiment wasto extend previous findings (Holmes et al 2003) that thedetection and processing of emotional information de-livered by facial expressions requires focal attention Werecorded ERPs to stimulus arrays containing emotionalor neutral bilateral faces under conditions when facialexpression was task relevant and therefore attended (emo-tion task) or when attention was actively directed awayfrom these faces toward a demanding perceptual judgment

Figure 5 Grand-averaged ERP waveforms elicited in the emotion task at right lateral tem-poral electrode T6 in the 700-msec interval following stimulus onset in response to stimulusarrays containing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are shownseparately for blocks containing angry disgusted fearful happy sad or surprised faces

ATTENTION AND FACIAL EXPRESSION PROCESSING 105

(lines task) In our previous ERP study (Holmes et al2003) spatial attention was manipulated on a trial-by-trial basis by precues presented at the start of each trialfacial expression was not task relevant (participants hadto detect infrequent identical stimulus pairs regardlessof expression) and only one emotional expression (fear)was tested In the present experiment a sustained attentionparadigm was employed (with emotion and lines tasksdelivered in separate experimental halves) facial expres-sion was task relevant in the emotion task and most im-portant all six basic facial emotional expressions wereincluded in different blocks

ERP correlates of emotional facial expression pro-cessing were identified by comparing ERPs elicited ontrials with emotional faces with ERPs in response to neu-tral faces This was done separately for the emotion taskand the lines task and for blocks including angry dis-gusted fearful happy sad and surprised faces In theemotion task where attention was directed toward task-relevant facial expressions an enhanced positivity foremotional relative to neutral faces was elicited similar toprevious observations from studies comparing ERP re-sponses to fearful versus neutral faces (Eimer amp Holmes2002 Holmes et al 2003) This emotional expressioneffect started at about 160 msec poststimulus and wasinitially distributed frontocentrally whereas a morebroadly distributed positivity was observed beyond300 msec (Figure 3) In addition an enhanced negativityfor fearful relative to neutral faces was elicited at lateral

posterior electrodes between 220 and 320 msec post-stimulus

The onset of the early frontocentral emotional expres-sion effect was slightly later in the present experimentthan in our previous experiment (Holmes et al 2003)where significant frontal differences between ERPs tofearful and neutral faces were already present at about120 msec poststimulus In the present study verticallines were presented close to fixation simultaneouslywith the bilateral faces whereas no such stimuli were in-cluded in our earlier experiment The presence of theseadditional central events may have slightly delayed theonset of early emotional expression effects It shouldalso be noted that an attenuation of amygdala responsesto emotional facial expressions has been observed whenthe demand for explicit emotion recognition was in-creased (Critchley et al 2000 Hariri Bookheimer ampMazziotta 2000) It is possible that the demand for ex-plicit emotion recognition in the emotion task con-tributed to the delayed onset of the early emotional ex-pression effect

In marked contrast to these ERP results obtained inthe emotion task emotional expression effects were en-tirely absent in the lines task (Figure 4) demonstratingthat ERP correlates of facial expression processing arestrongly dependent on spatial attention With sustainedspatial attention directed away from face stimuli towardanother demanding perceptual task the presence of emo-tional versus neutral faces had no effect whatsoever on

Figure 6 Grand-averaged ERP waveforms elicited in the emotion task at midline electrodeFz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are shown separatelyfor blocks containing angry disgusted fearful happy sad or surprised faces

106 EIMER HOLMES AND MCGLONE

ERP waveforms That is emotional expression effectswere completely eliminated for all six basic emotions in-cluded in this experiment In line with this ERP resultperformance in the lines task was entirely unaffected bythe expression of the faces presented simultaneouslywith the task-relevant line pairs Overall these findingsextend and confirm the observations of our previousERP experiment which compared ERPs in response tofearful versus neutral faces (Holmes et al 2003) Clearlythese results challenge the hypothesis that the detectionandor processing of emotional facial expression occurspreattentively If this were the case at least some system-atic ERP differences should have been elicited in responseto emotional versus neutral faces in the lines task reflect-ing the automatic detection of emotionally significantevents

Covert attention toward emotional faces under condi-tions when they were task relevant may have enhancedtheir visualndashperceptual representation (eg CarrascoPenpeci-Talgar amp Eckstein 2000) thereby enabling theextraction of features relating to the affective valence ofthese faces and thus their subsequent encoding andanalysis (as reflected by the emotion-specific ERP effectsobserved in the emotion task) The early frontocentrallydistributed emotional expression effects may be mediatedby connections from the superior temporal sulcus (STS)and amygdala to orbitofrontal cortex (Rolls 1999) TheSTS has been implicated in the early discrimination ofvisual features relating to emotional facial expressions(eg Sprengelmeyer et al 1998) In addition efferentfeedback projections from the amygdala and relatedstructures (see Lang et al 1998 Morris et al 1998)may have produced the more broadly distributed emo-tional expression effects observed in the present experi-ment at longer latencies

One could argue that the absence of emotional ex-pression effects under conditions where faces were un-attended may have been due to the fact that the presen-tation of specific emotional expressions was blocked andthat each expression was presented repeatedly in twoseparate blocks Repeated exposure to a specific emo-tional expression may have resulted in a gradual habitu-ation of emotion-specific responses thus potentially at-tenuating any emotional expression effects that may havebeen present in the lines task To investigate this possi-bility we computed separate averages for the first blockand for the second block including angry disgustedfearful happy sad or surprised faces separately for theemotion and for the lines task These data were then an-alyzed with the additional factor of block position (firstvs second block containing a specific emotional facialexpression) If emotional expression effects were subjectto habituation one would expect to find larger emotionalexpression effects for the first relative to the second blockin the emotion task and potentially also a residual emo-tional expression effect for the first block in the lines task

Figure 7 shows ERPs elicited at Fz in response to neu-tral faces (solid lines) or emotional faces (dashed lines)

collapsed across all six different emotional expressionsERPs are displayed separately for the emotion task (toppanel) and the lines task (bottom panel) and for the firstblock (left) or second block (right) including one of thesix emotional expressions As can be seen from Figure 7(top) there was no evidence whatsoever for any habitu-ation of emotional expression effects as a function ofblock position in the emotion task This was confirmedby the absence of any block position 3 valence or blockposition 3 block type 3 valence interactions for all latencywindows employed in the analyses reported above [allFs(113) lt 1] Along similar lines Figure 7 (bottom panel)suggests that there was no residual emotional expressioneffect for the first block including a specific emotionalexpression in the lines task This was confirmed by theabsence of any interactions involving block position [allFs(113) lt 16] Thus the fact that emotional expressioneffects were absent in response to unattended faces in thelines task is unlikely to have been the result of a habitu-ation of emotion-specific brain responses

The conclusion that the processing of emotional facialexpression as reflected by ERP facial expression effectsis gated by spatial attention appears to be inconsistentwith neuroimaging studies demonstrating that fearfulfaces result in amygdala activations even when thesefaces are outside the focus of attention (Vuilleumieret al 2001 see also Morris et al 1996 Whalen et al1998) However it is extremely unlikely that the ERP ef-fects observed in the present study are directly linked toamygdala activations Due to its nuclear structure ofclustered neurones the amygdala is electrically closedand thus largely inaccessible to ERP measures The earlyemotional expression effects observed in response to at-tended faces are more likely to be generated in prefrontalcortex where emotion-specific single-cell responseshave recently been recorded at short latencies (Kawasakiet al 2001) Such prefrontal responses may reflect stagesin emotional processing that could be contingent uponbut functionally separate from prior amygdala activa-tions (see Le Doux 1996 Rolls 1999) It is possible thatamygdala responses can be triggered by unattendedemotional stimuli (although these responses may be at-tenuated) whereas subsequent neocortical stages ofemotional processing (as reflected by the ERP effects ob-served in the present experiment) are fully dependent onfocal attention An alternative possibility is that amyg-dala responses to emotional stimuli may also require at-tention (see Pessoa Kastner amp Ungerleider 2002 Pes-soa McKenna et al 2002) and that the elimination ofemotional expression effects in the lines task reflects anearlier attentional gating of such subcortical processing

Another important new finding of the present experi-ment was that the onset time course and scalp distribu-tion of emotional expression effects obtained in the emo-tion task were remarkably similar for all six basic facialexpressions used here (Figures 5 and 6) The absence ofany differential ERP responses to different emotional ex-pressions was reflected by the absence of any significant

ATTENTION AND FACIAL EXPRESSION PROCESSING 107

interactions between block type (blocks with angry dis-gusted fearful happy sad or surprised faces) and va-lence (emotional vs neutral expression) In line withthese observations the size of the RT advantage for emo-tional relative to neutral faces in the emotion task wassimilar for all six emotional facial expressions (Figure 2top panel) The similarity in the time course of emotionalexpression effects across all six emotional expressionsobserved here suggests that emotionally relevant infor-mation delivered by facial expression is available to neo-cortical processes within less then 200 msec after stim-ulus onset and at approximately the same time for allbasic emotional expressions

These observations do not seem to support the ideasuggested by recent f MRI results that distinct neuralsubsystems specialize in the processing of specific emo-tions (Adolphs 2002) If this were the case one mighthave expected some systematic differences between ERPemotional expression effects elicited by different facial

expressions However it should be noted that althoughsome neuroimaging data show emotion-specific differ-ential activation of brain regions such as the amygdala orinsula few studies point to differential activation withinsurface cortical structures (where the ERP effects ob-served in the present experiments are likely to be gener-ated see also Pizzagalli et al 1999 Sato et al 2001for related results from recent ERP studies)

Thus one could argue that early stages in the processingof emotionally relevant information subserved by lim-bic structures or the basal ganglia and subsequent neo-cortical emotional processing stages differ not only intheir dependence on focal attention (see above) but alsoin their specificity Early processes may be differentiallyengaged by specific emotional expressions thus provid-ing a rapid classif ication of emotionally significantevents Data in support of this view come from single-unit recordings which reveal a rapid emergence of dif-ferential effects to emotional expressions in the human

Figure 7 Grand-averaged ERP waveforms elicited in the emotion task (toppanel) and in the lines task (bottom panel) at midline electrode Fz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are col-lapsed across blocks including each of the six different emotional facialexpressions and are shown separately for the first block (left) and the secondblock (right) including one specific emotional expression

108 EIMER HOLMES AND MCGLONE

amygdala (Liu et al 1999) Conversely later stagesmight be involved in the in-depth processing of variouskinds of affective information and thus would be muchless selective with respect to different facial expressions

This suggestion is consistent with some recent evi-dence that subcortical and neocortical routes for visualprocessing are involved differentially in emotional ex-pression analysis A subcortical magnocellular pathwayto the amygdala would appear to support valence dis-crimination processes whereas parvocellular subsys-tems of ventral visual cortices may be preferentially in-volved in emotional intensity evaluation irrespective ofemotional valence (Schyns amp Oliva 1999 VuilleumierArmony Driver amp Dolan 2003) Recent neuroimagingresults (Vuilleumier et al 2003) suggest that low andhigh spatial frequency components of fearful faces se-lectively drive amygdala and visual cortical responsesrespectively However although enhanced amygdala ac-tivation was found in response to low-spatial-frequencyfearful face stimuli explicit judgments relating to theperceived intensity of fearfulness were increased by thepresence of high-spatial-frequency cues These resultssupport the view that coarse visual information may bedirected via magnocellular channels from the retina tothe amygdala through a tectopulvinar pathway (eg Bistiamp Sireteanu 1976 Jones amp Burton 1976) enabling thefast appraisal of the affective significance of a stimulus(eg Morris Oumlhman amp Dolan 1999)3

Another aim of the present study was to investigatewhether the face-specific N170 component which is as-sumed to reflect the structural encoding of faces is sen-sitive to emotional facial expressions In previous ERPstudies which have not found any modulations of theN170 elicited by fearful relative to neutral faces (Eimeramp Holmes 2002 Holmes et al 2003) facial expressionwas always task irrelevant In contrast participantsrsquo re-sponses were contingent upon facial expression in thepresent emotion task In spite of this fact the N170 wasfound to be completely unaffected by facial expressionsin the emotion task and this was consistently the casefor all six emotional expressions used in the presentstudy (Figure 5)

In line with earlier findings from depth electrodes (Mc-Carthy Puce Belger amp Allison 1999) this pattern of re-sults now demonstrates comprehensively that the structuralencoding of faces as reflected by the N170 is entirely in-sensitive to information derived from emotional facial ex-pression Thus the rapid detection of emotional facialexpression appears to occur independently and in paral-lel to the construction of a detailed perceptual representa-tion of a face The absence of systematic early emotionalexpression effects at posterior sites and the presence ofsuch ERP effects at frontocentral electrodes at about160 msec poststimulus suggests that higher order visualprocessing stages involved in face processing are af-fected by emotional facial expression only after this in-formation has been processed in prefrontal cortex Thisis consistent with the face processing model proposed byBruce and Young (1986) in which the extraction of per-

ceptual information for emotional expression processingoccurs independently and simultaneously with structuralencoding for face recognition

In summary the present ERP results demonstrate thatthe neocortical processing of emotional facial expres-sion is strongly dependent on focal attention Whenfaces were attended systematic emotional expression ef-fects were elicited by emotional relative to neutral facesand these effects were strikingly similar in terms of theirtiming and morphology for all six basic facial expres-sions In contrast when attention was actively directedaway from these faces emotional expression effectswere completely eliminated The rapid and automatic en-coding of emotionally significant events occurring out-side the focus of attention may be adaptively advanta-geous because it prepares the organism for f ight orflight through subcortically mediated autonomic activa-tion (eg Oumlhman Flykt amp Lundqvist 2000) Howeverit is equally important that irrelevant affective stimuli donot continuously divert attention This suggests a divi-sion of labor between limbic structures involved in theobligatory detection of emotional informationmdashprepar-ing the organism for rapid action (Morris et al 1999Whalen et al 1998)mdashand subsequent neocortical emo-tional processing stages Limbic structures may be re-sponsible for establishing a readiness to respond to anyenvironmental threat that could become the focus of at-tention presumably through heightened autonomic acti-vation However neocortical stages appear to be pro-tected by efficient attentional gating mechanisms whichreduce distractibility by emotional stimuli so that ongo-ing goals and plans can be accomplished without inter-ference from irrelevant events

REFERENCES

Adolphs R (2002) Recognizing emotion from facial expressionsPsychological and neurological mechanisms Behavioral CognitiveNeuroscience Review 1 21-61

Adolphs R Tranel D amp Damasio A R ( 2003) Dissociableneural systems for recognizing emotions Brain amp Cognition 52 61-69

Amaral D G amp Price J L (1984) Amygdalo-cortical projectionsin the monkey (Macaca fasicularis) Journal of Comparative Neu-rology 230 465-496

Amaral D G Price J L Pitkanen A amp Carmichael S T(1992) Anatomical organization of the primate amygdaloid complexIn J P Aggleton (Ed) The amygdala Neurobiological aspects ofemotion memory and mental dysfunction (pp 1-66) New YorkWiley-Liss

Armony J L amp Dolan R J (2002) Modulation of spatial attentionby fear-conditioned stimuli An event-related fMRI study Neuro-psychologia 7 817-826

Bentin S Allison T Puce A Perez E amp McCarthy G (1996)Electrophysiological studies of face perception in humans Journal ofCognitive Neuroscience 8 551-565

Bisti S amp Sireteanu R C (1976) Sensitivity to spatial frequencyand contrast of visual cells in the cat superior colliculus Vision Re-search 16 247-251

Blair R J R Morris J S Frith C D Perrett D I amp DolanR J (1999) Dissociable neural responses to facial expressions ofsadness and anger Brain 122 883-893

Breiter H C Etcoff N L Whalen P J Kennedy W ARauch S L Buckner R L Strauss M M Hyman S E amp

ATTENTION AND FACIAL EXPRESSION PROCESSING 109

Rosen B R (1996) Response and habituation of the human amyg-dala during visual processing of facial expression Neuron 17 875-887

Bruce V amp Young A (1986) Understanding face recognitionBritish Journal of Psychology 77 305-327

Calder A J Keane J Manes F Antoun N amp Young A W(2000) Impaired recognition and experience of disgust followingbrain injury Nature Neuroscience 3 1077-1078

Calder A J Lawrence A D amp Young A W (2001) Neuropsy-chology of fear and loathing Nature Reviews Neuroscience 2 352-363

Carrasco M Penpeci-Talgar C amp Eckstein M (2000) Spatialcovert attention increases contrast sensitivity across the CSF Sup-port for signal enhancement Vision Research 40 1203-1215

Critchley H [D] Daly E [M] Phillips M Brammer M Bull-more E Williams S [C] van Amelsvoort T Robertson DDavid A amp Murphy D [G M] (2000) Explicit and implicit neuralmechanisms for processing of social information from facial expres-sions A functional magnetic resonance imaging study Human BrainMapping 9 93-105

Cuthbert B N Schupp H T Bradley M M Birbaumer N ampLang P J (2000) Brain potentials in affective picture processingCovariation with autonomic arousal and affective report BiologicalPsychology 52 95-111

Damasio A R (1994) Descartesrsquo error Emotion reason and the humanbrain New York G P Putnamrsquos Sons

Diedrich O Naumann E Maier S amp Becker G (1997) Afrontal slow wave in the ERP associated with emotional slides Jour-nal of Psychophysiology 11 71-84

Eastwood J D Smilek D amp Merikle P M (2001) Differential at-tentional guidance by unattended faces expressing positive and neg-ative emotion Perception amp Psychophysics 63 1004-1013

Eimer M (1998) Does the face-specif ic N170 component reflect theactivity of a specialized eye detector NeuroReport 9 2945-2948

Eimer M (2000) The face-specific N170 component reflects late stagesin the structural encoding of faces NeuroReport 11 2319-2324

Eimer M amp Holmes A (2002) An ERP study on the time course ofemotional face processing NeuroReport 13 427-431

Ekman P amp Friesen W V (1976) Pictures of facial affect Palo AltoCA Consulting Psychologists Press

Fox E Lester V Russo R Bowles R J Pichler A amp Dutton K(2000) Facial expressions of emotion Are angry faces detected moreefficiently Cognition amp Emotion 14 61-92

Hansen C H amp Hansen R D (1988) Finding the face in the crowdAn anger superiority effect Journal of Personality amp Social Psy-chology 54 917-924

Hariri A R Bookheimer S Y amp Mazziotta J C (2000) Modu-lating emotional responses Effects of a neocortical network on thelimbic system NeuroReport 11 43-48

Harmer C J Thilo K V Rothwell J C amp Goodwin G M(2001) Transcranial magnetic stimulation of medial-frontal corteximpairs the processing of angry facial expressions Nature Neuro-science 4 17-18

Holmes A Vuilleumier P amp Eimer M (2003) The processing ofemotional facial expression is gated by spatial attention Evidencefrom event-related brain potentials Cognitive Brain Research 16174-184

Jones E G amp Burton H (1976) A projection from the medial pul-vinar to the amygdala in primates Brain Research 104 142-147

Kawasaki H Kaufman O Damasio H Damasio A R Gran-ner M Bakken H Hori T Howard M A III amp Adolphs R(2001) Single-neuron responses to emotional visual stimuli recordedin human ventral prefrontal cortex Nature Neuroscience 4 15-16

Lane R D Chua P M amp Dolan R J (1999) Common effects ofemotional valence arousal and attention on neural activation duringvisual processing of pictures Neuropsychologia 37 989-997

Lang P J Bradley M M Fitzsimmons J R Cuthbert B NScott J D Moulder B amp Nangia V (1998) Emotional arousaland activation of the visual cortex An fMRI analysis Psychophysi-ology 35 199-210

Le Doux J E (1996) The emotional brain New York Simon ampSchuster

Liu L Ioannides A A amp Streit M (1999) Single trial analysis ofneurophysiological correlates of the recognition of complex objectsand facial expressions of emotion Brain Topography 11 291-303

McCarthy G Puce A Belger A amp Allison T (1999) Electro-physiological studies of human face perception II Response proper-ties of face-specific potentials generated in occipitotemporal cortexCerebral Cortex 9 431-444

Mogg K amp Bradley B P (1999) Orienting of attention to threaten-ing facial expressions presented under conditions of restricted aware-ness Cognition amp Emotion 13 713-740

Mogg K McNamara J Powys M Rawlinson H Seiffer A ampBradley B P (2000) Selective attention to threat A test of twocognitive models of anxiety Cognition amp Emotion 14 375-399

Morris J S Friston K J Buechel C Frith C D Young A WCalder A J amp Dolan R J (1998) A neuromodulatory role forthe human amygdala in processing emotional facial expressionsBrain 121 47-57

Morris J S Frith C D Perrett D I Rowland D YoungA W Calder A J amp Dolan R J (1996) A differential neural re-sponse in the human amygdala to fearful and happy facial expres-sions Nature 383 812-815

Morris J S Oumlhman A amp Dolan R J (1999) A subcortical path-way to the right amygdala mediating ldquounseenrdquo fear Proceedings ofthe National Academy of Sciences 96 1680-1685

Oumlhman A Flykt A amp Esteves F (2001) Emotion drives attentionDetecting the snake in the grass Journal of Experimental Psychol-ogy General 130 466-478

Oumlhman A Flykt A amp Lundqvist D (2000) Unconscious emo-tion Evolutionary perspectives psychophysiological data and neu-ropsychological mechanisms In R D Lane amp L Nadel (Eds) Cog-nitive neuroscience of emotion (pp 296-327) New York OxfordUniversity Press

Oumlhman A Lundqvist D amp Esteves F (2001) The face in thecrowd revisited A threat advantage with schematic stimuli Journalof Personality amp Social Psychology 80 381-396

Pessoa L Kastner S amp Ungerleider LG (2002) Attentionalcontrol of the processing of neutral and emotional stimuli CognitiveBrain Research 15 31-45

Pessoa L McKenna M Gutierrez E amp Ungerleider LG(2002) Neural processing of emotional faces requires attention Pro-ceedings of the National Academy of Sciences 99 11458-11463

Phillips M L Young A W Scott S K Calder A J Andrew CGiampietro V Williams S C R Bullmore E T BrammerMamp Gray J A (1998) Neural responses to facial and vocal expres-sions of fear and disgust Proceedings of the Royal Society LondonSeries B 265 1809-1817

Phillips M L Young A W Senior C Brammer M Andrew CCalder A J Bullmore E T Perrett D I Rowland D ampWilliams S C R et al (1997) A specif ic neural substrate forperceiving facial expressions of disgust Nature 389 495-498

Pizzagalli D Regard M amp Lehmann D (1999) Rapid emotionalface processing in the human right and left brain hemispheres AnERP study NeuroReport 10 2691-2698

Rapcsak S Z Galper S R Comer J F Reminger S L NielsenLKaszniak A W Verfaellie M Laguna J F Labiner D Mamp Cohen R A (2000) Fear recognition deficits after focal braindamage Neurology 54 575-581

Rolls E T (1999) The brain and emotion Oxford Oxford UniversityPress

Sato W Kochiyama T Yoshikawa S amp Matsumura M (2001)Emotional expression boosts early visual processing of the face ERPrecording and its decomposition by independent component analysisNeuroReport 12 709-714

Schyns P G amp Oliva A (1999) Dr Angry and Mr Smile Whencategorization flexibly modifies the perception of faces in rapid vi-sual presentations Cognition 69 243-265

Sprengelmeyer R Rausch M Eysel U T amp Przuntek H (1998)Neural structures associated with recognition of facial expressions of

110 EIMER HOLMES AND MCGLONE

basic emotions Proceedings of the Royal Society London Series B265 1927-1931

Stepniewska I Qi H X amp Kaas J H (1999) Do superior collicu-lus projection zones in the inferior pulvinar project to MT in pri-mates European Journal of Neuroscience 11 469-480

Vuilleumier P Armony J L Driver J amp Dolan R J (2001) Ef-fects of attention and emotion on face processing in the human brainAn event-related fMRI study Neuron 30 829-841

Vuilleumier P Armony J L Driver J amp Dolan R J (2003)Distinct spatial frequency sensitivities for processing faces and emo-tional expressions Nature Neuroscience 6 624-631

Vuilleumier P amp Schwartz S (2001a) Beware and be aware Cap-ture of spatial attention by fear-related stimuli in neglect Neuro-Report 12 1119-1122

Vuilleumier P amp Schwartz S (2001b) Emotional expressionscapture attention Neurology 56 153-158

Whalen P J Rach S L Etcoff N L McInerney S C LeeM B amp Jenike M A (1998) Masked presentations of emotionalfacial expressions modulate amygdala activity without explicitknowledge Journal of Neuroscience 18 411-418

Whalen P J Shin L M McInerney S C Fischer H WrightC I amp Rauch S L (2001) A functional MRI study of humanamygdala responses to facial expressions of fear versus anger Emo-tion 1 70-83

Yeterian E H amp Pandya D N (1991) Corticothalamic connec-tions of the superior temporal sulcus in rhesus monkeys Experimen-tal Brain Research 83 268-284

NOTES

1 In spite of the fact that significant valence effects were present atcentral electrodes in the emotion task but were absent in the lines taskthis interaction failed to reach significance at central sites

2 At lateral occipital electrodes a significantly enhanced positivityfor emotional relative to neutral faces was present between 320 and495 msec in the emotion task [F(113) = 62 p lt 03] but not in thelines task and this was reflected in a nearly significant task 3 valenceinteraction [F(113) = 46 p lt 06]

3 It should be noted that anatomical evidence for a colliculo-pulvinar-amygdalar pathway is currently lacking since the medial pulvinarwhich projects to the amygdala does not receive a significant directinput from the superior colliculus (eg Stepniewska Qi amp Kaas1999) However possible connections between the inferior pulvinar(which receives visual inputs from the superior colliculus) and the me-dial nucleus may support the transmission of information to the amyg-dala through a colliculo-pulvinar route Alternatively cortical inputmay be involved since STS (implicated in facial expression processingSprengelmeyer et al 1998) is known to project to the medial pulvinar(Yeterian amp Pandya 1991) Our thanks to an anonymous reviewer forraising this important point

(Manuscript received December 10 2002revision accepted for publication June 11 2003)

104 EIMER HOLMES AND MCGLONE

irrespective of which of the six basic emotions was in-cluded in a given block This fact is further illustrated inFigure 6 which shows ERPs in response to neutral andemotional faces elicited in the emotion task at Fz dis-played separately for all emotional expressions used inthis experiment Emotional expression effects were verysimilar across expressions and started at approximatelythe same time for all six basic emotions No significantemotional expression effects were present between 160and 215 msec poststimulus at parietal and occipital elec-trodes

Between 220 and 315 msec poststimulus task 3 va-lence interactions were present at frontal and centralelectrodes as well as at lateral temporal and occipitalsites [all Fs(113) gt 72 all ps lt 02] indicating thatemotional expression affected ERPs in the emotion taskbut not in the lines task At frontal and central sites maineffects of valence in the emotion task [both Fs(113) gt150 both ps lt 02] reflected enhanced positivities foremotional relative to neutral faces (Figure 3) No blocktype 3 valence interactions were present (both Fs lt 1)demonstrating that this effect was elicited in similarfashion for all six basic emotions (Figure 6) Again nofrontocentral emotional expression effects were observedin the lines task (both Fs lt 16) At lateral temporal andoccipital sites an enhanced negativity was observed inthe 220- to 315-msec latency window for emotional rel-ative to neutral faces in the emotion task [both Fs(113) gt61 both ps lt 03] but not in the lines task (both Fs lt 1)Again no block type 3 valence interactions were pres-

ent for the emotion task (both Fs lt 16) indicating thatthis lateral posterior emotional negativity was elicited inresponse to all six basic emotions (Figure 5)

In the final two analysis windows (320ndash495 msec and500ndash700 msec poststimulus respectively) highly sig-nificant task 3 valence interactions were present atfrontal central and parietal electrodes [all Fs(113) gt100 all ps lt 01] again reflecting the presence of emo-tional expression effects in the emotion task (Figure 3)and their absence in the lines task (Figure 4) Main ef-fects of valence at frontal and central as well as at pari-etal electrodes in the emotion task [all Fs(113) gt 119all ps lt 01] without any significant interactions betweenvalence and block type demonstrated that enhanced pos-itivities for emotional faces were elicited at these sites ina similar fashion for all six basic emotions (Figure 6)Again effects of valence were entirely absent in the linestask2

DISCUSSION

The primary aim of the present ERP experiment wasto extend previous findings (Holmes et al 2003) that thedetection and processing of emotional information de-livered by facial expressions requires focal attention Werecorded ERPs to stimulus arrays containing emotionalor neutral bilateral faces under conditions when facialexpression was task relevant and therefore attended (emo-tion task) or when attention was actively directed awayfrom these faces toward a demanding perceptual judgment

Figure 5 Grand-averaged ERP waveforms elicited in the emotion task at right lateral tem-poral electrode T6 in the 700-msec interval following stimulus onset in response to stimulusarrays containing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are shownseparately for blocks containing angry disgusted fearful happy sad or surprised faces

ATTENTION AND FACIAL EXPRESSION PROCESSING 105

(lines task) In our previous ERP study (Holmes et al2003) spatial attention was manipulated on a trial-by-trial basis by precues presented at the start of each trialfacial expression was not task relevant (participants hadto detect infrequent identical stimulus pairs regardlessof expression) and only one emotional expression (fear)was tested In the present experiment a sustained attentionparadigm was employed (with emotion and lines tasksdelivered in separate experimental halves) facial expres-sion was task relevant in the emotion task and most im-portant all six basic facial emotional expressions wereincluded in different blocks

ERP correlates of emotional facial expression pro-cessing were identified by comparing ERPs elicited ontrials with emotional faces with ERPs in response to neu-tral faces This was done separately for the emotion taskand the lines task and for blocks including angry dis-gusted fearful happy sad and surprised faces In theemotion task where attention was directed toward task-relevant facial expressions an enhanced positivity foremotional relative to neutral faces was elicited similar toprevious observations from studies comparing ERP re-sponses to fearful versus neutral faces (Eimer amp Holmes2002 Holmes et al 2003) This emotional expressioneffect started at about 160 msec poststimulus and wasinitially distributed frontocentrally whereas a morebroadly distributed positivity was observed beyond300 msec (Figure 3) In addition an enhanced negativityfor fearful relative to neutral faces was elicited at lateral

posterior electrodes between 220 and 320 msec post-stimulus

The onset of the early frontocentral emotional expres-sion effect was slightly later in the present experimentthan in our previous experiment (Holmes et al 2003)where significant frontal differences between ERPs tofearful and neutral faces were already present at about120 msec poststimulus In the present study verticallines were presented close to fixation simultaneouslywith the bilateral faces whereas no such stimuli were in-cluded in our earlier experiment The presence of theseadditional central events may have slightly delayed theonset of early emotional expression effects It shouldalso be noted that an attenuation of amygdala responsesto emotional facial expressions has been observed whenthe demand for explicit emotion recognition was in-creased (Critchley et al 2000 Hariri Bookheimer ampMazziotta 2000) It is possible that the demand for ex-plicit emotion recognition in the emotion task con-tributed to the delayed onset of the early emotional ex-pression effect

In marked contrast to these ERP results obtained inthe emotion task emotional expression effects were en-tirely absent in the lines task (Figure 4) demonstratingthat ERP correlates of facial expression processing arestrongly dependent on spatial attention With sustainedspatial attention directed away from face stimuli towardanother demanding perceptual task the presence of emo-tional versus neutral faces had no effect whatsoever on

Figure 6 Grand-averaged ERP waveforms elicited in the emotion task at midline electrodeFz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are shown separatelyfor blocks containing angry disgusted fearful happy sad or surprised faces

106 EIMER HOLMES AND MCGLONE

ERP waveforms That is emotional expression effectswere completely eliminated for all six basic emotions in-cluded in this experiment In line with this ERP resultperformance in the lines task was entirely unaffected bythe expression of the faces presented simultaneouslywith the task-relevant line pairs Overall these findingsextend and confirm the observations of our previousERP experiment which compared ERPs in response tofearful versus neutral faces (Holmes et al 2003) Clearlythese results challenge the hypothesis that the detectionandor processing of emotional facial expression occurspreattentively If this were the case at least some system-atic ERP differences should have been elicited in responseto emotional versus neutral faces in the lines task reflect-ing the automatic detection of emotionally significantevents

Covert attention toward emotional faces under condi-tions when they were task relevant may have enhancedtheir visualndashperceptual representation (eg CarrascoPenpeci-Talgar amp Eckstein 2000) thereby enabling theextraction of features relating to the affective valence ofthese faces and thus their subsequent encoding andanalysis (as reflected by the emotion-specific ERP effectsobserved in the emotion task) The early frontocentrallydistributed emotional expression effects may be mediatedby connections from the superior temporal sulcus (STS)and amygdala to orbitofrontal cortex (Rolls 1999) TheSTS has been implicated in the early discrimination ofvisual features relating to emotional facial expressions(eg Sprengelmeyer et al 1998) In addition efferentfeedback projections from the amygdala and relatedstructures (see Lang et al 1998 Morris et al 1998)may have produced the more broadly distributed emo-tional expression effects observed in the present experi-ment at longer latencies

One could argue that the absence of emotional ex-pression effects under conditions where faces were un-attended may have been due to the fact that the presen-tation of specific emotional expressions was blocked andthat each expression was presented repeatedly in twoseparate blocks Repeated exposure to a specific emo-tional expression may have resulted in a gradual habitu-ation of emotion-specific responses thus potentially at-tenuating any emotional expression effects that may havebeen present in the lines task To investigate this possi-bility we computed separate averages for the first blockand for the second block including angry disgustedfearful happy sad or surprised faces separately for theemotion and for the lines task These data were then an-alyzed with the additional factor of block position (firstvs second block containing a specific emotional facialexpression) If emotional expression effects were subjectto habituation one would expect to find larger emotionalexpression effects for the first relative to the second blockin the emotion task and potentially also a residual emo-tional expression effect for the first block in the lines task

Figure 7 shows ERPs elicited at Fz in response to neu-tral faces (solid lines) or emotional faces (dashed lines)

collapsed across all six different emotional expressionsERPs are displayed separately for the emotion task (toppanel) and the lines task (bottom panel) and for the firstblock (left) or second block (right) including one of thesix emotional expressions As can be seen from Figure 7(top) there was no evidence whatsoever for any habitu-ation of emotional expression effects as a function ofblock position in the emotion task This was confirmedby the absence of any block position 3 valence or blockposition 3 block type 3 valence interactions for all latencywindows employed in the analyses reported above [allFs(113) lt 1] Along similar lines Figure 7 (bottom panel)suggests that there was no residual emotional expressioneffect for the first block including a specific emotionalexpression in the lines task This was confirmed by theabsence of any interactions involving block position [allFs(113) lt 16] Thus the fact that emotional expressioneffects were absent in response to unattended faces in thelines task is unlikely to have been the result of a habitu-ation of emotion-specific brain responses

The conclusion that the processing of emotional facialexpression as reflected by ERP facial expression effectsis gated by spatial attention appears to be inconsistentwith neuroimaging studies demonstrating that fearfulfaces result in amygdala activations even when thesefaces are outside the focus of attention (Vuilleumieret al 2001 see also Morris et al 1996 Whalen et al1998) However it is extremely unlikely that the ERP ef-fects observed in the present study are directly linked toamygdala activations Due to its nuclear structure ofclustered neurones the amygdala is electrically closedand thus largely inaccessible to ERP measures The earlyemotional expression effects observed in response to at-tended faces are more likely to be generated in prefrontalcortex where emotion-specific single-cell responseshave recently been recorded at short latencies (Kawasakiet al 2001) Such prefrontal responses may reflect stagesin emotional processing that could be contingent uponbut functionally separate from prior amygdala activa-tions (see Le Doux 1996 Rolls 1999) It is possible thatamygdala responses can be triggered by unattendedemotional stimuli (although these responses may be at-tenuated) whereas subsequent neocortical stages ofemotional processing (as reflected by the ERP effects ob-served in the present experiment) are fully dependent onfocal attention An alternative possibility is that amyg-dala responses to emotional stimuli may also require at-tention (see Pessoa Kastner amp Ungerleider 2002 Pes-soa McKenna et al 2002) and that the elimination ofemotional expression effects in the lines task reflects anearlier attentional gating of such subcortical processing

Another important new finding of the present experi-ment was that the onset time course and scalp distribu-tion of emotional expression effects obtained in the emo-tion task were remarkably similar for all six basic facialexpressions used here (Figures 5 and 6) The absence ofany differential ERP responses to different emotional ex-pressions was reflected by the absence of any significant

ATTENTION AND FACIAL EXPRESSION PROCESSING 107

interactions between block type (blocks with angry dis-gusted fearful happy sad or surprised faces) and va-lence (emotional vs neutral expression) In line withthese observations the size of the RT advantage for emo-tional relative to neutral faces in the emotion task wassimilar for all six emotional facial expressions (Figure 2top panel) The similarity in the time course of emotionalexpression effects across all six emotional expressionsobserved here suggests that emotionally relevant infor-mation delivered by facial expression is available to neo-cortical processes within less then 200 msec after stim-ulus onset and at approximately the same time for allbasic emotional expressions

These observations do not seem to support the ideasuggested by recent f MRI results that distinct neuralsubsystems specialize in the processing of specific emo-tions (Adolphs 2002) If this were the case one mighthave expected some systematic differences between ERPemotional expression effects elicited by different facial

expressions However it should be noted that althoughsome neuroimaging data show emotion-specific differ-ential activation of brain regions such as the amygdala orinsula few studies point to differential activation withinsurface cortical structures (where the ERP effects ob-served in the present experiments are likely to be gener-ated see also Pizzagalli et al 1999 Sato et al 2001for related results from recent ERP studies)

Thus one could argue that early stages in the processingof emotionally relevant information subserved by lim-bic structures or the basal ganglia and subsequent neo-cortical emotional processing stages differ not only intheir dependence on focal attention (see above) but alsoin their specificity Early processes may be differentiallyengaged by specific emotional expressions thus provid-ing a rapid classif ication of emotionally significantevents Data in support of this view come from single-unit recordings which reveal a rapid emergence of dif-ferential effects to emotional expressions in the human

Figure 7 Grand-averaged ERP waveforms elicited in the emotion task (toppanel) and in the lines task (bottom panel) at midline electrode Fz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are col-lapsed across blocks including each of the six different emotional facialexpressions and are shown separately for the first block (left) and the secondblock (right) including one specific emotional expression

108 EIMER HOLMES AND MCGLONE

amygdala (Liu et al 1999) Conversely later stagesmight be involved in the in-depth processing of variouskinds of affective information and thus would be muchless selective with respect to different facial expressions

This suggestion is consistent with some recent evi-dence that subcortical and neocortical routes for visualprocessing are involved differentially in emotional ex-pression analysis A subcortical magnocellular pathwayto the amygdala would appear to support valence dis-crimination processes whereas parvocellular subsys-tems of ventral visual cortices may be preferentially in-volved in emotional intensity evaluation irrespective ofemotional valence (Schyns amp Oliva 1999 VuilleumierArmony Driver amp Dolan 2003) Recent neuroimagingresults (Vuilleumier et al 2003) suggest that low andhigh spatial frequency components of fearful faces se-lectively drive amygdala and visual cortical responsesrespectively However although enhanced amygdala ac-tivation was found in response to low-spatial-frequencyfearful face stimuli explicit judgments relating to theperceived intensity of fearfulness were increased by thepresence of high-spatial-frequency cues These resultssupport the view that coarse visual information may bedirected via magnocellular channels from the retina tothe amygdala through a tectopulvinar pathway (eg Bistiamp Sireteanu 1976 Jones amp Burton 1976) enabling thefast appraisal of the affective significance of a stimulus(eg Morris Oumlhman amp Dolan 1999)3

Another aim of the present study was to investigatewhether the face-specific N170 component which is as-sumed to reflect the structural encoding of faces is sen-sitive to emotional facial expressions In previous ERPstudies which have not found any modulations of theN170 elicited by fearful relative to neutral faces (Eimeramp Holmes 2002 Holmes et al 2003) facial expressionwas always task irrelevant In contrast participantsrsquo re-sponses were contingent upon facial expression in thepresent emotion task In spite of this fact the N170 wasfound to be completely unaffected by facial expressionsin the emotion task and this was consistently the casefor all six emotional expressions used in the presentstudy (Figure 5)

In line with earlier findings from depth electrodes (Mc-Carthy Puce Belger amp Allison 1999) this pattern of re-sults now demonstrates comprehensively that the structuralencoding of faces as reflected by the N170 is entirely in-sensitive to information derived from emotional facial ex-pression Thus the rapid detection of emotional facialexpression appears to occur independently and in paral-lel to the construction of a detailed perceptual representa-tion of a face The absence of systematic early emotionalexpression effects at posterior sites and the presence ofsuch ERP effects at frontocentral electrodes at about160 msec poststimulus suggests that higher order visualprocessing stages involved in face processing are af-fected by emotional facial expression only after this in-formation has been processed in prefrontal cortex Thisis consistent with the face processing model proposed byBruce and Young (1986) in which the extraction of per-

ceptual information for emotional expression processingoccurs independently and simultaneously with structuralencoding for face recognition

In summary the present ERP results demonstrate thatthe neocortical processing of emotional facial expres-sion is strongly dependent on focal attention Whenfaces were attended systematic emotional expression ef-fects were elicited by emotional relative to neutral facesand these effects were strikingly similar in terms of theirtiming and morphology for all six basic facial expres-sions In contrast when attention was actively directedaway from these faces emotional expression effectswere completely eliminated The rapid and automatic en-coding of emotionally significant events occurring out-side the focus of attention may be adaptively advanta-geous because it prepares the organism for f ight orflight through subcortically mediated autonomic activa-tion (eg Oumlhman Flykt amp Lundqvist 2000) Howeverit is equally important that irrelevant affective stimuli donot continuously divert attention This suggests a divi-sion of labor between limbic structures involved in theobligatory detection of emotional informationmdashprepar-ing the organism for rapid action (Morris et al 1999Whalen et al 1998)mdashand subsequent neocortical emo-tional processing stages Limbic structures may be re-sponsible for establishing a readiness to respond to anyenvironmental threat that could become the focus of at-tention presumably through heightened autonomic acti-vation However neocortical stages appear to be pro-tected by efficient attentional gating mechanisms whichreduce distractibility by emotional stimuli so that ongo-ing goals and plans can be accomplished without inter-ference from irrelevant events

REFERENCES

Adolphs R (2002) Recognizing emotion from facial expressionsPsychological and neurological mechanisms Behavioral CognitiveNeuroscience Review 1 21-61

Adolphs R Tranel D amp Damasio A R ( 2003) Dissociableneural systems for recognizing emotions Brain amp Cognition 52 61-69

Amaral D G amp Price J L (1984) Amygdalo-cortical projectionsin the monkey (Macaca fasicularis) Journal of Comparative Neu-rology 230 465-496

Amaral D G Price J L Pitkanen A amp Carmichael S T(1992) Anatomical organization of the primate amygdaloid complexIn J P Aggleton (Ed) The amygdala Neurobiological aspects ofemotion memory and mental dysfunction (pp 1-66) New YorkWiley-Liss

Armony J L amp Dolan R J (2002) Modulation of spatial attentionby fear-conditioned stimuli An event-related fMRI study Neuro-psychologia 7 817-826

Bentin S Allison T Puce A Perez E amp McCarthy G (1996)Electrophysiological studies of face perception in humans Journal ofCognitive Neuroscience 8 551-565

Bisti S amp Sireteanu R C (1976) Sensitivity to spatial frequencyand contrast of visual cells in the cat superior colliculus Vision Re-search 16 247-251

Blair R J R Morris J S Frith C D Perrett D I amp DolanR J (1999) Dissociable neural responses to facial expressions ofsadness and anger Brain 122 883-893

Breiter H C Etcoff N L Whalen P J Kennedy W ARauch S L Buckner R L Strauss M M Hyman S E amp

ATTENTION AND FACIAL EXPRESSION PROCESSING 109

Rosen B R (1996) Response and habituation of the human amyg-dala during visual processing of facial expression Neuron 17 875-887

Bruce V amp Young A (1986) Understanding face recognitionBritish Journal of Psychology 77 305-327

Calder A J Keane J Manes F Antoun N amp Young A W(2000) Impaired recognition and experience of disgust followingbrain injury Nature Neuroscience 3 1077-1078

Calder A J Lawrence A D amp Young A W (2001) Neuropsy-chology of fear and loathing Nature Reviews Neuroscience 2 352-363

Carrasco M Penpeci-Talgar C amp Eckstein M (2000) Spatialcovert attention increases contrast sensitivity across the CSF Sup-port for signal enhancement Vision Research 40 1203-1215

Critchley H [D] Daly E [M] Phillips M Brammer M Bull-more E Williams S [C] van Amelsvoort T Robertson DDavid A amp Murphy D [G M] (2000) Explicit and implicit neuralmechanisms for processing of social information from facial expres-sions A functional magnetic resonance imaging study Human BrainMapping 9 93-105

Cuthbert B N Schupp H T Bradley M M Birbaumer N ampLang P J (2000) Brain potentials in affective picture processingCovariation with autonomic arousal and affective report BiologicalPsychology 52 95-111

Damasio A R (1994) Descartesrsquo error Emotion reason and the humanbrain New York G P Putnamrsquos Sons

Diedrich O Naumann E Maier S amp Becker G (1997) Afrontal slow wave in the ERP associated with emotional slides Jour-nal of Psychophysiology 11 71-84

Eastwood J D Smilek D amp Merikle P M (2001) Differential at-tentional guidance by unattended faces expressing positive and neg-ative emotion Perception amp Psychophysics 63 1004-1013

Eimer M (1998) Does the face-specif ic N170 component reflect theactivity of a specialized eye detector NeuroReport 9 2945-2948

Eimer M (2000) The face-specific N170 component reflects late stagesin the structural encoding of faces NeuroReport 11 2319-2324

Eimer M amp Holmes A (2002) An ERP study on the time course ofemotional face processing NeuroReport 13 427-431

Ekman P amp Friesen W V (1976) Pictures of facial affect Palo AltoCA Consulting Psychologists Press

Fox E Lester V Russo R Bowles R J Pichler A amp Dutton K(2000) Facial expressions of emotion Are angry faces detected moreefficiently Cognition amp Emotion 14 61-92

Hansen C H amp Hansen R D (1988) Finding the face in the crowdAn anger superiority effect Journal of Personality amp Social Psy-chology 54 917-924

Hariri A R Bookheimer S Y amp Mazziotta J C (2000) Modu-lating emotional responses Effects of a neocortical network on thelimbic system NeuroReport 11 43-48

Harmer C J Thilo K V Rothwell J C amp Goodwin G M(2001) Transcranial magnetic stimulation of medial-frontal corteximpairs the processing of angry facial expressions Nature Neuro-science 4 17-18

Holmes A Vuilleumier P amp Eimer M (2003) The processing ofemotional facial expression is gated by spatial attention Evidencefrom event-related brain potentials Cognitive Brain Research 16174-184

Jones E G amp Burton H (1976) A projection from the medial pul-vinar to the amygdala in primates Brain Research 104 142-147

Kawasaki H Kaufman O Damasio H Damasio A R Gran-ner M Bakken H Hori T Howard M A III amp Adolphs R(2001) Single-neuron responses to emotional visual stimuli recordedin human ventral prefrontal cortex Nature Neuroscience 4 15-16

Lane R D Chua P M amp Dolan R J (1999) Common effects ofemotional valence arousal and attention on neural activation duringvisual processing of pictures Neuropsychologia 37 989-997

Lang P J Bradley M M Fitzsimmons J R Cuthbert B NScott J D Moulder B amp Nangia V (1998) Emotional arousaland activation of the visual cortex An fMRI analysis Psychophysi-ology 35 199-210

Le Doux J E (1996) The emotional brain New York Simon ampSchuster

Liu L Ioannides A A amp Streit M (1999) Single trial analysis ofneurophysiological correlates of the recognition of complex objectsand facial expressions of emotion Brain Topography 11 291-303

McCarthy G Puce A Belger A amp Allison T (1999) Electro-physiological studies of human face perception II Response proper-ties of face-specific potentials generated in occipitotemporal cortexCerebral Cortex 9 431-444

Mogg K amp Bradley B P (1999) Orienting of attention to threaten-ing facial expressions presented under conditions of restricted aware-ness Cognition amp Emotion 13 713-740

Mogg K McNamara J Powys M Rawlinson H Seiffer A ampBradley B P (2000) Selective attention to threat A test of twocognitive models of anxiety Cognition amp Emotion 14 375-399

Morris J S Friston K J Buechel C Frith C D Young A WCalder A J amp Dolan R J (1998) A neuromodulatory role forthe human amygdala in processing emotional facial expressionsBrain 121 47-57

Morris J S Frith C D Perrett D I Rowland D YoungA W Calder A J amp Dolan R J (1996) A differential neural re-sponse in the human amygdala to fearful and happy facial expres-sions Nature 383 812-815

Morris J S Oumlhman A amp Dolan R J (1999) A subcortical path-way to the right amygdala mediating ldquounseenrdquo fear Proceedings ofthe National Academy of Sciences 96 1680-1685

Oumlhman A Flykt A amp Esteves F (2001) Emotion drives attentionDetecting the snake in the grass Journal of Experimental Psychol-ogy General 130 466-478

Oumlhman A Flykt A amp Lundqvist D (2000) Unconscious emo-tion Evolutionary perspectives psychophysiological data and neu-ropsychological mechanisms In R D Lane amp L Nadel (Eds) Cog-nitive neuroscience of emotion (pp 296-327) New York OxfordUniversity Press

Oumlhman A Lundqvist D amp Esteves F (2001) The face in thecrowd revisited A threat advantage with schematic stimuli Journalof Personality amp Social Psychology 80 381-396

Pessoa L Kastner S amp Ungerleider LG (2002) Attentionalcontrol of the processing of neutral and emotional stimuli CognitiveBrain Research 15 31-45

Pessoa L McKenna M Gutierrez E amp Ungerleider LG(2002) Neural processing of emotional faces requires attention Pro-ceedings of the National Academy of Sciences 99 11458-11463

Phillips M L Young A W Scott S K Calder A J Andrew CGiampietro V Williams S C R Bullmore E T BrammerMamp Gray J A (1998) Neural responses to facial and vocal expres-sions of fear and disgust Proceedings of the Royal Society LondonSeries B 265 1809-1817

Phillips M L Young A W Senior C Brammer M Andrew CCalder A J Bullmore E T Perrett D I Rowland D ampWilliams S C R et al (1997) A specif ic neural substrate forperceiving facial expressions of disgust Nature 389 495-498

Pizzagalli D Regard M amp Lehmann D (1999) Rapid emotionalface processing in the human right and left brain hemispheres AnERP study NeuroReport 10 2691-2698

Rapcsak S Z Galper S R Comer J F Reminger S L NielsenLKaszniak A W Verfaellie M Laguna J F Labiner D Mamp Cohen R A (2000) Fear recognition deficits after focal braindamage Neurology 54 575-581

Rolls E T (1999) The brain and emotion Oxford Oxford UniversityPress

Sato W Kochiyama T Yoshikawa S amp Matsumura M (2001)Emotional expression boosts early visual processing of the face ERPrecording and its decomposition by independent component analysisNeuroReport 12 709-714

Schyns P G amp Oliva A (1999) Dr Angry and Mr Smile Whencategorization flexibly modifies the perception of faces in rapid vi-sual presentations Cognition 69 243-265

Sprengelmeyer R Rausch M Eysel U T amp Przuntek H (1998)Neural structures associated with recognition of facial expressions of

110 EIMER HOLMES AND MCGLONE

basic emotions Proceedings of the Royal Society London Series B265 1927-1931

Stepniewska I Qi H X amp Kaas J H (1999) Do superior collicu-lus projection zones in the inferior pulvinar project to MT in pri-mates European Journal of Neuroscience 11 469-480

Vuilleumier P Armony J L Driver J amp Dolan R J (2001) Ef-fects of attention and emotion on face processing in the human brainAn event-related fMRI study Neuron 30 829-841

Vuilleumier P Armony J L Driver J amp Dolan R J (2003)Distinct spatial frequency sensitivities for processing faces and emo-tional expressions Nature Neuroscience 6 624-631

Vuilleumier P amp Schwartz S (2001a) Beware and be aware Cap-ture of spatial attention by fear-related stimuli in neglect Neuro-Report 12 1119-1122

Vuilleumier P amp Schwartz S (2001b) Emotional expressionscapture attention Neurology 56 153-158

Whalen P J Rach S L Etcoff N L McInerney S C LeeM B amp Jenike M A (1998) Masked presentations of emotionalfacial expressions modulate amygdala activity without explicitknowledge Journal of Neuroscience 18 411-418

Whalen P J Shin L M McInerney S C Fischer H WrightC I amp Rauch S L (2001) A functional MRI study of humanamygdala responses to facial expressions of fear versus anger Emo-tion 1 70-83

Yeterian E H amp Pandya D N (1991) Corticothalamic connec-tions of the superior temporal sulcus in rhesus monkeys Experimen-tal Brain Research 83 268-284

NOTES

1 In spite of the fact that significant valence effects were present atcentral electrodes in the emotion task but were absent in the lines taskthis interaction failed to reach significance at central sites

2 At lateral occipital electrodes a significantly enhanced positivityfor emotional relative to neutral faces was present between 320 and495 msec in the emotion task [F(113) = 62 p lt 03] but not in thelines task and this was reflected in a nearly significant task 3 valenceinteraction [F(113) = 46 p lt 06]

3 It should be noted that anatomical evidence for a colliculo-pulvinar-amygdalar pathway is currently lacking since the medial pulvinarwhich projects to the amygdala does not receive a significant directinput from the superior colliculus (eg Stepniewska Qi amp Kaas1999) However possible connections between the inferior pulvinar(which receives visual inputs from the superior colliculus) and the me-dial nucleus may support the transmission of information to the amyg-dala through a colliculo-pulvinar route Alternatively cortical inputmay be involved since STS (implicated in facial expression processingSprengelmeyer et al 1998) is known to project to the medial pulvinar(Yeterian amp Pandya 1991) Our thanks to an anonymous reviewer forraising this important point

(Manuscript received December 10 2002revision accepted for publication June 11 2003)

ATTENTION AND FACIAL EXPRESSION PROCESSING 105

(lines task) In our previous ERP study (Holmes et al2003) spatial attention was manipulated on a trial-by-trial basis by precues presented at the start of each trialfacial expression was not task relevant (participants hadto detect infrequent identical stimulus pairs regardlessof expression) and only one emotional expression (fear)was tested In the present experiment a sustained attentionparadigm was employed (with emotion and lines tasksdelivered in separate experimental halves) facial expres-sion was task relevant in the emotion task and most im-portant all six basic facial emotional expressions wereincluded in different blocks

ERP correlates of emotional facial expression pro-cessing were identified by comparing ERPs elicited ontrials with emotional faces with ERPs in response to neu-tral faces This was done separately for the emotion taskand the lines task and for blocks including angry dis-gusted fearful happy sad and surprised faces In theemotion task where attention was directed toward task-relevant facial expressions an enhanced positivity foremotional relative to neutral faces was elicited similar toprevious observations from studies comparing ERP re-sponses to fearful versus neutral faces (Eimer amp Holmes2002 Holmes et al 2003) This emotional expressioneffect started at about 160 msec poststimulus and wasinitially distributed frontocentrally whereas a morebroadly distributed positivity was observed beyond300 msec (Figure 3) In addition an enhanced negativityfor fearful relative to neutral faces was elicited at lateral

posterior electrodes between 220 and 320 msec post-stimulus

The onset of the early frontocentral emotional expres-sion effect was slightly later in the present experimentthan in our previous experiment (Holmes et al 2003)where significant frontal differences between ERPs tofearful and neutral faces were already present at about120 msec poststimulus In the present study verticallines were presented close to fixation simultaneouslywith the bilateral faces whereas no such stimuli were in-cluded in our earlier experiment The presence of theseadditional central events may have slightly delayed theonset of early emotional expression effects It shouldalso be noted that an attenuation of amygdala responsesto emotional facial expressions has been observed whenthe demand for explicit emotion recognition was in-creased (Critchley et al 2000 Hariri Bookheimer ampMazziotta 2000) It is possible that the demand for ex-plicit emotion recognition in the emotion task con-tributed to the delayed onset of the early emotional ex-pression effect

In marked contrast to these ERP results obtained inthe emotion task emotional expression effects were en-tirely absent in the lines task (Figure 4) demonstratingthat ERP correlates of facial expression processing arestrongly dependent on spatial attention With sustainedspatial attention directed away from face stimuli towardanother demanding perceptual task the presence of emo-tional versus neutral faces had no effect whatsoever on

Figure 6 Grand-averaged ERP waveforms elicited in the emotion task at midline electrodeFz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are shown separatelyfor blocks containing angry disgusted fearful happy sad or surprised faces

106 EIMER HOLMES AND MCGLONE

ERP waveforms That is emotional expression effectswere completely eliminated for all six basic emotions in-cluded in this experiment In line with this ERP resultperformance in the lines task was entirely unaffected bythe expression of the faces presented simultaneouslywith the task-relevant line pairs Overall these findingsextend and confirm the observations of our previousERP experiment which compared ERPs in response tofearful versus neutral faces (Holmes et al 2003) Clearlythese results challenge the hypothesis that the detectionandor processing of emotional facial expression occurspreattentively If this were the case at least some system-atic ERP differences should have been elicited in responseto emotional versus neutral faces in the lines task reflect-ing the automatic detection of emotionally significantevents

Covert attention toward emotional faces under condi-tions when they were task relevant may have enhancedtheir visualndashperceptual representation (eg CarrascoPenpeci-Talgar amp Eckstein 2000) thereby enabling theextraction of features relating to the affective valence ofthese faces and thus their subsequent encoding andanalysis (as reflected by the emotion-specific ERP effectsobserved in the emotion task) The early frontocentrallydistributed emotional expression effects may be mediatedby connections from the superior temporal sulcus (STS)and amygdala to orbitofrontal cortex (Rolls 1999) TheSTS has been implicated in the early discrimination ofvisual features relating to emotional facial expressions(eg Sprengelmeyer et al 1998) In addition efferentfeedback projections from the amygdala and relatedstructures (see Lang et al 1998 Morris et al 1998)may have produced the more broadly distributed emo-tional expression effects observed in the present experi-ment at longer latencies

One could argue that the absence of emotional ex-pression effects under conditions where faces were un-attended may have been due to the fact that the presen-tation of specific emotional expressions was blocked andthat each expression was presented repeatedly in twoseparate blocks Repeated exposure to a specific emo-tional expression may have resulted in a gradual habitu-ation of emotion-specific responses thus potentially at-tenuating any emotional expression effects that may havebeen present in the lines task To investigate this possi-bility we computed separate averages for the first blockand for the second block including angry disgustedfearful happy sad or surprised faces separately for theemotion and for the lines task These data were then an-alyzed with the additional factor of block position (firstvs second block containing a specific emotional facialexpression) If emotional expression effects were subjectto habituation one would expect to find larger emotionalexpression effects for the first relative to the second blockin the emotion task and potentially also a residual emo-tional expression effect for the first block in the lines task

Figure 7 shows ERPs elicited at Fz in response to neu-tral faces (solid lines) or emotional faces (dashed lines)

collapsed across all six different emotional expressionsERPs are displayed separately for the emotion task (toppanel) and the lines task (bottom panel) and for the firstblock (left) or second block (right) including one of thesix emotional expressions As can be seen from Figure 7(top) there was no evidence whatsoever for any habitu-ation of emotional expression effects as a function ofblock position in the emotion task This was confirmedby the absence of any block position 3 valence or blockposition 3 block type 3 valence interactions for all latencywindows employed in the analyses reported above [allFs(113) lt 1] Along similar lines Figure 7 (bottom panel)suggests that there was no residual emotional expressioneffect for the first block including a specific emotionalexpression in the lines task This was confirmed by theabsence of any interactions involving block position [allFs(113) lt 16] Thus the fact that emotional expressioneffects were absent in response to unattended faces in thelines task is unlikely to have been the result of a habitu-ation of emotion-specific brain responses

The conclusion that the processing of emotional facialexpression as reflected by ERP facial expression effectsis gated by spatial attention appears to be inconsistentwith neuroimaging studies demonstrating that fearfulfaces result in amygdala activations even when thesefaces are outside the focus of attention (Vuilleumieret al 2001 see also Morris et al 1996 Whalen et al1998) However it is extremely unlikely that the ERP ef-fects observed in the present study are directly linked toamygdala activations Due to its nuclear structure ofclustered neurones the amygdala is electrically closedand thus largely inaccessible to ERP measures The earlyemotional expression effects observed in response to at-tended faces are more likely to be generated in prefrontalcortex where emotion-specific single-cell responseshave recently been recorded at short latencies (Kawasakiet al 2001) Such prefrontal responses may reflect stagesin emotional processing that could be contingent uponbut functionally separate from prior amygdala activa-tions (see Le Doux 1996 Rolls 1999) It is possible thatamygdala responses can be triggered by unattendedemotional stimuli (although these responses may be at-tenuated) whereas subsequent neocortical stages ofemotional processing (as reflected by the ERP effects ob-served in the present experiment) are fully dependent onfocal attention An alternative possibility is that amyg-dala responses to emotional stimuli may also require at-tention (see Pessoa Kastner amp Ungerleider 2002 Pes-soa McKenna et al 2002) and that the elimination ofemotional expression effects in the lines task reflects anearlier attentional gating of such subcortical processing

Another important new finding of the present experi-ment was that the onset time course and scalp distribu-tion of emotional expression effects obtained in the emo-tion task were remarkably similar for all six basic facialexpressions used here (Figures 5 and 6) The absence ofany differential ERP responses to different emotional ex-pressions was reflected by the absence of any significant

ATTENTION AND FACIAL EXPRESSION PROCESSING 107

interactions between block type (blocks with angry dis-gusted fearful happy sad or surprised faces) and va-lence (emotional vs neutral expression) In line withthese observations the size of the RT advantage for emo-tional relative to neutral faces in the emotion task wassimilar for all six emotional facial expressions (Figure 2top panel) The similarity in the time course of emotionalexpression effects across all six emotional expressionsobserved here suggests that emotionally relevant infor-mation delivered by facial expression is available to neo-cortical processes within less then 200 msec after stim-ulus onset and at approximately the same time for allbasic emotional expressions

These observations do not seem to support the ideasuggested by recent f MRI results that distinct neuralsubsystems specialize in the processing of specific emo-tions (Adolphs 2002) If this were the case one mighthave expected some systematic differences between ERPemotional expression effects elicited by different facial

expressions However it should be noted that althoughsome neuroimaging data show emotion-specific differ-ential activation of brain regions such as the amygdala orinsula few studies point to differential activation withinsurface cortical structures (where the ERP effects ob-served in the present experiments are likely to be gener-ated see also Pizzagalli et al 1999 Sato et al 2001for related results from recent ERP studies)

Thus one could argue that early stages in the processingof emotionally relevant information subserved by lim-bic structures or the basal ganglia and subsequent neo-cortical emotional processing stages differ not only intheir dependence on focal attention (see above) but alsoin their specificity Early processes may be differentiallyengaged by specific emotional expressions thus provid-ing a rapid classif ication of emotionally significantevents Data in support of this view come from single-unit recordings which reveal a rapid emergence of dif-ferential effects to emotional expressions in the human

Figure 7 Grand-averaged ERP waveforms elicited in the emotion task (toppanel) and in the lines task (bottom panel) at midline electrode Fz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are col-lapsed across blocks including each of the six different emotional facialexpressions and are shown separately for the first block (left) and the secondblock (right) including one specific emotional expression

108 EIMER HOLMES AND MCGLONE

amygdala (Liu et al 1999) Conversely later stagesmight be involved in the in-depth processing of variouskinds of affective information and thus would be muchless selective with respect to different facial expressions

This suggestion is consistent with some recent evi-dence that subcortical and neocortical routes for visualprocessing are involved differentially in emotional ex-pression analysis A subcortical magnocellular pathwayto the amygdala would appear to support valence dis-crimination processes whereas parvocellular subsys-tems of ventral visual cortices may be preferentially in-volved in emotional intensity evaluation irrespective ofemotional valence (Schyns amp Oliva 1999 VuilleumierArmony Driver amp Dolan 2003) Recent neuroimagingresults (Vuilleumier et al 2003) suggest that low andhigh spatial frequency components of fearful faces se-lectively drive amygdala and visual cortical responsesrespectively However although enhanced amygdala ac-tivation was found in response to low-spatial-frequencyfearful face stimuli explicit judgments relating to theperceived intensity of fearfulness were increased by thepresence of high-spatial-frequency cues These resultssupport the view that coarse visual information may bedirected via magnocellular channels from the retina tothe amygdala through a tectopulvinar pathway (eg Bistiamp Sireteanu 1976 Jones amp Burton 1976) enabling thefast appraisal of the affective significance of a stimulus(eg Morris Oumlhman amp Dolan 1999)3

Another aim of the present study was to investigatewhether the face-specific N170 component which is as-sumed to reflect the structural encoding of faces is sen-sitive to emotional facial expressions In previous ERPstudies which have not found any modulations of theN170 elicited by fearful relative to neutral faces (Eimeramp Holmes 2002 Holmes et al 2003) facial expressionwas always task irrelevant In contrast participantsrsquo re-sponses were contingent upon facial expression in thepresent emotion task In spite of this fact the N170 wasfound to be completely unaffected by facial expressionsin the emotion task and this was consistently the casefor all six emotional expressions used in the presentstudy (Figure 5)

In line with earlier findings from depth electrodes (Mc-Carthy Puce Belger amp Allison 1999) this pattern of re-sults now demonstrates comprehensively that the structuralencoding of faces as reflected by the N170 is entirely in-sensitive to information derived from emotional facial ex-pression Thus the rapid detection of emotional facialexpression appears to occur independently and in paral-lel to the construction of a detailed perceptual representa-tion of a face The absence of systematic early emotionalexpression effects at posterior sites and the presence ofsuch ERP effects at frontocentral electrodes at about160 msec poststimulus suggests that higher order visualprocessing stages involved in face processing are af-fected by emotional facial expression only after this in-formation has been processed in prefrontal cortex Thisis consistent with the face processing model proposed byBruce and Young (1986) in which the extraction of per-

ceptual information for emotional expression processingoccurs independently and simultaneously with structuralencoding for face recognition

In summary the present ERP results demonstrate thatthe neocortical processing of emotional facial expres-sion is strongly dependent on focal attention Whenfaces were attended systematic emotional expression ef-fects were elicited by emotional relative to neutral facesand these effects were strikingly similar in terms of theirtiming and morphology for all six basic facial expres-sions In contrast when attention was actively directedaway from these faces emotional expression effectswere completely eliminated The rapid and automatic en-coding of emotionally significant events occurring out-side the focus of attention may be adaptively advanta-geous because it prepares the organism for f ight orflight through subcortically mediated autonomic activa-tion (eg Oumlhman Flykt amp Lundqvist 2000) Howeverit is equally important that irrelevant affective stimuli donot continuously divert attention This suggests a divi-sion of labor between limbic structures involved in theobligatory detection of emotional informationmdashprepar-ing the organism for rapid action (Morris et al 1999Whalen et al 1998)mdashand subsequent neocortical emo-tional processing stages Limbic structures may be re-sponsible for establishing a readiness to respond to anyenvironmental threat that could become the focus of at-tention presumably through heightened autonomic acti-vation However neocortical stages appear to be pro-tected by efficient attentional gating mechanisms whichreduce distractibility by emotional stimuli so that ongo-ing goals and plans can be accomplished without inter-ference from irrelevant events

REFERENCES

Adolphs R (2002) Recognizing emotion from facial expressionsPsychological and neurological mechanisms Behavioral CognitiveNeuroscience Review 1 21-61

Adolphs R Tranel D amp Damasio A R ( 2003) Dissociableneural systems for recognizing emotions Brain amp Cognition 52 61-69

Amaral D G amp Price J L (1984) Amygdalo-cortical projectionsin the monkey (Macaca fasicularis) Journal of Comparative Neu-rology 230 465-496

Amaral D G Price J L Pitkanen A amp Carmichael S T(1992) Anatomical organization of the primate amygdaloid complexIn J P Aggleton (Ed) The amygdala Neurobiological aspects ofemotion memory and mental dysfunction (pp 1-66) New YorkWiley-Liss

Armony J L amp Dolan R J (2002) Modulation of spatial attentionby fear-conditioned stimuli An event-related fMRI study Neuro-psychologia 7 817-826

Bentin S Allison T Puce A Perez E amp McCarthy G (1996)Electrophysiological studies of face perception in humans Journal ofCognitive Neuroscience 8 551-565

Bisti S amp Sireteanu R C (1976) Sensitivity to spatial frequencyand contrast of visual cells in the cat superior colliculus Vision Re-search 16 247-251

Blair R J R Morris J S Frith C D Perrett D I amp DolanR J (1999) Dissociable neural responses to facial expressions ofsadness and anger Brain 122 883-893

Breiter H C Etcoff N L Whalen P J Kennedy W ARauch S L Buckner R L Strauss M M Hyman S E amp

ATTENTION AND FACIAL EXPRESSION PROCESSING 109

Rosen B R (1996) Response and habituation of the human amyg-dala during visual processing of facial expression Neuron 17 875-887

Bruce V amp Young A (1986) Understanding face recognitionBritish Journal of Psychology 77 305-327

Calder A J Keane J Manes F Antoun N amp Young A W(2000) Impaired recognition and experience of disgust followingbrain injury Nature Neuroscience 3 1077-1078

Calder A J Lawrence A D amp Young A W (2001) Neuropsy-chology of fear and loathing Nature Reviews Neuroscience 2 352-363

Carrasco M Penpeci-Talgar C amp Eckstein M (2000) Spatialcovert attention increases contrast sensitivity across the CSF Sup-port for signal enhancement Vision Research 40 1203-1215

Critchley H [D] Daly E [M] Phillips M Brammer M Bull-more E Williams S [C] van Amelsvoort T Robertson DDavid A amp Murphy D [G M] (2000) Explicit and implicit neuralmechanisms for processing of social information from facial expres-sions A functional magnetic resonance imaging study Human BrainMapping 9 93-105

Cuthbert B N Schupp H T Bradley M M Birbaumer N ampLang P J (2000) Brain potentials in affective picture processingCovariation with autonomic arousal and affective report BiologicalPsychology 52 95-111

Damasio A R (1994) Descartesrsquo error Emotion reason and the humanbrain New York G P Putnamrsquos Sons

Diedrich O Naumann E Maier S amp Becker G (1997) Afrontal slow wave in the ERP associated with emotional slides Jour-nal of Psychophysiology 11 71-84

Eastwood J D Smilek D amp Merikle P M (2001) Differential at-tentional guidance by unattended faces expressing positive and neg-ative emotion Perception amp Psychophysics 63 1004-1013

Eimer M (1998) Does the face-specif ic N170 component reflect theactivity of a specialized eye detector NeuroReport 9 2945-2948

Eimer M (2000) The face-specific N170 component reflects late stagesin the structural encoding of faces NeuroReport 11 2319-2324

Eimer M amp Holmes A (2002) An ERP study on the time course ofemotional face processing NeuroReport 13 427-431

Ekman P amp Friesen W V (1976) Pictures of facial affect Palo AltoCA Consulting Psychologists Press

Fox E Lester V Russo R Bowles R J Pichler A amp Dutton K(2000) Facial expressions of emotion Are angry faces detected moreefficiently Cognition amp Emotion 14 61-92

Hansen C H amp Hansen R D (1988) Finding the face in the crowdAn anger superiority effect Journal of Personality amp Social Psy-chology 54 917-924

Hariri A R Bookheimer S Y amp Mazziotta J C (2000) Modu-lating emotional responses Effects of a neocortical network on thelimbic system NeuroReport 11 43-48

Harmer C J Thilo K V Rothwell J C amp Goodwin G M(2001) Transcranial magnetic stimulation of medial-frontal corteximpairs the processing of angry facial expressions Nature Neuro-science 4 17-18

Holmes A Vuilleumier P amp Eimer M (2003) The processing ofemotional facial expression is gated by spatial attention Evidencefrom event-related brain potentials Cognitive Brain Research 16174-184

Jones E G amp Burton H (1976) A projection from the medial pul-vinar to the amygdala in primates Brain Research 104 142-147

Kawasaki H Kaufman O Damasio H Damasio A R Gran-ner M Bakken H Hori T Howard M A III amp Adolphs R(2001) Single-neuron responses to emotional visual stimuli recordedin human ventral prefrontal cortex Nature Neuroscience 4 15-16

Lane R D Chua P M amp Dolan R J (1999) Common effects ofemotional valence arousal and attention on neural activation duringvisual processing of pictures Neuropsychologia 37 989-997

Lang P J Bradley M M Fitzsimmons J R Cuthbert B NScott J D Moulder B amp Nangia V (1998) Emotional arousaland activation of the visual cortex An fMRI analysis Psychophysi-ology 35 199-210

Le Doux J E (1996) The emotional brain New York Simon ampSchuster

Liu L Ioannides A A amp Streit M (1999) Single trial analysis ofneurophysiological correlates of the recognition of complex objectsand facial expressions of emotion Brain Topography 11 291-303

McCarthy G Puce A Belger A amp Allison T (1999) Electro-physiological studies of human face perception II Response proper-ties of face-specific potentials generated in occipitotemporal cortexCerebral Cortex 9 431-444

Mogg K amp Bradley B P (1999) Orienting of attention to threaten-ing facial expressions presented under conditions of restricted aware-ness Cognition amp Emotion 13 713-740

Mogg K McNamara J Powys M Rawlinson H Seiffer A ampBradley B P (2000) Selective attention to threat A test of twocognitive models of anxiety Cognition amp Emotion 14 375-399

Morris J S Friston K J Buechel C Frith C D Young A WCalder A J amp Dolan R J (1998) A neuromodulatory role forthe human amygdala in processing emotional facial expressionsBrain 121 47-57

Morris J S Frith C D Perrett D I Rowland D YoungA W Calder A J amp Dolan R J (1996) A differential neural re-sponse in the human amygdala to fearful and happy facial expres-sions Nature 383 812-815

Morris J S Oumlhman A amp Dolan R J (1999) A subcortical path-way to the right amygdala mediating ldquounseenrdquo fear Proceedings ofthe National Academy of Sciences 96 1680-1685

Oumlhman A Flykt A amp Esteves F (2001) Emotion drives attentionDetecting the snake in the grass Journal of Experimental Psychol-ogy General 130 466-478

Oumlhman A Flykt A amp Lundqvist D (2000) Unconscious emo-tion Evolutionary perspectives psychophysiological data and neu-ropsychological mechanisms In R D Lane amp L Nadel (Eds) Cog-nitive neuroscience of emotion (pp 296-327) New York OxfordUniversity Press

Oumlhman A Lundqvist D amp Esteves F (2001) The face in thecrowd revisited A threat advantage with schematic stimuli Journalof Personality amp Social Psychology 80 381-396

Pessoa L Kastner S amp Ungerleider LG (2002) Attentionalcontrol of the processing of neutral and emotional stimuli CognitiveBrain Research 15 31-45

Pessoa L McKenna M Gutierrez E amp Ungerleider LG(2002) Neural processing of emotional faces requires attention Pro-ceedings of the National Academy of Sciences 99 11458-11463

Phillips M L Young A W Scott S K Calder A J Andrew CGiampietro V Williams S C R Bullmore E T BrammerMamp Gray J A (1998) Neural responses to facial and vocal expres-sions of fear and disgust Proceedings of the Royal Society LondonSeries B 265 1809-1817

Phillips M L Young A W Senior C Brammer M Andrew CCalder A J Bullmore E T Perrett D I Rowland D ampWilliams S C R et al (1997) A specif ic neural substrate forperceiving facial expressions of disgust Nature 389 495-498

Pizzagalli D Regard M amp Lehmann D (1999) Rapid emotionalface processing in the human right and left brain hemispheres AnERP study NeuroReport 10 2691-2698

Rapcsak S Z Galper S R Comer J F Reminger S L NielsenLKaszniak A W Verfaellie M Laguna J F Labiner D Mamp Cohen R A (2000) Fear recognition deficits after focal braindamage Neurology 54 575-581

Rolls E T (1999) The brain and emotion Oxford Oxford UniversityPress

Sato W Kochiyama T Yoshikawa S amp Matsumura M (2001)Emotional expression boosts early visual processing of the face ERPrecording and its decomposition by independent component analysisNeuroReport 12 709-714

Schyns P G amp Oliva A (1999) Dr Angry and Mr Smile Whencategorization flexibly modifies the perception of faces in rapid vi-sual presentations Cognition 69 243-265

Sprengelmeyer R Rausch M Eysel U T amp Przuntek H (1998)Neural structures associated with recognition of facial expressions of

110 EIMER HOLMES AND MCGLONE

basic emotions Proceedings of the Royal Society London Series B265 1927-1931

Stepniewska I Qi H X amp Kaas J H (1999) Do superior collicu-lus projection zones in the inferior pulvinar project to MT in pri-mates European Journal of Neuroscience 11 469-480

Vuilleumier P Armony J L Driver J amp Dolan R J (2001) Ef-fects of attention and emotion on face processing in the human brainAn event-related fMRI study Neuron 30 829-841

Vuilleumier P Armony J L Driver J amp Dolan R J (2003)Distinct spatial frequency sensitivities for processing faces and emo-tional expressions Nature Neuroscience 6 624-631

Vuilleumier P amp Schwartz S (2001a) Beware and be aware Cap-ture of spatial attention by fear-related stimuli in neglect Neuro-Report 12 1119-1122

Vuilleumier P amp Schwartz S (2001b) Emotional expressionscapture attention Neurology 56 153-158

Whalen P J Rach S L Etcoff N L McInerney S C LeeM B amp Jenike M A (1998) Masked presentations of emotionalfacial expressions modulate amygdala activity without explicitknowledge Journal of Neuroscience 18 411-418

Whalen P J Shin L M McInerney S C Fischer H WrightC I amp Rauch S L (2001) A functional MRI study of humanamygdala responses to facial expressions of fear versus anger Emo-tion 1 70-83

Yeterian E H amp Pandya D N (1991) Corticothalamic connec-tions of the superior temporal sulcus in rhesus monkeys Experimen-tal Brain Research 83 268-284

NOTES

1 In spite of the fact that significant valence effects were present atcentral electrodes in the emotion task but were absent in the lines taskthis interaction failed to reach significance at central sites

2 At lateral occipital electrodes a significantly enhanced positivityfor emotional relative to neutral faces was present between 320 and495 msec in the emotion task [F(113) = 62 p lt 03] but not in thelines task and this was reflected in a nearly significant task 3 valenceinteraction [F(113) = 46 p lt 06]

3 It should be noted that anatomical evidence for a colliculo-pulvinar-amygdalar pathway is currently lacking since the medial pulvinarwhich projects to the amygdala does not receive a significant directinput from the superior colliculus (eg Stepniewska Qi amp Kaas1999) However possible connections between the inferior pulvinar(which receives visual inputs from the superior colliculus) and the me-dial nucleus may support the transmission of information to the amyg-dala through a colliculo-pulvinar route Alternatively cortical inputmay be involved since STS (implicated in facial expression processingSprengelmeyer et al 1998) is known to project to the medial pulvinar(Yeterian amp Pandya 1991) Our thanks to an anonymous reviewer forraising this important point

(Manuscript received December 10 2002revision accepted for publication June 11 2003)

106 EIMER HOLMES AND MCGLONE

ERP waveforms That is emotional expression effectswere completely eliminated for all six basic emotions in-cluded in this experiment In line with this ERP resultperformance in the lines task was entirely unaffected bythe expression of the faces presented simultaneouslywith the task-relevant line pairs Overall these findingsextend and confirm the observations of our previousERP experiment which compared ERPs in response tofearful versus neutral faces (Holmes et al 2003) Clearlythese results challenge the hypothesis that the detectionandor processing of emotional facial expression occurspreattentively If this were the case at least some system-atic ERP differences should have been elicited in responseto emotional versus neutral faces in the lines task reflect-ing the automatic detection of emotionally significantevents

Covert attention toward emotional faces under condi-tions when they were task relevant may have enhancedtheir visualndashperceptual representation (eg CarrascoPenpeci-Talgar amp Eckstein 2000) thereby enabling theextraction of features relating to the affective valence ofthese faces and thus their subsequent encoding andanalysis (as reflected by the emotion-specific ERP effectsobserved in the emotion task) The early frontocentrallydistributed emotional expression effects may be mediatedby connections from the superior temporal sulcus (STS)and amygdala to orbitofrontal cortex (Rolls 1999) TheSTS has been implicated in the early discrimination ofvisual features relating to emotional facial expressions(eg Sprengelmeyer et al 1998) In addition efferentfeedback projections from the amygdala and relatedstructures (see Lang et al 1998 Morris et al 1998)may have produced the more broadly distributed emo-tional expression effects observed in the present experi-ment at longer latencies

One could argue that the absence of emotional ex-pression effects under conditions where faces were un-attended may have been due to the fact that the presen-tation of specific emotional expressions was blocked andthat each expression was presented repeatedly in twoseparate blocks Repeated exposure to a specific emo-tional expression may have resulted in a gradual habitu-ation of emotion-specific responses thus potentially at-tenuating any emotional expression effects that may havebeen present in the lines task To investigate this possi-bility we computed separate averages for the first blockand for the second block including angry disgustedfearful happy sad or surprised faces separately for theemotion and for the lines task These data were then an-alyzed with the additional factor of block position (firstvs second block containing a specific emotional facialexpression) If emotional expression effects were subjectto habituation one would expect to find larger emotionalexpression effects for the first relative to the second blockin the emotion task and potentially also a residual emo-tional expression effect for the first block in the lines task

Figure 7 shows ERPs elicited at Fz in response to neu-tral faces (solid lines) or emotional faces (dashed lines)

collapsed across all six different emotional expressionsERPs are displayed separately for the emotion task (toppanel) and the lines task (bottom panel) and for the firstblock (left) or second block (right) including one of thesix emotional expressions As can be seen from Figure 7(top) there was no evidence whatsoever for any habitu-ation of emotional expression effects as a function ofblock position in the emotion task This was confirmedby the absence of any block position 3 valence or blockposition 3 block type 3 valence interactions for all latencywindows employed in the analyses reported above [allFs(113) lt 1] Along similar lines Figure 7 (bottom panel)suggests that there was no residual emotional expressioneffect for the first block including a specific emotionalexpression in the lines task This was confirmed by theabsence of any interactions involving block position [allFs(113) lt 16] Thus the fact that emotional expressioneffects were absent in response to unattended faces in thelines task is unlikely to have been the result of a habitu-ation of emotion-specific brain responses

The conclusion that the processing of emotional facialexpression as reflected by ERP facial expression effectsis gated by spatial attention appears to be inconsistentwith neuroimaging studies demonstrating that fearfulfaces result in amygdala activations even when thesefaces are outside the focus of attention (Vuilleumieret al 2001 see also Morris et al 1996 Whalen et al1998) However it is extremely unlikely that the ERP ef-fects observed in the present study are directly linked toamygdala activations Due to its nuclear structure ofclustered neurones the amygdala is electrically closedand thus largely inaccessible to ERP measures The earlyemotional expression effects observed in response to at-tended faces are more likely to be generated in prefrontalcortex where emotion-specific single-cell responseshave recently been recorded at short latencies (Kawasakiet al 2001) Such prefrontal responses may reflect stagesin emotional processing that could be contingent uponbut functionally separate from prior amygdala activa-tions (see Le Doux 1996 Rolls 1999) It is possible thatamygdala responses can be triggered by unattendedemotional stimuli (although these responses may be at-tenuated) whereas subsequent neocortical stages ofemotional processing (as reflected by the ERP effects ob-served in the present experiment) are fully dependent onfocal attention An alternative possibility is that amyg-dala responses to emotional stimuli may also require at-tention (see Pessoa Kastner amp Ungerleider 2002 Pes-soa McKenna et al 2002) and that the elimination ofemotional expression effects in the lines task reflects anearlier attentional gating of such subcortical processing

Another important new finding of the present experi-ment was that the onset time course and scalp distribu-tion of emotional expression effects obtained in the emo-tion task were remarkably similar for all six basic facialexpressions used here (Figures 5 and 6) The absence ofany differential ERP responses to different emotional ex-pressions was reflected by the absence of any significant

ATTENTION AND FACIAL EXPRESSION PROCESSING 107

interactions between block type (blocks with angry dis-gusted fearful happy sad or surprised faces) and va-lence (emotional vs neutral expression) In line withthese observations the size of the RT advantage for emo-tional relative to neutral faces in the emotion task wassimilar for all six emotional facial expressions (Figure 2top panel) The similarity in the time course of emotionalexpression effects across all six emotional expressionsobserved here suggests that emotionally relevant infor-mation delivered by facial expression is available to neo-cortical processes within less then 200 msec after stim-ulus onset and at approximately the same time for allbasic emotional expressions

These observations do not seem to support the ideasuggested by recent f MRI results that distinct neuralsubsystems specialize in the processing of specific emo-tions (Adolphs 2002) If this were the case one mighthave expected some systematic differences between ERPemotional expression effects elicited by different facial

expressions However it should be noted that althoughsome neuroimaging data show emotion-specific differ-ential activation of brain regions such as the amygdala orinsula few studies point to differential activation withinsurface cortical structures (where the ERP effects ob-served in the present experiments are likely to be gener-ated see also Pizzagalli et al 1999 Sato et al 2001for related results from recent ERP studies)

Thus one could argue that early stages in the processingof emotionally relevant information subserved by lim-bic structures or the basal ganglia and subsequent neo-cortical emotional processing stages differ not only intheir dependence on focal attention (see above) but alsoin their specificity Early processes may be differentiallyengaged by specific emotional expressions thus provid-ing a rapid classif ication of emotionally significantevents Data in support of this view come from single-unit recordings which reveal a rapid emergence of dif-ferential effects to emotional expressions in the human

Figure 7 Grand-averaged ERP waveforms elicited in the emotion task (toppanel) and in the lines task (bottom panel) at midline electrode Fz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are col-lapsed across blocks including each of the six different emotional facialexpressions and are shown separately for the first block (left) and the secondblock (right) including one specific emotional expression

108 EIMER HOLMES AND MCGLONE

amygdala (Liu et al 1999) Conversely later stagesmight be involved in the in-depth processing of variouskinds of affective information and thus would be muchless selective with respect to different facial expressions

This suggestion is consistent with some recent evi-dence that subcortical and neocortical routes for visualprocessing are involved differentially in emotional ex-pression analysis A subcortical magnocellular pathwayto the amygdala would appear to support valence dis-crimination processes whereas parvocellular subsys-tems of ventral visual cortices may be preferentially in-volved in emotional intensity evaluation irrespective ofemotional valence (Schyns amp Oliva 1999 VuilleumierArmony Driver amp Dolan 2003) Recent neuroimagingresults (Vuilleumier et al 2003) suggest that low andhigh spatial frequency components of fearful faces se-lectively drive amygdala and visual cortical responsesrespectively However although enhanced amygdala ac-tivation was found in response to low-spatial-frequencyfearful face stimuli explicit judgments relating to theperceived intensity of fearfulness were increased by thepresence of high-spatial-frequency cues These resultssupport the view that coarse visual information may bedirected via magnocellular channels from the retina tothe amygdala through a tectopulvinar pathway (eg Bistiamp Sireteanu 1976 Jones amp Burton 1976) enabling thefast appraisal of the affective significance of a stimulus(eg Morris Oumlhman amp Dolan 1999)3

Another aim of the present study was to investigatewhether the face-specific N170 component which is as-sumed to reflect the structural encoding of faces is sen-sitive to emotional facial expressions In previous ERPstudies which have not found any modulations of theN170 elicited by fearful relative to neutral faces (Eimeramp Holmes 2002 Holmes et al 2003) facial expressionwas always task irrelevant In contrast participantsrsquo re-sponses were contingent upon facial expression in thepresent emotion task In spite of this fact the N170 wasfound to be completely unaffected by facial expressionsin the emotion task and this was consistently the casefor all six emotional expressions used in the presentstudy (Figure 5)

In line with earlier findings from depth electrodes (Mc-Carthy Puce Belger amp Allison 1999) this pattern of re-sults now demonstrates comprehensively that the structuralencoding of faces as reflected by the N170 is entirely in-sensitive to information derived from emotional facial ex-pression Thus the rapid detection of emotional facialexpression appears to occur independently and in paral-lel to the construction of a detailed perceptual representa-tion of a face The absence of systematic early emotionalexpression effects at posterior sites and the presence ofsuch ERP effects at frontocentral electrodes at about160 msec poststimulus suggests that higher order visualprocessing stages involved in face processing are af-fected by emotional facial expression only after this in-formation has been processed in prefrontal cortex Thisis consistent with the face processing model proposed byBruce and Young (1986) in which the extraction of per-

ceptual information for emotional expression processingoccurs independently and simultaneously with structuralencoding for face recognition

In summary the present ERP results demonstrate thatthe neocortical processing of emotional facial expres-sion is strongly dependent on focal attention Whenfaces were attended systematic emotional expression ef-fects were elicited by emotional relative to neutral facesand these effects were strikingly similar in terms of theirtiming and morphology for all six basic facial expres-sions In contrast when attention was actively directedaway from these faces emotional expression effectswere completely eliminated The rapid and automatic en-coding of emotionally significant events occurring out-side the focus of attention may be adaptively advanta-geous because it prepares the organism for f ight orflight through subcortically mediated autonomic activa-tion (eg Oumlhman Flykt amp Lundqvist 2000) Howeverit is equally important that irrelevant affective stimuli donot continuously divert attention This suggests a divi-sion of labor between limbic structures involved in theobligatory detection of emotional informationmdashprepar-ing the organism for rapid action (Morris et al 1999Whalen et al 1998)mdashand subsequent neocortical emo-tional processing stages Limbic structures may be re-sponsible for establishing a readiness to respond to anyenvironmental threat that could become the focus of at-tention presumably through heightened autonomic acti-vation However neocortical stages appear to be pro-tected by efficient attentional gating mechanisms whichreduce distractibility by emotional stimuli so that ongo-ing goals and plans can be accomplished without inter-ference from irrelevant events

REFERENCES

Adolphs R (2002) Recognizing emotion from facial expressionsPsychological and neurological mechanisms Behavioral CognitiveNeuroscience Review 1 21-61

Adolphs R Tranel D amp Damasio A R ( 2003) Dissociableneural systems for recognizing emotions Brain amp Cognition 52 61-69

Amaral D G amp Price J L (1984) Amygdalo-cortical projectionsin the monkey (Macaca fasicularis) Journal of Comparative Neu-rology 230 465-496

Amaral D G Price J L Pitkanen A amp Carmichael S T(1992) Anatomical organization of the primate amygdaloid complexIn J P Aggleton (Ed) The amygdala Neurobiological aspects ofemotion memory and mental dysfunction (pp 1-66) New YorkWiley-Liss

Armony J L amp Dolan R J (2002) Modulation of spatial attentionby fear-conditioned stimuli An event-related fMRI study Neuro-psychologia 7 817-826

Bentin S Allison T Puce A Perez E amp McCarthy G (1996)Electrophysiological studies of face perception in humans Journal ofCognitive Neuroscience 8 551-565

Bisti S amp Sireteanu R C (1976) Sensitivity to spatial frequencyand contrast of visual cells in the cat superior colliculus Vision Re-search 16 247-251

Blair R J R Morris J S Frith C D Perrett D I amp DolanR J (1999) Dissociable neural responses to facial expressions ofsadness and anger Brain 122 883-893

Breiter H C Etcoff N L Whalen P J Kennedy W ARauch S L Buckner R L Strauss M M Hyman S E amp

ATTENTION AND FACIAL EXPRESSION PROCESSING 109

Rosen B R (1996) Response and habituation of the human amyg-dala during visual processing of facial expression Neuron 17 875-887

Bruce V amp Young A (1986) Understanding face recognitionBritish Journal of Psychology 77 305-327

Calder A J Keane J Manes F Antoun N amp Young A W(2000) Impaired recognition and experience of disgust followingbrain injury Nature Neuroscience 3 1077-1078

Calder A J Lawrence A D amp Young A W (2001) Neuropsy-chology of fear and loathing Nature Reviews Neuroscience 2 352-363

Carrasco M Penpeci-Talgar C amp Eckstein M (2000) Spatialcovert attention increases contrast sensitivity across the CSF Sup-port for signal enhancement Vision Research 40 1203-1215

Critchley H [D] Daly E [M] Phillips M Brammer M Bull-more E Williams S [C] van Amelsvoort T Robertson DDavid A amp Murphy D [G M] (2000) Explicit and implicit neuralmechanisms for processing of social information from facial expres-sions A functional magnetic resonance imaging study Human BrainMapping 9 93-105

Cuthbert B N Schupp H T Bradley M M Birbaumer N ampLang P J (2000) Brain potentials in affective picture processingCovariation with autonomic arousal and affective report BiologicalPsychology 52 95-111

Damasio A R (1994) Descartesrsquo error Emotion reason and the humanbrain New York G P Putnamrsquos Sons

Diedrich O Naumann E Maier S amp Becker G (1997) Afrontal slow wave in the ERP associated with emotional slides Jour-nal of Psychophysiology 11 71-84

Eastwood J D Smilek D amp Merikle P M (2001) Differential at-tentional guidance by unattended faces expressing positive and neg-ative emotion Perception amp Psychophysics 63 1004-1013

Eimer M (1998) Does the face-specif ic N170 component reflect theactivity of a specialized eye detector NeuroReport 9 2945-2948

Eimer M (2000) The face-specific N170 component reflects late stagesin the structural encoding of faces NeuroReport 11 2319-2324

Eimer M amp Holmes A (2002) An ERP study on the time course ofemotional face processing NeuroReport 13 427-431

Ekman P amp Friesen W V (1976) Pictures of facial affect Palo AltoCA Consulting Psychologists Press

Fox E Lester V Russo R Bowles R J Pichler A amp Dutton K(2000) Facial expressions of emotion Are angry faces detected moreefficiently Cognition amp Emotion 14 61-92

Hansen C H amp Hansen R D (1988) Finding the face in the crowdAn anger superiority effect Journal of Personality amp Social Psy-chology 54 917-924

Hariri A R Bookheimer S Y amp Mazziotta J C (2000) Modu-lating emotional responses Effects of a neocortical network on thelimbic system NeuroReport 11 43-48

Harmer C J Thilo K V Rothwell J C amp Goodwin G M(2001) Transcranial magnetic stimulation of medial-frontal corteximpairs the processing of angry facial expressions Nature Neuro-science 4 17-18

Holmes A Vuilleumier P amp Eimer M (2003) The processing ofemotional facial expression is gated by spatial attention Evidencefrom event-related brain potentials Cognitive Brain Research 16174-184

Jones E G amp Burton H (1976) A projection from the medial pul-vinar to the amygdala in primates Brain Research 104 142-147

Kawasaki H Kaufman O Damasio H Damasio A R Gran-ner M Bakken H Hori T Howard M A III amp Adolphs R(2001) Single-neuron responses to emotional visual stimuli recordedin human ventral prefrontal cortex Nature Neuroscience 4 15-16

Lane R D Chua P M amp Dolan R J (1999) Common effects ofemotional valence arousal and attention on neural activation duringvisual processing of pictures Neuropsychologia 37 989-997

Lang P J Bradley M M Fitzsimmons J R Cuthbert B NScott J D Moulder B amp Nangia V (1998) Emotional arousaland activation of the visual cortex An fMRI analysis Psychophysi-ology 35 199-210

Le Doux J E (1996) The emotional brain New York Simon ampSchuster

Liu L Ioannides A A amp Streit M (1999) Single trial analysis ofneurophysiological correlates of the recognition of complex objectsand facial expressions of emotion Brain Topography 11 291-303

McCarthy G Puce A Belger A amp Allison T (1999) Electro-physiological studies of human face perception II Response proper-ties of face-specific potentials generated in occipitotemporal cortexCerebral Cortex 9 431-444

Mogg K amp Bradley B P (1999) Orienting of attention to threaten-ing facial expressions presented under conditions of restricted aware-ness Cognition amp Emotion 13 713-740

Mogg K McNamara J Powys M Rawlinson H Seiffer A ampBradley B P (2000) Selective attention to threat A test of twocognitive models of anxiety Cognition amp Emotion 14 375-399

Morris J S Friston K J Buechel C Frith C D Young A WCalder A J amp Dolan R J (1998) A neuromodulatory role forthe human amygdala in processing emotional facial expressionsBrain 121 47-57

Morris J S Frith C D Perrett D I Rowland D YoungA W Calder A J amp Dolan R J (1996) A differential neural re-sponse in the human amygdala to fearful and happy facial expres-sions Nature 383 812-815

Morris J S Oumlhman A amp Dolan R J (1999) A subcortical path-way to the right amygdala mediating ldquounseenrdquo fear Proceedings ofthe National Academy of Sciences 96 1680-1685

Oumlhman A Flykt A amp Esteves F (2001) Emotion drives attentionDetecting the snake in the grass Journal of Experimental Psychol-ogy General 130 466-478

Oumlhman A Flykt A amp Lundqvist D (2000) Unconscious emo-tion Evolutionary perspectives psychophysiological data and neu-ropsychological mechanisms In R D Lane amp L Nadel (Eds) Cog-nitive neuroscience of emotion (pp 296-327) New York OxfordUniversity Press

Oumlhman A Lundqvist D amp Esteves F (2001) The face in thecrowd revisited A threat advantage with schematic stimuli Journalof Personality amp Social Psychology 80 381-396

Pessoa L Kastner S amp Ungerleider LG (2002) Attentionalcontrol of the processing of neutral and emotional stimuli CognitiveBrain Research 15 31-45

Pessoa L McKenna M Gutierrez E amp Ungerleider LG(2002) Neural processing of emotional faces requires attention Pro-ceedings of the National Academy of Sciences 99 11458-11463

Phillips M L Young A W Scott S K Calder A J Andrew CGiampietro V Williams S C R Bullmore E T BrammerMamp Gray J A (1998) Neural responses to facial and vocal expres-sions of fear and disgust Proceedings of the Royal Society LondonSeries B 265 1809-1817

Phillips M L Young A W Senior C Brammer M Andrew CCalder A J Bullmore E T Perrett D I Rowland D ampWilliams S C R et al (1997) A specif ic neural substrate forperceiving facial expressions of disgust Nature 389 495-498

Pizzagalli D Regard M amp Lehmann D (1999) Rapid emotionalface processing in the human right and left brain hemispheres AnERP study NeuroReport 10 2691-2698

Rapcsak S Z Galper S R Comer J F Reminger S L NielsenLKaszniak A W Verfaellie M Laguna J F Labiner D Mamp Cohen R A (2000) Fear recognition deficits after focal braindamage Neurology 54 575-581

Rolls E T (1999) The brain and emotion Oxford Oxford UniversityPress

Sato W Kochiyama T Yoshikawa S amp Matsumura M (2001)Emotional expression boosts early visual processing of the face ERPrecording and its decomposition by independent component analysisNeuroReport 12 709-714

Schyns P G amp Oliva A (1999) Dr Angry and Mr Smile Whencategorization flexibly modifies the perception of faces in rapid vi-sual presentations Cognition 69 243-265

Sprengelmeyer R Rausch M Eysel U T amp Przuntek H (1998)Neural structures associated with recognition of facial expressions of

110 EIMER HOLMES AND MCGLONE

basic emotions Proceedings of the Royal Society London Series B265 1927-1931

Stepniewska I Qi H X amp Kaas J H (1999) Do superior collicu-lus projection zones in the inferior pulvinar project to MT in pri-mates European Journal of Neuroscience 11 469-480

Vuilleumier P Armony J L Driver J amp Dolan R J (2001) Ef-fects of attention and emotion on face processing in the human brainAn event-related fMRI study Neuron 30 829-841

Vuilleumier P Armony J L Driver J amp Dolan R J (2003)Distinct spatial frequency sensitivities for processing faces and emo-tional expressions Nature Neuroscience 6 624-631

Vuilleumier P amp Schwartz S (2001a) Beware and be aware Cap-ture of spatial attention by fear-related stimuli in neglect Neuro-Report 12 1119-1122

Vuilleumier P amp Schwartz S (2001b) Emotional expressionscapture attention Neurology 56 153-158

Whalen P J Rach S L Etcoff N L McInerney S C LeeM B amp Jenike M A (1998) Masked presentations of emotionalfacial expressions modulate amygdala activity without explicitknowledge Journal of Neuroscience 18 411-418

Whalen P J Shin L M McInerney S C Fischer H WrightC I amp Rauch S L (2001) A functional MRI study of humanamygdala responses to facial expressions of fear versus anger Emo-tion 1 70-83

Yeterian E H amp Pandya D N (1991) Corticothalamic connec-tions of the superior temporal sulcus in rhesus monkeys Experimen-tal Brain Research 83 268-284

NOTES

1 In spite of the fact that significant valence effects were present atcentral electrodes in the emotion task but were absent in the lines taskthis interaction failed to reach significance at central sites

2 At lateral occipital electrodes a significantly enhanced positivityfor emotional relative to neutral faces was present between 320 and495 msec in the emotion task [F(113) = 62 p lt 03] but not in thelines task and this was reflected in a nearly significant task 3 valenceinteraction [F(113) = 46 p lt 06]

3 It should be noted that anatomical evidence for a colliculo-pulvinar-amygdalar pathway is currently lacking since the medial pulvinarwhich projects to the amygdala does not receive a significant directinput from the superior colliculus (eg Stepniewska Qi amp Kaas1999) However possible connections between the inferior pulvinar(which receives visual inputs from the superior colliculus) and the me-dial nucleus may support the transmission of information to the amyg-dala through a colliculo-pulvinar route Alternatively cortical inputmay be involved since STS (implicated in facial expression processingSprengelmeyer et al 1998) is known to project to the medial pulvinar(Yeterian amp Pandya 1991) Our thanks to an anonymous reviewer forraising this important point

(Manuscript received December 10 2002revision accepted for publication June 11 2003)

ATTENTION AND FACIAL EXPRESSION PROCESSING 107

interactions between block type (blocks with angry dis-gusted fearful happy sad or surprised faces) and va-lence (emotional vs neutral expression) In line withthese observations the size of the RT advantage for emo-tional relative to neutral faces in the emotion task wassimilar for all six emotional facial expressions (Figure 2top panel) The similarity in the time course of emotionalexpression effects across all six emotional expressionsobserved here suggests that emotionally relevant infor-mation delivered by facial expression is available to neo-cortical processes within less then 200 msec after stim-ulus onset and at approximately the same time for allbasic emotional expressions

These observations do not seem to support the ideasuggested by recent f MRI results that distinct neuralsubsystems specialize in the processing of specific emo-tions (Adolphs 2002) If this were the case one mighthave expected some systematic differences between ERPemotional expression effects elicited by different facial

expressions However it should be noted that althoughsome neuroimaging data show emotion-specific differ-ential activation of brain regions such as the amygdala orinsula few studies point to differential activation withinsurface cortical structures (where the ERP effects ob-served in the present experiments are likely to be gener-ated see also Pizzagalli et al 1999 Sato et al 2001for related results from recent ERP studies)

Thus one could argue that early stages in the processingof emotionally relevant information subserved by lim-bic structures or the basal ganglia and subsequent neo-cortical emotional processing stages differ not only intheir dependence on focal attention (see above) but alsoin their specificity Early processes may be differentiallyengaged by specific emotional expressions thus provid-ing a rapid classif ication of emotionally significantevents Data in support of this view come from single-unit recordings which reveal a rapid emergence of dif-ferential effects to emotional expressions in the human

Figure 7 Grand-averaged ERP waveforms elicited in the emotion task (toppanel) and in the lines task (bottom panel) at midline electrode Fz in the 700-msec interval following stimulus onset in response to stimulus arrays contain-ing neutral faces (solid lines) or emotional faces (dashed lines) ERPs are col-lapsed across blocks including each of the six different emotional facialexpressions and are shown separately for the first block (left) and the secondblock (right) including one specific emotional expression

108 EIMER HOLMES AND MCGLONE

amygdala (Liu et al 1999) Conversely later stagesmight be involved in the in-depth processing of variouskinds of affective information and thus would be muchless selective with respect to different facial expressions

This suggestion is consistent with some recent evi-dence that subcortical and neocortical routes for visualprocessing are involved differentially in emotional ex-pression analysis A subcortical magnocellular pathwayto the amygdala would appear to support valence dis-crimination processes whereas parvocellular subsys-tems of ventral visual cortices may be preferentially in-volved in emotional intensity evaluation irrespective ofemotional valence (Schyns amp Oliva 1999 VuilleumierArmony Driver amp Dolan 2003) Recent neuroimagingresults (Vuilleumier et al 2003) suggest that low andhigh spatial frequency components of fearful faces se-lectively drive amygdala and visual cortical responsesrespectively However although enhanced amygdala ac-tivation was found in response to low-spatial-frequencyfearful face stimuli explicit judgments relating to theperceived intensity of fearfulness were increased by thepresence of high-spatial-frequency cues These resultssupport the view that coarse visual information may bedirected via magnocellular channels from the retina tothe amygdala through a tectopulvinar pathway (eg Bistiamp Sireteanu 1976 Jones amp Burton 1976) enabling thefast appraisal of the affective significance of a stimulus(eg Morris Oumlhman amp Dolan 1999)3

Another aim of the present study was to investigatewhether the face-specific N170 component which is as-sumed to reflect the structural encoding of faces is sen-sitive to emotional facial expressions In previous ERPstudies which have not found any modulations of theN170 elicited by fearful relative to neutral faces (Eimeramp Holmes 2002 Holmes et al 2003) facial expressionwas always task irrelevant In contrast participantsrsquo re-sponses were contingent upon facial expression in thepresent emotion task In spite of this fact the N170 wasfound to be completely unaffected by facial expressionsin the emotion task and this was consistently the casefor all six emotional expressions used in the presentstudy (Figure 5)

In line with earlier findings from depth electrodes (Mc-Carthy Puce Belger amp Allison 1999) this pattern of re-sults now demonstrates comprehensively that the structuralencoding of faces as reflected by the N170 is entirely in-sensitive to information derived from emotional facial ex-pression Thus the rapid detection of emotional facialexpression appears to occur independently and in paral-lel to the construction of a detailed perceptual representa-tion of a face The absence of systematic early emotionalexpression effects at posterior sites and the presence ofsuch ERP effects at frontocentral electrodes at about160 msec poststimulus suggests that higher order visualprocessing stages involved in face processing are af-fected by emotional facial expression only after this in-formation has been processed in prefrontal cortex Thisis consistent with the face processing model proposed byBruce and Young (1986) in which the extraction of per-

ceptual information for emotional expression processingoccurs independently and simultaneously with structuralencoding for face recognition

In summary the present ERP results demonstrate thatthe neocortical processing of emotional facial expres-sion is strongly dependent on focal attention Whenfaces were attended systematic emotional expression ef-fects were elicited by emotional relative to neutral facesand these effects were strikingly similar in terms of theirtiming and morphology for all six basic facial expres-sions In contrast when attention was actively directedaway from these faces emotional expression effectswere completely eliminated The rapid and automatic en-coding of emotionally significant events occurring out-side the focus of attention may be adaptively advanta-geous because it prepares the organism for f ight orflight through subcortically mediated autonomic activa-tion (eg Oumlhman Flykt amp Lundqvist 2000) Howeverit is equally important that irrelevant affective stimuli donot continuously divert attention This suggests a divi-sion of labor between limbic structures involved in theobligatory detection of emotional informationmdashprepar-ing the organism for rapid action (Morris et al 1999Whalen et al 1998)mdashand subsequent neocortical emo-tional processing stages Limbic structures may be re-sponsible for establishing a readiness to respond to anyenvironmental threat that could become the focus of at-tention presumably through heightened autonomic acti-vation However neocortical stages appear to be pro-tected by efficient attentional gating mechanisms whichreduce distractibility by emotional stimuli so that ongo-ing goals and plans can be accomplished without inter-ference from irrelevant events

REFERENCES

Adolphs R (2002) Recognizing emotion from facial expressionsPsychological and neurological mechanisms Behavioral CognitiveNeuroscience Review 1 21-61

Adolphs R Tranel D amp Damasio A R ( 2003) Dissociableneural systems for recognizing emotions Brain amp Cognition 52 61-69

Amaral D G amp Price J L (1984) Amygdalo-cortical projectionsin the monkey (Macaca fasicularis) Journal of Comparative Neu-rology 230 465-496

Amaral D G Price J L Pitkanen A amp Carmichael S T(1992) Anatomical organization of the primate amygdaloid complexIn J P Aggleton (Ed) The amygdala Neurobiological aspects ofemotion memory and mental dysfunction (pp 1-66) New YorkWiley-Liss

Armony J L amp Dolan R J (2002) Modulation of spatial attentionby fear-conditioned stimuli An event-related fMRI study Neuro-psychologia 7 817-826

Bentin S Allison T Puce A Perez E amp McCarthy G (1996)Electrophysiological studies of face perception in humans Journal ofCognitive Neuroscience 8 551-565

Bisti S amp Sireteanu R C (1976) Sensitivity to spatial frequencyand contrast of visual cells in the cat superior colliculus Vision Re-search 16 247-251

Blair R J R Morris J S Frith C D Perrett D I amp DolanR J (1999) Dissociable neural responses to facial expressions ofsadness and anger Brain 122 883-893

Breiter H C Etcoff N L Whalen P J Kennedy W ARauch S L Buckner R L Strauss M M Hyman S E amp

ATTENTION AND FACIAL EXPRESSION PROCESSING 109

Rosen B R (1996) Response and habituation of the human amyg-dala during visual processing of facial expression Neuron 17 875-887

Bruce V amp Young A (1986) Understanding face recognitionBritish Journal of Psychology 77 305-327

Calder A J Keane J Manes F Antoun N amp Young A W(2000) Impaired recognition and experience of disgust followingbrain injury Nature Neuroscience 3 1077-1078

Calder A J Lawrence A D amp Young A W (2001) Neuropsy-chology of fear and loathing Nature Reviews Neuroscience 2 352-363

Carrasco M Penpeci-Talgar C amp Eckstein M (2000) Spatialcovert attention increases contrast sensitivity across the CSF Sup-port for signal enhancement Vision Research 40 1203-1215

Critchley H [D] Daly E [M] Phillips M Brammer M Bull-more E Williams S [C] van Amelsvoort T Robertson DDavid A amp Murphy D [G M] (2000) Explicit and implicit neuralmechanisms for processing of social information from facial expres-sions A functional magnetic resonance imaging study Human BrainMapping 9 93-105

Cuthbert B N Schupp H T Bradley M M Birbaumer N ampLang P J (2000) Brain potentials in affective picture processingCovariation with autonomic arousal and affective report BiologicalPsychology 52 95-111

Damasio A R (1994) Descartesrsquo error Emotion reason and the humanbrain New York G P Putnamrsquos Sons

Diedrich O Naumann E Maier S amp Becker G (1997) Afrontal slow wave in the ERP associated with emotional slides Jour-nal of Psychophysiology 11 71-84

Eastwood J D Smilek D amp Merikle P M (2001) Differential at-tentional guidance by unattended faces expressing positive and neg-ative emotion Perception amp Psychophysics 63 1004-1013

Eimer M (1998) Does the face-specif ic N170 component reflect theactivity of a specialized eye detector NeuroReport 9 2945-2948

Eimer M (2000) The face-specific N170 component reflects late stagesin the structural encoding of faces NeuroReport 11 2319-2324

Eimer M amp Holmes A (2002) An ERP study on the time course ofemotional face processing NeuroReport 13 427-431

Ekman P amp Friesen W V (1976) Pictures of facial affect Palo AltoCA Consulting Psychologists Press

Fox E Lester V Russo R Bowles R J Pichler A amp Dutton K(2000) Facial expressions of emotion Are angry faces detected moreefficiently Cognition amp Emotion 14 61-92

Hansen C H amp Hansen R D (1988) Finding the face in the crowdAn anger superiority effect Journal of Personality amp Social Psy-chology 54 917-924

Hariri A R Bookheimer S Y amp Mazziotta J C (2000) Modu-lating emotional responses Effects of a neocortical network on thelimbic system NeuroReport 11 43-48

Harmer C J Thilo K V Rothwell J C amp Goodwin G M(2001) Transcranial magnetic stimulation of medial-frontal corteximpairs the processing of angry facial expressions Nature Neuro-science 4 17-18

Holmes A Vuilleumier P amp Eimer M (2003) The processing ofemotional facial expression is gated by spatial attention Evidencefrom event-related brain potentials Cognitive Brain Research 16174-184

Jones E G amp Burton H (1976) A projection from the medial pul-vinar to the amygdala in primates Brain Research 104 142-147

Kawasaki H Kaufman O Damasio H Damasio A R Gran-ner M Bakken H Hori T Howard M A III amp Adolphs R(2001) Single-neuron responses to emotional visual stimuli recordedin human ventral prefrontal cortex Nature Neuroscience 4 15-16

Lane R D Chua P M amp Dolan R J (1999) Common effects ofemotional valence arousal and attention on neural activation duringvisual processing of pictures Neuropsychologia 37 989-997

Lang P J Bradley M M Fitzsimmons J R Cuthbert B NScott J D Moulder B amp Nangia V (1998) Emotional arousaland activation of the visual cortex An fMRI analysis Psychophysi-ology 35 199-210

Le Doux J E (1996) The emotional brain New York Simon ampSchuster

Liu L Ioannides A A amp Streit M (1999) Single trial analysis ofneurophysiological correlates of the recognition of complex objectsand facial expressions of emotion Brain Topography 11 291-303

McCarthy G Puce A Belger A amp Allison T (1999) Electro-physiological studies of human face perception II Response proper-ties of face-specific potentials generated in occipitotemporal cortexCerebral Cortex 9 431-444

Mogg K amp Bradley B P (1999) Orienting of attention to threaten-ing facial expressions presented under conditions of restricted aware-ness Cognition amp Emotion 13 713-740

Mogg K McNamara J Powys M Rawlinson H Seiffer A ampBradley B P (2000) Selective attention to threat A test of twocognitive models of anxiety Cognition amp Emotion 14 375-399

Morris J S Friston K J Buechel C Frith C D Young A WCalder A J amp Dolan R J (1998) A neuromodulatory role forthe human amygdala in processing emotional facial expressionsBrain 121 47-57

Morris J S Frith C D Perrett D I Rowland D YoungA W Calder A J amp Dolan R J (1996) A differential neural re-sponse in the human amygdala to fearful and happy facial expres-sions Nature 383 812-815

Morris J S Oumlhman A amp Dolan R J (1999) A subcortical path-way to the right amygdala mediating ldquounseenrdquo fear Proceedings ofthe National Academy of Sciences 96 1680-1685

Oumlhman A Flykt A amp Esteves F (2001) Emotion drives attentionDetecting the snake in the grass Journal of Experimental Psychol-ogy General 130 466-478

Oumlhman A Flykt A amp Lundqvist D (2000) Unconscious emo-tion Evolutionary perspectives psychophysiological data and neu-ropsychological mechanisms In R D Lane amp L Nadel (Eds) Cog-nitive neuroscience of emotion (pp 296-327) New York OxfordUniversity Press

Oumlhman A Lundqvist D amp Esteves F (2001) The face in thecrowd revisited A threat advantage with schematic stimuli Journalof Personality amp Social Psychology 80 381-396

Pessoa L Kastner S amp Ungerleider LG (2002) Attentionalcontrol of the processing of neutral and emotional stimuli CognitiveBrain Research 15 31-45

Pessoa L McKenna M Gutierrez E amp Ungerleider LG(2002) Neural processing of emotional faces requires attention Pro-ceedings of the National Academy of Sciences 99 11458-11463

Phillips M L Young A W Scott S K Calder A J Andrew CGiampietro V Williams S C R Bullmore E T BrammerMamp Gray J A (1998) Neural responses to facial and vocal expres-sions of fear and disgust Proceedings of the Royal Society LondonSeries B 265 1809-1817

Phillips M L Young A W Senior C Brammer M Andrew CCalder A J Bullmore E T Perrett D I Rowland D ampWilliams S C R et al (1997) A specif ic neural substrate forperceiving facial expressions of disgust Nature 389 495-498

Pizzagalli D Regard M amp Lehmann D (1999) Rapid emotionalface processing in the human right and left brain hemispheres AnERP study NeuroReport 10 2691-2698

Rapcsak S Z Galper S R Comer J F Reminger S L NielsenLKaszniak A W Verfaellie M Laguna J F Labiner D Mamp Cohen R A (2000) Fear recognition deficits after focal braindamage Neurology 54 575-581

Rolls E T (1999) The brain and emotion Oxford Oxford UniversityPress

Sato W Kochiyama T Yoshikawa S amp Matsumura M (2001)Emotional expression boosts early visual processing of the face ERPrecording and its decomposition by independent component analysisNeuroReport 12 709-714

Schyns P G amp Oliva A (1999) Dr Angry and Mr Smile Whencategorization flexibly modifies the perception of faces in rapid vi-sual presentations Cognition 69 243-265

Sprengelmeyer R Rausch M Eysel U T amp Przuntek H (1998)Neural structures associated with recognition of facial expressions of

110 EIMER HOLMES AND MCGLONE

basic emotions Proceedings of the Royal Society London Series B265 1927-1931

Stepniewska I Qi H X amp Kaas J H (1999) Do superior collicu-lus projection zones in the inferior pulvinar project to MT in pri-mates European Journal of Neuroscience 11 469-480

Vuilleumier P Armony J L Driver J amp Dolan R J (2001) Ef-fects of attention and emotion on face processing in the human brainAn event-related fMRI study Neuron 30 829-841

Vuilleumier P Armony J L Driver J amp Dolan R J (2003)Distinct spatial frequency sensitivities for processing faces and emo-tional expressions Nature Neuroscience 6 624-631

Vuilleumier P amp Schwartz S (2001a) Beware and be aware Cap-ture of spatial attention by fear-related stimuli in neglect Neuro-Report 12 1119-1122

Vuilleumier P amp Schwartz S (2001b) Emotional expressionscapture attention Neurology 56 153-158

Whalen P J Rach S L Etcoff N L McInerney S C LeeM B amp Jenike M A (1998) Masked presentations of emotionalfacial expressions modulate amygdala activity without explicitknowledge Journal of Neuroscience 18 411-418

Whalen P J Shin L M McInerney S C Fischer H WrightC I amp Rauch S L (2001) A functional MRI study of humanamygdala responses to facial expressions of fear versus anger Emo-tion 1 70-83

Yeterian E H amp Pandya D N (1991) Corticothalamic connec-tions of the superior temporal sulcus in rhesus monkeys Experimen-tal Brain Research 83 268-284

NOTES

1 In spite of the fact that significant valence effects were present atcentral electrodes in the emotion task but were absent in the lines taskthis interaction failed to reach significance at central sites

2 At lateral occipital electrodes a significantly enhanced positivityfor emotional relative to neutral faces was present between 320 and495 msec in the emotion task [F(113) = 62 p lt 03] but not in thelines task and this was reflected in a nearly significant task 3 valenceinteraction [F(113) = 46 p lt 06]

3 It should be noted that anatomical evidence for a colliculo-pulvinar-amygdalar pathway is currently lacking since the medial pulvinarwhich projects to the amygdala does not receive a significant directinput from the superior colliculus (eg Stepniewska Qi amp Kaas1999) However possible connections between the inferior pulvinar(which receives visual inputs from the superior colliculus) and the me-dial nucleus may support the transmission of information to the amyg-dala through a colliculo-pulvinar route Alternatively cortical inputmay be involved since STS (implicated in facial expression processingSprengelmeyer et al 1998) is known to project to the medial pulvinar(Yeterian amp Pandya 1991) Our thanks to an anonymous reviewer forraising this important point

(Manuscript received December 10 2002revision accepted for publication June 11 2003)

108 EIMER HOLMES AND MCGLONE

amygdala (Liu et al 1999) Conversely later stagesmight be involved in the in-depth processing of variouskinds of affective information and thus would be muchless selective with respect to different facial expressions

This suggestion is consistent with some recent evi-dence that subcortical and neocortical routes for visualprocessing are involved differentially in emotional ex-pression analysis A subcortical magnocellular pathwayto the amygdala would appear to support valence dis-crimination processes whereas parvocellular subsys-tems of ventral visual cortices may be preferentially in-volved in emotional intensity evaluation irrespective ofemotional valence (Schyns amp Oliva 1999 VuilleumierArmony Driver amp Dolan 2003) Recent neuroimagingresults (Vuilleumier et al 2003) suggest that low andhigh spatial frequency components of fearful faces se-lectively drive amygdala and visual cortical responsesrespectively However although enhanced amygdala ac-tivation was found in response to low-spatial-frequencyfearful face stimuli explicit judgments relating to theperceived intensity of fearfulness were increased by thepresence of high-spatial-frequency cues These resultssupport the view that coarse visual information may bedirected via magnocellular channels from the retina tothe amygdala through a tectopulvinar pathway (eg Bistiamp Sireteanu 1976 Jones amp Burton 1976) enabling thefast appraisal of the affective significance of a stimulus(eg Morris Oumlhman amp Dolan 1999)3

Another aim of the present study was to investigatewhether the face-specific N170 component which is as-sumed to reflect the structural encoding of faces is sen-sitive to emotional facial expressions In previous ERPstudies which have not found any modulations of theN170 elicited by fearful relative to neutral faces (Eimeramp Holmes 2002 Holmes et al 2003) facial expressionwas always task irrelevant In contrast participantsrsquo re-sponses were contingent upon facial expression in thepresent emotion task In spite of this fact the N170 wasfound to be completely unaffected by facial expressionsin the emotion task and this was consistently the casefor all six emotional expressions used in the presentstudy (Figure 5)

In line with earlier findings from depth electrodes (Mc-Carthy Puce Belger amp Allison 1999) this pattern of re-sults now demonstrates comprehensively that the structuralencoding of faces as reflected by the N170 is entirely in-sensitive to information derived from emotional facial ex-pression Thus the rapid detection of emotional facialexpression appears to occur independently and in paral-lel to the construction of a detailed perceptual representa-tion of a face The absence of systematic early emotionalexpression effects at posterior sites and the presence ofsuch ERP effects at frontocentral electrodes at about160 msec poststimulus suggests that higher order visualprocessing stages involved in face processing are af-fected by emotional facial expression only after this in-formation has been processed in prefrontal cortex Thisis consistent with the face processing model proposed byBruce and Young (1986) in which the extraction of per-

ceptual information for emotional expression processingoccurs independently and simultaneously with structuralencoding for face recognition

In summary the present ERP results demonstrate thatthe neocortical processing of emotional facial expres-sion is strongly dependent on focal attention Whenfaces were attended systematic emotional expression ef-fects were elicited by emotional relative to neutral facesand these effects were strikingly similar in terms of theirtiming and morphology for all six basic facial expres-sions In contrast when attention was actively directedaway from these faces emotional expression effectswere completely eliminated The rapid and automatic en-coding of emotionally significant events occurring out-side the focus of attention may be adaptively advanta-geous because it prepares the organism for f ight orflight through subcortically mediated autonomic activa-tion (eg Oumlhman Flykt amp Lundqvist 2000) Howeverit is equally important that irrelevant affective stimuli donot continuously divert attention This suggests a divi-sion of labor between limbic structures involved in theobligatory detection of emotional informationmdashprepar-ing the organism for rapid action (Morris et al 1999Whalen et al 1998)mdashand subsequent neocortical emo-tional processing stages Limbic structures may be re-sponsible for establishing a readiness to respond to anyenvironmental threat that could become the focus of at-tention presumably through heightened autonomic acti-vation However neocortical stages appear to be pro-tected by efficient attentional gating mechanisms whichreduce distractibility by emotional stimuli so that ongo-ing goals and plans can be accomplished without inter-ference from irrelevant events

REFERENCES

Adolphs R (2002) Recognizing emotion from facial expressionsPsychological and neurological mechanisms Behavioral CognitiveNeuroscience Review 1 21-61

Adolphs R Tranel D amp Damasio A R ( 2003) Dissociableneural systems for recognizing emotions Brain amp Cognition 52 61-69

Amaral D G amp Price J L (1984) Amygdalo-cortical projectionsin the monkey (Macaca fasicularis) Journal of Comparative Neu-rology 230 465-496

Amaral D G Price J L Pitkanen A amp Carmichael S T(1992) Anatomical organization of the primate amygdaloid complexIn J P Aggleton (Ed) The amygdala Neurobiological aspects ofemotion memory and mental dysfunction (pp 1-66) New YorkWiley-Liss

Armony J L amp Dolan R J (2002) Modulation of spatial attentionby fear-conditioned stimuli An event-related fMRI study Neuro-psychologia 7 817-826

Bentin S Allison T Puce A Perez E amp McCarthy G (1996)Electrophysiological studies of face perception in humans Journal ofCognitive Neuroscience 8 551-565

Bisti S amp Sireteanu R C (1976) Sensitivity to spatial frequencyand contrast of visual cells in the cat superior colliculus Vision Re-search 16 247-251

Blair R J R Morris J S Frith C D Perrett D I amp DolanR J (1999) Dissociable neural responses to facial expressions ofsadness and anger Brain 122 883-893

Breiter H C Etcoff N L Whalen P J Kennedy W ARauch S L Buckner R L Strauss M M Hyman S E amp

ATTENTION AND FACIAL EXPRESSION PROCESSING 109

Rosen B R (1996) Response and habituation of the human amyg-dala during visual processing of facial expression Neuron 17 875-887

Bruce V amp Young A (1986) Understanding face recognitionBritish Journal of Psychology 77 305-327

Calder A J Keane J Manes F Antoun N amp Young A W(2000) Impaired recognition and experience of disgust followingbrain injury Nature Neuroscience 3 1077-1078

Calder A J Lawrence A D amp Young A W (2001) Neuropsy-chology of fear and loathing Nature Reviews Neuroscience 2 352-363

Carrasco M Penpeci-Talgar C amp Eckstein M (2000) Spatialcovert attention increases contrast sensitivity across the CSF Sup-port for signal enhancement Vision Research 40 1203-1215

Critchley H [D] Daly E [M] Phillips M Brammer M Bull-more E Williams S [C] van Amelsvoort T Robertson DDavid A amp Murphy D [G M] (2000) Explicit and implicit neuralmechanisms for processing of social information from facial expres-sions A functional magnetic resonance imaging study Human BrainMapping 9 93-105

Cuthbert B N Schupp H T Bradley M M Birbaumer N ampLang P J (2000) Brain potentials in affective picture processingCovariation with autonomic arousal and affective report BiologicalPsychology 52 95-111

Damasio A R (1994) Descartesrsquo error Emotion reason and the humanbrain New York G P Putnamrsquos Sons

Diedrich O Naumann E Maier S amp Becker G (1997) Afrontal slow wave in the ERP associated with emotional slides Jour-nal of Psychophysiology 11 71-84

Eastwood J D Smilek D amp Merikle P M (2001) Differential at-tentional guidance by unattended faces expressing positive and neg-ative emotion Perception amp Psychophysics 63 1004-1013

Eimer M (1998) Does the face-specif ic N170 component reflect theactivity of a specialized eye detector NeuroReport 9 2945-2948

Eimer M (2000) The face-specific N170 component reflects late stagesin the structural encoding of faces NeuroReport 11 2319-2324

Eimer M amp Holmes A (2002) An ERP study on the time course ofemotional face processing NeuroReport 13 427-431

Ekman P amp Friesen W V (1976) Pictures of facial affect Palo AltoCA Consulting Psychologists Press

Fox E Lester V Russo R Bowles R J Pichler A amp Dutton K(2000) Facial expressions of emotion Are angry faces detected moreefficiently Cognition amp Emotion 14 61-92

Hansen C H amp Hansen R D (1988) Finding the face in the crowdAn anger superiority effect Journal of Personality amp Social Psy-chology 54 917-924

Hariri A R Bookheimer S Y amp Mazziotta J C (2000) Modu-lating emotional responses Effects of a neocortical network on thelimbic system NeuroReport 11 43-48

Harmer C J Thilo K V Rothwell J C amp Goodwin G M(2001) Transcranial magnetic stimulation of medial-frontal corteximpairs the processing of angry facial expressions Nature Neuro-science 4 17-18

Holmes A Vuilleumier P amp Eimer M (2003) The processing ofemotional facial expression is gated by spatial attention Evidencefrom event-related brain potentials Cognitive Brain Research 16174-184

Jones E G amp Burton H (1976) A projection from the medial pul-vinar to the amygdala in primates Brain Research 104 142-147

Kawasaki H Kaufman O Damasio H Damasio A R Gran-ner M Bakken H Hori T Howard M A III amp Adolphs R(2001) Single-neuron responses to emotional visual stimuli recordedin human ventral prefrontal cortex Nature Neuroscience 4 15-16

Lane R D Chua P M amp Dolan R J (1999) Common effects ofemotional valence arousal and attention on neural activation duringvisual processing of pictures Neuropsychologia 37 989-997

Lang P J Bradley M M Fitzsimmons J R Cuthbert B NScott J D Moulder B amp Nangia V (1998) Emotional arousaland activation of the visual cortex An fMRI analysis Psychophysi-ology 35 199-210

Le Doux J E (1996) The emotional brain New York Simon ampSchuster

Liu L Ioannides A A amp Streit M (1999) Single trial analysis ofneurophysiological correlates of the recognition of complex objectsand facial expressions of emotion Brain Topography 11 291-303

McCarthy G Puce A Belger A amp Allison T (1999) Electro-physiological studies of human face perception II Response proper-ties of face-specific potentials generated in occipitotemporal cortexCerebral Cortex 9 431-444

Mogg K amp Bradley B P (1999) Orienting of attention to threaten-ing facial expressions presented under conditions of restricted aware-ness Cognition amp Emotion 13 713-740

Mogg K McNamara J Powys M Rawlinson H Seiffer A ampBradley B P (2000) Selective attention to threat A test of twocognitive models of anxiety Cognition amp Emotion 14 375-399

Morris J S Friston K J Buechel C Frith C D Young A WCalder A J amp Dolan R J (1998) A neuromodulatory role forthe human amygdala in processing emotional facial expressionsBrain 121 47-57

Morris J S Frith C D Perrett D I Rowland D YoungA W Calder A J amp Dolan R J (1996) A differential neural re-sponse in the human amygdala to fearful and happy facial expres-sions Nature 383 812-815

Morris J S Oumlhman A amp Dolan R J (1999) A subcortical path-way to the right amygdala mediating ldquounseenrdquo fear Proceedings ofthe National Academy of Sciences 96 1680-1685

Oumlhman A Flykt A amp Esteves F (2001) Emotion drives attentionDetecting the snake in the grass Journal of Experimental Psychol-ogy General 130 466-478

Oumlhman A Flykt A amp Lundqvist D (2000) Unconscious emo-tion Evolutionary perspectives psychophysiological data and neu-ropsychological mechanisms In R D Lane amp L Nadel (Eds) Cog-nitive neuroscience of emotion (pp 296-327) New York OxfordUniversity Press

Oumlhman A Lundqvist D amp Esteves F (2001) The face in thecrowd revisited A threat advantage with schematic stimuli Journalof Personality amp Social Psychology 80 381-396

Pessoa L Kastner S amp Ungerleider LG (2002) Attentionalcontrol of the processing of neutral and emotional stimuli CognitiveBrain Research 15 31-45

Pessoa L McKenna M Gutierrez E amp Ungerleider LG(2002) Neural processing of emotional faces requires attention Pro-ceedings of the National Academy of Sciences 99 11458-11463

Phillips M L Young A W Scott S K Calder A J Andrew CGiampietro V Williams S C R Bullmore E T BrammerMamp Gray J A (1998) Neural responses to facial and vocal expres-sions of fear and disgust Proceedings of the Royal Society LondonSeries B 265 1809-1817

Phillips M L Young A W Senior C Brammer M Andrew CCalder A J Bullmore E T Perrett D I Rowland D ampWilliams S C R et al (1997) A specif ic neural substrate forperceiving facial expressions of disgust Nature 389 495-498

Pizzagalli D Regard M amp Lehmann D (1999) Rapid emotionalface processing in the human right and left brain hemispheres AnERP study NeuroReport 10 2691-2698

Rapcsak S Z Galper S R Comer J F Reminger S L NielsenLKaszniak A W Verfaellie M Laguna J F Labiner D Mamp Cohen R A (2000) Fear recognition deficits after focal braindamage Neurology 54 575-581

Rolls E T (1999) The brain and emotion Oxford Oxford UniversityPress

Sato W Kochiyama T Yoshikawa S amp Matsumura M (2001)Emotional expression boosts early visual processing of the face ERPrecording and its decomposition by independent component analysisNeuroReport 12 709-714

Schyns P G amp Oliva A (1999) Dr Angry and Mr Smile Whencategorization flexibly modifies the perception of faces in rapid vi-sual presentations Cognition 69 243-265

Sprengelmeyer R Rausch M Eysel U T amp Przuntek H (1998)Neural structures associated with recognition of facial expressions of

110 EIMER HOLMES AND MCGLONE

basic emotions Proceedings of the Royal Society London Series B265 1927-1931

Stepniewska I Qi H X amp Kaas J H (1999) Do superior collicu-lus projection zones in the inferior pulvinar project to MT in pri-mates European Journal of Neuroscience 11 469-480

Vuilleumier P Armony J L Driver J amp Dolan R J (2001) Ef-fects of attention and emotion on face processing in the human brainAn event-related fMRI study Neuron 30 829-841

Vuilleumier P Armony J L Driver J amp Dolan R J (2003)Distinct spatial frequency sensitivities for processing faces and emo-tional expressions Nature Neuroscience 6 624-631

Vuilleumier P amp Schwartz S (2001a) Beware and be aware Cap-ture of spatial attention by fear-related stimuli in neglect Neuro-Report 12 1119-1122

Vuilleumier P amp Schwartz S (2001b) Emotional expressionscapture attention Neurology 56 153-158

Whalen P J Rach S L Etcoff N L McInerney S C LeeM B amp Jenike M A (1998) Masked presentations of emotionalfacial expressions modulate amygdala activity without explicitknowledge Journal of Neuroscience 18 411-418

Whalen P J Shin L M McInerney S C Fischer H WrightC I amp Rauch S L (2001) A functional MRI study of humanamygdala responses to facial expressions of fear versus anger Emo-tion 1 70-83

Yeterian E H amp Pandya D N (1991) Corticothalamic connec-tions of the superior temporal sulcus in rhesus monkeys Experimen-tal Brain Research 83 268-284

NOTES

1 In spite of the fact that significant valence effects were present atcentral electrodes in the emotion task but were absent in the lines taskthis interaction failed to reach significance at central sites

2 At lateral occipital electrodes a significantly enhanced positivityfor emotional relative to neutral faces was present between 320 and495 msec in the emotion task [F(113) = 62 p lt 03] but not in thelines task and this was reflected in a nearly significant task 3 valenceinteraction [F(113) = 46 p lt 06]

3 It should be noted that anatomical evidence for a colliculo-pulvinar-amygdalar pathway is currently lacking since the medial pulvinarwhich projects to the amygdala does not receive a significant directinput from the superior colliculus (eg Stepniewska Qi amp Kaas1999) However possible connections between the inferior pulvinar(which receives visual inputs from the superior colliculus) and the me-dial nucleus may support the transmission of information to the amyg-dala through a colliculo-pulvinar route Alternatively cortical inputmay be involved since STS (implicated in facial expression processingSprengelmeyer et al 1998) is known to project to the medial pulvinar(Yeterian amp Pandya 1991) Our thanks to an anonymous reviewer forraising this important point

(Manuscript received December 10 2002revision accepted for publication June 11 2003)

ATTENTION AND FACIAL EXPRESSION PROCESSING 109

Rosen B R (1996) Response and habituation of the human amyg-dala during visual processing of facial expression Neuron 17 875-887

Bruce V amp Young A (1986) Understanding face recognitionBritish Journal of Psychology 77 305-327

Calder A J Keane J Manes F Antoun N amp Young A W(2000) Impaired recognition and experience of disgust followingbrain injury Nature Neuroscience 3 1077-1078

Calder A J Lawrence A D amp Young A W (2001) Neuropsy-chology of fear and loathing Nature Reviews Neuroscience 2 352-363

Carrasco M Penpeci-Talgar C amp Eckstein M (2000) Spatialcovert attention increases contrast sensitivity across the CSF Sup-port for signal enhancement Vision Research 40 1203-1215

Critchley H [D] Daly E [M] Phillips M Brammer M Bull-more E Williams S [C] van Amelsvoort T Robertson DDavid A amp Murphy D [G M] (2000) Explicit and implicit neuralmechanisms for processing of social information from facial expres-sions A functional magnetic resonance imaging study Human BrainMapping 9 93-105

Cuthbert B N Schupp H T Bradley M M Birbaumer N ampLang P J (2000) Brain potentials in affective picture processingCovariation with autonomic arousal and affective report BiologicalPsychology 52 95-111

Damasio A R (1994) Descartesrsquo error Emotion reason and the humanbrain New York G P Putnamrsquos Sons

Diedrich O Naumann E Maier S amp Becker G (1997) Afrontal slow wave in the ERP associated with emotional slides Jour-nal of Psychophysiology 11 71-84

Eastwood J D Smilek D amp Merikle P M (2001) Differential at-tentional guidance by unattended faces expressing positive and neg-ative emotion Perception amp Psychophysics 63 1004-1013

Eimer M (1998) Does the face-specif ic N170 component reflect theactivity of a specialized eye detector NeuroReport 9 2945-2948

Eimer M (2000) The face-specific N170 component reflects late stagesin the structural encoding of faces NeuroReport 11 2319-2324

Eimer M amp Holmes A (2002) An ERP study on the time course ofemotional face processing NeuroReport 13 427-431

Ekman P amp Friesen W V (1976) Pictures of facial affect Palo AltoCA Consulting Psychologists Press

Fox E Lester V Russo R Bowles R J Pichler A amp Dutton K(2000) Facial expressions of emotion Are angry faces detected moreefficiently Cognition amp Emotion 14 61-92

Hansen C H amp Hansen R D (1988) Finding the face in the crowdAn anger superiority effect Journal of Personality amp Social Psy-chology 54 917-924

Hariri A R Bookheimer S Y amp Mazziotta J C (2000) Modu-lating emotional responses Effects of a neocortical network on thelimbic system NeuroReport 11 43-48

Harmer C J Thilo K V Rothwell J C amp Goodwin G M(2001) Transcranial magnetic stimulation of medial-frontal corteximpairs the processing of angry facial expressions Nature Neuro-science 4 17-18

Holmes A Vuilleumier P amp Eimer M (2003) The processing ofemotional facial expression is gated by spatial attention Evidencefrom event-related brain potentials Cognitive Brain Research 16174-184

Jones E G amp Burton H (1976) A projection from the medial pul-vinar to the amygdala in primates Brain Research 104 142-147

Kawasaki H Kaufman O Damasio H Damasio A R Gran-ner M Bakken H Hori T Howard M A III amp Adolphs R(2001) Single-neuron responses to emotional visual stimuli recordedin human ventral prefrontal cortex Nature Neuroscience 4 15-16

Lane R D Chua P M amp Dolan R J (1999) Common effects ofemotional valence arousal and attention on neural activation duringvisual processing of pictures Neuropsychologia 37 989-997

Lang P J Bradley M M Fitzsimmons J R Cuthbert B NScott J D Moulder B amp Nangia V (1998) Emotional arousaland activation of the visual cortex An fMRI analysis Psychophysi-ology 35 199-210

Le Doux J E (1996) The emotional brain New York Simon ampSchuster

Liu L Ioannides A A amp Streit M (1999) Single trial analysis ofneurophysiological correlates of the recognition of complex objectsand facial expressions of emotion Brain Topography 11 291-303

McCarthy G Puce A Belger A amp Allison T (1999) Electro-physiological studies of human face perception II Response proper-ties of face-specific potentials generated in occipitotemporal cortexCerebral Cortex 9 431-444

Mogg K amp Bradley B P (1999) Orienting of attention to threaten-ing facial expressions presented under conditions of restricted aware-ness Cognition amp Emotion 13 713-740

Mogg K McNamara J Powys M Rawlinson H Seiffer A ampBradley B P (2000) Selective attention to threat A test of twocognitive models of anxiety Cognition amp Emotion 14 375-399

Morris J S Friston K J Buechel C Frith C D Young A WCalder A J amp Dolan R J (1998) A neuromodulatory role forthe human amygdala in processing emotional facial expressionsBrain 121 47-57

Morris J S Frith C D Perrett D I Rowland D YoungA W Calder A J amp Dolan R J (1996) A differential neural re-sponse in the human amygdala to fearful and happy facial expres-sions Nature 383 812-815

Morris J S Oumlhman A amp Dolan R J (1999) A subcortical path-way to the right amygdala mediating ldquounseenrdquo fear Proceedings ofthe National Academy of Sciences 96 1680-1685

Oumlhman A Flykt A amp Esteves F (2001) Emotion drives attentionDetecting the snake in the grass Journal of Experimental Psychol-ogy General 130 466-478

Oumlhman A Flykt A amp Lundqvist D (2000) Unconscious emo-tion Evolutionary perspectives psychophysiological data and neu-ropsychological mechanisms In R D Lane amp L Nadel (Eds) Cog-nitive neuroscience of emotion (pp 296-327) New York OxfordUniversity Press

Oumlhman A Lundqvist D amp Esteves F (2001) The face in thecrowd revisited A threat advantage with schematic stimuli Journalof Personality amp Social Psychology 80 381-396

Pessoa L Kastner S amp Ungerleider LG (2002) Attentionalcontrol of the processing of neutral and emotional stimuli CognitiveBrain Research 15 31-45

Pessoa L McKenna M Gutierrez E amp Ungerleider LG(2002) Neural processing of emotional faces requires attention Pro-ceedings of the National Academy of Sciences 99 11458-11463

Phillips M L Young A W Scott S K Calder A J Andrew CGiampietro V Williams S C R Bullmore E T BrammerMamp Gray J A (1998) Neural responses to facial and vocal expres-sions of fear and disgust Proceedings of the Royal Society LondonSeries B 265 1809-1817

Phillips M L Young A W Senior C Brammer M Andrew CCalder A J Bullmore E T Perrett D I Rowland D ampWilliams S C R et al (1997) A specif ic neural substrate forperceiving facial expressions of disgust Nature 389 495-498

Pizzagalli D Regard M amp Lehmann D (1999) Rapid emotionalface processing in the human right and left brain hemispheres AnERP study NeuroReport 10 2691-2698

Rapcsak S Z Galper S R Comer J F Reminger S L NielsenLKaszniak A W Verfaellie M Laguna J F Labiner D Mamp Cohen R A (2000) Fear recognition deficits after focal braindamage Neurology 54 575-581

Rolls E T (1999) The brain and emotion Oxford Oxford UniversityPress

Sato W Kochiyama T Yoshikawa S amp Matsumura M (2001)Emotional expression boosts early visual processing of the face ERPrecording and its decomposition by independent component analysisNeuroReport 12 709-714

Schyns P G amp Oliva A (1999) Dr Angry and Mr Smile Whencategorization flexibly modifies the perception of faces in rapid vi-sual presentations Cognition 69 243-265

Sprengelmeyer R Rausch M Eysel U T amp Przuntek H (1998)Neural structures associated with recognition of facial expressions of

110 EIMER HOLMES AND MCGLONE

basic emotions Proceedings of the Royal Society London Series B265 1927-1931

Stepniewska I Qi H X amp Kaas J H (1999) Do superior collicu-lus projection zones in the inferior pulvinar project to MT in pri-mates European Journal of Neuroscience 11 469-480

Vuilleumier P Armony J L Driver J amp Dolan R J (2001) Ef-fects of attention and emotion on face processing in the human brainAn event-related fMRI study Neuron 30 829-841

Vuilleumier P Armony J L Driver J amp Dolan R J (2003)Distinct spatial frequency sensitivities for processing faces and emo-tional expressions Nature Neuroscience 6 624-631

Vuilleumier P amp Schwartz S (2001a) Beware and be aware Cap-ture of spatial attention by fear-related stimuli in neglect Neuro-Report 12 1119-1122

Vuilleumier P amp Schwartz S (2001b) Emotional expressionscapture attention Neurology 56 153-158

Whalen P J Rach S L Etcoff N L McInerney S C LeeM B amp Jenike M A (1998) Masked presentations of emotionalfacial expressions modulate amygdala activity without explicitknowledge Journal of Neuroscience 18 411-418

Whalen P J Shin L M McInerney S C Fischer H WrightC I amp Rauch S L (2001) A functional MRI study of humanamygdala responses to facial expressions of fear versus anger Emo-tion 1 70-83

Yeterian E H amp Pandya D N (1991) Corticothalamic connec-tions of the superior temporal sulcus in rhesus monkeys Experimen-tal Brain Research 83 268-284

NOTES

1 In spite of the fact that significant valence effects were present atcentral electrodes in the emotion task but were absent in the lines taskthis interaction failed to reach significance at central sites

2 At lateral occipital electrodes a significantly enhanced positivityfor emotional relative to neutral faces was present between 320 and495 msec in the emotion task [F(113) = 62 p lt 03] but not in thelines task and this was reflected in a nearly significant task 3 valenceinteraction [F(113) = 46 p lt 06]

3 It should be noted that anatomical evidence for a colliculo-pulvinar-amygdalar pathway is currently lacking since the medial pulvinarwhich projects to the amygdala does not receive a significant directinput from the superior colliculus (eg Stepniewska Qi amp Kaas1999) However possible connections between the inferior pulvinar(which receives visual inputs from the superior colliculus) and the me-dial nucleus may support the transmission of information to the amyg-dala through a colliculo-pulvinar route Alternatively cortical inputmay be involved since STS (implicated in facial expression processingSprengelmeyer et al 1998) is known to project to the medial pulvinar(Yeterian amp Pandya 1991) Our thanks to an anonymous reviewer forraising this important point

(Manuscript received December 10 2002revision accepted for publication June 11 2003)

110 EIMER HOLMES AND MCGLONE

basic emotions Proceedings of the Royal Society London Series B265 1927-1931

Stepniewska I Qi H X amp Kaas J H (1999) Do superior collicu-lus projection zones in the inferior pulvinar project to MT in pri-mates European Journal of Neuroscience 11 469-480

Vuilleumier P Armony J L Driver J amp Dolan R J (2001) Ef-fects of attention and emotion on face processing in the human brainAn event-related fMRI study Neuron 30 829-841

Vuilleumier P Armony J L Driver J amp Dolan R J (2003)Distinct spatial frequency sensitivities for processing faces and emo-tional expressions Nature Neuroscience 6 624-631

Vuilleumier P amp Schwartz S (2001a) Beware and be aware Cap-ture of spatial attention by fear-related stimuli in neglect Neuro-Report 12 1119-1122

Vuilleumier P amp Schwartz S (2001b) Emotional expressionscapture attention Neurology 56 153-158

Whalen P J Rach S L Etcoff N L McInerney S C LeeM B amp Jenike M A (1998) Masked presentations of emotionalfacial expressions modulate amygdala activity without explicitknowledge Journal of Neuroscience 18 411-418

Whalen P J Shin L M McInerney S C Fischer H WrightC I amp Rauch S L (2001) A functional MRI study of humanamygdala responses to facial expressions of fear versus anger Emo-tion 1 70-83

Yeterian E H amp Pandya D N (1991) Corticothalamic connec-tions of the superior temporal sulcus in rhesus monkeys Experimen-tal Brain Research 83 268-284

NOTES

1 In spite of the fact that significant valence effects were present atcentral electrodes in the emotion task but were absent in the lines taskthis interaction failed to reach significance at central sites

2 At lateral occipital electrodes a significantly enhanced positivityfor emotional relative to neutral faces was present between 320 and495 msec in the emotion task [F(113) = 62 p lt 03] but not in thelines task and this was reflected in a nearly significant task 3 valenceinteraction [F(113) = 46 p lt 06]

3 It should be noted that anatomical evidence for a colliculo-pulvinar-amygdalar pathway is currently lacking since the medial pulvinarwhich projects to the amygdala does not receive a significant directinput from the superior colliculus (eg Stepniewska Qi amp Kaas1999) However possible connections between the inferior pulvinar(which receives visual inputs from the superior colliculus) and the me-dial nucleus may support the transmission of information to the amyg-dala through a colliculo-pulvinar route Alternatively cortical inputmay be involved since STS (implicated in facial expression processingSprengelmeyer et al 1998) is known to project to the medial pulvinar(Yeterian amp Pandya 1991) Our thanks to an anonymous reviewer forraising this important point

(Manuscript received December 10 2002revision accepted for publication June 11 2003)