hira.hope.ac.uk face processing … · web viewhira.hope.ac.uk
TRANSCRIPT
1
Effects of touch on emotional face processing: A study of event-related potentials, facial EMG and cardiac activity
Spapé12, M.M., Harjunen23, & Ravaja234, N.
1. Department of Psychology, Liverpool Hope University
2. Helsinki Institute for Information Technology, Aalto University
3. Aalto University, School of Business
4. University of Helsinki, Department of Social Research
Word count
Running head: Touch and Emotional Face ERPs
Text and references: 9581 words
Abstract: 154 words
Keywords: Emotion, EEG, ERP, EMG, emotional faces, touch, non-verbal communication
Correspondence: Michiel M. Spapé
Liverpool Hope University
Department of Psychology
Hope Park
L16 9JD
Liverpool
United Kingdom
This work was supported by the Academy of Finland, project number 268999.
© 2017. This manuscript version is made available under the CC-BY-NC-ND 4.0
license http://creativecommons.org/licenses/by-nc-nd/4.0/
2
Abstract
Being touched is known to affect emotion, and even a casual touch can elicit positive feelings and affinity.
Psychophysiological studies have recently shown that tactile primes affect visual evoked potentials to emotional
stimuli, suggesting altered affective stimulus processing. As, however, these studies approached emotion from a
purely unidimensional perspective, it remains unclear whether touch biases emotional evaluation or a more
general feature such as salience. Here, we investigated how simple tactile primes modulate event related
potentials (ERPs), facial EMG and cardiac response to pictures of facial expressions of emotion. All measures
replicated known effects of emotional face processing: Disgust and fear modulated early ERPs, anger increased
the cardiac orienting response, and expressions elicited emotion-congruent facial EMG activity. Tactile primes
also affected these measures, but priming never interacted with the type of emotional expression. Thus, touch
may additively affect general stimulus processing, but it does not bias or modulate immediate affective
evaluation.
3
Introduction
Touch is well-known to affect emotions and cognition. Early research focused on the critical role of
touch in the formation of social bonds that is necessary for the emotional development of both humans (Spitz,
1945) and other primates (Harlow, 1958). This idea, of a deep relationship between touch and closeness remains
current in the conceptualization of touch, with recent studies suggesting that the degree of allowed touch can
predict the closeness of a relationship (Suvilehto, Glerean, Dunbar, Hari, & Nummenmaa, 2015), and that the
degree of actual touch can predict functional connectivity of the social brain (Brauer, Xiao, Poulain, Friederici,
& Schirmer, 2016). However, even a brief, seemingly trivial touch can positively bias emotions. In a landmark
study by Fisher, Rytting, & Heslin (1976), people were shown to appreciate a library better if they had
previously been touched upon receiving a new membership card. Further studies suggested that the action of
touching, possibly by signaling a positive social bond, not only increases affiliation but can result in social
compliance, yielding tangible gains after tactile interactions, such as higher tips for waiters, free dimes for
strangers and free bus-rides for potential passengers (Crusco & Wetzel, 1984; Guéguen & Fischer-Lokou, 2003;
Kleinke, 1977). As such outcomes had a clear financial value, the effect became commonly known as the Midas
touch, after the famous Phrygian king from Greek mythology, whose touch could transform anything into solid
gold.
Although the effect continues to be replicated in various settings (Dolinski, 2010; Haans & IJsselsteijn,
2009), there have been varying, even conflicting, explanations as to which social, affective, and cognitive
functions elicit it. In the social domain, Henley (1973) observed more interpersonal touch from men to women
than vice versa and suggested that touch may signify warmth towards one who has a lower social status than the
sender (Henley, 1973). Psychologists have suggested that touch, innately or as formed through childhood, is
intrinsically a positive reinforcement, and may thus directly cause positive affect (Hornik, 1992). Touch may
also act as an embodied cue that symbolizes similarity between a sender and a receiver, thus leading to more
positive attitudes towards the sender (Smith, 2008). For instance, being touched by an outgroup member has
been found to reduce implicit negative attitudes towards ethnic minorities (Seger, Smith, Percy & Conrey,
4
2014). Neither hypothesis explains, however, the fact that touch does not consistently lead to positive effects
and can even yield negative outcomes (Dolinski, 2010). Of course, in social psychological experiments, the
physical characteristics of touch are rarely controlled, making it possible that positive and negative evaluations
simply follow pleasant and unpleasant haptic parameters. Furthermore, studies generally focused on
consequences of touch that follow minutes after the event, leaving it impossible to guess whether touch
immediately and directly alters affective appraisal and cognitive processing. Event related potential techniques
have been demonstrated to be invaluable in providing information when examining the temporal dynamics of
human cognition and emotions (Hajcak, MacNamara, & Olvet, 2010), but have thus far rarely been applied
when it comes to touch.
Effects of tactile stimuli on emotional processing
Recently, Schirmer et al. (2010) showed direct effects of tactile modulation on emotional processing.
They presented emotional stimuli from the international affective picture system (IAPS; Lang, Bradley, &
Cuthbert, 1997) while participants occasionally received a touch. Interestingly, they found similar effects
whether a touch was applied by a friend, applied by a tactile device but attributed to a friend, or applied by a
tactile device and attributed to a computer. In each condition, touch was found to alter the event related
potentials to visual stimuli, suggesting the tactile effects were independent of the source of the stimulation.
Early potentials were found to be amplified after touch, regardless of the emotionality of the picture, possibly
indicating that touch enhances visual attention. However, these effects were replicated after replacing the touch
with a simple auditory stimulus, so it could be argued that the mere presence of a stimulus in an irrelevant
modality may have influenced (e.g. primed) visual stimulus processing. Of more interest, therefore, were the
effects on late potentials: tactile, but not auditory primes (although these were not directly contrasted), were
found to amplify the difference between negative and neutral emotional pictures. This suggests that touch
modulates emotional processing, making recipients more attuned to the affective part of the message.
5
In a study by our own group (Spapé, Hoggan, Jacucci, & Ravaja, 2015), vibrotactile stimuli were
presented (applied using a device but attributed to another person) before the onset of emotional stimuli. Here,
instead of images from the IAPS database, touch was presented before the onset of symbolic cues (smiley faces)
that conveyed winning or losing in a decision making paradigm. To investigate the effect of the tactile primes
on emotion, we measured the feedback related negativity (FRN), an ERP component that has been related to
evaluative appraisal ( Cohen, Elger, & Ranganath, 2007). Touch had some positive effects on decision making
(accepting offers) and the FRN, but these were found to be indistinguishable from the effects of auditory control
primes. Thus, we concluded that the tactile primes had no direct influences on the affective evaluation of
subsequent information. However, touch was found to amplify the late component of the P3 wave of both
negative and positive cues, which we interpreted as potentially reflecting enhanced encoding into memory. In
order to investigate this hypothesis, we analyzed how tactile communication in one trial influenced decision
making in the next, and observed patterns suggesting that touch modulated reciprocating and generosity. Thus,
touch was found to not immediately or automatically affect emotional evaluation, but instead has a general
influence on cognition that may bias later interpretation and decision making.
Both studies conclude that touch affects emotions to some extent, but strictly speaking, this inference is
premature. In the first study, negative affect evoking pictures were contrasted with non-emotional stimuli,
making it unclear whether touch affected arousal – which can result from positive and negative images alike –
or perceived emotionality. The second study similarly focused on the positive/negative dimension of emotion by
solely contrasting negative (losses) and positive (gains) conditions. This dimension, valence, is of course
important in describing emotion but cannot singularly account for the emotional spectrum (Schlosberg, 1954).
More importantly, because of the simple contrasts in the studies, it cannot be inferred whether the effects of
touch concern a bias of valence evaluation, or a more general facilitation of cognitive processing. For example,
a suddenly appearing stimulus in an irrelevant modality can facilitate an early phase of the response selection
stage, an effect which is commonly known as the accessory stimulus effect (Hackley & Valle-Inclàn, 1999).
6
In order to investigate whether qualitatively different emotional dimensions are distinctly affected by
tactile stimulation, emotional face stimuli could be more informative than economic or IAPS stimuli. Emotional
faces have long been the focus of much emotion research (Ekman & Friesen, 1976), resulting in some consensus
on how different emotional expressions affect event related potentials and peripheral psychophysiological
activity. In terms of the former, event related potentials have been related to distinct stages of cognitive
processing. As for the latter, measurements of facial electromyography and electrocardiography have long been
known to provide reliable information on the later consequences of affective evaluation.
Early processing of emotion in faces
The temporal accuracy of EEG has proven remarkably informative in the study of emotion processing.
Early research showed that from 170 ms, a dissociation in the visual evoked potential emerges which
distinguishes between faces and other visual stimuli (Bentin, Allison, Puce, Perez, & McCarthy, 1996; Seeck &
Grüsser, 1992). This face-sensitive potential came to be called the vertex positive potential (VPP, maximal over
frontal electrodes) or N170 (minimal over lateral parietal P7/P8), depending on the reference (Joyce & Rossion,
2005). The consistency across studies and the timing of the potentials, being well before potentials that are
strongly affected by task-relevance, such as the P300 (Polich & Kok, 1995), suggested the VPP/N170 are task-
independent and possibly pre-attentive (Cauquil, Edmonds, & Taylor, 2000). In that way, it is associated with an
early, automatic visual facial pattern recognition process and the operation of the fusiform face area that fMRI
studies commonly associate with face identification (Itier & Taylor, 2004b; Watanabe, Kakigi, & Puce, 2003)
although the degree of face-specificity of the N170 remains debated (Itier & Taylor, 2004a; Thierry, Martin,
Downing, & Pegna, 2007).
Interestingly, several studies suggest that information regarding the emotion of the facial expression is
either parallel to, or precedes the identification of the stimulus as a face. That is, earlier findings provide
evidence that identification of threat occurs almost at the same time as the face selective activity, with such
stimuli incurring an early, negative shift at around 230 ms (Vanderploeg, Brown & Marsh, 1987). It was later
7
found that multiple arousing or threatening emotional expressions (e.g. anger) incur this negativity near the
N200 (Balconi & Pozzoli, 2003; Sato, Kochiyama, Yoshikawa, & Matsumura, 2001; Schupp et al., 2004). A
more recent study showed that under conditions of limited attentional resources, the emotional modulation may
even happen prior to the identification of faces. Thus, Luo, Feng, He, Wang, & Luo (2010) observed greater
frontal N1s and posterior P1s for fearful than happy faces, as well as stronger VPPs for both fearful and happy
than neutral faces (but see Eimer, Holmes, & McGlone, 2003; Herrmann et al., 2002). The extreme immediacy
of these effects has been taken to indicate that facial emotions may be extracted even pre-attentively, before
conscious processing. While the field continues to debate the degree of control over facial expression perception
(Palermo & Rhodes, 2007), it is clear that emotional processing of a facial expression starts early, and probably
before the image is identified as a face.
Late processing of facial emotions
While facilitating the early processing of stimuli that indicate threat may be an adaptive trait (Fox, etc),
positive facial expressions tend to have stronger effects in later potentials. According to Luo et al. (2010), an
initial stage of emotional face processing consists of extracting the coarse emotional content in order to identify
threat while the more fine-grained emotional content becomes only relevant at a relatively late (ca. 300 ms)
stage. Their definition of this late stage is indicated by an N300 over lateral sites that is more pronounced for
fearful than neutral expressions, while simultaneously a positivity starts to affect the event related potential,
distinguishing the happy from the neutral expressions. The latter effect may reflect the P3, a general longer-
lasting positive potential over centro-parietal sensors from 300 ms onwards, which has been related to functions
of late perceptual processing, or early memory (Polich, 2007). A variety of studies have shown the P3, and in
particular its later aspect, the late positive potential (LPP) to be amplified for positively valenced stimuli such as
happy expressions (Schupp et al., 2004, Luo et al. 2010), and pictures of beloved others (Langeslag, Jansma,
Franken, Van Strien, 2007). However, as the P3 has long been known to be affected by stimulus relevance
(Sutton, Braren, Zubin, & John, 1965), it is not clear whether the late effects reflect an intrinsic effect of
8
valence, or are more related to the specific task and motivation (Olofsson, Nordin, Squieira & Polich) as the
direction of the effects depends on the task (Pastor et al., 2008).
Peripheral psychophysiology of facial emotions
After the emotional evaluation of face stimuli by the brain, they elicit distinct reactions in our facial
muscles and heartrate. Positive affect has generally been shown to result in activity over the zygomaticus major
(ZM, smiling) muscle area, while negative affect has been associated with activity over the corrugator supercilii
(CS, frowning) area (Fridlund & Cacioppo, 1986). Perceiving an emotion in another person may also induce the
congruent feeling in oneself, and/or mimicking of the emotional expression, an effect that is sometimes referred
to as emotional contagion (Hatfield, Cacioppo, & Rapson, 1993) or linkage (Levenson & Gottman, 1983; Spapé
et al., 2013). Perception of emotional expressions thus has predictable consequences: smiling eliciting activity
over the ZM and disgust over the levator labii superioris alaeque nasi (LN, nose-wrinkling) muscle areas
(Blairy, Herrera, & Hess, 1999). Finally, negative emotional expressions, such as angry and fearful faces, have
been shown to result in enhanced attentional alertness, associated with the brief, parasympathetically-mediated,
deceleration of cardiac activity (Bradley, 2009; Jönsson & Sonnby-Borgström, 2003). This so-called orienting
response is followed by increased sympathetic control indexed by cardiac acceleration (Bradley, Codispoti,
Cuthbert & Lang, 2001). Contrary to the simple fight-or-flight type of response, cardiac acceleration results
both from stimulating negative and positive events and is thus related to the stimulus arousal rather than valence
(Balconi, Vanutelli, & Finocchiaro, 2014; for a review see Kreibig, 2010).
Present Study
In the present study, we set out to investigate whether simple, tactile stimuli change subsequent
processing of emotional expressions as measured by their effects on ERPs and peripheral psychophysiology. To
find out if tactile primes biased emotional evaluation, as opposed to a more general type of stimulus-processing,
we used four different types of emotional expression. The study principally focusses on whether tactile primes
9
interact with the quality of the emotion, operationalized as the interaction between the type of emotional
expression and the type of prime. In order to control for other priming effects, we employed both a control
condition without prime, and a control condition with a similar prime, but delivered via a different modality.
Firstly, if touch amplifies the affective content directly, as would be expected from the observations by
Schirmer et al. (2010), one would expect interactions between emotional expression- and prime-type already at
an early ERP stage, for example resulting in touch amplifying the difference between the N170s for angry and
happy facial expressions. In contrast, if, as we suggested (Spapé et al., 2015), touch affects only the significance
of the emotional stimulus, then the primes may have additive effects on early stimulus processing, but may
interact only with emotional content (i.e. the type of expression) at a later stage.
Secondly, if touch induces positive affect, as is the traditional interpretation of the Midas Touch effect,
one should predict a bias towards a more positive evaluation of facial expressions. Accordingly, one would
predict tactile primes to increase ZM muscle activity in response to happy faces, but to decrease CS and LN
activity in response to negatively valenced facial expressions. Similarly, one could predict negative expressions
related cardiac deceleration to be attenuated as a result of touch. However, if, as another theory of touch
suggests, touch acts as an embodied cue related to interpersonal similarity (Smith, 2008), one would predict
touch to facilitate mimicry of all facial expressions. Specifically, this predicts emotion-congruent facial muscle
activity to be amplified after touch: increased ZM activity in response to happiness, corrugator supercilii (CS)
activity in response to anger and fear, and levator labii superioris alaeque nasi (LN) activity in response to
disgust (Fridlund & Cacioppo, 1986). Finally, if touch increases visual attention to emotional stimuli (Schirmer
et al., 2010), or modulates the significance of stimuli (Spapé et al., 2015), one could expect generic effects of
touch on corrugator supercilii and the orienting response, resulting from enhanced attention (Bradley, Codispoti,
Cuthbert, & Lang, 2001). For the former, if the emotional salience is processed further, one should expect
emotion-specific interactions to occur at the facial EMG level, while for the latter, emotional and tactile effects
could be additive factors.
10
Note that the aim of the present study is not necessarily to falsify a null-hypothesis: The effects of
emotional expressions (and to some extent, of priming) are well known. The critical aim is to find out at what
time, if any, touch selectively modulates emotion, for which reason effect sizes could be more informative than
significance.
Methods and materials
Participants
Seventeen female and twenty-four male, right-handed undergraduate students from Aalto University
and the University of Helsinki volunteered to take part in the study in exchange for money. They were 24.7 ±
3.9 years old on average. The experiment was conducted in accordance with the ethical guidelines laid out in the
Declaration of Helsinki, and approved by the Ethical Board of Aalto University. Participants received full
briefing regarding their rights – including their right to withdraw from the study at any time without fear of
negative consequences – and signed informed consents prior to the experiment. The data from one (female)
participant were removed from the analyses due to a technical problem.
Stimuli
Presentation of stimuli, recording of behavior, and synchronization with EEG equipment over the
parallel port was done using E-Prime 2.0.10.353 (Psychology Software Tools Inc, Sharpsburg, PA), running on
Windows XP. Images from the commonly used “Pictures of Facial Affect” (Ekman & Friesen, 1976) were used
as visual source material. Of these, we used only those happiness, fear, anger and disgust images that could be
combined with a neutral expression of the same face (16 on average). Images were 320x480 pixels, presented at
a distance of ca. 60 cm on a 48 x 30 cm LCD screen running at a 1024x768 resolution and a refresh rate of 60
Hz. The images were shown in the center of a gray screen, against a black window of 340x540 pixels which also
occluded the top 60 pixels of the source material, as they include identifying information. Primes always used
11
the same audio waveform as source: a 100 Hz square wave form of 500 ms in duration with a gradual fade-in of
10 ms and fade-out of 350 ms. A C2 tactor (http://www.atactech.com/PR_tactors.html) was used to present the
prime in the vibrotactile domain. This device operates via a moving magnet linear actuator, with a vibrating
contact area of 7 mm, surrounded by an additional, shielded area of 23 mm. It was strapped using elastic tape to
the volar surface, in the centre of the palm of the right hand. A computer speaker, positioned to the right of the
LCD screen, presented the same stimulus as audio at a level of ca. 50 dB (at the distance of the participant to the
speaker). The same stimulus was presented over both channels, but completely attenuated in order to provide a
silent, control condition with commeasurable timing characteristics.
Procedure and design
Participants began the experiment immediately after reading experiment instructions and viewing a
demonstration of the experiment by the laboratory assistant. As schematically portrayed in figure 1, trials started
with a fixation cross presented to the center of the screen for 400 ms, which participants were asked to fixate on.
A face with a neutral expression then replaced the cross. After a randomized interval of 300-600 ms, the prime
was presented. Then, after a randomized stimulus onset asynchrony of 650-1000 ms, the neutral expression was
replaced by an image of the same person, now wearing an emotional expression. Participants were asked to not
respond until after the face disappeared, which occurred 1000 ms later. They were then instructed to use their
left hand and Z and X keys and their right hand and N and M keys to indicate the emotion of the face.
[-- FIGURE 1 about here --]
Participants completed four blocks of 120 trials. The emotional expression-response mapping between
anger, disgust, fear, happiness on the one hand and the Z, X, N and M keys on the other, was randomized every
block with a Latin square design to avoid confounds with lateralized activity. Since this could induce confusion
upon having to adjust to the new response mapping, we provided 4 training trials at the beginning of every
block, during which participants received full feedback as to their speed and accuracy. In the subsequent 120
testing trials per block, feedback was only provided if participants responded too early (before face disappeared)
12
or too late (>2000 ms). Such trials that provoked feedback, as well as the training trials, were excluded from
analysis.
The study used a within subject design with factors emotion (anger, disgust, fear and happiness) and
prime (touch, tone, silent) randomized within every 12 trials, with 10 repetitions for each of the 4 blocks,
resulting in a maximum of 40 trials per design cell for the analysis. Participants took ca. 40 minutes to complete
the entire series of 480 trials of the experiment.
Recording
A BrainProducts QuickAmp (BrainProducts GmbH, Gilching, Germany) recorded EEG at 1000 Hz
from 32 Ag/AgCl electrodes placed on sites over FP1, FP2, F7, F3, Fz, F4, F8, FT9, FC5, FC1, FC2, FC6,
FT10, T7, C3, Cz, C4, T8, TP9, CP5, CP1, CP2, CP6, TP10, P7, P3, Pz, P4, P8, O1, Oz and O2 using an elastic
cap for standardization (EasyCap). We used the average reference both during recording and all steps in the
analysis, as it is most commonly used in face and emotional expression perception studies (Joyce & Rossion,
2005). Two bipolar electrodes were placed 1 cm lateral to the outer canthi of the left and right eyes to capture
horizontal eye movement activity (HEOG), and two similar electrodes 1 cm superior and inferior to the left eye
for vertical eye movements and eye-blinks (VEOG). The electro-cardiogram (ECG) was recorded using a
bipolar pair of disposable adhesive electrodes, one placed near the manubrium and the other near the ninth left
rib. The facial electromyogram (fEMG) was recorded with bipolar electrodes on sites overlying the left
zygomaticus major (ZM), corrugators supercilii (CS) and levator labii superioris alaeque nasi (LN) muscle
regions. All data were recorded at a sample-rate of 1000 Hz and a hardware filter at 0.01 Hz.
EEG preprocessing
EEG and EOG were pre-processed off-line with a high-pass filter at 0.2 Hz and a 50 Hz notch filter.
The artifact correction procedure made use of the infomax algorithm based independent component analysis
(ICA) as implemented in EEGLAB (Delorme & Makeig, 2004). To optimize ICA decomposition, we first
filtered the data between 1 and 80 Hz, and segmented data centering on the face stimulus, including 2.1 s before
13
and after onset. Extreme amplitudes (>500 μV, ca. 4.4%) were removed to reduce, in the subsequent ICA
decomposition, the likelihood of components selectively accounting for rare artefacts with high variance.
Following ICA, we inspected the activity, frequency spectrum and topography of components and visually
classified components as artefactual (related to eye-movements, eye-blinks, etcetera) or clean (EEG related).
The EEG was reconstructed by applying the thus obtained weights to the unfiltered, continuous data and
projecting only the artifact-free components to the electrode level. The artefact-corrected data was further
processed in Brain Vision Analyzer 2 (BrainProducts GmbH, Gilching, Germany). This included applying an
IIR filter to the data (0.2 – 40 Hz) and epoching the data, again centering on face stimulus onset, but now
including 200 ms of baseline and 800 ms of data following the presentation of the emotional expression. Epochs
were rejected for suspect activity if any of the channels considered in the analysis (see ERP measurements) had
amplitudes exceeding 35 µV or a difference between maximum and minimum value exceeding 55 µV. This
procedure removed ca. 10.6 ± 13.9 % of epochs, so that ERPs – averages of the epochs for each of the 12
combinations between prime and emotion – were computed over 35.8 ± 5.6 trials in total.
ERPs
In order to avoid “cherry picking”, we defined points of interest based on the grand average visual
evoked potential of the emotional expression stimuli regardless of the preceding prime. As early components
tend to be much shorter and more affected by particularities in laboratory setups, we defined the temporal
windows of interest for early components based on the data, and for later components based on the literature.
Windows for early components were defined by calculating the standardized (mean / SE) evoked potential for
F3, Fz, F4, C3, Cz, C4, P7, P3, Pz, P4, P8, O1, Oz, and O2 in every 10 ms between stimulus onset and 250 ms.
Then, the local peaks were detected, around which the surrounding window with a threshold of T(40) > 4 was
taken as the interval for N1/P1, VPP/N170 and P2. An N1 was found in frontal sites (strongest over Fz at 110-
120 ms, T = -8.21), in contrast with a P1 in lateral-temporal and occipital sites (at 110-120 ms, P8: T = 8.24,
O1: T = 6.26). We thus defined the N1/P1 as the local maximum in P8 between 90 and 130 ms. Following, a
14
VPP/N170 was found: maximal over frontal sites (F3: T = 10.23 at 170-180 ms, Fz: T = 9.82 at 170-180 ms)
and most negative over lateral posterior sites (P7: T = -8.69 at 170-180 ms, P8: T = -11.85 at 180-190 ms). The
window for the N170 was therefore defined as the minimal value between 150-210 ms over P8. A subsequent
P2 was observed slightly after the N170, characterized by positivity in central parietal channels (Cz: T = 14.94
between 210-220 ms, Pz: T = 9.26), for which reason the latency between 180 and 250 with maximal value in
Cz was used. Following detection of the component within the thus defined relevant channel and window of
interest, its latency was used to extract the amplitude for all electrodes. Windows for late components were
based on Eimer et al. (2003), who used mean amplitudes between 220-315, 320-495 and 500-700 ms post-
stimulus.
The analysis of ERPs extended the basic design of emotion and prime with a factor area, based on Eimer et al.
(2003), with area defined as the average across electrodes within predefined regions: frontal (F3, Fz, F4),
central (C3, Cz, C4), parietal (P3, Pz, P4), lateral posterior (P7, P8) and lateral occipital (O1, O2). For N1/P1
and N170/VPP – being components with opposing polarities – we used all areas, whereas for the P2 only central
(C3, Cz, C4) and parietal (P3, Pz, P4) areas were included. For late components of the ERP, the analysis was
extended to a four-way repeated measure design with emotion, prime, area (all five regions), and time (220-315,
320-495 and 500-700 ms) as factors. Throughout all analyses, Greenhouse-Geisser corrections were applied
wherever sphericity assumptions were violated.
EMG, ECG
The continuous EMG data were filtered using a 7 Hz high-pass filter, rectified using the Hilbert
transform to obtain the envelope (Myers et al., 2003) and normalized with the log-transform to reduce outlier
effects. Continuous ECG signals were band-pass filtered 2-100 Hz, with a notch filter between 46-54 Hz.
Following, a peak-detection algorithm was used to detect the latencies of heartbeats as local, globally stable,
maxima (MATLAB code available at cognitology.eu/source/mig_ContECGToIBI.m), and interpolating rare
intervals in which no heartbeats were detected. Resulting heartbeat latencies were then spline-interpolated to
15
inter-beat interval (IBI) by time signal. Means for IBI and EMG over three areas were then calculated over four
1 s bins following the emotion onset, with the average activity in the 1 s interval preceding the emotion
subtracted. The repeated measures ANOVA on IBI included time, as the orienting response affects ECG in a
strongly time-locked manner (Graham & Clifton, 1966). Effects of emotion on facial EMG are less consistent in
time, but can be distinguishable in terms of their localization. As some expressions affect multiple muscle areas
simultaneously (Ravaja, Turpeinen, Saari, Puttonen, & Keltikangas-Järvinen, 2008), we added area as a factor
to the EMG analysis. One repeated measures ANOVA tested the effects of prime, emotion (anger vs. happiness
vs. disgust vs. fear) and time (1st vs. 2nd vs. 3rd vs. 4th s) on IBI. Another tested the effects of prime, emotion and
area (CS vs. LN vs. ZM) on EMG activity.
Results
Reactions occurring during the training period and slow reactions were excluded from all analyses.
Erroneous reactions were removed from the analysis of reaction times (RTs) only. Repeated measures ANOVAs
were conducted separately for behavior, EMG, ECG, early ERP amplitudes and late ERP area averages. All
analyses had emotional expression (anger vs. disgust vs. fear vs. happiness) and prime (silent vs. audio vs.
touch) as factors. Extensions of this basic design – for example, if an analysis also included a factor to estimate
differences across areas – are mentioned in their particular description. Our interest primarily concerned
interaction effects between touch and emotional expression, so for the sake of brevity, we omit a detailed report
of each individual effect or possible post-hoc comparison. However, an overview of contrasts of interests and
effect sizes, for emotional expression, touch and their interaction on behavior, EMG, ECG, early and late ERPs
is provided below (see table 1).
[ TABLE 1 about here ]
Behavior
16
Participants generally responded accurately (92.3 ± 0.6 % correct), and quickly (566 ± 20 ms). We
tested whether emotional expression and prime affected overt behavior, in terms of error rates (percentage
erroneous) and reaction time (RT). A cut-off value of <1250 ms in latency was used to remove outliers from the
RT analysis. Following, significant effects of emotional expression were observed for RT, F (2.67, 106.92) =
81.27, p < .0001, and error rates, F (2.19, 85.48) = 36.81, p < .0001. Happiness provoked fastest and least
erroneous responses (459 ms, 0.7 %) followed by fear (572 ms, 3.7 %), disgust (589 ms, 12.4%) and anger (642
ms, 14.0%). Prime did not significantly affect errors, p > .9, but did affect RT, F (1.94, 75.76) = 4.21, p = .02,
with audio (562 ± 20 ms) and touch (573 ± 20 ms), eliciting faster responses than silent (563 ± 20 ms)
conditions. No interactions were significant, Fs < 0.9, ps > .5.
EMG
Similar ANOVAs, but with area (CS vs. ZM vs. LN muscle group) added, on EMG activity gave
significant main effects for area, F (1.99, 77.42) = 12.77, p < .001, and prime, F (1.31, 51.20) = 20.04, p
< .0001, but not emotional expression, p > .3, with stronger facial muscle activity over ZM muscles, and after
audio primes. Emotional expression and area significantly interacted, F (1.64, 63.95) = 15.48, p < .0001, with
stereotypical effects of anger and fear (increasing CS), disgust (increasing CS and LN), and happiness
(increasing ZM while decreasing CS). Prime also interacted with area, F (1.36, 53.21) = 12.12, p < .001. This
effect was found to be mainly the result of differences in the baseline of the LN muscle: already in the first 1 s, a
stronger decrease in LN activity was observed. This was found to be related to the tactile primes eliciting an
immediate short burst of activity prior to the onset of the face which resulted in a difference in baseline (see
Figure 2). However, prime neither interacted with emotional expression, nor modulated the emotional
expression-by-area interaction, Fs < 1.3, ps > .3.
ECG
17
For IBI, the effect of time (1st, 2nd, 3rd, 4th second following the emotional stimulus) was added to the
basic design. This showed significant effects of time, F (1.73, 67.52) = 37.53, p < .0001, and emotional
expression, F (2.72, 105.94) = 4.97, p < .004, but not prime, p > .1. The effect of time showed a cardiac
deceleration (9.3 ms) in IBI in the first two seconds, suggesting a general orienting response. Time also
interacted with emotional expression, F (3.24, 126.51) = 4.83, p = .003, and with prime, F (3.84, 149.83) = 2.79,
p = .03. The effect of emotional expression could be described as mainly due to an amplified deceleration
occurring particularly in the third bin with anger (14.0 ± 1.8 ms). The prime-by-time interaction showed a more
pronounced deceleration after touch in the 2nd second (9.4 ms) than silent (8.0 ms) or tone (8.0 ms), followed by
tones showing a stronger re-acceleration in the 4th second (4.1 ms) than after silent (1.5 ms) and touch (1.4 ms).
The latter part, being the stronger effect, is described in Table 1. Neither the prime-by-emotional expression, —
nor the three-way—interaction was significant, Fs < 0.8, ps > 0.5.
[ FIGURE 2 about here ]
N1/P1
In three repeated measures ANOVAs with area (frontal, central, parietal, lateral-temporal and
occipital), emotional expression (anger, disgust, fear and happiness) and prime (touch, tone and silent) as
factors, area significantly affected P1/N1, F (1.61, 64.25) = 41.63, p < .0001, while neither of the other main
effects was significant, ps > .07. However, emotional expression significantly interacted with area, F (5.55,
216.26) = 4.04, p = .001, as did prime, F (3.19, 124.35) = 7.07, p < 0.001. The effect of area showed negativity
over frontal areas (N1) and positivity over occipital ones (P1). The interaction effect could be described as
indicating an enhanced N1/P1 for faces showing disgust, with amplified negativity over central (-0.82 μV vs. -
0.61, -0.53, and -0.51 for respectively anger, fear and happiness) areas as well as amplified positivity in
occipital (2.52 μV vs. 2.01, 1.76 and 1.97) areas. Prime likewise affected both potentials, with touch amplifying
(by 0.20 μV), and auditory primes attenuating (by 0.22 μV) the N1 over fronto-central areas in contrast with
silent conditions. The P1, over lateral-temporal and occipital areas, was likewise attenuated after auditory
18
primes (by 0.30 μV) but not after touch (0.05 μV, ns). However, neither the prime-by-emotional expression, F
(4.87, 189.72) = 0.90, p > .5, nor the three-way interaction, F (7.97, 310.87) = 0.51, p > .8, was significant.
N170 / VPP
Similar repeated measures ANOVAs on the N170 / VPP amplitude showed significant main effects of
area, F (1.31, 51.10) = 127.38, p < .0001, and emotional expression, F (2.83, 110.35) = 13.13, p < .0001, but not
prime, F (1.92, 74.95) = 2.35, p > .1. Faces expressing happiness showed strongest negativity (-1.41 μV) vs.
disgust, fear and anger (-1.22, -1.19 and -1.04 μV respectively). However, both prime, F (3.85, 150.07) = 9.44, p
< .0001, and emotional expression, F (4.56, 177.66) = 14.33, p < .0001, significantly interacted with area,
suggesting the effect was localized. Comparing the effects largely showed the same effects, but in inverse order,
consistent with the idea that the N170 and VPP components have similar functional meaning, but are reversed in
polarity (Joyce & Rossion, 2005). The N170, as measured over lateral-temporal areas, was amplified after fear
(-7.71 μV), followed by happiness (-7.33 μV), disgust (-6.97 μV), and anger (-6.53 μV). In terms of primes, the
N170 was enhanced after auditory primes (-7.42 μV), followed by touch (-6.92) and silent (-7.06). The same
pattern of the effect of emotional expression and prime could be observed on the VPP, strongest present over
frontal areas, with greatest positivity after fear (3.65), followed by happiness (3.50), disgust (3.24) and anger
(2.97); and after auditory (3.69), followed by silent (3.28 μV) and tactile (3.05 μV) primes. Again, neither the
prime-by-emotional expression-, F (5.52, 215.40) = 0.65, p > .6, nor the three-way-interaction, F (8.83, 344.22)
= 0.49, p > .8 was significant.
P2
A repeated measures ANOVA with area (central vs. parietal), prime and emotional expression, showed
significant main effects of area, F (1, 39) = 50.14, p < .0001, and emotional expression, F (2.63, 102.71) = 4.62,
p = .006, but not prime, F (1.86, 72.58) = 2.27, p > .1. Area and emotional expression significantly interacted, F
(2.78, 108.43) = 6.88, p = .0004, as did area and prime, F (1.96, 76.48) = 5.93, p = .004. For emotional
expression, the effect showed amplified P2s over central areas with fear (3.68 μV) and happiness (3.62 μV)
19
compared to anger (3.37 μV) and disgust (3.31 μV) while activity over parietal areas was stronger in anger (1.37
μV) and fear (1.30 μV) than in disgust (0.90 μV) and happiness (0.87 μV). Primes showed stronger effects in
central areas, with higher amplitudes after auditory (3.66 μV) and silent (3.58 μV) than tactile (3.24 μV) primes.
Again, the interaction between prime and emotional expression and the three-way interaction were both non-
significant, Fs < 0.8, ps > .5.
Later components
A repeated measures ANOVA, with time (220-315, 320-495 and 500-700 ms), area (frontal, central,
parietal, lateral-temporal and occipital), emotional expression (anger, disgust, fear and happiness) and prime
(touch, tone and silent) as factors, showed significant effects of time, F (1.53, 59.82) = 66.97, p < .0001, area
(1.61, 64.19) = 50.33, p < .0001, emotional expression, F (2.44, 95.24) = 12.48, p < .0001, and prime, F (2, 78)
= 5.84, p = .005. Prime interacted with area, F (3.57, 139.04) = 32.49, p < .0001, and this interaction was
significantly modulated by time, F (5.29, 206.24) = 3.80, p = .002. Emotional expression interacted with area, F
(4.11, 160.19) = 10.80, p < .0001, and time, F (3.64, 141.92) = 15.32, p < .0001. Here, too, the three-way
interaction between time, area and emotional expression was significant, F (6.41, 249.79) = 10.80, p < .0001. A
complete description of how the main effects of emotion and primes are distributed in time and topography is
provided in Figure 3. However, of main importance for the present study was that no evidence for an interaction
between emotional expression and prime could be found, Fs < 1.2, ps > .3.
[ FIGURE 3 about here ]
Discussion
This study investigated whether touch modulates emotional face processing. In order to control for
generic cognitive effects of cross-modal primes, such as might be expected from a forewarning cue or an
20
accessory stimulus effect (Hackley & Valle-Inclán, 1999; Swallow & Jiang, 2010), we contrasted tactile primes
with both no-cue and auditory cue conditions. That is, if a touch is merely an accessory stimulus, it should have
similar consequences to other task-irrelevant primes, while hypotheses considering touch a special, inherently
emotional form of stimulus predict distinct differences between tactile and auditory primes.
The affective responses to emotional faces were strong and in line with previous studies. The observed
effects on facial EMG activity reflected the well-established phenomenon of facial mimicry: the response in the
corrugator supercilii (CS) was most pronounced after negative expressions and that in the zygomaticus major
(ZM) after happy expressions (Hess & Blairy, 2001). Faces displaying expressions of disgust, however, did not
affect the levator labii superioris alaeque nasi (LN). Finally, an increased cardiac deceleration indicated a
parasympathetically mediated orienting response to threatening (angry) faces (Bradley, 2009).
The event related potentials to the emotional faces likewise replicated some two decades of research on
EEG responses to facial emotions. In particular, the N1, indicated in preliminary visual attention (Hillyard,
Hink, Schwent, & Picton, 1973; Näätänen & Michie, 1979), was amplified after disgust. The N170/VPP,
commonly seen as specific to face perception (Bentin et al., 1996) was stronger in response to expressions
showing fear or happiness. The P2, previously indicated as relating to anxiety (Holmes, Nielsen, & Green,
2008), was here also amplified after fearful faces. Finally, the P3/LPP, previously shown to be related to
positive affect (Gable & Harmon-Jones, 2010; Pastor et al., 2008), but sometimes also to negative affect (An et
al., 2003), was amplified with expressions showing happiness, relative to other emotional expressions.
Cross-modal cues also affected responses to the visual stimuli. Auditory and tactile stimuli facilitated
forced choice reaction times to subsequently shown faces, consistent with the accessory stimulus effect
(Hackley & Valle-Inclán, 1999). Touch furthermore reduced CS activity, possibly indicating a relaxation of
concentration (Barral et al., 2015), but did not affect ZM activity, suggesting it did not necessarily result in an
affective response. It did, however, elicit LN activity to a surprisingly strong degree. Since this effect was
shown to be in response to the touch itself, thus preceding the emotional expression, and since the effect was
21
much larger than the emotional effects on LN, we conclude that the effect is unlikely to have a cognitive or
affective cause. Instead, we speculate that previous suggestions of cross-modal pathways between touch and
smell (Dematte, Sanabria, Sugarman, & Spence, 2006) give rise to a tactile-LN reflex. Be that as it may, in sum,
auditory and tactile primes seemed to affect stimulus evaluation in general – as a general facilitation of
cognitive processing – but in no measurement under consideration was an interaction between emotional
expression and prime observed. Thus, it does not appear that touch automatically changes emotional evaluation,
for example by increasing empathy – an old (Fisher et al., 1976) but still relevant (Gallace & Spence, 2010)
conceptualization – as this would predict that touch should elicit facial mimicry (Sims, Van Reekum, Johnstone,
& Chakrabarti, 2012).
Likewise, almost all ERP components that were affected by emotional expression were also found to be
affected by primes, but not in any clear or interacting way. Thus, touch increased the N1, replicating Schirmer et
al. (2010), and reduced the VPP (though not necessarily the N170) as well as the P2. A frontal part of the P3
response was reduced for touch and enhanced for sound, whereas a later (possibly P3b) response was found
similarly amplified after both types of primes. Finally, the late positive frontal negativity (possibly inversely
related to the late positive complex, Picton & Stuss, 1994; Sutton & Ruchkin, 1984) was found reduced for
touch compared to control and sound. The LPP in particular has recently become the focus of
psychophysiologists, some of whom explicitly consider it a motivational (Gable & Harmon-Jones, 2010;
Schupp et al., 2000) or emotional (Brown, van Steenbergen, Band, de Rover, & Nieuwenhuis, 2012; Liu,
Huang, McGinnis-Deweese, Keil, & Ding, 2012) correlate. However, given that no ERP component showed any
interaction with the emotional identity of faces, the evidence does not point toward the primes biasing emotional
face processing.
Although a lack of a significant effect is no proof of its absence, it should be noted that our study had a
relatively large sample size, and consistently shows a clear lack of any effect of tactile primes modulating
emotional expression across behavior, ERPs, EMG and cardiac activity. Indeed, looking at the effect sizes of the
interaction terms across the sixteen measures in the present study, we find that on average, only 1.8% of the
22
variance can be accounted for by the effect of priming on emotional face processing. To put that into
perspective, the effects of emotional expression and type of priming on their own accounted for respectively
27.1% and 19.5%. This has important implications for the sample size of replication studies, but more
importantly, this means that the explanatory value of the hypothesis that touch biases emotional evaluation is
limited. Indeed, it would be hard to align the size of any of the psychophysiological effects with the Midas
Touch literature. For example, Gueguen & Fischer-Lokou (2003), show the likelihood that a request for a free
ride increases by 20 percentage points after being “encouraged by a tactile contact”, and in a meta-analysis,
Haans & IJsselsteijn (2009) computed an average effect size across studies of 15.9%.
However, one might argue that the “touch” employed in the present study does not sufficiently convey a
real, human touch. Schirmer et al. (2010), for example, used a pressure based device to deliver tactile primes
that generally had a long (5 s) duration. On the other hand, upon contrasting real, human touch with mechanical
touch from a device either (believed to be) operated by a human, or starting randomly, they themselves noted
very little effect of the “humanity” of the touch. Furthermore, the duration of the touch presented in the current
study is not particularly short with regards to the Midas Touch literature, which favors a single (see footnote 1)
short type of touch: Fisher, Rytting & Heslin (1976), for example, specifically trained confederates to achieve
tactile stimulations of durations exactly equal to those in the present study. Thus, there is little a-priori evidence
that long durations or human haptic parameters are required to cause emotional modulation, suggesting the
argument is raised purely because no emotional modulation was observed.
A more serious argument relies on the lack of neutral emotional display. One might argue that our lack
of neutral faces could cause all cross-modal effects to be equal amongst different emotions: perhaps a touch
does not make a threatening face more threatening, but merely makes all faces more emotional? This
interpretation would be possible, but only if emotions would bias facial stimulus-related components in similar
ways. However, ERP studies have demonstrated that emotional face processing follows a more complex
timeline. For example, fearful faces were found to amplify early (N1) responses, before (N170) the generic
23
emotional response (Luo et al., 2010; Zhang, Luo, & Luo, 2013). However, here we find the reverse pattern:
touch enhanced the N1, but not the N170.
Another interpretation is that cross-modal primes influence visual attention, but not necessarily
immediately the emotional impact of this stimulus. This interpretation would be consistent with one suggested
by Schirmer et al. (2010), who, remarking upon the similarity between tactile and auditory effects on emotional
stimulus processing, suggested they might enhance “the sensory salience or emotional significance of
accompanying pictures”. As the quality or valence of the emotional picture, as contributed in the present study,
did not interact with the prime, the conclusion must be that the sensory salience rather than emotional
processing is affected.
Still, it is important to note that even though touch does not seem to immediately alter emotional
evaluation, it is possible that touch may still induce a late type of emotional effect on decision making. For
example, if a touch enhances sensory salience, it could improve encoding to memory, thus making it more likely
that any affective content is retrieved. Thus, in our previous work on the Midas Touch, we initially expected
tactile primes to modulate affective appraisal and, consequently, the feedback related negativity (Spapé et al.,
2015). This was not observed: Touch enhanced the late positive potential regardless of affective content.
Consistent with touch as biasing stimulus significance, we related this effect to a persons’ relationship with the
sender of the touch and the degree of fairness of economic offers, and showed the touch could still bias decision
making at a subsequent phase of the experiment. Principally, this means that a touch does neither act necessarily
as an embodied cue for interpersonal similarity, nor does it automatically bias ongoing affect and thereby
improve emotional evaluation. It seems instead more likely that a touch acts firstly by focusing attention and
secondly via a complex network that uses cultural expectations in the interpretation of the event.
In other words, a touch does not immediately make everything seem better. It is, however, possible that
a touch may facilitate attention towards a stimulus, making it more likely we will remember the event. Or, in
terms of the Midas Touch: A waiter’s touch may not make their smile appear warmer, but it might help you
24
remember the event. It does not reduce the cost of your dinner, but you still will think twice before you fail to
tip.
References
An, S. K., Lee, S. J., Lee, C. H., Cho, H. S., Lee, P. G., Lee, C., … Namkoong, K. (2003). Reduced P3
amplitudes by negative facial emotional photographs in schizophrenia. Schizophrenia Research, 64(2),
125–135.
Balconi, M., & Pozzoli, U. (2003). Face-selective processing and the effect of pleasant and unpleasant
emotional expressions on ERP correlates. International Journal of Psychophysiology, 49(1), 67–74.
Barral, O., Eugster, M. J., Ruotsalo, T., Spapé, M. M., Kosunen, I., Ravaja, N., … Jacucci, G. (2015). Exploring
Peripheral Physiology as a Predictor of Perceived Relevance in Information Retrieval. In Proceedings
of the 20th International Conference on Intelligent User Interfaces (pp. 389–399). ACM.
Bentin, S., Allison, T., Puce, A., Perez, E., & McCarthy, G. (1996). Electrophysiological studies of face
perception in humans. Journal of Cognitive Neuroscience, 8(6), 551–565.
Blairy, S., Herrera, P., & Hess, U. (1999). Mimicry and the judgment of emotional facial expressions. Journal
of Nonverbal Behavior, 23(1), 5–41.
Bradley, M. M. (2009). Natural selective attention: Orienting and emotion. Psychophysiology, 46(1), 1–11.
Brauer, J., Xiao, Y., Poulain, T., Friederici, A. D., & Schirmer, A. (2016). Frequency of Maternal Touch
Predicts Resting Activity and Connectivity of the Developing Social Brain. Cerebral Cortex, bhw137.
Brown, S. B., van Steenbergen, H., Band, G. P., de Rover, M., & Nieuwenhuis, S. (2012). Functional
significance of the emotion-related late positive potential. Frontiers in Human Neuroscience, 6.
Retrieved from http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3287021/
Cauquil, A. S., Edmonds, G. E., & Taylor, M. J. (2000). Is the face-sensitive N170 the only ERP not affected by
selective attention? Neuroreport, 11(10), 2167–2171.
25
Cohen, M. X., Elger, C. E., & Ranganath, C. (2007). Reward expectation modulates feedback-related negativity
and EEG spectra. NeuroImage, 35(2), 968–978. https://doi.org/10.1016/j.neuroimage.2006.11.056
Crusco, A. H., & Wetzel, C. G. (1984). The Midas touch The effects of interpersonal touch on restaurant
tipping. Personality and Social Psychology Bulletin, 10(4), 512–517.
https://doi.org/10.1177/0146167284104003
Delorme, A., & Makeig, S. (2004). EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics
including independent component analysis. Journal of Neuroscience Methods, 134(1), 9–21.
https://doi.org/10.1016/j.jneumeth.2003.10.009
Dematte, M. L., Sanabria, D., Sugarman, R., & Spence, C. (2006). Cross-modal interactions between olfaction
and touch. Chemical Senses, 31(4), 291–300.
Dolinski, D. (2010). Touch, compliance, and homophobia. Journal of Nonverbal Behavior, 34(3), 179–192.
Eimer, M., Holmes, A., & McGlone, F. P. (2003). The role of spatial attention in the processing of facial
expression: an ERP study of rapid brain responses to six basic emotions. Cognitive, Affective, &
Behavioral Neuroscience, 3(2), 97–110.
Ekman, P., & Friesen, W. V. (1976). Pictures of facial affect. Palo Alto, CA: Consulting Psychologists Press.
Fisher, J. D., Rytting, M., & Heslin, R. (1976). Hands touching hands: Affective and evaluative effects of an
interpersonal touch. Sociometry, 416–421.
Fridlund, A. J., & Cacioppo, J. T. (1986). Guidelines for human electromyographic research. Psychophysiology,
23(5), 567–589.
Gable, P. A., & Harmon-Jones, E. (2010). Late positive potential to appetitive stimuli and local attentional bias.
Emotion, 10(3), 441.
Gallace, A., & Spence, C. (2010). The science of interpersonal touch: an overview. Neuroscience &
Biobehavioral Reviews, 34(2), 246–259. https://doi.org/10.1016/j.neubiorev.2008.10.004
Graham, F. K., & Clifton, R. K. (1966). Heart-rate change as a component of the orienting response.
Psychological Bulletin, 65(5), 305.
26
Guéguen, N., & Fischer-Lokou, J. (2003). Another evaluation of touch and helping behavior. Psychological
Reports, 92(1), 62–64. https://doi.org/10.2466/pr0.2003.92.1.62
Haans, A., & IJsselsteijn, W. A. (2009). The virtual Midas touch: Helping behavior after a mediated social
touch. Haptics, IEEE Transactions on, 2(3), 136–140.
Hackley, S. A., & Valle-Inclán, F. (1999). Accessory stimulus effects on response selection: Does arousal speed
decision making? Journal of Cognitive Neuroscience, 11(3), 321–329.
Hajcak, G., MacNamara, A., & Olvet, D. M. (2010). Event-related potentials, emotion, and emotion regulation:
an integrative review. Developmental Neuropsychology, 35(2), 129–155.
Harlow, H. F. (1958). The nature of love. American Psychologist, 13(12), 673.
Hatfield, E., Cacioppo, J. T., & Rapson, R. L. (1993). Emotional contagion. Current Directions in
Psychological Science, 2(3), 96–99.
Henley, N. M. (1973). Status and sex: Some touching observations. Bulletin of the Psychonomic Society, 2(2),
91–93. https://doi.org/10.3758/BF03327726
Herrmann, M. J., Aranda, D., Ellgring, H., Mueller, T. J., Strik, W. K., Heidrich, A., & Fallgatter, A. J. (2002).
Face-specific event-related potential in humans is independent from facial expression. International
Journal of Psychophysiology, 45(3), 241–244.
Hess, U., & Blairy, S. (2001). Facial mimicry and emotional contagion to dynamic emotional facial expressions
and their influence on decoding accuracy. International Journal of Psychophysiology, 40(2), 129–141.
Hillyard, S. A., Hink, R. F., Schwent, V. L., & Picton, T. W. (1973). Electrical signs of selective attention in the
human brain. Science, 182(108), 177–180.
Holmes, A., Nielsen, M. K., & Green, S. (2008). Effects of anxiety on the processing of fearful and happy faces:
An event-related potential study. Biological Psychology, 77(2), 159–173.
https://doi.org/10.1016/j.biopsycho.2007.10.003
Hornik, J. (1992). Tactile stimulation and consumer response. Journal of Consumer Research, 449–458.
Itier, R. J., & Taylor, M. J. (2004a). N170 or N1? Spatiotemporal differences between object and face
processing using ERPs. Cerebral Cortex, 14(2), 132–142.
27
Itier, R. J., & Taylor, M. J. (2004b). Source analysis of the N170 to faces and objects. Neuroreport, 15(8),
1261–1265.
Jönsson, P., & Sonnby-Borgström, M. (2003). The effects of pictures of emotional faces on tonic and phasic
autonomic cardiac control in women and men. Biological Psychology, 62(2), 157–173.
Joyce, C., & Rossion, B. (2005). The face-sensitive N170 and VPP components manifest the same brain
processes: The effect of reference electrode site. Clinical Neurophysiology, 116(11), 2613–2631.
https://doi.org/10.1016/j.clinph.2005.07.005
Kleinke, C. L. (1977). Compliance to requests made by gazing and touching experimenters in field settings.
Journal of Experimental Social Psychology, 13(3), 218–223. https://doi.org/10.1016/0022-
1031(77)90044-0
Lang, P. J., Bradley, M. M., & Cuthbert, B. N. (1997). International affective picture system (IAPS): Technical
manual and affective ratings. NIMH Center for the Study of Emotion and Attention, 39–58.
Levenson, R. W., & Gottman, J. M. (1983). Marital interaction: Physiological linkage and affective exchange.
Journal of Personality and Social Psychology, 45(3), 587–597.
Liu, Y., Huang, H., McGinnis-Deweese, M., Keil, A., & Ding, M. (2012). Neural Substrate of the Late Positive
Potential in Emotional Processing. The Journal of Neuroscience, 32(42), 14563–14572.
https://doi.org/10.1523/JNEUROSCI.3109-12.2012
Luo, W., Feng, W., He, W., Wang, N.-Y., & Luo, Y.-J. (2010). Three stages of facial expression processing:
ERP study with rapid serial visual presentation. NeuroImage, 49(2), 1857–1867.
https://doi.org/10.1016/j.neuroimage.2009.09.018
Milne, R. J., Kay, N. E., & Irwin, R. J. (1991). Habituation to repeated painful and non-painful cutaneous
stimuli: a quantitative psychophysical study. Experimental Brain Research, 87(2), 438–444.
Myers, L. J., Lowery, M., O’Malley, M., Vaughan, C. L., Heneghan, C., St Clair Gibson, A., … Sreenivasan, R.
(2003). Rectification and non-linear pre-processing of EMG signals for cortico-muscular analysis.
Journal of Neuroscience Methods, 124(2), 157–165.
28
Näätänen, R., & Michie, P. T. (1979). Early selective-attention effects on the evoked potential: a critical review
and reinterpretation. Biological Psychology, 8(2), 81–136.
Palermo, R., & Rhodes, G. (2007). Are you always on my mind? A review of how face perception and attention
interact. Neuropsychologia, 45(1), 75–92. https://doi.org/10.1016/j.neuropsychologia.2006.04.025
Pastor, M. C., Bradley, M. M., Löw, A., Versace, F., Moltó, J., & Lang, P. J. (2008). Affective picture
perception: emotion, context, and the late positive potential. Brain Research, 1189, 145–151.
Picton, T. W., & Stuss, D. T. (1994). Neurobiology of conscious experience. Current Opinion in Neurobiology,
4(2), 256–265.
Polich, J. (2007). Updating P300: an integrative theory of P3a and P3b. Clinical Neurophysiology, 118(10),
2128–2148. https://doi.org/10.1016/j.clinph.2007.04.019
Polich, J., & Kok, A. (1995). Cognitive and biological determinants of P300: an integrative review. Biological
Psychology, 41(2), 103–146.
Ravaja, N., Turpeinen, M., Saari, T., Puttonen, S., & Keltikangas-Järvinen, L. (2008). The psychophysiology of
James Bond: phasic emotional responses to violent video game events. Emotion, 8(1), 114.
Sato, W., Kochiyama, T., Yoshikawa, S., & Matsumura, M. (2001). Emotional expression boosts early visual
processing of the face: ERP recording and its decomposition by independent component analysis.
Neuroreport, 12(4), 709–714.
Schirmer, A., Teh, K. S., Wang, S., Vijayakumar, R., Ching, A., Nithianantham, D., … Cheok, A. D. (2010).
Squeeze me, but don’t tease me: Human and mechanical touch enhance visual attention and emotion
discrimination. Social Neuroscience, 6(3), 219–230. https://doi.org/10.1080/17470919.2010.507958
Schlosberg, H. (1954). Three dimensions of emotion. Psychological Review, 61(2), 81–88.
https://doi.org/10.1037/h0054570
Schupp, H. T., Cuthbert, B. N., Bradley, M. M., Cacioppo, J. T., Ito, T., & Lang, P. J. (2000). Affective picture
processing: the late positive potential is modulated by motivational relevance. Psychophysiology, 37(2),
257–261.
29
Schupp, H. T., Öhman, A., Junghöfer, M., Weike, A. I., Stockburger, J., & Hamm, A. O. (2004). The facilitated
processing of threatening faces: an ERP analysis. Emotion, 4(2), 189.
Seeck, M., & Grüsser, O.-J. (1992). Category-related components in visual evoked potentials: photographs of
faces, persons, flowers and tools as stimuli. Experimental Brain Research, 92(2), 338–349.
Sims, T. B., Van Reekum, C. M., Johnstone, T., & Chakrabarti, B. (2012). How reward modulates mimicry:
EMG evidence of greater facial mimicry of more rewarding happy faces. Psychophysiology, 49(7), 998–
1004.
Smith, E. R. (2008). Social relationships and groups: New insights on embodied and distributed cognition.
Cognitive Systems Research, 9(1), 24–32.
Spapé, M. M., Hoggan, E. E., Jacucci, G., & Ravaja, N. (2015). The meaning of the virtual Midas touch: An
ERP study in economic decision making. Psychophysiology, 52(3), 378–398.
Spapé, M. M., Kivikangas, J. M., Järvelä, S., Kosunen, I., Jacucci, G., & Ravaja, N. (2013). Keep Your
Opponents Close: Social Context Affects EEG and fEMG Linkage in a Turn-Based Computer Game.
PLoS ONE, 8(11), e78795. https://doi.org/10.1371/journal.pone.0078795
Spitz, R. A. (1945). Hospitalism; an inquiry into the genesis of psychiatric conditions in early childhood. The
Psychoanalytic Study of the Child, 1, 53.
Sutton, S., Braren, M., Zubin, J., & John, E. R. (1965). Evoked-potential correlates of stimulus uncertainty.
Science, 150(3700), 1187–1188.
Sutton, S., & Ruchkin, D. S. (1984). The late positive complex. Annals of the New York Academy of Sciences,
425(1), 1–23.
Suvilehto, J. T., Glerean, E., Dunbar, R. I., Hari, R., & Nummenmaa, L. (2015). Topography of social touching
depends on emotional bonds between humans. Proceedings of the National Academy of Sciences,
112(45), 13811–13816.
Swallow, K. M., & Jiang, Y. V. (2010). The role of timing in the attentional boost effect. Attention, Perception,
& Psychophysics, 73(2), 389–404. https://doi.org/10.3758/s13414-010-0045-y
30
Thierry, G., Martin, C. D., Downing, P., & Pegna, A. J. (2007). Controlling for interstimulus perceptual
variance abolishes N170 face selectivity. Nature Neuroscience, 10(4), 505–511.
Vanderploeg, R. D., Brown, W. S., & Marsh, J. T. (1987). Judgements of emotion in words and faces: ERP
correlates. International Journal of Psychophysiology, 5(3), 193–205. https://doi.org/10.1016/0167-
8760(87)90006-7
Watanabe, S., Kakigi, R., & Puce, A. (2003). The spatiotemporal dynamics of the face inversion effect: a
magneto-and electro-encephalographic study. Neuroscience, 116(3), 879–895.
Zhang, D., Luo, W., & Luo, Y. (2013). Single-trial ERP analysis reveals facial expression category in a three-
stage scheme. Brain Research, 1512, 78–88. https://doi.org/10.1016/j.brainres.2013.03.044
31
Figure Captions
Figure 1: Trial procedure. A fixation was presented for 400 ms before a face stimulus was displayed. The face
always wore a neutral expression and was presented for 950-1600 ms. While this visual stimulus was still being
shown on the screen, a touch, a tone or a silent control prime was presented. Then, the face’s expression
changed to one of four emotions. The emotional expression was always presented for 1000 ms, only after which
participants were asked to indicate the displayed emotion using a keypress.
Figure 2: Effects of emotion (left panels) and primes (right panels) on facial EMG and ECG.
Portrayed are mean (per second) evoked logarithmic normalized EMG activity over Zygomaticus major
(ZM, “smiling”), Corrugator supercilii (CS, “frowning”) and Levator labii superioris alaeque nasi (LN,
“nose wrinkling”) muscle groups. For ECG, averages are shown for the average change in inter-beat
interval (IBI) in ms (i.e. deceleration is plotted upwards). Error bars denote standard errors of means.
Figure 3: Effects of emotion (left panels) and primes (right panels) on ERPs. Portrayed are grand
averages, pooled over frontal (F3, Fz, F4), central (C3, Cz, C4), parietal (P3, Pz, P4), lateral-temporal
(P7, P8) and occipital (O1, Oz, O2) electrodes. The analysis of early potentials (N1/P1, N170/VPP and
P2) used peak amplitudes at the latencies indicated in the left panel. For late potentials, average
amplitudes were used over three distinct intervals (220-315, 320-495 and 500-700 ms).
32
TableTable 1Effects of emotion and prime on behavior and physiology
Effect sizes (ηp2 ) Direction of Effects
Emotion Prime Interaction Emotions PrimesBehavior A D F H s a t RT .68*** .09* .02 2 1 1
ERROR .49*** .00 .01 3 3 2 1 nsEMG
CS .43*** .09* .05 3 2 2 1 2 2 1ZM .13** .01 .01 1 1 1 2 nsLN .03 .30*** .03 ns 2 2 1
ECG IBI .11** .07* .02 2 1 1 1 1 2 1Early ERPs (area with highest amplitude) N1 (fronto-central) .09* .26*** .01 1 2 1 1 2 1 3
P1 (occipital) .12** .06 .02 2 3 1 2 nsVPP (frontal) .25*** .33*** .01 1 2 3 2 2 3 1N170 (lateral temporal) .39*** .18*** .02 1 2 4 3 1 2 1P2 (central) .12** .19*** .02 1 1 2 2 2 2 1
Late ERPs (area with strongest effect)220-315 (frontal+) .46*** .18*** .01 1 1 1 2 2 3 1220-315 (occipital-) .30*** .38*** .01 2 2 3 1 2 1 2320-495 (central+) .19*** .30*** .02 1 2 2 3 1 2 2500-700 (lateral temporal-) .45*** .01 .01 1 1 1 2 ns
500-700 (frontal-) .10*** .15* .02 2 2 1 1 2 2 1Note. RT = reaction time, error = error percentage, CS = corrugator supercilii, ZM = zygomaticus major, LN = levator labii superioris alaeque nasi, IBI = inter-beat interval. Direction of effects are on the main effects of emotion (first letter capitals denoting Anger, Disgust, Fear and Happiness) and primes (lower case letters denoting silent, audio, touch). Early ERPs are relative to peak polarity. Late ERPs are average amplitudes in bins with named midpoint. Effects are described as post-hoc comparisons after sorting the values in ascending order: RT A3 should be interpreted as reaction time in Anger being significantly higher than Disgust. ns: p > .05, *: p < .05; **: p < .01, ***: p < .001.
Acknowledgements
This work was supported by the Academy of Finland, grant #268999. The authors declare there are no possible
conflicts of interest. We would like to thank Tom Gallagher-Mitchell and Zania Sovijärvi-Spapé for proof-
reading and helpful comments.
33
Footnotes
Footnote 1
A similar argument is that repeated exposure to tactile primes could have caused cutaneous habituation (Milne,
Kay, & Irwin, 1991), and with the weaker sense of touch, a smaller interaction between prime and emotional
expression might be expected. To explore this possibility, we contrasted the behavioral and physiological
measures from the first half of trials with the second half in an additional analysis. As shown in the
Supplementary Information, a main effect of time was observed for reaction times, error percentages, and late
ERPs, as well as interactions between time and emotional expression, for error percentages and late ERPs. The
error percentages showed mainly a floor effect towards 0% errors with higher gains for correctly classifying
angry emotional expressions in the second half of trials. Conversely, late ERPs were found to have stronger
effects of emotional expressions in the latter part of the experiment. Furthermore, as no interaction between time
and the effect of primes on emotional expression was observed in any measure, we conclude that there is no
evidence that habituation reduced priming effects on emotional face processing.
Supplementary Information
SI 1: Auditory / Tactile prime waveform
SI 2: Effect sizes as a function of time (first half of trials vs last half of trials).
Figure 1
34
35
Figure 2
36
Figure 3
37
SI 2Table 2Effect sizes as a function of time (first half of trials vs last half of trials) Main Effects Interactions
1. Emotion 2. Prime 3. Time 1 x 2 1x3 2x3 1x2x3Behavior
Reaction time (ms) .52*** .06 .26*** .00 .04 .00 .01
Errors (%) .66*** .00 .31*** .03 .18*** .04 .02
EMG CS .42*** .10* .00 .04 .01 .00 .03ZM .13** .02 .00 .02 .01 .01 .05LN .03 .31*** .00 .03 .01 .07 .03
ECGIBI .12** .06* .00 .02 .02 .02 .03
Late ERPs (area with strongest effect) 220-315 (frontal+) .10*** .48*** .27*** .01 .04 .02 .02220-315 (occipital-) .30*** .38*** .10* .01 .05 .03 .02320-495 (central+) .35*** .49*** .29*** .02 .04 .01 .03500-700 (lateral temporal-) .26*** .41*** .04 .01 .12* .02 .03
500-700 (frontal-) .10* .55*** .11* .02 .07* .01 .02
Note. Values are effect sizes (ηp2 ). The effect of time concerns the contrast between the first half and
second half of trials. CS = corrugator supercilii, ZM = zygomaticus major, LN = levator labii superioris alaeque nasi, IBI = inter-beat interval (ms). Numbers denote the type of interaction tested, e.g. 1x3 being the interaction between emotion and time. *: p < .05; **: p < .01, ***: p < .001.