Hinojosa 2015

Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

Neuroscience and Biobehavioral Reviews 55 (2015) 498–509

Contents lists available at ScienceDirect

Neuroscience and Biobehavioral Reviews


journal homepage: www.elsevier.com/locate/neubiorev

Review

N170 sensitivity to facial expression: A meta-analysis


J.A. Hinojosa a,b,∗ , F. Mercado c , L. Carretié d
a
Instituto Pluridisciplinar, Universidad Complutense de Madrid, Madrid, Spain
b
Facultad de Psicología, Universidad Complutense de Madrid, Madrid, Spain
c
Facultad de Ciencias de la Salud, Universidad Rey Juan Carlos, Madrid, Spain
d
Facultad de Psicología, Universidad Autónoma de Madrid, Madrid, Spain

a r t i c l e i n f o a b s t r a c t

Article history: The N170 component is the most important electrophysiological index of face processing. Early studies
Received 30 October 2014 concluded that it was insensitive to facial expression, thus supporting dual theories postulating separate
Received in revised form 22 April 2015 mechanisms for identity and expression encoding. However, recent evidence contradicts this assump-
Accepted 3 June 2015
tion. We conducted a meta-analysis to resolve inconsistencies and to derive theoretical implications.
Available online 9 June 2015
A systematic revision of 128 studies analyzing N170 in response to neutral and emotional expressions
yielded 57 meta-analyzable experiments (involving 1645 healthy adults). First, the N170 was found to
Keywords:
be sensitive to facial expressions, supporting proposals arguing for integrated rather than segregated
N170
Facial expression mechanisms in the processing of identity and expression. Second, this sensitivity is heterogeneous, with
Meta-analysis anger, fear and happy faces eliciting the largest N170 amplitudes. Third, we explored some modulatory
Happy factors, including the focus of attention – N170 amplitude was found to be also sensitive to unattended
Fear expressions – or the reference electrode – common reference reinforcing the effects. In sum, N170 is a
Angry valuable tool to study the neural processing of facial expressions in order to develop current theories.
Sad © 2015 Elsevier Ltd. All rights reserved.
Disgust

Contents

1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 498
2. Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 499
2.1. Selection of studies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 499
2.2. Search and data description methodologies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 499
3. Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 500
4. Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 500
5. Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 506
Acknowledgements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 507
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 507

1. Introduction relationships and social functioning. In humans, as well as in other


primates sharing complex social organization, this important
Facial expressions, omnipresent in our daily life either in natural communicative function of facial expressions is supported by a
situations or through visual media, are the most important visual complex and specialized set of muscles and neural mechanisms
signal of emotion from others. The ability to extract emotional that enable both the production of a rich variety of expressions
states from facial expressions and regulate emotions in ourselves and their efficient detection and fast processing (Burrows, 2008;
and others is critical for an efficient behavior in interpersonal Hadj-Bouziane et al., 2008). These conspicuous characteristics
may explain the fact that facial expressions are probably the most
recurrent focus of research in both affective and social sciences.
∗ Corresponding author at: Universidad Complutense de Madrid, Instituto
Two methodological factors may contribute to this preference.
Pluridisciplinar, Paseo Juan XXIII, 1, Madrid 28040, Spain. Tel.: +34 91 394 32 61;
First, the susceptibility of faces, as compared to other affective
fax: +34 91 394 32 64. visual stimuli such as emotional scenes, for easy experimental con-
E-mail address: [email protected] (J.A. Hinojosa). trol between different emotional categories with respect to visual

http://dx.doi.org/10.1016/j.neubiorev.2015.06.002
0149-7634/© 2015 Elsevier Ltd. All rights reserved.
J.A. Hinojosa et al. / Neuroscience and Biobehavioral Reviews 55 (2015) 498–509 499

task performance following brain injury (e.g., Parry et al., 1991;


Humphreys et al., 1993), from behavioral studies that found no
differences when making expression matching decisions to famil-
iar and unfamiliar faces (e.g., Young et al., 1986), and from the
finding of activity in differentiated brain regions during the pro-
cessing of face identity and facial expression (e.g., Winston et al.,
2004). Against this view, modulations in the recognition of famous
and familiar faces by their emotional expressions (e.g., Kaufmann
and Schweinberger, 2004; Lander and Metcalfe, 2007), or partial
overlapping of neural activity during identity and facial expression
recognition tasks (e.g., LaBar et al., 2003; Ganel et al., 2005), have
been reported. Notably, although the idea of the N170 insensitivity
to emotional faces is still relatively solid (e.g., Rellecke et al., 2013),
enhanced N170 amplitudes elicited by different facial expressions
have been also reported in several studies, as will be described later.
A second inconsistency concerns the homogeneity of the N170
Fig. 1. Graphical schematic summary showing the latency, polarity and amplitude
of the N170 component of event-related potentials in response to faces and to other
observed across discrete emotional categories. In other words, it
non-facial visual stimuli. is unclear whether there are uniform N170 effects for all types
of emotional expressions or whether this component shows dif-
ferential effects in some particular expressions. Finally, given the
parameters such as color, contrast, or spatial frequency. Second, inconsistent findings regarding the sensitivity of the N170 to
their susceptibility for rapid processing allows measuring agile facial expressions, it is important to determine the contribution of
behavioral and neural responses. Indeed, the neural signal with potential moderators that may account for these divergent results.
the highest temporal resolution, electroencephalography (EEG) – Although evidence on this issue is still scarce, some modulating
along with magnetoencephalography (MEG) – is often employed to factors have been proposed, mainly at the recording/technical and
explore the neural mechanisms underlying facial processing. When the task design levels. In this sense, the reference electrode place-
EEG is measured and event-related potentials (ERPs) are extracted ment or whether voluntary attention is directed to expressions
(ERPs are the part of the EEG signal which reflects the response to or not has been proposed to influence N170 responses to facial
the specific event which is being processed), the presentation of expressions (e.g., Rellecke et al., 2013; Rossion, 2014; Wronka
faces consistently elicits a N170 component. This waveform typ- and Walentowska, 2011). Therefore, a meta-analysis seems nec-
ically shows enhanced amplitudes between 130 and 200 ms for essary and justified in this field of research. In the current study,
facial compared to non-facial stimuli, being maximal at posterior we explored these three issues through a systematic review of
lateral scalp areas (e.g., Bentin et al., 1996; Itier and Taylor, 2004a; 128 studies and meta-analyses of 57 experiments reporting N170
Rousselet et al., 2004; see Fig. 1). Its neural origin has been located in response to facial expressions. Disentangling facial expression
in face processing areas, such as the superior temporal sulcus or effects on the N170 may be also particularly useful to delineate a
the fusiform gyrus (Itier and Taylor, 2004b; Sadeh et al., 2010). better understanding about the functional significance of this com-
The N170 was first related to a face-specific processing mechanism ponent and to further explore the assumption of its involvement in
(Carmel and Bentin, 2002), although the finding of N170 effects to encapsulated structural encoding.
non-face objects following visual expertise training supports a non-
modular view (Rossion et al., 2002; Rossion and Jacques, 2011).
2. Methods
Also, face inversion consistently elicits delayed N170 latency (or
increased amplitudes sometimes), which has been interpreted as a
2.1. Selection of studies
reflection of impaired holistic/configural processing of faces (e.g.,
Rossion et al., 2000; Rossion and Gauthier, 2002; but see Itier et al.,
Studies that were included in the current review and analy-
2007, Itier and Batty, 2009, and Nemrodov et al., 2014 for a key
ses satisfied the following criteria: (1) the study used repeated
role of eye-sensitive neurons in the generation of the face inver-
measure designs to statistically compare the N170 amplitude to
sion effect). The functional meaning of this component has been
neutral and at least one discrete emotional facial expression. Those
related to the construction of abstract structural representations of
studies that collapsed different discrete expressions into a gen-
faces, following Bruce and Young, 1986Bruce and Young’s (1986)
eral valence dimension (i.e., positive/negative expressions) were
influential proposal (see Rossion, 2014, and Rossion and Jacques,
excluded; (2) participants could not belong to a clinical population,
2008, 2011 for extended reviews on this component).
although healthy control groups in clinical studies were included if
Of the greatest relevance for the scope of this study are those
the data were reported separately; (3) studies including schematic,
studies that compared N170 effects elicited by emotional and neu-
robotic and photographic faces were considered; (4) the analysis
tral facial expressions. Importantly, data show at this respect some
of N170 amplitudes was an explicit objective of the study; (5) only
apparent inconsistencies at several levels. First, early studies in
data from adult participants were considered since there is evi-
this field failed to show any effect of facial expressions on N170
dence suggesting that the N170 shows significant shifts in infant
(Eimer et al., 2003; Herrmann et al., 2002; Krolak-Salmon et al.,
populations (Taylor et al., 2004); (6) findings were reported in an
2001). These findings were taken as additional evidence supporting
English language, peer reviewed journal article. The cutoff date for
the interpretation of this component as a correlate of mechanisms
the literature search was December 31st, 2014.
involved in the structural encoding of faces (Eimer and Holmes,
2002, 2007). A core aspect of Bruce and Young’s model is the exis-
tence of parallel but functionally independent processing routes 2.2. Search and data description methodologies
devoted to the analysis of different face features, such as identity,
emotional expressions and facial speech (see Atkinson and Adolphs, Search of relevant studies was carried out through electronic
2011, and Posamentier and Abdi, 2003, for reviews). Evidence sup- searches of different databases (PsychInfo© , Google Scholar© ,
porting this claim came from patients that showed dissociations in PubMed© , ISI WoK© and Scopus© among other resources also
500 J.A. Hinojosa et al. / Neuroscience and Biobehavioral Reviews 55 (2015) 498–509

involving book search) using the following search terms: (N170) Next, effects for each specific emotional expression on N170
AND (fac* OR express*) AND (emotion* OR affect*). Every arti- amplitude were also explored. As appreciated in Table 1 and in the
cle identified was downloaded – or solicited to authors- and read Review Table (www.uam.es/CEACO/sup/N170 2014.htm), a wide
to ensure that it was appropriate to be included in the meta- variety of emotions have been studied, producing mixed results.
analysis/review. Studies found to meet these search criteria and To further explore this issue, meta-analyses on N170 amplitudes
the selection criteria described above were 128. A table summa- were carried out separately for Happiness > Neutral ESs (n = 32; i.e.,
rizing the main characteristics of the experimental design and the this comparison was reported in 32 out of the 56 meta-analyzable
main pertinent results of these 128 reviewed studies is available at studies); Anger > Neutral ESs (n = 21), Fear > Neutral ESs (n = 28),
www.uam.es/CEACO/sup/N170 2014.htm, in which any study not Disgust > Neutral ESs (n = 7), and Sadness > Neutral ESs (n = 14).
currently included but detected by readers will be added. No meta-analyzable studies exploring surprise expressions exist
Cohen’s effect sizes (ES), the parameter submitted to meta- up to the moment. Statistical tests (Table 2) showed that the
analyses, consisted of standardized mean differences computed greatest effect size corresponded to Anger > Neu and Fear > Neu
whenever one of the following numerical values regarding rele- contrasts, followed by Happiness > Neu. The other two contrasts
vant contrasts was reported in the paper: Fischer’s F (obtained in (Sadness > Neu and Disgust > Neu) did not reach significance.
one-way, two level ANOVAs), means and dispersion measures, or Finally, the capability of certain methodological and task-related
Student’s t-values. Calculation of ES from these three parameters factors to modulate N170 amplitude in response to facial expres-
required formulas for paired samples (e.g., Lakens, 2013), since sions of emotion was meta-analyzed. First, we tested the effects of
all studies employed repeated measures designs to compare emo- the reference electrode used to measure ERPs. To this aim, a meta-
tional vs. neutral facial expressions. This information was available analysis employing the Q statistic analog to ANOVA (see previous
in 55 out of the 128 studies reviewed, which described 57 exper- section) was carried out on the emotional > neutral ESs regarding
iments (two studies described two meta-analyzable experiments) N170 amplitudes contrasting the modulator role of Reference Elec-
involving 1645 healthy adults; the rest of studies provided insuf- trode (2 levels: Common Average vs. Specific References, which
ficient information to compute ES. Details and summaries of all included mastoids – n = 9, nosetip – n = 5, earlobes – n = 2, vertex
ESs and other parameters necessary for meta-analysis correspond- – n = 1 and FCz – n = 1). The 56 meta-analyzable studies shown
ing to each experiment are available at www.uam.es/CEACO/sup/ in Fig. 2 were included in this meta-analysis. Results showed
N170 2014.htm. Main results and methodological characteristics of non-significant differences (Q(1) = 2.4043, p = 0.1210). Mean ES
these 57 meta-analyzable experiments are summarized in Table 1. for Common Average (n = 38) was −0.3472 (95%CI = −0.4259 to
For global statistics on ES (i.e., calculation of mean ES and its −0.2686, Z = −8.6522, p < 0.0001), and for Specific References
statistical significance through Z test: Lipsey and Wilson, 2001), the (n = 18), it was −0.2569 (95%CI = −0.3397 to −0.1740, Z = −6.0767,
“MeanES” SPSS macro designed by Wilson (2010) was employed. p < 0.0001). Therefore, ESs of emotional expression regarding N170
To investigate potential moderators of ES, a Q statistic analog amplitudes were significant within both groups of electrode ref-
to analysis of variance (ANOVA) for categorical variables (Lipsey erences, but no differences existed between them. An analysis
and Wilson, 2001), was computed also through Wilson’s SPSS comparing only studies using Common Average and Mastoids
macros (“metaF”; links to these macros are available at www. was also carried out, differences being in this case significant
uam.es/CEACO/sup/N170 2014.htm). All analyses were conducted (Q(1) = 4.1457, p = 0.0417), ESs being greater in the Common Aver-
using maximum likelihood, random-effects models weighted by age than in the Mastoids group.
the inverse of the variance. Second, a meta-analysis was carried out on whether the task
To address the “file drawer problem” – that is, the bias for sig- involved direct instructions with respect to expressions (i.e., they
nificant results to be more likely published and retrievable for were asked to attend to the expression in the faces) or not. All
a meta-analysis relative to non-significant results – the fail-safe meta-analyzable studies shown in Fig. 2 were included in this
N (Nfs) was computed. This Nfs represents the estimated num- meta-analysis (n = 56). Again, the Q statistic analog to ANOVA was
ber of unpublished studies reporting null results (here defined as computed on the emotional > neutral ESs N170 amplitudes con-
ES = −0.21 ) that should exist to render the overall findings non- trasting the modulator role of type of task (2 levels: direct task
significant (Rosenthal, 1979). To this aim, the Orwin (1983) Nfs vs. non-direct task, which included passive viewing – n = 10 – and
formula was applied. indirect tasks, in which participants were asked to direct their
attention away from facial expressions – n = 20 – e.g., identifying
objects such as houses or chairs, discriminating line orientations,
3. Results or performing digit categorization, all these non-facial targets pre-
sented besides, or superimposed to, the facial expressions). Results
The global effect of Emotional expressions (averaging the N170 showed significant differences (Q(1) = 6.3708, p = 0.0116). Mean ES
ESs to all emotional expressions employed in each study) vs. Neu- for Direct Task (n = 26) was −0.2025 (95%CI = −0.3000 to −0.1050,
tral expressions on N170 amplitudes was explored. Fig. 2 shows ESs Z = −4.0696, p < 0.0001), and for non-direct task (n = 30), it was
and 95% confidence intervals (CI) for meta-analyzed experiments −0.3574 (95%CI = −0.4277 to −0.2870, Z = −9.9595, p < 0.0001). In
(n = 56 since, as may be appreciated in Fig. 2, one of the 57 meta- other words, ESs of emotional expression regarding N170 ampli-
analyzable experiments was detected to be outlier and was not tudes were significant in both groups of tasks, but a between-group
included). This meta-analysis confirmed that emotional vs. neutral difference also existed indicating significantly greater ESs in the
faces ESs were significant. Global computations following a random non-direct group of tasks.
effects model showed that the mean ES for this sample of studies
(mean ES = −0.3325, 95%CI = −0.4471 to −0.2179) was statistically
significant (Z = −5.6855, p < 0.0001; Nfs = 46), clearly supporting an 4. Discussion
Emotional > Neutral effect on N170.
The first question that we aimed to answer was whether the
N170 amplitude in response to emotional faces differs from that
1
Emotional > Neutral differences in N170 amplitudes are reflected in negative
in response to neutral faces. Whereas early experiments in this
effect sizes since the more the amplitude, the more the negativity in this negative field failed to find any effect of facial expressions on N170 (e.g.,
ERP component. Eimer et al., 2003; Herrmann et al., 2002; Krolak-Salmon et al.,
Table 1
Description and main results of meta-analyzable studies exploring N170 amplitudes in response to neutral and emotional facial expressions.

Authors Year Sample statistics of Ongoing task Facial expressions Time window for Electrodes where Ref. electrode used Any main Emo vs Which Emo? Significant LH
healthy adults: F/M N170 in ms N170 was analyzed for analyses Neu N170 amplitude vs. RH N170
(average age) (amplitude type) (total) difference? amplitude
(controls) differences
(Emo)?
Aarts et al. 2012 51/9 (20) Go-no go task, faces Neutral, Happy, 150–200 ms (peak D30, D31, D32, A9, Common average Yes (Happy, Angry) > Neu Non-reported
provided feedback Angry amplitude) A10, A11, B6, B7, B8,
on task performance B10, B11, B12 (128)
Akbarfahimi et al. 2013 12/16 (32.61) Detection of houses Neutral, Happy, Fear 130–250 ms (peak P7, P8 (32) Mastoids No No
among faces amplitudes)

J.A. Hinojosa et al. / Neuroscience and Biobehavioral Reviews 55 (2015) 498–509


Almeida et al. 2014 0/54 (23.24) Detection of chairs Neutral, Happy, 130–220 ms (peak P7, P8 (32) Common average Yes (Happy = Yes (LH < RH);
among faces Angry, Fear, Disgust amplitude) Angry = Fear = Disgust) > (Happy = Angry
Neu = Fear = Disgust) >
Neu
Andreatta et al. 2012 25 (22.9) Passive viewing Neutral, Happy, 140–190 ms (peak P7, P8 (28) Common average No No
Angry amplitude)
Bediou et al. 2009 0/24 (27) Detection of Neutral, Angry, Sad 130–200 ms (mean PO9, PO10 (64) Mastoids Yes (Angry = Sad) > Neu Non-reported
immediate stimulus amplitude)
repetitions
Blau et al. 2007 19/15 (18–36) Learning Neutral, Fear 130–210 ms (mean 18 Common average Yes Fear > Neu No
associations amplitude) occipito-temporal
between line from each
drawings and words hemisphere (128)
Blechert et al. 2012 32/0 (21.9) Passive viewing Neutral, Angry 140–180 ms (peak P7, P8 (42) Common average Yes Angry > Neu Non-reported
amplitude)
Brennan et al. 2014 38/70 (20.5) Passive Neutral, Happy, 120–220 ms (peak T5, T6, O1, O2 (26) Mastoids No No
viewing + backward Angry, Fear, Sad, amplitude)
masking paradigm Disgust
Brenner et al. 2014 16/13 (20) Delayed matching of Neutral, Happy, Not specified (peak P7, TP7, O1, P8, TP8, Common average Yes Average of Angry, Fear, No
facial stimuli Angry, Fear, Sad amplitudes) O2 (31) Sad > (Neu = Happy)
Bublatzky et al. 2014 16/10 (23) Passive viewing Neutral, Happy, 150–200 ms (area P7, P8 (64) Common average Yes (Happy = Angry) > Neu No
Angry scores)
Campanella et al. 2006 7 (48), gender not Emotional oddball Neutral, Happy, Fear, 160–230 ms (peak T5, T6 (27) Common average No No
specified task Sad amplitude)
Carretié et al. 2013 28/6 (22.79) Digit categorization Neutral, Happy, 180 ms (temporal P7, P8, O1, O2, Oz Nosetip No No
Disgust PCA) (spatial PCA) (30)
Chai et al. 2012 21/19 (27.7) Emotional oddball Neutral, Happy, 140–200 ms (peak Fz, Cz, Pz Mastoids No No
task Angry, Sad amplitude)
Chammat et al. 2010 7/8 (20) Detection of Neutral, Sad 130–170 ms (peak PO9, P7, P5, PO7, P6, Common average No No
expressions amplitude) P8, PO8, PO10 (62)
Chen et al. 2014 24/22 (31.1) Emotional oddball Neutral, Happy, Sad 130–210 ms (peak P7, P8 (32) Earlobes Yes Happy > Neu > Sad Non-reported
task amplitude)
Conty et al. 2012 11/11 (25.0) Judgements of the Neutral, Angry Early N170: P5, P7, CP5, TP7, P6, Common average Yes Angry > Neu No
direction of gaze 160–184 ms (peak P8, CP6, TP8 (not
amplitude); Late specified)
N170: 176–200 ms
(peak amplitude)
Dan & Raz 2012 32/18 (23.58) Detection of gender Neutral, Angry 120–180 ms (mean Occipital (35, 37, 39) Common average No No
amplitude) and
posterior-parietal
electrodes (31, 33,
34, 36, 38, 40) (64)

501
502
Table 1 (Continued)

Authors Year Sample statistics of Ongoing task Facial expressions Time window for Electrodes where Ref. electrode used Any main Emo vs Which Emo? Significant LH
healthy adults: F/M N170 in ms N170 was analyzed for analyses Neu N170 amplitude vs. RH N170
(average age) (amplitude type) (total) difference? amplitude
(controls) differences
(Emo)?
Dubal et al. 2011 7/8 (22.1) Detection of Neutral, Happy 130–190 ms (mean P7, P8, P5, P6, P3, P4, Common average No No
expressions amplitude) P1, P2, Pz, PO3, PO4,
PO5, PO6, PO7, PO8,
POZ, O1, O2, Oz (62)
Frühholz et al. 2011 10/7 (23.7) Task 1: Color Neutral, Fear 145–185 ms (mean P7, P8, PO7, PO8, Common average Yes Fear > Neu Non-reported

J.A. Hinojosa et al. / Neuroscience and Biobehavioral Reviews 55 (2015) 498–509


detection; Task 2: amplitude) PO9, PO10 (64)
Detection of
expressions
He et al. 2012 19/18 (27.7) Detection of Neutral, Happy, 140–200 ms (peak FZ, Cz, Pz (3) Mastoids No No
expressions Angry, Sad amplitude)
Hendriks et al. 2007 25 (19.7), gender not Detection of gender Neutral, Happy, 172–205 ms (peak P7, P8 (49) Common average No No
specified Angry, Fear, Sad amplitude)
Herbert et al. 2013 Exp. 1: 16/5 (22); Exp. 1: Passive Neutral, Happy, 140–180 ms (mean O1, O2, PO9, PO10, Common average Yes Exp. 1, Exp. 2: Non-reported
Exp. 2: 12/5 (22) viewing. Angry, Fear amplitude) P9, P10 (32) Fear > Neu
Exp. 2: Passive
viewing + emotional
regulation
Hirai et al. 2008 5/6 (28.4) Detection of Neutral, Fear 160–240 ms (mean T5, T6, T5 , T6 Nosetip No No
cartoons among amplitude) (located 2 cm below
faces T5 and T6) (18)
Holmes et al. 2006 8/4 (31) Task 1: Detect if the Neutral, Fear 160–220 ms (mean T5, T6 (26) Earlobes No No
face was presented amplitude)
in the previous trial;
Task 2: Detect if the
line arrangement of
two lines was
presented in the
previous trial
Jetha et al. 2013 19/22 (39.3) Passive viewing Neutral, Happy, 130–200 ms (peak Seven left and right Common average Yes (Fear = Angry) > Neu No
Angry, Fear amplitude) lateral
occipital–temporal
electrodes (128)
Jiang et al. 2009 11/7 (mean age not Detection of changes Neutral, Fear 160–180 ms (peak Bilateral temporal Common average Yes Fear > Neu Non-reported
specified) in the contrast of amplitude) electrodes (60)
fixation point
Jiang et al. 2014 10/8 (23.7) Detection of Neutral, Happy, 130–200 ms (mean 57, 58, 63, 64, 65, 68, Common average Yes Angry > Neu No
expressions Angry amplitude) 69, 89, 90, 94, 95, 96,
99, 100 (128)
Jung et al. 2012 12/12 (37.6/38.5) Detection of Neutral, Happy, Fear 120–220 ms (mean 60 (64) Common average No No
expressions amplitude around
individual peak
amplitudes)
Kerestes et al. 2009 0/12 (21.3) Detection of Neutral, Happy, Fear 120–230 ms (peak P7, P8 (21) Common average No No
expressions amplitude)
Labuschagne et al. 2010 0/14 (23.19) Detection of Neutral, Happy, Sad 120–220 ms (peak P7, P8 (64) Common average No No
expressions amplitude)
Lee et al. 2007 6/4 (26.90) Detection of Neutral, Happy, Fear 140–210 ms (peak PO8 (64) Vertex Yes (Happy = Fear) > Neu Yes (RH);
expressions amplitude) (Happy = Fear)
> Neu
Lynn & Salisbury 2008 3/5 (41.0) Detection of Neutral, Happy, 120–270 ms (peak P9, P10 (64) Common average Yes Sad > (Neu = Happy No
expressions Angry, Fear, Disgust, amplitude) = Fear)
Sad
MacNamara et al. 2012 7/9 (students, mean Memorizing letters Neutral, Fear 130–180 ms (mean P7, P8 (34) Common average Yes Fear > Neu No
age non-reported) in high- and amplitude)

J.A. Hinojosa et al. / Neuroscience and Biobehavioral Reviews 55 (2015) 498–509


low-load conditions
Maurage et al. 2007 1/9 (43.9) Emotional oddball Neutral, Happy, Fear, 172–205 ms (peak O1,Oz, O2, T5, T6 Common average No No
task Sad amplitude) (32)
Morel et al. 2009 6/8 (26.4) Working memory of Neutral, Happy, Fear Individual peak PO7, PO8, PO9, PO10, Nosetip Yes (Happy = Fear) > Neu Yes (LH > RH);
facial identity amplitudes P7, P8, P9, P10 (64) (Happy = Fear)
(1-back) > Neu
Morel et al. 2014 19/14 (20.5) Detection of Neutral, Happy, Fear 120–170 ms (peak P7, P8; PO7, PO8, FCz Yes Fear > Neu No
expressions amplitude) PO9, PO10, O1, O2
(62)
Mühlberger et al. 2009 18/18 (23.4) Passive viewing Neutral, Happy, 140–190 ms (peak P7, P8 (21) Common average Yes (Happy, Angry, No
Angry, Fear amplitude) Fear) > Neu
Rellecke et al. 2012 12/12 (23.8) Task 1: Passive Neutral, Happy, 100–200 ms (peak PO7, PO8, P09, PO10, Common average Yes (Happy = Angry) > Neu Non-reported
viewing; Task 2: Angry amplitude) P9, P10 (55)
Emotional passive
viewing; Task 3:
Detection of gender;
Task 4: Detection of
emotion; Task 5:
“Face or word”
decision
Righart & de Gelder 2006 10/2 (21.1) Exp. 1 Neutral, Fear 140–220 ms (peak P5, P6, P7, P8, PO7, Common average Yes Exp. 3: Fear > Neu Yes (LH < RH);
(faces + contexts): amplitude) PO8 (49) Angry > Neu
Line orientation
discrimination; Exp.
2 (contexts only):
Detection of targets;
Exp. 3 (faces only):
Line orientation
discrimination
Rigoulot et al. 2011 16/0 (19.1) Detection of Neutral, Fear 140–240 ms (peak P7, P8 (63) Common average Yes Fear > Neu No
expressions amplitude)
Rossignol et al. 2012 13/11 (19.8) Emotional oddball Neutral, Happy, 150–220 ms (mean P7, P8 (32) Common average Yes (Disgust = Happy) > No
task Angry, Fear, Disgust amplitude) (Fear = Neu) > Angry
Smith et al. 2012 10/5 (28.7) Detection of Neutral, Happy, Fear, 160–180 ms (mean P7, P8 (64) Common average Yes Aware condition: No
expression Disgust amplitude) (Happy = Fear =
Disgust) > Neu;
Unaware condition:
Fear > Neu

503
504
Table 1 (Continued)

Authors Year Sample statistics of Ongoing task Facial expressions Time window for Electrodes where Ref. electrode used Any main Emo vs Which Emo? Significant LH
healthy adults: F/M N170 in ms N170 was analyzed for analyses Neu N170 amplitude vs. RH N170
(average age) (amplitude type) (total) difference? amplitude
(controls) differences
(Emo)?
Stekelenburg & de 2004 12 (21.4) Judgements on Neutral, Fear 140–200 ms (peak P7, P8 (49) Common average Yes Fear > Neu No
Gelder orientation amplitude)
Tortosa et al. 2013 Exp. 1: 20/2 (20); Detection of white Neutral, Happy, 150–190 ms (mean 47, 50, TP7, 56, TP9, Common average Exp.1: No; Exp.2: Exp. 2: Yes (LH < RH);
Exp. 2: 14/11 (22) or black faces Angry amplitude) P9, 63, P10, TP8, 100, Yes (Happy = Angry) > Neu (Happy

J.A. Hinojosa et al. / Neuroscience and Biobehavioral Reviews 55 (2015) 498–509


TP10, 102, 103, 108 = Angry) > Neu
(128)
Tsurusawa et al. 2008 10/10 (20–45) Oddball task Neutral, Angry Ranging from 141 to T5, T6 (16) Nosetip No No
144 ms depending
on the emotion
(peak amplitudes)
Turetsky et al. 2007 4/12 (28.1) Detection of Neutral, Happy, very 135–185 ms (mean Bilateral posterior Common average Yes Sad > Neu Non-reported
expressions Happy, Sad, very Sad GFP) temporal electrodes
(32)
Walentowska & 2012 33/3 (21) Detection of Neutral, Fear 150–195 ms (mean PO7, PO3, PO4, PO8 Mastoids No No
Wronka face-subsequent amplitude) (32)
mask stimulus
asymmetry
Wieser et al. 2010 27/23 (22.3) Passive viewing Neutral, Happy, 140–190 ms (peak P7, P8 (21) Common average No No
Angry amplitude)
Wieser et al. 2012a 38/0 (18–32) Rating valence and Neutral, Happy, Fear 140–190 ms (peak P7, P8 (28) Common average No No
arousal amplitude)
Wild-Wall et al. 2008 Exp. 2: 20/0 (25) Detection of familiar Neutral, Happy, 100–250 ms (peak P9, P10 (26) Left mastoid Yes (Happy = Disgust) > Neu Non-reported
faces Disgust amplitude)
Williams et al. 2006 107/112 (34.94) Passive Neutral, Happy, Fear 120–220 ms (peak T5, T6, O1, O2 (26) Mastoids Yes (Fear = Happy) > Neu Yes (LH < RH);
viewing + post- amplitude) Fear > Neu
testing
recognition
Wronka & 2011 12/10 (20.76) Task 1: Detection of Neutral, Emotional 140–185 ms (peak PO7, PO3, PO4, PO8 Mastoids No No
Walentowska expressions; Task 2: (Happy, Angry) amplitude) (32)
Detection of gender
Yuan et al. 2014 28/0 (20.24) Passive Neutral, Angry 130–190 ms (mean P8, PO8 (64) Common average Yes Angry > Neu Non-reported
viewing + emotional amplitude)
regula-
tion + emotional
rating
Zhang et al. 2012 24/18 (18–26) Detection of Neutral, Fear Not specified (peak P7, P8 (62) Common average Yes Fear > Neu No
expressions to peak analysis)
Zhao & Li 2006 5/9 (20–23) Passive Neutral, Happy, Sad 120–200 ms (peak P7, PO7, CB1, PO7, Nosetip Yes Sad > (Neu = Happy) Yes (RH);
viewing + detection amplitude) O1, P8, PO8, CB2, Happy > Neu
of tones PO8, O2 (64)
Meta-analyzable studies were those reporting one of the following numerical values regarding relevant contrasts: Fischer’s F (obtained in one-way, two level ANOVAs), means and dispersion measures, or Student’s t-values. A
table including the description of the 128 studies reviewed prior to meta-analyses, as well as the Effect Sizes and other statistics necessary for meta-analysis, are available at www.uam.es/CEACO/sup/N170 2014.htm.
J.A. Hinojosa et al. / Neuroscience and Biobehavioral Reviews 55 (2015) 498–509 505

Table 2 The sensitivity of the N170 to face expression suggests that it


Main parameters regarding the Meta-Analysis (random effects) of individual facial
does not strictly reflect an encapsulated encoding of the structural
expressions ordered as a function of the Z statistic. In bold letters, those facial
expressions whose effect sizes were significant. representation of the faces, as traditionally posited by some theo-
retical proposals (Bruce and Young, 1986). Interestingly, a similar
n Mean ES −95%CI +95%CI Z p
and parallel debate exists regarding the sensitivity of the infero-
Anger > Neu 21 −0.2062 −0.2977 −0.1147 −4.4153 <0.0001 temporal (IT) cortex, and particularly the fusiform face area, which
Fear > Neu 28 −0.4219 −0.6209 −0.2229 −4.1546 <0.0001 has been reported to be involved in the generation of N170 to facial
Happiness > Neu 32 −0.2552 −0.4078 −0.1027 −3.2793 0.0010
expressions (Deffke et al., 2007; Itier and Taylor, 2004b; Sadeh et al.,
Disgust > Neu 7 −0.4217 −1.1667 0.3233 −1.1093 0.2673
Sadness > Neu 14 −0.0769 −0.3225 0.1687 −0.6134 0.5396 2010). In this sense, the influential model by Haxby et al. (2000) pro-
posed that IT cortex is in charge of invariant aspects of faces, such as
ES = effect size; CI = confidence interval; n = number of meta-analyzable studies.
identity, but is not involved in the processing of changeable charac-
teristics such as expression. However, in recent years modulations
of activity in IT by emotional expressions have been reported (Fox
2001) and the idea of its insensitivity to emotional faces is still
et al., 2009; Ganel et al., 2005; Harry et al., 2013; Kawasaki et al.,
relatively solid, our study reveals that N170 shows different ampli-
2012; Tsuchiya et al., 2008; Xu and Biederman, 2010).
tudes in response to neutral compared to emotional faces. These
Alternatively, the functional significance of the N170 may be
effects were not associated, however, with any consistent lateral-
better understood in the light of some theoretical views arguing
ization pattern. In this sense, twenty five out of the forty studies
that face recognition is a flexible process that involves a com-
that explicitly explored this issue failed to observe any asymme-
plex interaction between mechanisms that extract different types
try (see the Review Table in www.uam.es/CEACO/sup/N170 2014.
of information (e.g., Ishai, 2008; Atkinson and Adolphs, 2011). In
htm). Among those showing asymmetries, right-lateralization was
this sense, Calder and Young (2005) proposed a face processing
observed in twelve studies and left-lateralization in three. Thus,
model which postulates that visuoperceptual representations of
although a trend toward a right-lateralization is observed, consis-
both facial expression and identity are initially coded by not com-
tent laterality effects during the emotional processing of faces were
pletely segregated mechanisms within a single representational
not unequivocally observed.

Fig. 2. Experiments susceptible to be included in meta-analysis from those reviewed and summarized in www.uam.es/CEACO/sup/N170 2014.htm. Mean effect sizes
(emotional minus neutral N170 amplitudes) and 95% confidence interval are shown. An outlier test recommended to leave the study marked with an asterisk out of
meta-analyses.
506 J.A. Hinojosa et al. / Neuroscience and Biobehavioral Reviews 55 (2015) 498–509

system (see also Ellamil et al., 2008, for a similar account). Sim- We also found that task-related factors modulated N170 ampli-
ilarly, the parallel-dependent model of face recognition assumes tude in response to facial expressions. In this sense, although N170
a parallel and interactive processing of facial identity and expres- amplitude was significantly greater in response to emotional than
sion information at early stages (Martens et al., 2010). According to to neutral expressions both in direct (i.e., participants were explic-
these views, the N170 may be better understood as a correlate of itly instructed to attend to facial expressions) and non-direct tasks
a perceptual representation stage indexing the interplay between (i.e., attention was directed elsewhere – i.e., participants are asked
several sources of information including both basic-level and high- to attend to lines, digits, letters or other non-facial targets pre-
level facial features (i.e., facial expression, identity, speech, etc.), sented along with faces- or a passive viewing of faces was required),
which could reflect the activity of the multiple neural sources con- effect sizes were larger in studies that used non-direct tasks. Thus,
tributing to the generation of this waveform (George, 2013; Rossion although some data exist suggesting the insensitivity of N170
and Jacques, 2011; Schyns et al., 2007). In this sense, the results amplitude to facial expressions in non-direct tasks (Wronka and
of several studies suggest that the N170 may be modulated by Walentowska, 2011), the results of the current meta-analysis indi-
the affective congruency between a facial expression and different cate that the N170 elicited by facial expressions is modulated not
sources of contextual affective information including scenes, body only by goal-directed mechanisms but also by automatic processes
expressions or the prosody of speech (e.g., Hietanen and Astikainen, (see also Schyns et al., 2007, for a similar claim). It is important to
2013; Pourtois et al., 2000; Righart and de Gelder, 2006, 2008; see note that facial expressions are good capturers of exogenous (auto-
Wieser and Brosch, 2012 and de Gelder and Van den Stock, 2011, matic) attention and that visual cortices, including those involved
for detailed reviews on this issue). in the generation of N170, are part of the exogenous attention
The second issue that was explored in this meta-analysis was neural circuitry (Carretié, 2014). In this sense, evidence indicat-
whether there is a differential or a homogeneous N170 effect for ing that facial expressions are processed even under conditions in
each particular emotional expression. Data show that there is a which attentional resources are not fully available, could partially
differential sensitivity of this component to each discrete emo- account for our findings. In particular, studies using the dot-probe
tion. Angry expressions cause by far the greatest increase in N170 task found preferential attention to angry faces with masked pre-
amplitude, followed by fear and happiness, whose effect sizes were sentations (Fox et al., 2002; Mogg and Bradley, 1999; But see Koster
also significant. Those not reaching statistical significance were et al., 2007). Moreover, emotional facial expressions have been
sadness and disgust. Some interesting implications derive from found to reduce the size of the attentional blink in healthy par-
these results. First, N170 is not sensitive to every facial gesture, ticipants (e.g., Bach et al., 2014; Vermeulen et al., 2009; but see
which would mean that this component is sensitive to every struc- Luo et al., 2010), or to be more resistant to extinction in neglect
tural sign of facial change whatever its functional meaning might patients (e.g., Fox, 2002; Vuilleumier and Schwartz, 2001). Finally,
be (which would be interpreted in another, ulterior processing the time needed to detect emotional facial expressions in visual
step). In contrast, this component seems to be specifically sen- search paradigms seems to be independent from the number of dis-
sitive to some particular expressions and not (or not so much) tracters (e.g., Hansen and Hansen, 1988; Fox et al., 2000; Frischen
to others. This finding suggests that the processes underlying the et al., 2008; but see Nummenmaa and Calvo, 2015). Thus, the N170
N170 already reflect a hierarchization of expressions. Second, this effects found during indirect tasks could be explained by the privi-
hierarchy is probably not only based on structural criteria (e.g., leged processing of emotional expression in faces that may partially
mouth features over eyebrow features), but also on communicative, rely on attentional mechanisms that do not require awareness.
social criteria. Indeed, fear, angry or happy expressions are usual in
interchanges which require rapid social/emotional reactions in the
receiver (as compared to sadness or disgust expressions). There- 5. Conclusions
fore, their processing would involve to a greater extent those rapid
facial decoding mechanisms reflected by the N170 (at least as quick The present meta-analyses clearly show that the N170 is sensi-
as those subserved by other structures traditionally proposed to be tive to facial expressions, and that this sensitivity is heterogeneous:
involved in facial expression processing, such as the amygdala – anger, fear and happiness, in this order, are those eliciting the
Morris et al., 1998 – whose latency of response is similar, or even largest amplitudes with respect to neutral, non-expressive faces.
longer: Conty et al., 2012; Pessoa and Adolphs, 2010). However, The effect of sadness and disgust did not reach significance, at
some caution is needed since data on disgust and sadness were least with the current number of meta-analyzable studies. This het-
based on seven and fourteen studies, respectively. erogeneity, which apparently favors expressions requiring rapid
Finally, it is important to highlight that some task-related factors social responses in the receiver, suggests that the mechanisms
may influence N170 responses to facial expression, according to the underlying N170 go beyond detecting any facial change or ges-
results of our meta-analysis. These modulating factors, as well as ture and seem to be functionally guided. Current results suggest
others that could be explored in the future, may partially explain that N170 does not behave according to the predictions of duals
why an important number of studies showed no N170 effects to models (identity vs. expressions or structure vs. change, Bruce and
facial expressions (see the Review Table in www.uam.es/CEACO/ Young, 1986). Rather, it appears to fit better with models sug-
sup/N170 2014.htm). Specifically, two potentially modulating fac- gesting a more integrated and intermixed circuitry in charge of
tors were taken into account in this study, one of technical nature processing the identity and expression of faces (Calder and Young,
(reference electrode/s location), and the other one regarding the 2005). Importantly, the results of the present study also indicate
characteristics of the task (its direct or non-direct nature). Accord- that this component is modulated by factors that may vary its
ing to this meta-analysis, the N170 showed significant effects of amplitude in response to expressions, potentially explaining why
facial expression whatever the placement of the reference elec- several studies have not reported significant differences. However,
trode was. However, this factor had a significant influence, since the further research is needed to address the impact of other variables
effects of facial expression on N170 amplitude were even stronger that have been scarcely explored in prior studies so they cannot
when common average was employed. This finding supports the be meta-analyzable at present. Among others, the capability of
claims made by Rellecke et al. (2013) about the importance of N170 to reflect the processing of facial expressions presented in
the reference electrode/s location on the N170 response to facial the periphery (Rigoulot et al., 2011) or how it responds to different
expressions, which showed greater amplitudes when common expression intensities within each emotional category (Turetsky
average was employed. et al., 2007), seems to be of particular relevance. In conclusion, the
J.A. Hinojosa et al. / Neuroscience and Biobehavioral Reviews 55 (2015) 498–509 507

N170, probably along with other, less studied face-sensitive ERP Conty, L., Dezecache, G., Hugueville, L., Grezes, J., 2012. Early binding of gaze,
components that have also shown sensitivity to facial expressions gesture, and emotion: neural time course and correlates. J. Neurosci. 32,
4531–4539.
within the first 200 ms, such as P1 and the visual vertex potential Dan, O., Raz, S., 2012. Adult attachment and emotional processing biases: an
(e.g., Batty and Taylor, 2003; Li et al., 2008; Luo et al., 2010), must event-related potentials (ERPs) study. Biol. Psychol. 91, 212–220.
be considered an important tool to study the neural mechanisms de Gelder, B., Van den Stock, J., 2011. Real faces, real emotions: perceiving facial
expressions in naturalistic contexts of voices, bodies and scenes. In: Calder,
underlying facial expression processing and may help to further A.J., Rhodes, G., Johnson, M.H., Haxby, J.V. (Eds.), The Oxford Handbook of Face
develop current theoretical models. Recognition. Oxford University Press, New York, pp. 535–550.
Deffke, I., Sander, T., Heidenreich, J., Sommer, W., Curio, G., Trahms, L., et al., 2007.
MEG/EEG sources of the 170-ms response to faces are co-localized in the
Acknowledgements fusiform gyrus. Neuroimage 35, 1495–1501.
Dubal, S., Foucher, A., Jouvent, R., Nadel, J., 2011. Human brain spots emotion in
This work was supported by grants PSI2012-37535 and PSI2014- non humanoid robots. Soc. Cogn. Affect. Neurosci. 6, 90–97.
Eimer, M., Holmes, A., McGlone, F.P., 2003. The role of spatial attention in the
54853-P from the Ministerio de Economía y Competitividad processing of facial expression: an ERP study of rapid brain responses to six
(MINECO) of Spain and grant PI13/01759 from the Institute of basic emotions. Cogn. Affect. Behav. Neurosci. 3 (2), 97–110.
Health Carlos III (ISCIII) of Spain. Eimer, M., Holmes, A., 2007. Event-related brain potential correlates of emotional
face processing. Neuropsychologia 45, 15–31.
Eimer, M., Holmes, A., 2002. An ERP study on the time course of emotional face
References processing. Neuroreport 13, 427–431.
Ellamil, M., Susskind, J.M., Anderson, A.K., 2008. Examinations of identity
Aarts, K., Pourtois, G., 2012. Anxiety disrupts the evaluative component of invariance in facial expression adaptation. Cogn. Affect. Behav. Neurosci. 8,
performance monitoring: an ERP study. Neuropsychologia 50, 1286–1296. 273–281.
Akbarfahimi, M., Tehrani-Doost, M., Ghassemi, F., 2013. Emotional face perception Fox, C.J., Moon, S.Y., Iaria, G., Barton, J.J.S., 2009. The correlates of subjective
in patients with schizophrenia: an event-related potential study. perception of identity and expression in the face network: an fMRI adaptation
Neurophysiology 45, 249–257. study. Neuroimage 44, 569–580.
Almeida, P.R., Ferreira-Santos, F., Vieira, J.B., Moreira, P.S., Barbosa, F., Fox, E., Lester, V., Russo, R., Bowles, R.J., Pichler, A., Dutton, K., 2000. Facial
Marques-Teixeira, 2014. Dissociable effects of psychopathic traits on cortical expressions of emotion: are angry faces detected more efficiently? Cogn. Emot.
and subcortical visual pathways during facial emotion processing: an ERP 14, 61–92.
study on the N170. Psychophysiology 51, 645–657. Fox, E., 2002. Processing emotional facial expressions: the role of anxiety and
Andreatta, M., Puschmann, A.K., Sommer, C., Weyers, P., Pauli, P., Mühlberger, A., awareness. Cogn. Affect. Behav. Neurosci. 2, 52–63.
2012. Altered processing of emotional stimuli in migraine: an event-related Fox, E., Russo, R., Dutton, K., 2002. Attentional bias for threat: evidence for delayed
potential study. Cephalalgia 32, 1101–1108. disengagement from emotional faces. Cogn. Emot. 16, 355–379.
Atkinson, A.P., Adolphs, R., 2011. The neuropsychology of face perception: beyond Frischen, A., Eastwood, J.D., Smilek, D., 2008. Visual search for faces with emotional
simple dissociations and functional selectivity. Philos. Trans. R. Soc. Lond. B: expressions. Psychol. Bull. 134, 662.
Biol. Sci. 366, 1726–1738. Frühholz, S., Jellinghaus, A., Herrmann, M., 2011. Time course of implicit
Bach, D.R., Schmidt-Daffy, M., Dolan, R.J., 2014. Facial expression influences face processing and explicit processing of emotional faces and emotional words.
identity recognition during the attentional blink. Emotion 14, 1007–1013. Biol. Psychol. 87, 265–274.
Batty, M., Taylor, M.J., 2003. Early processing of the six basic facial emotional Ganel, T., Valyear, K.F., Goshen-Gottstein, Y., Goodale, M.A., 2005. The involvement
expressions. Cogn. Brain Res. 17, 613–620. of the “fusiform face area” in processing facial expression. Neuropsychologia
Bediou, B., Eimer, M., d’Amato, T., Hauk, O., Calder, A.J., 2009. In the eye of the 43, 1645–1654.
beholder: individual differences in reward-drive modulate early frontocentral George, N., 2013. The facial expression of emotions. In: Armony, J., Vuilleumier, P.
ERPs to angry faces. Neuropsychologia 47, 825–834. (Eds.), The Cambridge Handbook of Human Affective Neuroscience. Cambridge
Bentin, S., Allison, T., Puce, A., Perez, E., McCarthy, G., 1996. Electrophysiological University Press, New York, pp. 171–197.
studies of face perception in humans. J. Cogn. Neurosci. 8, 551–565. Hadj-Bouziane, F., Bell, A.H., Knusten, T.A., Ungerleider, L.G., Tootell, R.B., 2008.
Blau, V.C., Maurer, U., Tottenham, N., McCandliss, B.D., 2007. The face-specific Perception of emotional expressions is independent of face selectivity in
N170 component is modulated by emotional facial expression. Behav. Brain monkey inferior temporal cortex. Proc. Natl. Acad. Sci. U. S. A. 105, 5591–5596.
Funct. 3, 7, http://dx.doi.org/10.1186/1744-9081-3-7 Hansen, C.H., Hansen, R.D., 1988. Finding the face in the crowd: an anger
Blechert, J., Sheppes, G., Di Tella, C., Williams, H., Gross, J.J., 2012. See what you superiority effect. J. Pers. Soc. Psychol. 54, 917.
think: reappraisal modulates behavioral and neural responses to social stimuli. Harry, B., Williams, M.A., Davis, C., Kim, J., 2013. Emotional expressions evoke a
Psychol. Sci. 23, 346–353. differential response in the fusiform face area. Front. Hum. Neurosci., 7.
Brennan, A.N., Harris, A.W.F., Williams, L.M., 2014. Neural processing of facial Haxby, J.V., Hoffman, E.A., Gobbini, M.I., 2000. The distributed human neural
expressions of emotion in first onset psychosis. Psychiatry Res. 219, 477–485. system for face perception. Trends Cogn. Sci. 4, 223–233.
Brenner, C.A., Rumak, S.P., Burns, A.M.N., Kieffaber, P.D., 2014. The role of encoding He, W., Chai, H., Chen, W., Zhang, J., Xu, Y., Zhu, J., Wang, W., 2012. Facial emotion
and attention in facial emotion memory: an EEG investigation. Int. J. triggered cerebral potentials in treatment-resistant depression and borderline
Psychophysiol. 93, 398–410. personality disorder patients of both genders. Prog. Neuropsychopharmacol.
Bruce, V., Young, A., 1986. Understanding face recognition. Br. J. Psychol. 77, Biol. Psychiatry 37, 121–127.
305–327. Hendriks, M.C., van Boxtel, G.J., Vingerhoets, A.J., 2007. An event-related potential
Bublatzky, F., Gerdes, A.B., White, A.J., Riemer, M., Alpers, G.W., 2014. Social and study on the early processing of crying faces. Neuroreport 18, 631–634.
emotional relevance in face processing: happy faces of future interaction Herbert, C., Sfärlea, A., Blumenthal, T., 2013. Your emotion or mine: labeling
partners enhance the late positive potential. Front. Hum. Neurosci. 8, 493. feelings alters emotional face perception – an ERP study on automatic and
Burrows, A.M., 2008. The facial expression musculature in primates and its intentional affect labeling. Front. Hum. Neurosci. 7, 378, http://dx.doi.org/10.
evolutionary significance. Bioessays 30, 212–225. 3389/fnhum.2013.00378
Calder, A.J., Young, A.W., 2005. Understanding the recognition of facial identity and Herrmann, M.J., Aranda, D., Ellgring, H., Mueller, T.J., Strik, W.K., Heidrich, A., et al.,
facial expression. Nat. Rev. Neurosci. 6, 641–651. 2002. Face-specific event-related potential in humans is independent from
Campanella, S., Montedoro, C., Streel, E., Verbanck, P., Rosier, V., 2006. Early visual facial expression. Int. J. Psychophysiol. 45, 241–244.
components (P100, N170) are disrupted in chronic schizophrenic patients: an Hietanen, J.K., Astikainen, P., 2013. N170 response to facial expressions is
event-related potentials study. Neurophysiol. Clin. 36, 71–78. modulated by the affective congruency between the emotional expression and
Carmel, D., Bentin, S., 2002. Domain specificity versus expertise: factors preceding affective picture. Biol. Psychol. 92, 114–124.
influencing distinct processing of faces. Cognition 83, 1–29. Hirai, M., Watanabe, S., Honda, Y., Miki, K., Kakigi, R., 2008. Emotional object and
Carretié, L., Kessel, D., Carboni, A., López-Martín, S., Albert, J., Tapia, M., Hinojosa, scene stimuli modulate subsequent face processing: an event-related potential
J.A., 2013. Exogenous attention to facial vs non-facial emotional visual stimuli. study. Brain Res. Bull. 77, 264–273.
Soc. Cogn. Affect. Neurosci. 8, 764–773. Holmes, A., Kiss, M., Eimer, M., 2006. Attention modulates the processing of
Carretié, L., 2014. Exogenous (automatic) attention to emotional stimuli: a review. emotional expression triggered by foveal faces. Neurosci. Lett. 394, 48–52.
Cogn. Affect. Behav. Neurosci. 14, 1228–1258. Humphreys, G.W., Donnelly, N., Riddoch, M.J., 1993. Expression is computed
Chai, H., Chen, W.Z., Zhu, J., Xu, Y., Lou, L., Yang, T., He, W., Wang, W., 2012. separately from facial identity, and it is computed separately for moving and
Processing of facial expressions of emotions in healthy volunteers: an static faces: neuropsychological evidence. Neuropsychologia 31,
exploration with event-related potentials and personality traits. Neurophysiol. 173–181.
Clin. 42, 369–375. Ishai, A., 2008. Let’s face it: it’s a cortical network. Neuroimage 40, 415–419.
Chammat, M., Foucher, A., Nadel, J., Dubal, S., 2010. Reading sadness beyond Itier, R.J., Alain, C., Sedore, K., McIntosh, A.R., 2007. Early face processing
human faces. Brain Res. 1348, 95–104. specificity: it’s in the eyes! J. Cogn. Neurosci. 19, 1815–1826.
Chen, J., Ma, W., Zhang, Y., Wu, X., Wei, D., Liu, G., Deng, Z., Yang, L., Zhang, Z., 2014. Itier, R.J., Batty, M., 2009. Neural bases of eye and gaze processing: the core of
Distinct facial processing related negative cognitive bias in first-episode and social cognition. Neurosci. Biobehav. Rev. 33, 843–863.
recurrent major depression: evidence from the N170 ERP component. PLoS Itier, R.J., Taylor, M.J., 2004a. N170 or N1? Spatiotemporal differences between
ONE 9, e109176. object and face processing using ERPs. Cereb. Cortex 14, 132–142.
508 J.A. Hinojosa et al. / Neuroscience and Biobehavioral Reviews 55 (2015) 498–509

Itier, R.J., Taylor, M.J., 2004b. Source analysis of the N170 to faces and objects. Orwin, R.G., 1983. A fail-safe N for effect size in meta-analysis. J. Edu. Stat. 8,
Neuroreport 15, 1261–1265. 157–159.
Jetha, M.K., Zheng, X., Goldberg, J.O., Segalowitz, S.J., Schmidt, L.A., 2013. Shyness Parry, F.M., Young, A.W., Shona, J., Saul, M., Moss, A., 1991. Dissociable face
and emotional face processing in schizophrenia: an ERP study. Biol. Psychol. processing impairments after brain injury. J. Clin. Exp. Neuropsychol. 13,
94, 562–574. 545–558.
Jiang, Y.I., Shannon, R.W., Vizueta, N., Bernat, E.M., Patrick, C.J., He, S., 2009. Pessoa, L., Adolphs, R., 2010. Emotion processing and the amygdala: from a ‘low
Dynamics of processing invisible faces in the brain: automatic neural encoding road’ to ‘many roads’ of evaluating biological significance. Nat. Rev. Neurosci.
of facial expression information. Neuroimage 44, 1171–1177. 11, 773–783.
Jiang, Y.I., Li, W., Recio, G., Liu, Y., Luo, W., Zhang, D., Sun, D., 2014. Time pressure Posamentier, M.T., Abdi, H., 2003. Processing faces and facial expressions.
inhibits dynamic advantage in the classification of facial expressions of Neuropsychol. Rev. 13, 113–143.
emotion. PLoS ONE 9, e100162. Pourtois, G., de Gelder, B., Vroomen, J., Rossion, B., Crommelinck, M., 2000. The
Jung, H.T., Kim, D.W., Kim, S., Im, C.H., Lee, S.H., 2012. Reduced source activity of time-course of intermodal binding between seeing and hearing affective
event-related potentials for affective facial pictures in schizophrenia patients. information. Neuroreport 11, 1329–1333.
Schizophr. Res. 136, 150–159. Rellecke, J., Sommer, W., Schacht, A., 2012. Does processing of emotional facial
Kaufmann, J.M., Schweinberger, S.R., 2004. Expression influences the recognition expressions depend on intention? Time-resolved evidence from event-related
of familiar faces. Perception 33, 399–408. brain potentials. Biol. Psychol. 90, 23–32.
Kawasaki, H., Tsuchiya, N., Kovach, C.K., Nourski, K.V., Oya, H., Howard, M.A., et al., Rellecke, J., Sommer, W., Schacht, A., 2013. Emotion effects on the N170: a question
2012. Processing of facial emotion in the human fusiform gyrus. J. Cogn. of reference? Brain Topogr. 26, 62–71.
Neurosci. 24, 1358–1370. Righart, R., de Gelder, B., 2006. Context influences early perceptual analysis of faces
Kerestes, R., Labuschagne, I., Croft, R.J., O’Neill, B.V., Bhagwagar, Z., Phan, K.L., – an electrophysiological study. Cereb. Cortex 16, 1249–1257.
Nathan, P.J., 2009. Evidence for modulation of facial emotional processing bias Righart, R., de Gelder, B., 2008. Rapid influence of emotional scenes on encoding of
during emotional expression decoding by serotonergic and noradrenergic facial expressions: an ERP study. Soc. Cogn. Affect. Neurosci. 3, 270–278.
antidepressants: an event-related potential (ERP) study. Psychopharmacology Rigoulot, S., D’Hondt, F., Defoort-Dhellemmes, S., Despretz, P., Honoré, J., 2011.
(Berl.) 202, 621–634. Fearful faces impact in peripheral vision: behavioral and neural evidence.
Koster, E.H., Verschuere, B., Burssens, B., Custers, R., Crombez, G., 2007. Attention Neuropsychologia 49, 2013–2021.
for emotional faces under restricted awareness revisited: do emotional faces Rosenthal, R., 1979. The file drawer problem and tolerance for null results. Psychol.
automatically attract attention? Emotion 7, 285. Bull. 86, 638.
Krolak-Salmon, P., Fischer, C., Vighetto, A., Mauguière, F., 2001. Processing of facial Rossignol, M., Campanella, S., Maurage, P., Heeren, A., 2012. Enhanced perceptual
emotional expression: spatio-temporal data as assessed by scalp event-related responses during visual processing of facial stimuli in young socially anxious
potentials. Eur. J. Neurosci. 13, 987–994. individuals. Neurosci. Lett. 526, 68–73.
LaBar, K.S., Crupain, M.J., Voyvodic, J.T., McCarthy, G., 2003. Dynamic perception of Rossion, B., 2014. Understanding face perception by means of human
facial affect and identity in the human brain. Cereb. Cortex 13, electrophysiology. Trends Cogn. Sci. 18, 310–318.
1023–1033. Rossion, B., Dricot, L., Devolder, A., Bodart, J., Crommelinck, M., de Gelder, B.,
Labuschagne, I., Croft, R.J., Phan, K.L., Nathan, P.J., 2010. Augmenting serotonin Zoontjes, R., 2000. Hemispheric asymmetries for whole-based and part-based
neurotransmission with citalopram modulates emotional expression decoding face processing in the human fusiform gyrus. J. Cogn. Neurosci. 12, 793–802.
but not structural encoding of moderate intensity sad facial emotional stimuli: Rossion, B., Gauthier, I., 2002. How does the brain process upright and inverted
an event-related potential (ERP) investigation. J. Psychopharmacol. (Oxf.) 24, faces? Behav. Cogn. Neurosci. Rev. 1, 63–75.
1153–1164. Rossion, B., Curran, T., Gauthier, I., 2002. A defense of the subordinate-level
Lakens, D., 2013. Calculating and reporting effect sizes to facilitate cumulative expertise account for the N170 component. Cognition 85, 189–196.
science: a practical primer for t-tests and ANOVAs. Front. Psychol. 4, 863. Rossion, B., Jacques, C., 2008. Does physical interstimulus variance account for
Lander, K., Metcalfe, S., 2007. The influence of positive and negative facial early electrophysiological face sensitive responses in the human brain? Ten
expressions on face familiarity. Memory 15, 63–69. lessons on the N170. Neuroimage 39, 1959–1979.
Lee, S., Kim, E., Kim, S., Im, W., Seo, H., Han, S., Kim, H., 2007. Facial affect Rossion, B., Jacques, C., 2011. The N170: understanding the time-course of face
perception and event-related potential N170 in schizophrenia: a preliminary perception in the human brain. In: Luck, S., Kappenman, E. (Eds.), The Oxford
study. Clin. Neuropsychopharmacol. Neurosci. 5, 76–80. Handbook of ERP Components. Oxford University Press, New York, pp.
Li, W., Zinbarg, R.E., Boehm, S.G., Paller, K.A., 2008. Neural and behavioral evidence 115–142.
for affective priming from unconsciously perceived emotional facial Rousselet, G.A., Macé, M.J., Fabre-Thorpe, M., 2004. Spatiotemporal analyses of the
expressions and the influence of trait anxiety. J. Cogn. Neurosci. 20, 95–107. N170 for human faces, animal faces and objects in natural scenes. Neuroreport
Lipsey, M.W., Wilson, D.B., 2001. Practical Meta-analysis: Applied Social Research 15, 2607–2611.
Methods Series. Sage, Thousand Oaks. Sadeh, B., Podlipsky, I., Zhdanov, A., Yovel, G., 2010. Event-related potential and
Luo, W., Feng, W., He, W., Wang, N.Y., Luo, Y.J., 2010. Three stages of facial functional MRI measures of face-selectivity are highly correlated: a
expression processing: ERP study with rapid serial visual presentation. simultaneous ERP-fMRI investigation. Hum. Brain Mapp. 31, 1490–1501.
Neuroimage 49, 1857–1867. Schyns, P.G., Petro, L.S., Smith, M.L., 2007. Dynamics of visual information
Lynn, S.K., Salisbury, D.F., 2008. Attenuated modulation of the N170 ERP by facial integration in the brain for categorizing facial expressions. Curr. Biol. 17,
expressions in schizophrenia. Clin. EEG Neurosci. 39, 108–111. 1580–1585.
MacNamara, A., Schmidt, J., Zelinsky, G.J., Hajcak, G., 2012. Electrocortical and Smith, M., 2012. Rapid processing of emotional expressions without conscious
ocular indices of attention to fearful and neutral faces presented under high awareness. Cereb. Cortex 22, 1748–1760.
and low working memory load. Biol. Psychol. 91, 349–356. Stekelenburg, J., de Gelder, B., 2004. The neural correlates of perceiving human
Martens, U., Leuthold, H., Schweinberger, S.R., 2010. Parallel processing in face bodies: an ERP study on the body-inversion effect. Cogn. Neurosci.
perception. J. Exp. Psychol. Hum. Percept. Perform. 36, 103–121. Neuropsychol. 15, 777–780.
Maurage, P., Philippot, P., Verbanck, P., Noel, X., Kornreich, C., Hanak, C., Taylor, M.J., Batty, M., Itier, R.J., 2004. The faces of development: a review of early
Campanella, S., 2007. Is the P300 deficit in alcoholism associated with early face processing over childhood. J. Cogn. Neurosci. 16, 1426–1442.
visual impairments (P100, N170)? An odball paradigm. Clin. Neurophysiol. Tortosa, M., Lupiánez, J., Ruz, M., 2013. Race, emotion and trust: an ERP study.
118, 633–644. Brain Res. 1494, 44–55.
Mogg, K., Bradley, B.P., 1999. Orienting of attention to threatening facial Tsuchiya, N., Kawasaki, H., Oya, H., Howard III, M.A., Adolphs, R., 2008. Decoding
expressions presented under conditions of restricted awareness. Cogn. Emot. face information in time, frequency and space from direct intracranial
13, 713–740. recordings of the human brain. PLoS ONE 3 (12), e3892.
Morel, S., Ponz, A., Mercier, M., Vuilleumier, P., George, N., 2009. EEG-MEG Tsurusawa, R., Goto, Y., Mitsudome, A., Nakashima, T., Tobimatsu, S., 2008.
evidence for early differential repetition effects for fearful, happy and neutral Different perceptual sensitities for Chernoff’s face between children and
faces. Brain Res. 1254, 84–98. adults. Neurosci. Res. 60, 176–183.
Morel, S., George, N., Foucher, A., Chammat, M., Dubal, S., 2014. ERP evidence for an Turetsky, B.I., Kohler, C.G., Indersmitten, T., Bhati, M.T., Charbonnier, D., Gur, R.C.,
early emotional bias towards happy faces in trait anxiety. Biol. Psychol. 99, 2007. Facial emotion recognition in schizophrenia: when and why does it go
183–192. awry? Schizophr. Res. 94, 253–263.
Morris, J., Friston, K., Büchel, C., Frith, C., Young, A., Calder, A., et al., 1998. A Vermeulen, N., Godefroid, J., Mermillod, M., 2009. Emotional modulation of
neuromodulatory role for the human amygdala in processing emotional facial attention: fear increases but disgust reduces the attentional blink. PLoS ONE 4
expressions. Brain 121, 47–57. (11), e7924.
Mühlberger, A., Wieser, M., Herrmann, M., Weyers, P., Tröger, C., Pauli, P., 2009. Vuilleumier, P., Schwartz, S., 2001. Beware and be aware: capture of spatial
Early cortical processing of natural and artificial emotional faces differs attention by fear-related stimuli in neglect. Neuroreport 12, 1119–1122.
between lower and higher socially anxious persons. J. Neural Transm. 116, Walentowska, W., Wronka, E., 2012. Trait anxiety and involuntary processing of
735–746. facial emotions. Int. J. Psychophysiol. 85, 27–36.
Nemrodov, D., Anderson, T., Preston, F.F., Itier, R.J., 2014. Early sensitivity for eyes Wieser, M.J., Brosch, T., 2012. Faces in context: a review and systematization of
within faces: a new neuronal account of holistic and featural processing. contextual influences on affective face processing. Front. Psychol. 3, 471,
Neuroimage 97, 81–94. http://dx.doi.org/10.3389/fpsyg.2012.00471
Nummenmaa, L., Calvo, M.G., 2015. Dissociation between recognition and Wieser, M., Pauli, P., Reicherts, P., Mühlberger, A., 2010. Don’t look at me in anger!
detection advantage for facial expressions: a meta-analysis. Emotion 15, Enhanced processing of angry faces in anticipation of public speaking.
243–256. Psychophysiology 47, 271–280.
J.A. Hinojosa et al. / Neuroscience and Biobehavioral Reviews 55 (2015) 498–509 509

Wieser, M., Gerdes, A., Greiner, R., Reicherts, P., Pauli, P., 2012. Tonic pain grabs Wronka, E., Walentowska, W., 2011. Attention modulates emotional expression
attention, but leaves the processing of facial expressions intact – evidence processing. Psychophysiology 48, 1047–1056.
from event-related brain potentials. Biol. Psychol. 90, 242–248. Xu, X., Biederman, I., 2010. Loci of the release from fMRI adaptation for changes in
Wild-Wall, N., Dimigen, O., Sommer, W., 2008. Interaction of facial expressions and facial expression, identity, and viewpoint. J. Vision 10, http://dx.doi.org/10.
familiarity: ERP evidence. Biol. Psychol. 77, 138–149. 1167/10.14.36
Williams, L., Palmer, D., Liddell, B., Song, L., Gordon, E., 2006. The “when” and Young, A.W., McWeeny, K.H., Hay, D.C., Ellis, A.W., 1986. Matching familiar and
“where” of perceiving signals of threat versus non-threat. Neuroimage 31, unfamiliar faces on identity and expression. Psychol. Res. 48, 63–68.
458–467. Yuan, L., Zhou, R., Hu, S., 2014. Cognitive reappraisal of facial expressions:
Wilson, D.B., 2010. SPSS macros for meta-analysis (2010 actualization). http:// electrophysiological evidence of social anxiety. Neurosci. Lett. 577, 45–50.
mason.gmu.edu/∼dwilsonb/ma.html Zhang, D., Wang, L., Luo, Y., Luo, Y., 2012. Individual differences in detecting
Winston, J.S., Henson, R.N.A., Fine-Goulden, M.R., Dolan, R.J., 2004. fMRI-adaptation rapidly presented fearful faces. PLoS ONE 7, e49517.
reveals dissociable neural representations of identity and expression in face Zhao, L., Li, J., 2006. Visual mismatch negativity elicited by facial expressions under
perception. J. Neurophysiol. 92, 1830–1839. non-attentional condition. Neurosci. Lett. 410, 126–131.

You might also like