Hinojosa 2015
Hinojosa 2015
Hinojosa 2015
Review
a r t i c l e i n f o a b s t r a c t
Article history: The N170 component is the most important electrophysiological index of face processing. Early studies
Received 30 October 2014 concluded that it was insensitive to facial expression, thus supporting dual theories postulating separate
Received in revised form 22 April 2015 mechanisms for identity and expression encoding. However, recent evidence contradicts this assump-
Accepted 3 June 2015
tion. We conducted a meta-analysis to resolve inconsistencies and to derive theoretical implications.
Available online 9 June 2015
A systematic revision of 128 studies analyzing N170 in response to neutral and emotional expressions
yielded 57 meta-analyzable experiments (involving 1645 healthy adults). First, the N170 was found to
Keywords:
be sensitive to facial expressions, supporting proposals arguing for integrated rather than segregated
N170
Facial expression mechanisms in the processing of identity and expression. Second, this sensitivity is heterogeneous, with
Meta-analysis anger, fear and happy faces eliciting the largest N170 amplitudes. Third, we explored some modulatory
Happy factors, including the focus of attention – N170 amplitude was found to be also sensitive to unattended
Fear expressions – or the reference electrode – common reference reinforcing the effects. In sum, N170 is a
Angry valuable tool to study the neural processing of facial expressions in order to develop current theories.
Sad © 2015 Elsevier Ltd. All rights reserved.
Disgust
Contents
1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 498
2. Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 499
2.1. Selection of studies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 499
2.2. Search and data description methodologies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 499
3. Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 500
4. Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 500
5. Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 506
Acknowledgements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 507
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 507
http://dx.doi.org/10.1016/j.neubiorev.2015.06.002
0149-7634/© 2015 Elsevier Ltd. All rights reserved.
J.A. Hinojosa et al. / Neuroscience and Biobehavioral Reviews 55 (2015) 498–509 499
involving book search) using the following search terms: (N170) Next, effects for each specific emotional expression on N170
AND (fac* OR express*) AND (emotion* OR affect*). Every arti- amplitude were also explored. As appreciated in Table 1 and in the
cle identified was downloaded – or solicited to authors- and read Review Table (www.uam.es/CEACO/sup/N170 2014.htm), a wide
to ensure that it was appropriate to be included in the meta- variety of emotions have been studied, producing mixed results.
analysis/review. Studies found to meet these search criteria and To further explore this issue, meta-analyses on N170 amplitudes
the selection criteria described above were 128. A table summa- were carried out separately for Happiness > Neutral ESs (n = 32; i.e.,
rizing the main characteristics of the experimental design and the this comparison was reported in 32 out of the 56 meta-analyzable
main pertinent results of these 128 reviewed studies is available at studies); Anger > Neutral ESs (n = 21), Fear > Neutral ESs (n = 28),
www.uam.es/CEACO/sup/N170 2014.htm, in which any study not Disgust > Neutral ESs (n = 7), and Sadness > Neutral ESs (n = 14).
currently included but detected by readers will be added. No meta-analyzable studies exploring surprise expressions exist
Cohen’s effect sizes (ES), the parameter submitted to meta- up to the moment. Statistical tests (Table 2) showed that the
analyses, consisted of standardized mean differences computed greatest effect size corresponded to Anger > Neu and Fear > Neu
whenever one of the following numerical values regarding rele- contrasts, followed by Happiness > Neu. The other two contrasts
vant contrasts was reported in the paper: Fischer’s F (obtained in (Sadness > Neu and Disgust > Neu) did not reach significance.
one-way, two level ANOVAs), means and dispersion measures, or Finally, the capability of certain methodological and task-related
Student’s t-values. Calculation of ES from these three parameters factors to modulate N170 amplitude in response to facial expres-
required formulas for paired samples (e.g., Lakens, 2013), since sions of emotion was meta-analyzed. First, we tested the effects of
all studies employed repeated measures designs to compare emo- the reference electrode used to measure ERPs. To this aim, a meta-
tional vs. neutral facial expressions. This information was available analysis employing the Q statistic analog to ANOVA (see previous
in 55 out of the 128 studies reviewed, which described 57 exper- section) was carried out on the emotional > neutral ESs regarding
iments (two studies described two meta-analyzable experiments) N170 amplitudes contrasting the modulator role of Reference Elec-
involving 1645 healthy adults; the rest of studies provided insuf- trode (2 levels: Common Average vs. Specific References, which
ficient information to compute ES. Details and summaries of all included mastoids – n = 9, nosetip – n = 5, earlobes – n = 2, vertex
ESs and other parameters necessary for meta-analysis correspond- – n = 1 and FCz – n = 1). The 56 meta-analyzable studies shown
ing to each experiment are available at www.uam.es/CEACO/sup/ in Fig. 2 were included in this meta-analysis. Results showed
N170 2014.htm. Main results and methodological characteristics of non-significant differences (Q(1) = 2.4043, p = 0.1210). Mean ES
these 57 meta-analyzable experiments are summarized in Table 1. for Common Average (n = 38) was −0.3472 (95%CI = −0.4259 to
For global statistics on ES (i.e., calculation of mean ES and its −0.2686, Z = −8.6522, p < 0.0001), and for Specific References
statistical significance through Z test: Lipsey and Wilson, 2001), the (n = 18), it was −0.2569 (95%CI = −0.3397 to −0.1740, Z = −6.0767,
“MeanES” SPSS macro designed by Wilson (2010) was employed. p < 0.0001). Therefore, ESs of emotional expression regarding N170
To investigate potential moderators of ES, a Q statistic analog amplitudes were significant within both groups of electrode ref-
to analysis of variance (ANOVA) for categorical variables (Lipsey erences, but no differences existed between them. An analysis
and Wilson, 2001), was computed also through Wilson’s SPSS comparing only studies using Common Average and Mastoids
macros (“metaF”; links to these macros are available at www. was also carried out, differences being in this case significant
uam.es/CEACO/sup/N170 2014.htm). All analyses were conducted (Q(1) = 4.1457, p = 0.0417), ESs being greater in the Common Aver-
using maximum likelihood, random-effects models weighted by age than in the Mastoids group.
the inverse of the variance. Second, a meta-analysis was carried out on whether the task
To address the “file drawer problem” – that is, the bias for sig- involved direct instructions with respect to expressions (i.e., they
nificant results to be more likely published and retrievable for were asked to attend to the expression in the faces) or not. All
a meta-analysis relative to non-significant results – the fail-safe meta-analyzable studies shown in Fig. 2 were included in this
N (Nfs) was computed. This Nfs represents the estimated num- meta-analysis (n = 56). Again, the Q statistic analog to ANOVA was
ber of unpublished studies reporting null results (here defined as computed on the emotional > neutral ESs N170 amplitudes con-
ES = −0.21 ) that should exist to render the overall findings non- trasting the modulator role of type of task (2 levels: direct task
significant (Rosenthal, 1979). To this aim, the Orwin (1983) Nfs vs. non-direct task, which included passive viewing – n = 10 – and
formula was applied. indirect tasks, in which participants were asked to direct their
attention away from facial expressions – n = 20 – e.g., identifying
objects such as houses or chairs, discriminating line orientations,
3. Results or performing digit categorization, all these non-facial targets pre-
sented besides, or superimposed to, the facial expressions). Results
The global effect of Emotional expressions (averaging the N170 showed significant differences (Q(1) = 6.3708, p = 0.0116). Mean ES
ESs to all emotional expressions employed in each study) vs. Neu- for Direct Task (n = 26) was −0.2025 (95%CI = −0.3000 to −0.1050,
tral expressions on N170 amplitudes was explored. Fig. 2 shows ESs Z = −4.0696, p < 0.0001), and for non-direct task (n = 30), it was
and 95% confidence intervals (CI) for meta-analyzed experiments −0.3574 (95%CI = −0.4277 to −0.2870, Z = −9.9595, p < 0.0001). In
(n = 56 since, as may be appreciated in Fig. 2, one of the 57 meta- other words, ESs of emotional expression regarding N170 ampli-
analyzable experiments was detected to be outlier and was not tudes were significant in both groups of tasks, but a between-group
included). This meta-analysis confirmed that emotional vs. neutral difference also existed indicating significantly greater ESs in the
faces ESs were significant. Global computations following a random non-direct group of tasks.
effects model showed that the mean ES for this sample of studies
(mean ES = −0.3325, 95%CI = −0.4471 to −0.2179) was statistically
significant (Z = −5.6855, p < 0.0001; Nfs = 46), clearly supporting an 4. Discussion
Emotional > Neutral effect on N170.
The first question that we aimed to answer was whether the
N170 amplitude in response to emotional faces differs from that
1
Emotional > Neutral differences in N170 amplitudes are reflected in negative
in response to neutral faces. Whereas early experiments in this
effect sizes since the more the amplitude, the more the negativity in this negative field failed to find any effect of facial expressions on N170 (e.g.,
ERP component. Eimer et al., 2003; Herrmann et al., 2002; Krolak-Salmon et al.,
Table 1
Description and main results of meta-analyzable studies exploring N170 amplitudes in response to neutral and emotional facial expressions.
Authors Year Sample statistics of Ongoing task Facial expressions Time window for Electrodes where Ref. electrode used Any main Emo vs Which Emo? Significant LH
healthy adults: F/M N170 in ms N170 was analyzed for analyses Neu N170 amplitude vs. RH N170
(average age) (amplitude type) (total) difference? amplitude
(controls) differences
(Emo)?
Aarts et al. 2012 51/9 (20) Go-no go task, faces Neutral, Happy, 150–200 ms (peak D30, D31, D32, A9, Common average Yes (Happy, Angry) > Neu Non-reported
provided feedback Angry amplitude) A10, A11, B6, B7, B8,
on task performance B10, B11, B12 (128)
Akbarfahimi et al. 2013 12/16 (32.61) Detection of houses Neutral, Happy, Fear 130–250 ms (peak P7, P8 (32) Mastoids No No
among faces amplitudes)
501
502
Table 1 (Continued)
Authors Year Sample statistics of Ongoing task Facial expressions Time window for Electrodes where Ref. electrode used Any main Emo vs Which Emo? Significant LH
healthy adults: F/M N170 in ms N170 was analyzed for analyses Neu N170 amplitude vs. RH N170
(average age) (amplitude type) (total) difference? amplitude
(controls) differences
(Emo)?
Dubal et al. 2011 7/8 (22.1) Detection of Neutral, Happy 130–190 ms (mean P7, P8, P5, P6, P3, P4, Common average No No
expressions amplitude) P1, P2, Pz, PO3, PO4,
PO5, PO6, PO7, PO8,
POZ, O1, O2, Oz (62)
Frühholz et al. 2011 10/7 (23.7) Task 1: Color Neutral, Fear 145–185 ms (mean P7, P8, PO7, PO8, Common average Yes Fear > Neu Non-reported
503
504
Table 1 (Continued)
Authors Year Sample statistics of Ongoing task Facial expressions Time window for Electrodes where Ref. electrode used Any main Emo vs Which Emo? Significant LH
healthy adults: F/M N170 in ms N170 was analyzed for analyses Neu N170 amplitude vs. RH N170
(average age) (amplitude type) (total) difference? amplitude
(controls) differences
(Emo)?
Stekelenburg & de 2004 12 (21.4) Judgements on Neutral, Fear 140–200 ms (peak P7, P8 (49) Common average Yes Fear > Neu No
Gelder orientation amplitude)
Tortosa et al. 2013 Exp. 1: 20/2 (20); Detection of white Neutral, Happy, 150–190 ms (mean 47, 50, TP7, 56, TP9, Common average Exp.1: No; Exp.2: Exp. 2: Yes (LH < RH);
Exp. 2: 14/11 (22) or black faces Angry amplitude) P9, 63, P10, TP8, 100, Yes (Happy = Angry) > Neu (Happy
Fig. 2. Experiments susceptible to be included in meta-analysis from those reviewed and summarized in www.uam.es/CEACO/sup/N170 2014.htm. Mean effect sizes
(emotional minus neutral N170 amplitudes) and 95% confidence interval are shown. An outlier test recommended to leave the study marked with an asterisk out of
meta-analyses.
506 J.A. Hinojosa et al. / Neuroscience and Biobehavioral Reviews 55 (2015) 498–509
system (see also Ellamil et al., 2008, for a similar account). Sim- We also found that task-related factors modulated N170 ampli-
ilarly, the parallel-dependent model of face recognition assumes tude in response to facial expressions. In this sense, although N170
a parallel and interactive processing of facial identity and expres- amplitude was significantly greater in response to emotional than
sion information at early stages (Martens et al., 2010). According to to neutral expressions both in direct (i.e., participants were explic-
these views, the N170 may be better understood as a correlate of itly instructed to attend to facial expressions) and non-direct tasks
a perceptual representation stage indexing the interplay between (i.e., attention was directed elsewhere – i.e., participants are asked
several sources of information including both basic-level and high- to attend to lines, digits, letters or other non-facial targets pre-
level facial features (i.e., facial expression, identity, speech, etc.), sented along with faces- or a passive viewing of faces was required),
which could reflect the activity of the multiple neural sources con- effect sizes were larger in studies that used non-direct tasks. Thus,
tributing to the generation of this waveform (George, 2013; Rossion although some data exist suggesting the insensitivity of N170
and Jacques, 2011; Schyns et al., 2007). In this sense, the results amplitude to facial expressions in non-direct tasks (Wronka and
of several studies suggest that the N170 may be modulated by Walentowska, 2011), the results of the current meta-analysis indi-
the affective congruency between a facial expression and different cate that the N170 elicited by facial expressions is modulated not
sources of contextual affective information including scenes, body only by goal-directed mechanisms but also by automatic processes
expressions or the prosody of speech (e.g., Hietanen and Astikainen, (see also Schyns et al., 2007, for a similar claim). It is important to
2013; Pourtois et al., 2000; Righart and de Gelder, 2006, 2008; see note that facial expressions are good capturers of exogenous (auto-
Wieser and Brosch, 2012 and de Gelder and Van den Stock, 2011, matic) attention and that visual cortices, including those involved
for detailed reviews on this issue). in the generation of N170, are part of the exogenous attention
The second issue that was explored in this meta-analysis was neural circuitry (Carretié, 2014). In this sense, evidence indicat-
whether there is a differential or a homogeneous N170 effect for ing that facial expressions are processed even under conditions in
each particular emotional expression. Data show that there is a which attentional resources are not fully available, could partially
differential sensitivity of this component to each discrete emo- account for our findings. In particular, studies using the dot-probe
tion. Angry expressions cause by far the greatest increase in N170 task found preferential attention to angry faces with masked pre-
amplitude, followed by fear and happiness, whose effect sizes were sentations (Fox et al., 2002; Mogg and Bradley, 1999; But see Koster
also significant. Those not reaching statistical significance were et al., 2007). Moreover, emotional facial expressions have been
sadness and disgust. Some interesting implications derive from found to reduce the size of the attentional blink in healthy par-
these results. First, N170 is not sensitive to every facial gesture, ticipants (e.g., Bach et al., 2014; Vermeulen et al., 2009; but see
which would mean that this component is sensitive to every struc- Luo et al., 2010), or to be more resistant to extinction in neglect
tural sign of facial change whatever its functional meaning might patients (e.g., Fox, 2002; Vuilleumier and Schwartz, 2001). Finally,
be (which would be interpreted in another, ulterior processing the time needed to detect emotional facial expressions in visual
step). In contrast, this component seems to be specifically sen- search paradigms seems to be independent from the number of dis-
sitive to some particular expressions and not (or not so much) tracters (e.g., Hansen and Hansen, 1988; Fox et al., 2000; Frischen
to others. This finding suggests that the processes underlying the et al., 2008; but see Nummenmaa and Calvo, 2015). Thus, the N170
N170 already reflect a hierarchization of expressions. Second, this effects found during indirect tasks could be explained by the privi-
hierarchy is probably not only based on structural criteria (e.g., leged processing of emotional expression in faces that may partially
mouth features over eyebrow features), but also on communicative, rely on attentional mechanisms that do not require awareness.
social criteria. Indeed, fear, angry or happy expressions are usual in
interchanges which require rapid social/emotional reactions in the
receiver (as compared to sadness or disgust expressions). There- 5. Conclusions
fore, their processing would involve to a greater extent those rapid
facial decoding mechanisms reflected by the N170 (at least as quick The present meta-analyses clearly show that the N170 is sensi-
as those subserved by other structures traditionally proposed to be tive to facial expressions, and that this sensitivity is heterogeneous:
involved in facial expression processing, such as the amygdala – anger, fear and happiness, in this order, are those eliciting the
Morris et al., 1998 – whose latency of response is similar, or even largest amplitudes with respect to neutral, non-expressive faces.
longer: Conty et al., 2012; Pessoa and Adolphs, 2010). However, The effect of sadness and disgust did not reach significance, at
some caution is needed since data on disgust and sadness were least with the current number of meta-analyzable studies. This het-
based on seven and fourteen studies, respectively. erogeneity, which apparently favors expressions requiring rapid
Finally, it is important to highlight that some task-related factors social responses in the receiver, suggests that the mechanisms
may influence N170 responses to facial expression, according to the underlying N170 go beyond detecting any facial change or ges-
results of our meta-analysis. These modulating factors, as well as ture and seem to be functionally guided. Current results suggest
others that could be explored in the future, may partially explain that N170 does not behave according to the predictions of duals
why an important number of studies showed no N170 effects to models (identity vs. expressions or structure vs. change, Bruce and
facial expressions (see the Review Table in www.uam.es/CEACO/ Young, 1986). Rather, it appears to fit better with models sug-
sup/N170 2014.htm). Specifically, two potentially modulating fac- gesting a more integrated and intermixed circuitry in charge of
tors were taken into account in this study, one of technical nature processing the identity and expression of faces (Calder and Young,
(reference electrode/s location), and the other one regarding the 2005). Importantly, the results of the present study also indicate
characteristics of the task (its direct or non-direct nature). Accord- that this component is modulated by factors that may vary its
ing to this meta-analysis, the N170 showed significant effects of amplitude in response to expressions, potentially explaining why
facial expression whatever the placement of the reference elec- several studies have not reported significant differences. However,
trode was. However, this factor had a significant influence, since the further research is needed to address the impact of other variables
effects of facial expression on N170 amplitude were even stronger that have been scarcely explored in prior studies so they cannot
when common average was employed. This finding supports the be meta-analyzable at present. Among others, the capability of
claims made by Rellecke et al. (2013) about the importance of N170 to reflect the processing of facial expressions presented in
the reference electrode/s location on the N170 response to facial the periphery (Rigoulot et al., 2011) or how it responds to different
expressions, which showed greater amplitudes when common expression intensities within each emotional category (Turetsky
average was employed. et al., 2007), seems to be of particular relevance. In conclusion, the
J.A. Hinojosa et al. / Neuroscience and Biobehavioral Reviews 55 (2015) 498–509 507
N170, probably along with other, less studied face-sensitive ERP Conty, L., Dezecache, G., Hugueville, L., Grezes, J., 2012. Early binding of gaze,
components that have also shown sensitivity to facial expressions gesture, and emotion: neural time course and correlates. J. Neurosci. 32,
4531–4539.
within the first 200 ms, such as P1 and the visual vertex potential Dan, O., Raz, S., 2012. Adult attachment and emotional processing biases: an
(e.g., Batty and Taylor, 2003; Li et al., 2008; Luo et al., 2010), must event-related potentials (ERPs) study. Biol. Psychol. 91, 212–220.
be considered an important tool to study the neural mechanisms de Gelder, B., Van den Stock, J., 2011. Real faces, real emotions: perceiving facial
expressions in naturalistic contexts of voices, bodies and scenes. In: Calder,
underlying facial expression processing and may help to further A.J., Rhodes, G., Johnson, M.H., Haxby, J.V. (Eds.), The Oxford Handbook of Face
develop current theoretical models. Recognition. Oxford University Press, New York, pp. 535–550.
Deffke, I., Sander, T., Heidenreich, J., Sommer, W., Curio, G., Trahms, L., et al., 2007.
MEG/EEG sources of the 170-ms response to faces are co-localized in the
Acknowledgements fusiform gyrus. Neuroimage 35, 1495–1501.
Dubal, S., Foucher, A., Jouvent, R., Nadel, J., 2011. Human brain spots emotion in
This work was supported by grants PSI2012-37535 and PSI2014- non humanoid robots. Soc. Cogn. Affect. Neurosci. 6, 90–97.
Eimer, M., Holmes, A., McGlone, F.P., 2003. The role of spatial attention in the
54853-P from the Ministerio de Economía y Competitividad processing of facial expression: an ERP study of rapid brain responses to six
(MINECO) of Spain and grant PI13/01759 from the Institute of basic emotions. Cogn. Affect. Behav. Neurosci. 3 (2), 97–110.
Health Carlos III (ISCIII) of Spain. Eimer, M., Holmes, A., 2007. Event-related brain potential correlates of emotional
face processing. Neuropsychologia 45, 15–31.
Eimer, M., Holmes, A., 2002. An ERP study on the time course of emotional face
References processing. Neuroreport 13, 427–431.
Ellamil, M., Susskind, J.M., Anderson, A.K., 2008. Examinations of identity
Aarts, K., Pourtois, G., 2012. Anxiety disrupts the evaluative component of invariance in facial expression adaptation. Cogn. Affect. Behav. Neurosci. 8,
performance monitoring: an ERP study. Neuropsychologia 50, 1286–1296. 273–281.
Akbarfahimi, M., Tehrani-Doost, M., Ghassemi, F., 2013. Emotional face perception Fox, C.J., Moon, S.Y., Iaria, G., Barton, J.J.S., 2009. The correlates of subjective
in patients with schizophrenia: an event-related potential study. perception of identity and expression in the face network: an fMRI adaptation
Neurophysiology 45, 249–257. study. Neuroimage 44, 569–580.
Almeida, P.R., Ferreira-Santos, F., Vieira, J.B., Moreira, P.S., Barbosa, F., Fox, E., Lester, V., Russo, R., Bowles, R.J., Pichler, A., Dutton, K., 2000. Facial
Marques-Teixeira, 2014. Dissociable effects of psychopathic traits on cortical expressions of emotion: are angry faces detected more efficiently? Cogn. Emot.
and subcortical visual pathways during facial emotion processing: an ERP 14, 61–92.
study on the N170. Psychophysiology 51, 645–657. Fox, E., 2002. Processing emotional facial expressions: the role of anxiety and
Andreatta, M., Puschmann, A.K., Sommer, C., Weyers, P., Pauli, P., Mühlberger, A., awareness. Cogn. Affect. Behav. Neurosci. 2, 52–63.
2012. Altered processing of emotional stimuli in migraine: an event-related Fox, E., Russo, R., Dutton, K., 2002. Attentional bias for threat: evidence for delayed
potential study. Cephalalgia 32, 1101–1108. disengagement from emotional faces. Cogn. Emot. 16, 355–379.
Atkinson, A.P., Adolphs, R., 2011. The neuropsychology of face perception: beyond Frischen, A., Eastwood, J.D., Smilek, D., 2008. Visual search for faces with emotional
simple dissociations and functional selectivity. Philos. Trans. R. Soc. Lond. B: expressions. Psychol. Bull. 134, 662.
Biol. Sci. 366, 1726–1738. Frühholz, S., Jellinghaus, A., Herrmann, M., 2011. Time course of implicit
Bach, D.R., Schmidt-Daffy, M., Dolan, R.J., 2014. Facial expression influences face processing and explicit processing of emotional faces and emotional words.
identity recognition during the attentional blink. Emotion 14, 1007–1013. Biol. Psychol. 87, 265–274.
Batty, M., Taylor, M.J., 2003. Early processing of the six basic facial emotional Ganel, T., Valyear, K.F., Goshen-Gottstein, Y., Goodale, M.A., 2005. The involvement
expressions. Cogn. Brain Res. 17, 613–620. of the “fusiform face area” in processing facial expression. Neuropsychologia
Bediou, B., Eimer, M., d’Amato, T., Hauk, O., Calder, A.J., 2009. In the eye of the 43, 1645–1654.
beholder: individual differences in reward-drive modulate early frontocentral George, N., 2013. The facial expression of emotions. In: Armony, J., Vuilleumier, P.
ERPs to angry faces. Neuropsychologia 47, 825–834. (Eds.), The Cambridge Handbook of Human Affective Neuroscience. Cambridge
Bentin, S., Allison, T., Puce, A., Perez, E., McCarthy, G., 1996. Electrophysiological University Press, New York, pp. 171–197.
studies of face perception in humans. J. Cogn. Neurosci. 8, 551–565. Hadj-Bouziane, F., Bell, A.H., Knusten, T.A., Ungerleider, L.G., Tootell, R.B., 2008.
Blau, V.C., Maurer, U., Tottenham, N., McCandliss, B.D., 2007. The face-specific Perception of emotional expressions is independent of face selectivity in
N170 component is modulated by emotional facial expression. Behav. Brain monkey inferior temporal cortex. Proc. Natl. Acad. Sci. U. S. A. 105, 5591–5596.
Funct. 3, 7, http://dx.doi.org/10.1186/1744-9081-3-7 Hansen, C.H., Hansen, R.D., 1988. Finding the face in the crowd: an anger
Blechert, J., Sheppes, G., Di Tella, C., Williams, H., Gross, J.J., 2012. See what you superiority effect. J. Pers. Soc. Psychol. 54, 917.
think: reappraisal modulates behavioral and neural responses to social stimuli. Harry, B., Williams, M.A., Davis, C., Kim, J., 2013. Emotional expressions evoke a
Psychol. Sci. 23, 346–353. differential response in the fusiform face area. Front. Hum. Neurosci., 7.
Brennan, A.N., Harris, A.W.F., Williams, L.M., 2014. Neural processing of facial Haxby, J.V., Hoffman, E.A., Gobbini, M.I., 2000. The distributed human neural
expressions of emotion in first onset psychosis. Psychiatry Res. 219, 477–485. system for face perception. Trends Cogn. Sci. 4, 223–233.
Brenner, C.A., Rumak, S.P., Burns, A.M.N., Kieffaber, P.D., 2014. The role of encoding He, W., Chai, H., Chen, W., Zhang, J., Xu, Y., Zhu, J., Wang, W., 2012. Facial emotion
and attention in facial emotion memory: an EEG investigation. Int. J. triggered cerebral potentials in treatment-resistant depression and borderline
Psychophysiol. 93, 398–410. personality disorder patients of both genders. Prog. Neuropsychopharmacol.
Bruce, V., Young, A., 1986. Understanding face recognition. Br. J. Psychol. 77, Biol. Psychiatry 37, 121–127.
305–327. Hendriks, M.C., van Boxtel, G.J., Vingerhoets, A.J., 2007. An event-related potential
Bublatzky, F., Gerdes, A.B., White, A.J., Riemer, M., Alpers, G.W., 2014. Social and study on the early processing of crying faces. Neuroreport 18, 631–634.
emotional relevance in face processing: happy faces of future interaction Herbert, C., Sfärlea, A., Blumenthal, T., 2013. Your emotion or mine: labeling
partners enhance the late positive potential. Front. Hum. Neurosci. 8, 493. feelings alters emotional face perception – an ERP study on automatic and
Burrows, A.M., 2008. The facial expression musculature in primates and its intentional affect labeling. Front. Hum. Neurosci. 7, 378, http://dx.doi.org/10.
evolutionary significance. Bioessays 30, 212–225. 3389/fnhum.2013.00378
Calder, A.J., Young, A.W., 2005. Understanding the recognition of facial identity and Herrmann, M.J., Aranda, D., Ellgring, H., Mueller, T.J., Strik, W.K., Heidrich, A., et al.,
facial expression. Nat. Rev. Neurosci. 6, 641–651. 2002. Face-specific event-related potential in humans is independent from
Campanella, S., Montedoro, C., Streel, E., Verbanck, P., Rosier, V., 2006. Early visual facial expression. Int. J. Psychophysiol. 45, 241–244.
components (P100, N170) are disrupted in chronic schizophrenic patients: an Hietanen, J.K., Astikainen, P., 2013. N170 response to facial expressions is
event-related potentials study. Neurophysiol. Clin. 36, 71–78. modulated by the affective congruency between the emotional expression and
Carmel, D., Bentin, S., 2002. Domain specificity versus expertise: factors preceding affective picture. Biol. Psychol. 92, 114–124.
influencing distinct processing of faces. Cognition 83, 1–29. Hirai, M., Watanabe, S., Honda, Y., Miki, K., Kakigi, R., 2008. Emotional object and
Carretié, L., Kessel, D., Carboni, A., López-Martín, S., Albert, J., Tapia, M., Hinojosa, scene stimuli modulate subsequent face processing: an event-related potential
J.A., 2013. Exogenous attention to facial vs non-facial emotional visual stimuli. study. Brain Res. Bull. 77, 264–273.
Soc. Cogn. Affect. Neurosci. 8, 764–773. Holmes, A., Kiss, M., Eimer, M., 2006. Attention modulates the processing of
Carretié, L., 2014. Exogenous (automatic) attention to emotional stimuli: a review. emotional expression triggered by foveal faces. Neurosci. Lett. 394, 48–52.
Cogn. Affect. Behav. Neurosci. 14, 1228–1258. Humphreys, G.W., Donnelly, N., Riddoch, M.J., 1993. Expression is computed
Chai, H., Chen, W.Z., Zhu, J., Xu, Y., Lou, L., Yang, T., He, W., Wang, W., 2012. separately from facial identity, and it is computed separately for moving and
Processing of facial expressions of emotions in healthy volunteers: an static faces: neuropsychological evidence. Neuropsychologia 31,
exploration with event-related potentials and personality traits. Neurophysiol. 173–181.
Clin. 42, 369–375. Ishai, A., 2008. Let’s face it: it’s a cortical network. Neuroimage 40, 415–419.
Chammat, M., Foucher, A., Nadel, J., Dubal, S., 2010. Reading sadness beyond Itier, R.J., Alain, C., Sedore, K., McIntosh, A.R., 2007. Early face processing
human faces. Brain Res. 1348, 95–104. specificity: it’s in the eyes! J. Cogn. Neurosci. 19, 1815–1826.
Chen, J., Ma, W., Zhang, Y., Wu, X., Wei, D., Liu, G., Deng, Z., Yang, L., Zhang, Z., 2014. Itier, R.J., Batty, M., 2009. Neural bases of eye and gaze processing: the core of
Distinct facial processing related negative cognitive bias in first-episode and social cognition. Neurosci. Biobehav. Rev. 33, 843–863.
recurrent major depression: evidence from the N170 ERP component. PLoS Itier, R.J., Taylor, M.J., 2004a. N170 or N1? Spatiotemporal differences between
ONE 9, e109176. object and face processing using ERPs. Cereb. Cortex 14, 132–142.
508 J.A. Hinojosa et al. / Neuroscience and Biobehavioral Reviews 55 (2015) 498–509
Itier, R.J., Taylor, M.J., 2004b. Source analysis of the N170 to faces and objects. Orwin, R.G., 1983. A fail-safe N for effect size in meta-analysis. J. Edu. Stat. 8,
Neuroreport 15, 1261–1265. 157–159.
Jetha, M.K., Zheng, X., Goldberg, J.O., Segalowitz, S.J., Schmidt, L.A., 2013. Shyness Parry, F.M., Young, A.W., Shona, J., Saul, M., Moss, A., 1991. Dissociable face
and emotional face processing in schizophrenia: an ERP study. Biol. Psychol. processing impairments after brain injury. J. Clin. Exp. Neuropsychol. 13,
94, 562–574. 545–558.
Jiang, Y.I., Shannon, R.W., Vizueta, N., Bernat, E.M., Patrick, C.J., He, S., 2009. Pessoa, L., Adolphs, R., 2010. Emotion processing and the amygdala: from a ‘low
Dynamics of processing invisible faces in the brain: automatic neural encoding road’ to ‘many roads’ of evaluating biological significance. Nat. Rev. Neurosci.
of facial expression information. Neuroimage 44, 1171–1177. 11, 773–783.
Jiang, Y.I., Li, W., Recio, G., Liu, Y., Luo, W., Zhang, D., Sun, D., 2014. Time pressure Posamentier, M.T., Abdi, H., 2003. Processing faces and facial expressions.
inhibits dynamic advantage in the classification of facial expressions of Neuropsychol. Rev. 13, 113–143.
emotion. PLoS ONE 9, e100162. Pourtois, G., de Gelder, B., Vroomen, J., Rossion, B., Crommelinck, M., 2000. The
Jung, H.T., Kim, D.W., Kim, S., Im, C.H., Lee, S.H., 2012. Reduced source activity of time-course of intermodal binding between seeing and hearing affective
event-related potentials for affective facial pictures in schizophrenia patients. information. Neuroreport 11, 1329–1333.
Schizophr. Res. 136, 150–159. Rellecke, J., Sommer, W., Schacht, A., 2012. Does processing of emotional facial
Kaufmann, J.M., Schweinberger, S.R., 2004. Expression influences the recognition expressions depend on intention? Time-resolved evidence from event-related
of familiar faces. Perception 33, 399–408. brain potentials. Biol. Psychol. 90, 23–32.
Kawasaki, H., Tsuchiya, N., Kovach, C.K., Nourski, K.V., Oya, H., Howard, M.A., et al., Rellecke, J., Sommer, W., Schacht, A., 2013. Emotion effects on the N170: a question
2012. Processing of facial emotion in the human fusiform gyrus. J. Cogn. of reference? Brain Topogr. 26, 62–71.
Neurosci. 24, 1358–1370. Righart, R., de Gelder, B., 2006. Context influences early perceptual analysis of faces
Kerestes, R., Labuschagne, I., Croft, R.J., O’Neill, B.V., Bhagwagar, Z., Phan, K.L., – an electrophysiological study. Cereb. Cortex 16, 1249–1257.
Nathan, P.J., 2009. Evidence for modulation of facial emotional processing bias Righart, R., de Gelder, B., 2008. Rapid influence of emotional scenes on encoding of
during emotional expression decoding by serotonergic and noradrenergic facial expressions: an ERP study. Soc. Cogn. Affect. Neurosci. 3, 270–278.
antidepressants: an event-related potential (ERP) study. Psychopharmacology Rigoulot, S., D’Hondt, F., Defoort-Dhellemmes, S., Despretz, P., Honoré, J., 2011.
(Berl.) 202, 621–634. Fearful faces impact in peripheral vision: behavioral and neural evidence.
Koster, E.H., Verschuere, B., Burssens, B., Custers, R., Crombez, G., 2007. Attention Neuropsychologia 49, 2013–2021.
for emotional faces under restricted awareness revisited: do emotional faces Rosenthal, R., 1979. The file drawer problem and tolerance for null results. Psychol.
automatically attract attention? Emotion 7, 285. Bull. 86, 638.
Krolak-Salmon, P., Fischer, C., Vighetto, A., Mauguière, F., 2001. Processing of facial Rossignol, M., Campanella, S., Maurage, P., Heeren, A., 2012. Enhanced perceptual
emotional expression: spatio-temporal data as assessed by scalp event-related responses during visual processing of facial stimuli in young socially anxious
potentials. Eur. J. Neurosci. 13, 987–994. individuals. Neurosci. Lett. 526, 68–73.
LaBar, K.S., Crupain, M.J., Voyvodic, J.T., McCarthy, G., 2003. Dynamic perception of Rossion, B., 2014. Understanding face perception by means of human
facial affect and identity in the human brain. Cereb. Cortex 13, electrophysiology. Trends Cogn. Sci. 18, 310–318.
1023–1033. Rossion, B., Dricot, L., Devolder, A., Bodart, J., Crommelinck, M., de Gelder, B.,
Labuschagne, I., Croft, R.J., Phan, K.L., Nathan, P.J., 2010. Augmenting serotonin Zoontjes, R., 2000. Hemispheric asymmetries for whole-based and part-based
neurotransmission with citalopram modulates emotional expression decoding face processing in the human fusiform gyrus. J. Cogn. Neurosci. 12, 793–802.
but not structural encoding of moderate intensity sad facial emotional stimuli: Rossion, B., Gauthier, I., 2002. How does the brain process upright and inverted
an event-related potential (ERP) investigation. J. Psychopharmacol. (Oxf.) 24, faces? Behav. Cogn. Neurosci. Rev. 1, 63–75.
1153–1164. Rossion, B., Curran, T., Gauthier, I., 2002. A defense of the subordinate-level
Lakens, D., 2013. Calculating and reporting effect sizes to facilitate cumulative expertise account for the N170 component. Cognition 85, 189–196.
science: a practical primer for t-tests and ANOVAs. Front. Psychol. 4, 863. Rossion, B., Jacques, C., 2008. Does physical interstimulus variance account for
Lander, K., Metcalfe, S., 2007. The influence of positive and negative facial early electrophysiological face sensitive responses in the human brain? Ten
expressions on face familiarity. Memory 15, 63–69. lessons on the N170. Neuroimage 39, 1959–1979.
Lee, S., Kim, E., Kim, S., Im, W., Seo, H., Han, S., Kim, H., 2007. Facial affect Rossion, B., Jacques, C., 2011. The N170: understanding the time-course of face
perception and event-related potential N170 in schizophrenia: a preliminary perception in the human brain. In: Luck, S., Kappenman, E. (Eds.), The Oxford
study. Clin. Neuropsychopharmacol. Neurosci. 5, 76–80. Handbook of ERP Components. Oxford University Press, New York, pp.
Li, W., Zinbarg, R.E., Boehm, S.G., Paller, K.A., 2008. Neural and behavioral evidence 115–142.
for affective priming from unconsciously perceived emotional facial Rousselet, G.A., Macé, M.J., Fabre-Thorpe, M., 2004. Spatiotemporal analyses of the
expressions and the influence of trait anxiety. J. Cogn. Neurosci. 20, 95–107. N170 for human faces, animal faces and objects in natural scenes. Neuroreport
Lipsey, M.W., Wilson, D.B., 2001. Practical Meta-analysis: Applied Social Research 15, 2607–2611.
Methods Series. Sage, Thousand Oaks. Sadeh, B., Podlipsky, I., Zhdanov, A., Yovel, G., 2010. Event-related potential and
Luo, W., Feng, W., He, W., Wang, N.Y., Luo, Y.J., 2010. Three stages of facial functional MRI measures of face-selectivity are highly correlated: a
expression processing: ERP study with rapid serial visual presentation. simultaneous ERP-fMRI investigation. Hum. Brain Mapp. 31, 1490–1501.
Neuroimage 49, 1857–1867. Schyns, P.G., Petro, L.S., Smith, M.L., 2007. Dynamics of visual information
Lynn, S.K., Salisbury, D.F., 2008. Attenuated modulation of the N170 ERP by facial integration in the brain for categorizing facial expressions. Curr. Biol. 17,
expressions in schizophrenia. Clin. EEG Neurosci. 39, 108–111. 1580–1585.
MacNamara, A., Schmidt, J., Zelinsky, G.J., Hajcak, G., 2012. Electrocortical and Smith, M., 2012. Rapid processing of emotional expressions without conscious
ocular indices of attention to fearful and neutral faces presented under high awareness. Cereb. Cortex 22, 1748–1760.
and low working memory load. Biol. Psychol. 91, 349–356. Stekelenburg, J., de Gelder, B., 2004. The neural correlates of perceiving human
Martens, U., Leuthold, H., Schweinberger, S.R., 2010. Parallel processing in face bodies: an ERP study on the body-inversion effect. Cogn. Neurosci.
perception. J. Exp. Psychol. Hum. Percept. Perform. 36, 103–121. Neuropsychol. 15, 777–780.
Maurage, P., Philippot, P., Verbanck, P., Noel, X., Kornreich, C., Hanak, C., Taylor, M.J., Batty, M., Itier, R.J., 2004. The faces of development: a review of early
Campanella, S., 2007. Is the P300 deficit in alcoholism associated with early face processing over childhood. J. Cogn. Neurosci. 16, 1426–1442.
visual impairments (P100, N170)? An odball paradigm. Clin. Neurophysiol. Tortosa, M., Lupiánez, J., Ruz, M., 2013. Race, emotion and trust: an ERP study.
118, 633–644. Brain Res. 1494, 44–55.
Mogg, K., Bradley, B.P., 1999. Orienting of attention to threatening facial Tsuchiya, N., Kawasaki, H., Oya, H., Howard III, M.A., Adolphs, R., 2008. Decoding
expressions presented under conditions of restricted awareness. Cogn. Emot. face information in time, frequency and space from direct intracranial
13, 713–740. recordings of the human brain. PLoS ONE 3 (12), e3892.
Morel, S., Ponz, A., Mercier, M., Vuilleumier, P., George, N., 2009. EEG-MEG Tsurusawa, R., Goto, Y., Mitsudome, A., Nakashima, T., Tobimatsu, S., 2008.
evidence for early differential repetition effects for fearful, happy and neutral Different perceptual sensitities for Chernoff’s face between children and
faces. Brain Res. 1254, 84–98. adults. Neurosci. Res. 60, 176–183.
Morel, S., George, N., Foucher, A., Chammat, M., Dubal, S., 2014. ERP evidence for an Turetsky, B.I., Kohler, C.G., Indersmitten, T., Bhati, M.T., Charbonnier, D., Gur, R.C.,
early emotional bias towards happy faces in trait anxiety. Biol. Psychol. 99, 2007. Facial emotion recognition in schizophrenia: when and why does it go
183–192. awry? Schizophr. Res. 94, 253–263.
Morris, J., Friston, K., Büchel, C., Frith, C., Young, A., Calder, A., et al., 1998. A Vermeulen, N., Godefroid, J., Mermillod, M., 2009. Emotional modulation of
neuromodulatory role for the human amygdala in processing emotional facial attention: fear increases but disgust reduces the attentional blink. PLoS ONE 4
expressions. Brain 121, 47–57. (11), e7924.
Mühlberger, A., Wieser, M., Herrmann, M., Weyers, P., Tröger, C., Pauli, P., 2009. Vuilleumier, P., Schwartz, S., 2001. Beware and be aware: capture of spatial
Early cortical processing of natural and artificial emotional faces differs attention by fear-related stimuli in neglect. Neuroreport 12, 1119–1122.
between lower and higher socially anxious persons. J. Neural Transm. 116, Walentowska, W., Wronka, E., 2012. Trait anxiety and involuntary processing of
735–746. facial emotions. Int. J. Psychophysiol. 85, 27–36.
Nemrodov, D., Anderson, T., Preston, F.F., Itier, R.J., 2014. Early sensitivity for eyes Wieser, M.J., Brosch, T., 2012. Faces in context: a review and systematization of
within faces: a new neuronal account of holistic and featural processing. contextual influences on affective face processing. Front. Psychol. 3, 471,
Neuroimage 97, 81–94. http://dx.doi.org/10.3389/fpsyg.2012.00471
Nummenmaa, L., Calvo, M.G., 2015. Dissociation between recognition and Wieser, M., Pauli, P., Reicherts, P., Mühlberger, A., 2010. Don’t look at me in anger!
detection advantage for facial expressions: a meta-analysis. Emotion 15, Enhanced processing of angry faces in anticipation of public speaking.
243–256. Psychophysiology 47, 271–280.
J.A. Hinojosa et al. / Neuroscience and Biobehavioral Reviews 55 (2015) 498–509 509
Wieser, M., Gerdes, A., Greiner, R., Reicherts, P., Pauli, P., 2012. Tonic pain grabs Wronka, E., Walentowska, W., 2011. Attention modulates emotional expression
attention, but leaves the processing of facial expressions intact – evidence processing. Psychophysiology 48, 1047–1056.
from event-related brain potentials. Biol. Psychol. 90, 242–248. Xu, X., Biederman, I., 2010. Loci of the release from fMRI adaptation for changes in
Wild-Wall, N., Dimigen, O., Sommer, W., 2008. Interaction of facial expressions and facial expression, identity, and viewpoint. J. Vision 10, http://dx.doi.org/10.
familiarity: ERP evidence. Biol. Psychol. 77, 138–149. 1167/10.14.36
Williams, L., Palmer, D., Liddell, B., Song, L., Gordon, E., 2006. The “when” and Young, A.W., McWeeny, K.H., Hay, D.C., Ellis, A.W., 1986. Matching familiar and
“where” of perceiving signals of threat versus non-threat. Neuroimage 31, unfamiliar faces on identity and expression. Psychol. Res. 48, 63–68.
458–467. Yuan, L., Zhou, R., Hu, S., 2014. Cognitive reappraisal of facial expressions:
Wilson, D.B., 2010. SPSS macros for meta-analysis (2010 actualization). http:// electrophysiological evidence of social anxiety. Neurosci. Lett. 577, 45–50.
mason.gmu.edu/∼dwilsonb/ma.html Zhang, D., Wang, L., Luo, Y., Luo, Y., 2012. Individual differences in detecting
Winston, J.S., Henson, R.N.A., Fine-Goulden, M.R., Dolan, R.J., 2004. fMRI-adaptation rapidly presented fearful faces. PLoS ONE 7, e49517.
reveals dissociable neural representations of identity and expression in face Zhao, L., Li, J., 2006. Visual mismatch negativity elicited by facial expressions under
perception. J. Neurophysiol. 92, 1830–1839. non-attentional condition. Neurosci. Lett. 410, 126–131.