Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2010, Journal of Vision
…
1 page
1 file
Research suggests that emotionally-charged stimuli are registered in limbic areas 1-2 and can influence task performance 3-4 outside of awareness. Such stimuli can also reach awareness under conditions under which neutral stimuli cannot 5-6. These findings suggest that emotional content can be detected and processed rapidly and preconsciously.
International Journal of Psychophysiology, 2008
Non-conscious processing of emotionally expressive faces has been found in patients with damage to visual brain areas and has been demonstrated experimentally in healthy controls using visual masking procedures. The time at which this subliminal processing occurs is not known. To address this question, a group of healthy participants performed a fearful face detection task in which backward masked fearful and non-fearful faces were presented at durations ranging from 16 to 266 ms. On the basis of the group's behavioural results, high-density event-related potentials were analysed for subliminal, intermediate and supraliminal presentations. Subliminally presented fearful faces were found to produce a stronger posterior negativity at 170 ms (N170) than non-fearful faces. This increase was also observed for intermediate and supraliminal conditions. A later component, the N2 occurring between 260 and 300 ms, was the earliest component related to stimulus detectability, increasing with target duration and differentiating fearful from non-fearful faces at longer durations of presentation. Source localisation performed on the N170 component showed that fear produced a greater activation of extrastriate visual areas, particularly on the right. Whether they are presented subliminally or supraliminally, fearful faces are processed at an early stage in the stream of visual processing, giving rise to enhanced activation of right extrastriate temporal cortex as early as 170 ms post-stimulus onset.
PloS one, 2012
It is part of basic emotions like fear or anger that they prepare the brain to act adaptively. Hence scenes representing emotional events are normally associated with characteristic adaptive behavior. Normally, face and body representation areas in the brain are modulated by these emotions when presented in the face or body. Here, we provide neuroimaging evidence (using functional magnetic resonance imaging) that the extrastriate body area (EBA) is highly responsive when subjects observe isolated faces presented in emotional scenes. This response of EBA to threatening scenes in which no body is present gives rise to speculation about its function. We discuss the possibility that the brain reacts proactively to the emotional meaning of the scene.
Emotion, 2005
A commonly held view is that emotional stimuli are processed independently of awareness. Here, the authors parametrically varied the duration of a fearful face target stimulus that was backward masked by a neutral face. The authors evaluated awareness by characterizing behavioral performance using receiver operating characteristic curves from signal detection theory. Their main finding was that no universal objective awareness threshold exists for fear perception. Although several subjects displayed a behavioral pattern consistent with previous reports (i.e., targets masked at 33 ms), a considerable percentage of their subjects (64%) were capable of reliably detecting 33-ms targets. Their findings suggest that considerable information is available even in briefly presented stimuli (possibly as short as 17 ms) to support masked fear detection.
Neuropsychologia, 2011
Many studies provided evidence that the emotional content of visual stimulations modulates behavioral performance and neuronal activity. Surprisingly, these studies were carried out using stimulations presented in the center of the visual field while the majority of visual events firstly appear in the peripheral visual field. In this study, we assessed the impact of the emotional facial expression of fear when projected in near and far periphery. Sixteen participants were asked to categorize fearful and neutral faces projected at four peripheral visual locations (15° and 30° of eccentricity in right and left sides of the visual field) while reaction times and event-related potentials (ERPs) were recorded. ERPs were analyzed by means of spatio-temporal principal component and baseline-to-peak methods. Behavioral data confirmed the decrease of performance with eccentricity and showed that fearful faces induced shorter reaction times than neutral ones. Electrophysiological data reveale...
Brain and Cognition, 2006
Frontiers in Psychology, 2015
In order to investigate the interactions between non-spatial selective attention, awareness and emotion processing, we carried out an ERP study using a backward masking paradigm, in which angry, fearful, happy, and neutral facial expressions were presented, while participants attempted to detect the presence of one or the other category of facial expressions in the different experimental blocks. ERP results showed that negative emotions enhanced an early N170 response over temporal-occipital leads in both masked and unmasked conditions, independently of selective attention. A later effect arising at the P2 was linked to awareness. Finally, selective attention was found to affect the N2 and N3 components over occipito-parietal leads. Our findings reveal that (i) the initial processing of facial expressions arises prior to attention and awareness; (ii) attention and awareness give rise to temporally distinct periods of activation independently of the type of emotion with only a partial degree of overlap; and (iii) selective attention appears to be influenced by the emotional nature of the stimuli, which in turn impinges on unconscious processing at a very early stage. This study confirms previous reports that negative facial expressions can be processed rapidly, in absence of visual awareness and independently of selective attention. On the other hand, attention and awareness may operate in a synergistic way, depending on task demand.
Neuropsychologia, 2007
Brain imaging studies in humans have shown that face processing in several areas is modulated by the affective significance of faces, particularly with fearful expressions, but also with other social signals such gaze direction. Here we review haemodynamic and electrical neuroimaging results indicating that activity in the face-selective fusiform cortex may be enhanced by emotional (fearful) expressions, without explicit voluntary control, and presumably through direct feedback connections from the amygdala. fMRI studies show that these increased responses in fusiform cortex to fearful faces are abolished by amygdala damage in the ipsilateral hemisphere, despite preserved effects of voluntary attention on fusiform; whereas emotional increases can still arise despite deficits in attention or awareness following parietal damage, and appear relatively unaffected by pharmacological increases in cholinergic stimulation. Fear-related modulations of face processing driven by amygdala signals may implicate not only fusiform cortex, but also earlier visual areas in occipital cortex (e.g., V1) and other distant regions involved in social, cognitive, or somatic responses (e.g., superior temporal sulcus, cingulate, or parietal areas). In the temporal domain, evoked-potentials show a widespread time-course of emotional face perception, with some increases in the amplitude of responses recorded over both occipital and frontal regions for fearful relative to neutral faces (as well as in the amygdala and orbitofrontal cortex, when using intracranial recordings), but with different latencies post-stimulus onset. Early emotional responses may arise around 120 ms, prior to a full visual categorization stage indexed by the face-selective N170 component, possibly reflecting rapid emotion processing based on crude visual cues in faces. Other electrical components arise at later latencies and involve more sustained activities, probably generated in associative or supramodal brain areas, and resulting in part from the modulatory signals received from amygdala. Altogether, these fMRI and ERP results demonstrate that emotion face perception is a complex process that cannot be related to a single neural event taking place in a single brain regions, but rather implicates an interactive network with distributed activity in time and space. Moreover, although traditional models in cognitive neuropsychology have often considered that facial expression and facial identity are processed along two separate pathways, evidence from fMRI and ERPs suggests instead that emotional processing can strongly affect brain systems responsible for face recognition and memory. The functional implications of these interactions remain to be fully explored, but might play an important role in the normal development of face processing skills and in some neuropsychiatric disorders.
Brain Research, 2010
Several lines of evidence demonstrate that processing facial expression can occur in the first 130 ms following a face presentation, but it remains unclear how this is modulated by attention. We presented neutral, fearful and happy faces to subjects who attended either to repeated identity or to repeated emotions. Brain activity was recorded using magnetoencephalography (MEG) and analyzed with event-related beamforming, providing both temporal and spatial information of processing in the brain. The first MEG component, at 90 ms (M90), was sensitive to facial expression, but only when attention was not directed to expression; non-attended fearful faces increased activation in occipital and right middle frontal gyri. Around 150 ms, activity in several brain regions, regardless of the direction of attention, was larger to emotional compared to neutral faces; attention directed to facial expressions increased activity in the right fusiform gyrus and the anterior insula bilaterally. M220 was not modulated by individual facial expressions; however, attention directed to facial expressions enhanced activity in the right inferior parietal lobe and precuneus, while attention directed to identity enhanced posterior cingulate activity.These data demonstrate that facial expression processing involves frontal brain areas as early as 90 ms. Attention directed to emotional expressions obscured this early automatic processing but increased the M170 activity. The M220 sources varied with the direction of attention. Thus, the pattern of neural activation to faces varied with attention to emotions or to identity, demonstrating separate and only partially overlapping networks for these two facets of information contained in faces.
Carolingians Frontiers: Italy and Beyond, Maddalena Betti, Francesco Borri and Stefano Gasparri (eds.), FUP 2024, pp. 261-271.
Espacio Tiempo y Forma. Serie I, Prehistoria y Arqueología, 1995
Journal of Food Science and Technology, 2012
Need to know. Intelligence and Politics. Western and Eastern Perspectives, ed. by Fris T. W., Bułhak W., Syddansk Universitetsforlag, Odense 2014, 2014
Publikaties van het Provinciaal Gallo-Romeins Museum te Tongeren. Reeks onder auspiciën van de bestendige deputatie van Limburg, 1984
DergiPark (Istanbul University), 2022
Archives of Oral Biology, 2018
Journal of Asian Natural Products Research, 2008
International Journal of Gynecology & Obstetrics, 2013
arXiv (Cornell University), 2023
Proceedings of The Third Workshop on Representation Learning for NLP, 2018
Molecular Systems Biology, 2011
AUVSI Unmanned Systems 2000 Symposium and …, 2000