Papers by Anthony Atkinson
The intentional properties and subjective qualities of conscious states pose special problems for... more The intentional properties and subjective qualities of conscious states pose special problems for physicalism. Yet 'consciousness' is a term of the vernacular that picks out such a heterogeneous group of phenomena that it will not be a good explanandum for science. This thesis adopted the position that we are licensed to theorize about the phenomena of consciousness, provided we are careful to dump all excess folk-psychological baggage surrounding the term. It was argued that the purposes and goals of folk psychology differ considerably from those of scientific psychology, for folk psychology is first and foremost a craft. Cognitive psychology is bound to the analytical strategy by way of functionalism. Various forms of functionalism were investigated, and two non mutually exclusive versions were favoured: homuncular functionalism and microfunctionalism. This led to the view that nature is multi-levelled, and therefore that functionalism may be better known as structural-fu...
dynamic visual patterns following damage to the amygdala. Society of Neuroscience Abstracts, 24, ... more dynamic visual patterns following damage to the amygdala. Society of Neuroscience Abstracts, 24, 1176. Henson, R. N. A., Shallice, T., & Dolan, R. J. (1999). Right prefrontal cortex and episodic memory retrieval: a functional MRI test of the monitoring process. Brain, 122, 1367-1381. Heyes, C. M. (1998). Theory of mind in nonhuman primates. Behavioral and Brain Sciences, 21, 101-148. Hietanen, J. K., & Perrett, D. I. (1996). Motion sensitive cells in the macaque superior temporal polysensory area: response discrimination between self-generated and externally generated pattern motion. Behavioural Brain Research, 76, 155-167. Evolutionary psychology's grain problem 34 © Anthony P. Atkinson & Michael Wheeler, 2002 Humphrey, N. (1976). The social function of intellect. In P. P. G. Bateson & R. A. Hinde (Eds.), Growing points in ethology (pp. 303-317). Cambridge: Cambridge University Press. Jellema, T., & Perrett, D. I. (2001). Coding of visible and hidden actions. In W. Prinz & B. H...
PLOS ONE, 2021
Certain facial features provide useful information for recognition of facial expressions. In two ... more Certain facial features provide useful information for recognition of facial expressions. In two experiments, we investigated whether foveating informative features of briefly presented expressions improves recognition accuracy and whether these features are targeted reflexively when not foveated. Angry, fearful, surprised, and sad or disgusted expressions were presented briefly at locations which would ensure foveation of specific features. Foveating the mouth of fearful, surprised and disgusted expressions improved emotion recognition compared to foveating an eye or cheek or the central brow. Foveating the brow led to equivocal results in anger recognition across the two experiments, which might be due to the different combination of emotions used. There was no consistent evidence suggesting that reflexive first saccades targeted emotion-relevant features; instead, they targeted the closest feature to initial fixation. In a third experiment, angry, fearful, surprised and disgusted...
Journal of Experimental Psychology: Human Perception and Performance, 2020
At normal interpersonal distances all features of a face cannot fall within one's fovea simultane... more At normal interpersonal distances all features of a face cannot fall within one's fovea simultaneously. Given that certain facial features are differentially informative of different emotions, does the ability to identify facially expressed emotions vary according to the feature fixated and do saccades preferentially seek diagnostic features? Previous findings are equivocal. We presented faces for a brief time, insufficient for a saccade, at a spatial position that guaranteed that a given feature-an eye, cheek, the central brow, or mouth-fell at the fovea. Across two experiments, observers were more accurate and faster at discriminating angry expressions when the high spatial-frequency information of the brow was projected to their fovea than when one or other cheek or eye was. Performance in classifying fear and happiness (Experiment 1) was not influenced by whether the most informative features (eyes and mouth, respectively) were projected foveally or extrafoveally. Observers more accurately distinguished between fearful and surprised expressions (Experiment 2) when the mouth was projected to the fovea. Reflexive first saccades tended towards the left and center of the face rather than preferentially targeting emotion-distinguishing features. These results reflect the integration of task-relevant information across the face constrained by the differences between foveal and extrafoveal processing (Peterson & Eckstein, 2012).
Emotion Review, 2009
According to simulation or shared-substrates models of emotion recognition, our ability to recogn... more According to simulation or shared-substrates models of emotion recognition, our ability to recognize the emotions expressed by other individuals relies, at least in part, on processes that internally simulate the same emotional state in ourselves. The term “emotional expressions” is nearly synonymous, in many people's minds, with facial expressions of emotion. However, vocal prosody and whole-body cues also convey emotional information. What is the relationship between these various channels of emotional communication? We first briefly review simulation models of emotion recognition, and then discuss neuroscientific evidence related to these models, including studies using facial expressions, whole-body cues, and vocal prosody. We conclude by discussing these data in the context of simulation and shared-substrates models of emotion recognition.
Naturalism, Evolution and Mind, 2001
Developmental Science, 2014
ERPs were measured in response to emotional body expressions in infants using pointlight displays... more ERPs were measured in response to emotional body expressions in infants using pointlight displays • 8-month-old infants, but not 4-month-old infants, discriminated between the orientation (upright, inverted) and the emotion (fearful, happy) of bodies in motion • neural evidence for the developmental emergence of emotion perception from body cues
Affective Computing, 2008
Frontiers in Human Neuroscience, 2014
Responding to others' emotional body expressions is an essential social skill in humans. Adults r... more Responding to others' emotional body expressions is an essential social skill in humans. Adults readily detect emotions from body postures, but it is unclear whether infants are sensitive to emotional body postures. We examined 8-month-old infants' brain responses to emotional body postures by measuring event-related potentials (ERPs) to happy and fearful bodies. Our results revealed two emotion-sensitive ERP components: body postures evoked an early N290 at occipital electrodes and a later Nc at fronto-central electrodes that were enhanced in response to fearful (relative to happy) expressions. These findings demonstrate that: (a) 8-month-old infants discriminate between static emotional body postures; and (b) similar to infant emotional face perception, the sensitivity to emotional body postures is reflected in early perceptual (N290) and later attentional (Nc) neural processes. This provides evidence for an early developmental emergence of the neural processes involved in the discrimination of emotional body postures.
Journal of Neuroscience, 2010
Basic emotional states (such as anger, fear, and joy) can be similarly conveyed by the face, the ... more Basic emotional states (such as anger, fear, and joy) can be similarly conveyed by the face, the body, and the voice. Are there human brain regions that represent these emotional mental states regardless of the sensory cues from which they are perceived? To address this question, in the present study participants evaluated the intensity of emotions perceived from face movements, body movements, or vocal intonations, while their brain activity was measured with functional magnetic resonance imaging (fMRI). Using multivoxel pattern analysis, we compared the similarity of response patterns across modalities to test for brain regions in which emotion-specific patterns in one modality (e.g., faces) could predict emotion-specific patterns in another modality (e.g., bodies). A whole-brain searchlight analysis revealed modality-independent but emotion category-specific activity patterns in medial prefrontal cortex (MPFC) and left superior temporal sulcus (STS). Multivoxel patterns in these regions contained information about the category of the perceived emotions (anger, disgust, fear, happiness, sadness) across all modality comparisons (face-body, face-voice, body-voice), and independently of the perceived intensity of the emotions. No systematic emotion-related differences were observed in the overall amplitude of activation in MPFC or STS. These results reveal supramodal representations of emotions in high-level brain areas previously implicated in affective processing, mental state attribution, and theory-of-mind. We suggest that MPFC and STS represent perceived emotions at an abstract, modality-independent level, and thus play a key role in the understanding and categorization of others' emotional mental states.
Social Cognitive and Affective Neuroscience, 2007
Emotionally expressive faces have been shown to modulate activation in visual cortex, including f... more Emotionally expressive faces have been shown to modulate activation in visual cortex, including face-selective regions in ventral temporal lobe. Here, we tested whether emotionally expressive bodies similarly modulate activation in body-selective regions. We show that dynamic displays of bodies with various emotional expressions vs neutral bodies, produce significant activation in two distinct body-selective visual areas, the extrastriate body area and the fusiform body area. Multi-voxel pattern analysis showed that the strength of this emotional modulation was related, on a voxel-by-voxel basis, to the degree of body selectivity, while there was no relation with the degree of selectivity for faces. Across subjects, amygdala responses to emotional bodies positively correlated with the modulation of body-selective areas. Together, these results suggest that emotional cues from body movements produce topographically selective influences on category-specific populations of neurons in visual cortex, and these increases may implicate discrete modulatory projections from the amygdala.
Psychological Medicine, 2010
BackgroundPrevious behavioural and neuroimaging studies of emotion processing in autistic spectru... more BackgroundPrevious behavioural and neuroimaging studies of emotion processing in autistic spectrum disorder (ASD) have focused on the use of facial stimuli. To date, however, no studies have examined emotion processing in autism across a broad range of social signals.MethodThis study addressed this issue by investigating emotion processing in a group of 23 adults with ASD and 23 age- and gender-matched controls. Recognition of basic emotions (‘happiness’, ‘sadness’, ‘anger’, disgust' and ‘fear’) was assessed from facial, body movement and vocal stimuli. The ability to make social judgements (such as approachability) from facial stimuli was also investigated.ResultsSignificant deficits in emotion recognition were found in the ASD group relative to the control group across all stimulus domains (faces, body movements and voices). These deficits were seen across a range of emotions. The ASD group were also impaired in making social judgements compared to the control group and this c...
Philosophical Transactions of the Royal Society B: Biological Sciences, 2011
Face processing relies on a distributed, patchy network of cortical regions in the temporal and f... more Face processing relies on a distributed, patchy network of cortical regions in the temporal and frontal lobes that respond disproportionately to face stimuli, other cortical regions that are not even primarily visual (such as somatosensory cortex), and subcortical structures such as the amygdala. Higher-level face perception abilities, such as judging identity, emotion and trustworthiness, appear to rely on an intact face-processing network that includes the occipital face area (OFA), whereas lower-level face categorization abilities, such as discriminating faces from objects, can be achieved without OFA, perhaps via the direct connections to the fusiform face area (FFA) from several extrastriate cortical areas. Some lesion, transcranial magnetic stimulation (TMS) and functional magnetic resonance imaging (fMRI) findings argue against a strict feed-forward hierarchical model of face perception, in which the OFA is the principal and common source of input for other visual and non-vis...
Neuropsychologia, 2009
Recent research has confirmed that individuals with Autism Spectrum Disorder (ASD) have difficult... more Recent research has confirmed that individuals with Autism Spectrum Disorder (ASD) have difficulties in recognizing emotions from body movements. Difficulties in perceiving coherent motion are also common in ASD. Yet it is unknown whether these two impairments are related. Thirteen adults with ASD and 16 age-and IQ-matched typically developing (TD) adults classified basic emotions from point-light and full-light displays of body movements and discriminated the direction of coherent motion in random-dot kinematograms. The ASD group was reliably less accurate in classifying emotions regardless of stimulus display type, and in perceiving coherent motion. As predicted, ASD individuals with higher motion coherence thresholds were less accurate in classifying emotions from body movements, especially in the point-light displays; this relationship was not evident for the TD group. The results are discussed in relation to recent models of biological motion processing and known abnormalities in the neural substrates of motion and social perception in ASD.
Neuropsychologia, 2007
Bilateral amygdala lesions impair the ability to identify certain emotions, especially fear, from... more Bilateral amygdala lesions impair the ability to identify certain emotions, especially fear, from facial expressions, and neuroimaging studies have demonstrated differential amygdala activation as a function of the emotional expression of faces, even under conditions of subliminal presentation, and again especially for fear. Yet the amygdala's role in processing emotion from other classes of stimuli remains poorly understood. On the basis of its known connectivity as well as prior studies in humans and animals, we hypothesised that the amygdala would be important also for the recognition of fear from body expressions. To test this hypothesis, we assessed a patient (S.M.) with complete bilateral amygdala lesions who is known to be severely impaired at recognising fear from faces. S.M. completed a battery of tasks involving forced-choice labelling and rating of the emotions in two sets of dynamic body movement stimuli, as well as in a set of static body postures. Unexpectedly, S.M.'s performance was completely normal. We replicated the finding in a second rare subject with bilateral lesions entirely confined to the amygdala. Compared to healthy comparison subjects, neither of the amygdala lesion subjects was impaired in identifying fear from any of these displays. Thus, whatever the role of the amygdala in processing whole-body fear cues, it is apparently not necessary for the normal recognition of fear from either static or dynamic body expressions.
NeuroImage, 2012
NOTE: This is a preprint of the article that was accepted for publication. It therefore does not ... more NOTE: This is a preprint of the article that was accepted for publication. It therefore does not include minor changes made at the 'proofs' stage. Please reference the final version of the article: Atkinson, A. P., Vuong, Q. C., & Smithson, H. E. (2012). Modulation of the face-and body-selective visual regions by the motion and emotion of point-light face and body stimuli.
Journal of Intelligent Systems, 1999
In this paper we introduce two dimensions over which current and recent theories of consciousness... more In this paper we introduce two dimensions over which current and recent theories of consciousness vary. These dimensions are (1) vehicle versus process theories, and (2) non-specialized versus specialized theories. Vehicle theories of consciousness propose that consciousness arises from some inherent property of mental representations or neural systems, irrespective of the computations they perform. Process theories propose that consciousness arises as a result of a particular form of computation. Non-specialized theories view consciousness as potentially arising from any part of the mind/brain with the appropriate property or performing the appropriate computations. Specialized theories view consciousness as arising from dedicated machinery. We review and situate nine theories of consciousness in the 2x2 matrix generated by the two dimensions. We then examine some difficulties with the precise characterizations of these dimensions. With regard to the vehicle/process dimension, we conclude that this dimension is best thought of as a continuum, in which the position is marked by the relative emphasis on computation versus implementation. Vehicle theories are those that place greater emphasis on implementation. With regard to the specialized/non-specialized dimension, we conclude that this dimension is a dynamic one, such that transitions may be possible over an evolutionary or developmental time scale, whereby initially non-specialized components may
Journal of Cognitive Neuroscience, 2011
Judging the sex of faces relies on cues related to facial morphology and spatial relations betwee... more Judging the sex of faces relies on cues related to facial morphology and spatial relations between features, whereas judging the trustworthiness of faces relies on both structural and expressive cues that signal affective valence. The right occipital face area (OFA) processes structural cues and has been associated with sex judgments, whereas the posterior STS processes changeable facial cues related to muscle movements and is activated when observers judge trustworthiness. It is commonly supposed that the STS receives inputs from the OFA, yet it is unknown whether these regions have functionally dissociable, critical roles in sex and trustworthiness judgments. We addressed this issue using event-related, fMRI-guided repetitive transcranial magnetic stimulation (rTMS). Twelve healthy volunteers judged the sex of individually presented faces and, in a separate session, whether those same faces were trustworthy or not. Relative to sham stimulation, RTs were significantly longer for se...
Cortex, 2010
The full-text may be used and/or reproduced, and given to third parties in any format or medium, ... more The full-text may be used and/or reproduced, and given to third parties in any format or medium, without prior permission or charge, for personal research or study, educational, or not-for-prot purposes provided that: • a full bibliographic reference is made to the original source • a link is made to the metadata record in DRO • the full-text is not changed in any way The full-text must not be sold in any format or medium without the formal permission of the copyright holders.
Cognition, 2012
Personality trait attribution can underpin important social decisions and yet requires little eff... more Personality trait attribution can underpin important social decisions and yet requires little effort; even a brief exposure to a photograph can generate lasting impressions. Body movement is a channel readily available to observers and allows judgments to be made when facial and body appearances are less visible; e.g., from great distances. Across three studies, we assessed the reliability of trait judgments of point-light walkers and identified motionrelated visual cues driving observers' judgments. The findings confirm that observers make reliable, albeit inaccurate, trait judgments, and these were linked to a small number of motion components derived from a Principal Component Analysis of the motion data. Parametric manipulation of the motion components linearly affected trait ratings, providing strong evidence that the visual cues captured by these components drive observers' trait judgments. Subsequent analyses suggest that reliability of trait ratings was driven by impressions of emotion, attractiveness and masculinity.
Uploads
Papers by Anthony Atkinson