Anatomi Dan Fisiologi

Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

Brain Struct Funct (2007) 212:121132 DOI 10.

1007/s00429-007-0154-0

REVIEW

Do early sensory cortices integrate cross-modal information?


Christoph Kayser Nikos K. Logothetis

Received: 21 May 2007 / Accepted: 14 July 2007 / Published online: 28 July 2007 Springer-Verlag 2007

Abstract Our different senses provide complementary evidence about the environment and their interaction often aids behavioral performance or alters the quality of the sensory percept. A traditional view defers the merging of sensory information to higher association cortices, and posits that a large part of the brain can be reduced into a collection of unisensory systems that can be studied in isolation. Recent studies, however, challenge this view and suggest that cross-modal interactions can already occur in areas hitherto regarded as unisensory. We review results from functional imaging and electrophysiology exemplifying cross-modal interactions that occur early during the evoked response, and at the earliest stages of sensory cortical processing. Although anatomical studies revealed several potential origins of these cross-modal inuences, there is yet no clear relation between particular functional observations and specic anatomical connections. In addition, our view on sensory integration at the neuronal level is coined by many studies on subcortical model systems of sensory integration; yet, the patterns of crossmodal interaction in cortex deviate from these model systems in several ways. Consequently, future studies on cortical sensory integration need to leave the descriptive level and need to incorporate cross-modal inuences into models of the organization of sensory processing. Only then will we be able to determine whether early crossmodal interactions truly merit the label sensory integration,
C. Kayser (&) N. K. Logothetis Max Planck Institute for Biological Cybernetics, bingen, Germany Spemannstrasse 38, 72076 Tu e-mail: [email protected] N. K. Logothetis Division of Imaging Science and Biomedical Engineering, University of Manchester, Manchester, UK

and how they increase a sensory systems ability to scrutinize its environment and nally aid behavior. Keywords Multisensory integration Auditory cortex Anatomy Physiology Functional imaging

Introduction Imagine being at a party with loud music playing and people cheering and chatting here and there. An old fried is telling you about his latest experiments, and only by closely watching his lips you just manage to understand the details. This well-knownand actually well-investigatedsituation nicely illustrates how we often rely on the combined sensory input to correctly perceive our environment. Indeed, it is the combination of sensory information that is important for an authentic and coherent perception (Adrian 1949; Stein and Meredith 1993). Many studies under controlled laboratory conditions nicely describe how multisensory input can facilitate behavior by speeding reaction times (Hershenson 1962; Gielen et al. 1983), improving detection of faint stimuli (Frens and Van Opstal 1995; Driver and Spence 1998; McDonald et al. 2000; Vroomen and de Gelder 2000), or can even change the quality of the sensory percept-like in illusions such as the ventriloquist, the McGurk or the parchment skin (Howard and Templeton 1966; McGurk and MacDonald 1976; Jousmaki and Hari 1998; Shams et al. 2000; Guest et al. 2002). And concerning the above example of the cocktail party, a classic study found that the visual input corresponds to a hearing improvement that equivalents about 1520 dB of sound intensity (Sumby and Polack 1954). Research on how our brain merges evidence from different modalities is key to an understanding of how

123

122

Brain Struct Funct (2007) 212:121132

sensory percepts arise, and as recent ndings suggest, might change our view on the organization of sensory processing. Given the manifold impact of sensory integration on perception and behavior, much work is devoted to the questions of where and how this occurs in the brain. Earlier studies found little evidence for cross-modal interactions at early stages of processing and promoted a hierarchical view, suggesting that sensory information converges only in higher association areas and specialized subcortical structures (Jones and Powell 1970; Felleman and Van Essen 1991; Stein and Meredith 1993). These association areas include the superior temporal sulcus, the intra-parietal sulcus and regions in the frontal lobe (Fig. 1), and abundant functional and anatomical studies support crossmodal interactions in these regions (Benevento et al. 1977; Hyvarinen and Shelepin 1979; Bruce et al. 1981; Rizzolatti et al. 1981; Hikosaka et al. 1988; Graziano et al. 1994, 1999; Cusick et al. 1995; Fogassi et al. 1996; Seltzer et al. 1996; Duhamel et al. 1998; Banati et al. 2000; Calvert et al. 2000; Fuster et al. 2000; Macaluso et al. 2000; Bremmer et al. 2002; Beauchamp et al. 2004; van Atteveldt et al. 2004; Barraclough et al. 2005; Saito et al. 2005; Schlack et al. 2005; Sugihara et al. 2006; Avillac et al. 2007). The observation that these association areas cover only a small portion of the cortical mantle (see Fig. 1) suggests that either most of sensory cortex is indeed devoted to the processing of a single modality, or that the hierarchical picture misses some of the areas involved in cross-modal processing. Indeed, accumulating evidence challenges this view and suggests that areas hitherto regarded as unisensory can be modulated by stimulation of several senses (Foxe and Schroeder 2005; Ghazanfar and Schroeder 2006). Here we review this evidence using the auditory cortex as a model system. However, before diving into the

evidence, it is helpful to consider the criteria that are frequently employed to identify sensory integration.

Functional criteria for sensory integration Multisensory integration is a frequently used term that is often left without exact denition. As a result, some researchers think of some higher level cognitive combination that merges different sensory evidence into a coherent percept, while others refer to specic response properties of neuronal activity. Concerning the study of neuronal responses, the term sensory integration is heavily inuenced by pioneering studies in the superior colliculus, a subcortical convergence zone for sensory information (Stein and Meredith 1993). This structure is involved in orienting the eyes towards salient points in the sensory environment and a series of studies nicely scrutinized how neurons in the superior colliculus respond to auditory, visual and somatosensory cues. In this context, sensory (or cross-modal) convergence can be dened as occurring if (for a given neuron) a response can be elicited by stimuli from different sensory modalities presented in isolation, or if activity elicited by one stimulus can be modulated (enhanced or depressed) by a stimulus from another modality. Such a response modulation is also called crossmodal interaction, as the activities elicited by the different stimuli interact to collectively determine the neurons response to the combined stimulus. Neurons, which show cross-modal convergence or interactions are also dened as multisensory neurons, as their responses can be affected by several sensory modalities. Based on the study of such neurons response properties, a number of principles for sensory integration were derived. A rst principle pertains to the spatial arrangement of sensory stimuli. Neurons in the superior colliculus usually respond to stimulation only within a restricted spatial region. For example, visual responses are limited to stimuli within a restricted region of the visual eld and auditory responses are limited to sounds originating from a range of directions. For multisensory neurons, the receptive elds of the different modalities usually overlap and only stimuli falling within this overlap lead to an enhanced response; stimuli falling outside the overlap often cause response depression (the principle of spatial coincidence) (Stein 1998). A second principle posits that the sensitivity of neurons to cross-modal enhancement is dependent on the relative timing of both stimuli (the principle of temporal coincidence) (Stein and Wallace 1996). Only stimuli that occur in close temporal proximity cause response enhancement and stimuli that are well separated in time elicit their normal unisensory response. Together with the principle of spatial

premotor cortex

intra-parietal sulcus

ce nt

prefrontal region

ra l

su lcu
la t s al er ul cu s

s
sc superior temporal sulcus

Fig. 1 Association areas implied in sensory integration. The (subcortical) superior colliculus (sc) is shown in light colors and the dashed gray lines indicate regions where sulci were opened. See text for a list of references reporting sensory integration in these areas

123

Brain Struct Funct (2007) 212:121132

123

coincidence, this posits that cross-modal interactions are specic to stimuli that possibly could originate from the same source. A third principle suggests that the strength of response modulation depends on the efcacies of the unisensory stimuli in driving the neuron (the principle of inverse effectiveness). Stimuli that by themselves elicit strong responses usually cause little cross-modal interaction, while stimuli that elicit weak responses can cause strong interactions when presented simultaneously (Stein and Wallace 1996; Perrault et al. 2003; Stanford et al. 2005; Stanford and Stein 2007). Importantly, this principle suggests a nice link between neuronal activity and behavioral benets of sensory integration. At the level of behavior, the benet of combining sensory evidence is highest when each sense alone provides only little information about the environment. Assuming that stronger responses also convey more information about the stimulus, this translates to a stronger response enhancement in the case of weak neuronal responses. These three principles have turned into a set of criteria that are often applied in order to decide whether a particular effect indeed reects sensory integration. Although these principles were derived from the activity of individual neurons in the superior colliculus, they are often applied to other measures of neuronal activity, such as functional imaging (Calvert 2001; Beauchamp 2005; Laurienti et al. 2005), and in other regions of the brain.

Sensory convergence in unisensory cortical areas: functional evidence Contrasting the hierarchical picture of sensory processing, several studies suggest that also areas classically regarded as unisensory show patterns of sensory convergence and integration. This phenomenon is often described as early sensory integration, as it proposes that cross-modal effects occur early in time during the response and in areas that are generally regarded as lower (early) in the sensory hierarchy (Schroeder et al. 2004; Foxe and Schroeder 2005; Ghazanfar and Schroeder 2006). It is worth noting that already several decades ago some studies suggested crossmodal interactions in lower sensory areas, but they perished in the mass of studies suggesting otherwise (Lomo and Mollica 1959; Murata et al. 1965; Bental et al. 1968; Spinelli et al. 1968; Morrell 1972; Fishman and Michael 1973; Vaudano et al. 1991). Electric functional imaging studies (EEG and MEG) reported changes of evoked potentials over sensory areas that occurred shortly after stimulus onset and as a result of combining stimuli of different modalities. For example, one study reported enhancement of auditory evoked

responses when an additional somatosensory stimulus was applied to a hand (Murray et al. 2005). This cross-modal enhancement reached signicance after 50 ms, suggesting that this effect occurs already during the rst feed-forward sweep of processing. Similar observations were made for a range of combinations of the different modalities (Giard and Peronnet 1999; Fort et al. 2002a, 2002b; Molholm et al. 2002), and in addition, EEG studies suggested neuronal correlates of well known multisensory illusions over classical auditory areas; e.g., for the McGurk effect (Colin et al. 2002; Mottonen et al. 2002). However, the coarse nature of this method leaves doubts about the localization of these effects, asking for methodologies with better spatial resolution. Functional imaging of the blood-oxygenation leveldependent response (fMRI-BOLD) provided good insight into which areas of the brain might be part of an early multisensory network (Calvert 2001). Prominent examples come from auditory cortex. For this system, several studies revealed that visual stimuli modulate (usually enhance) auditory activity (Pekkola et al. 2005a; Lehmann et al. 2006) and, to a certain degree, might also activate auditory cortex by themselves (Calvert et al. 1997; Bernstein et al. 2002). Many of these studies relied on audio-visual speech or communication signals (Calvert et al. 1999; van Atteveldt et al. 2004), suggesting that this class of stimuli might engage circuits that are particularly prone to crossmodal inuences. However, similar cross-modal effects were also reported for combinations of auditory and somatosensory modalities (Foxe et al. 2002; Schurmann et al. 2006). Several imaging studies proposed that cross-modal inuences on auditory cortex occur at the earliest stages, possibly even in primary auditory cortex (Calvert et al. 1997). Yet, to fully support such claims, one needs to faithfully localize individual auditory elds in the same subjects that show cross-modal inuences. For auditory cortex this can be a problem, as many of the auditory elds are rather small and have a variable position in different subjects (Hackett et al. 1998; Kaas and Hackett 2000); especially group averaging techniques can easily blur over distinct functional areas (Rademacher et al. 1993; Crivello et al. 2002; Desai et al. 2005). To overcome these limitations, we employed high resolution imaging of the macaque monkey (Logothetis et al. 1999), in combination with a recently developed approach to localize individual elds in auditory cortex (Petkov et al. 2006) (Fig. 2a, b). This technique allows obtaining a functional parcellation of auditory cortex into the distinct functional elds, in a similar manner as retinotopic mapping is frequently used to obtain a map of the different visual areas (Engel et al. 1994). Combining visual and somatosensory stimuli with various sounds, we were able to reproduce previous ndings

123

124

Brain Struct Funct (2007) 212:121132

Tonotopic parcellation of auditory cortex


rostral lateral

MM

A1 CL

CM

Core Parabelt Belt

Visual modulation

Touch modulation
lateral

rostral

fMRI-BOLD signal

Core

Touch modulation

time

Fig. 2 Functional imaging of sensory convergence in monkey auditory cortex. a Functional images were acquired parallel to the temporal plane, in order to maximize resolution and signal to noise over the auditory regions. b To functionally localize many of the known auditory elds, a functional parcellation was obtained using different localizer sounds such as pure tones and band-passed noise (Petkov et al. 2006). Left panel: voxels signicantly preferring low or high sound frequencies. Middle panel: a smoothed frequency preference map obtained using six different sound frequencies. Right panel: sketch of the functional parcellation of auditory cortex. Prominent regions are the core, which receives strong driving projections from the thalamus, the belt, which encompasses many secondary regions and the parabelt, which mostly comprises auditory association cortex. A1: primary auditory cortex, CM: caudo-medial eld; CL: caudo-lateral eld; MM: medio-medial eld in the medial belt. c Visual stimulation enhances auditory activations in the caudal

eld (Kayser et al. 2007). Middle panel: activation map for naturalistic sounds, with a superimposed functional parcellation from this animal. Left panel: Time course for two regions in auditory cortex (see arrows). The upper example shows similar responses to auditory and audio-visual stimulation, the lower example shows stronger (enhanced) responses in the audio-visual condition. Right panel: summary across many experiments with alert and anaesthetized animals; shaded elds consistently exhibited audio-visual enhancement. d Touch stimulation enhances auditory activations in the caudal belt (Kayser et al. 2005). Left panel: example map showing a discrete region with auditory-somatosensory response enhancement (blue), and activation to tone stimuli (red) used for functional localization of primary elds. Right panel: summary across experiments with anaesthetized animals; shaded elds consistently exhibited audiotactile enhancement

that visual and somatosensory stimulation can enhance auditory activations within restricted regions of auditory cortex (Kayser et al. 2005, 2007) (Fig. 2c, d). Our results clearly demonstrate that these cross-modal inuences occur only at the caudal end and mostly in the auditory belt and parabelt (secondary and association cortex). The functional parcellation of auditory cortex allowed us to localize these cross-modal interactions to the caudo-medial and caudolateral elds (CM, CL), portions of the medial belt (MM) and the caudal parabelt. To test the functional criteria of sensory integration, we manipulated the temporal alignment of acoustic and touch stimulation, and altered the effectiveness of the auditory stimulus. The results demonstrated that both, the principle of temporal coincidence and the principle of inverse effectiveness are met in this paradigmsupporting the notion that these effects have

the typical patterns of sensory integration. Combined with the human imaging studies, our ndings provide strong support for the notion that early auditory cortical areas are indeed susceptible to cross-modal inuences. To answer the questions of where this inuence originates from, and how it manifests at the level of individual neurons, complementary studies using anatomical and electrophysiological methods are required.

Anatomical evidence for early convergence The functional evidence for early cross-modal interactions is paralleled by increasing anatomical evidence for multiple sources of these effects. In short, all types of anatomical connections, regardless whether of feed-forward, lateral or

123

Brain Struct Funct (2007) 212:121132

125

feed-back type, have the potential to provide cross-modal inputs into early sensory cortices. Best studied are feedback projections from classical association areas, which reach down to primary and secondary auditory cortex (Barnes and Pandya 1992; Hackett et al. 1998; Romanski et al. 1999; Smiley et al. 2007), to primary and secondary visual areas (Falchier et al. 2002) and to somatosensory cortex (Cappe and Barone 2005). Most noteworthy, recent studies demonstrated crossconnections between different sensory streams and found projections from auditory areas to primary and secondary visual cortex in the macaque monkey (Falchier et al. 2002; Rockland and Ojima 2003). While some of these projections arise from the auditory core (primary auditory cortex), demonstrating direct cross connections between primary sensory cortical areas, most of them originate in auditory parabelt (auditory association areas). Overall these projections are sparse with sometimes only a dozen of neurons labeled per slice, and not much is known about their specic targets. Yet, they show signs of functional specicity, and prominently target the peripheral representation and the lower visual hemield. Although such specicity seems unexpected, it might be related to species specic habits, like the manipulations of objects in the hands for primates. Similar projections from the visual to auditory cortex were recently demonstrated in the ferret, where primary auditory cortex receives considerable projections from higher visual areas and also weaker projections from primary visual cortex (Bizley et al. 2006) (and see (Budinger et al. 2006) for similar results in the Mongolian Gerbil). Along the same lines, a recent study nicely described possible routes for somatosensory input to auditory cortex (Smiley et al. 2007). Smiley and colleagues found that the caudal auditory belt receives input from the granular insula, the retroinsula as well as higher somatosensory areas in the parietal lobe, suggesting that lateral input from higher somatosensory processing stages is surprisingly prominent in the auditory belt. In addition to these cortico-cortical connections, there is a range of subcortical nuclei that could relay cross-modal signals to sensory cortices. Many of the intralaminar nuclei (e.g., the suprageniculate or the limitans nucleus), the koniocellular matrix neurons and forebrain structures project diffusely to the sensory cortices (Morel and Kaas 1992; Pandya et al. 1994; Jones 1998; Zaborszky and Duque 2000; Zaborszky 2002; Budinger et al. 2006; de la Mothe et al. 2006). Again the caudal auditory cortex can serve as a good model system, and a recent study delineated how several multisensory thalamic structures could send somatosensory signals to the auditory belt (Hackett et al. 2007). This not being enough, the thalamus also provides more complex means for the interaction of different sensory streams. For example, different sensory streams can

cross-talk via thalamic nuclei such as the reticular nucleus, and recent studies provided interesting insights into how this structure might facilitate interaction between sensory streams and with association areas in the prefrontal cortex (Crabtree et al. 1998; Crabtree and Isaac 2002; Zikopoulos and Barbas 2006): both, somatosensory and motor related thalamic nuclei were found to send and receive projections from overlapping regions in the TRN, allowing these to modulate each other. Although not directly shown so far, such intra-thalamic pathways could also link different sensory modalities, allowing different sensory streams to interact via thalamo-cortical loops. Although anatomical studies revealed a number of candidate routes for cross-modal input to early sensory cortices, there is no clear relationship between a specic connection and a specic functional nding. At present, the lack of understanding makes it hard to incorporate the early cross-modal interactions into current frameworks of sensory processing, and each of the functional studies reviewed above points to several of these connections as a presumptive source. It could well be the case that different types of cross-modal interactions in a given sensory area are mediated by distinct connections; this for example seems to be the case for visual and somatosensory inputs to auditory areas (Schroeder and Foxe 2002). However, the present knowledge about the anatomical projections that mediate early cross-modal interactions is not conclusive enough to advance our understanding about their function.

Electrophysiological studies of early cross-modal interactions The evidence from functional imaging studies is supported by a growing body of electrophysiological data. For example, local eld potential responses to audio-visual communication signals are enhanced when a sound is complemented by a visual stimulus (Ghazanfar et al. 2005). In this study, conspecic vocalizations were paired with a video showing the animal producing this vocalization, in analogy to human studies using audio-visual speech (Fig. 3a). The cross-modal interaction was very prominent, including more than 70% of the recording sites in the auditory core (primary auditory cortex) and nearly 90% in the auditory belt (secondary auditory cortex). In addition, the interaction was found to be sensitive to the temporal alignment of the auditory and visual components, in agreement with the principle of temporal coincidence. To probe in more detail whether and how individual auditory neurons can be modulated by visual stimuli, we adapted our fMRI paradigm to electrophysiological experiments. Recording in different caudal elds, we could not nd a clear impact of a purely visual stimulus on the

123

126

Brain Struct Funct (2007) 212:121132

10 kHz

0 0

Time (ms)
Face + Voice Voice alone Face alone 150 100 50 0

1336

200

150 100 50 0 150

150 100 50

100 50 0 -300 0 500 1000

0
-50

F+V

Time (ms)

Movie + sound Movie alone Sound alone

AMUA

AMUA

spikes / second

3 2

4 3

1 0

2 1 0

70 60 50 40 30 20 10 0

SUA

1000

1000

1000

Time (ms)

Time (ms)

Time (ms)

Fig. 3 Visual stimuli modulate neuronal activity in auditory cortex. a Combining auditory and visual conspecic vocalization signals enhances auditory population responses in the auditory belt. Top panel: example of an audiovisual movie showing a vocalizing macaque monkey (producing a coo vocalization). Lower panels: local eld potentials recorded from two sites. The upper example shows response enhancement, the lower response depression and neither of them showed a signicant response to just visual stimulation. Adapted with permission from (Ghazanfar et al. 2005). b Recordings of multiunit (AMUA) and single-unit responses to different naturalistic audiovisual movies in the caudal belt. No example shows a clear visual response, but all show multisensory response depression (Kayser and Logothetis unpublished data)

relative timing of both stimuli and some neurons were responsive to restricted areas of the visual eld only, providing evidence for the principles of spatial and temporal coincidence. Physiological studies also convincingly revealed somatosensory input to auditory cortex. While a subset of neurons in the caudo-medial eld responds to cutaneous stimulation of the head and neck (Fu et al. 2003), the inuence of the somatosensory stimulus is even more compelling at the level of subthreshold activity (Schroeder et al. 2001, 2003; Schroeder and Foxe 2002). Recording current source densities and multi-unit activity, Lakatos and colleagues nicely delineated the mechanism by which the somatosensory stimulus enhances auditory responses (Lakatos et al. 2007) (Fig. 4). While the somatosensory stimulus by itself does not increase neuronal ring rates, it resets the phase of the ongoing slow wave activity. This phase resetting ensures that a simultaneous auditory stimulus arrives at the phase of optimal excitability. In this way, an auditory stimulus that is paired with a simultaneous somatosensory stimulus will elicit stronger neuronal responses than an auditory stimulus presented in isolation. In addition, this effect is spatially specic with respect to the hand receiving the somatosensory stimulus, and depends on the efcacy of the auditory stimulus, in agreement with the principles of inverse effectiveness and spatial coincidence. Most impressively, however, the authors found a strong relationship between the timing of auditory and somatosensory stimulation: only when the auditory stimulus was synchronous, or followed the somatosensory stimulus after a multiple of a certain oscillation cycle, was the response enhanced (Fig. 4). It should be noted that this data was obtained from primary auditory cortex, the rst stage of auditory processing in the cortex. All in all, electrophysiological studies are discovering a growing number of stimulation paradigms in which early cortical sensory areas are modulated by cross-modal input. The electrophysiological studies thereby provide detailed means to understand the neuronal basis of cross-modal interactions frequently observed in human imaging studies.

s.d.

Microvolts

Microvolts

Frequency

neuronal responses. However, the visual stimulus clearly modulated the auditory responses for many of the neurons, although often in a subtle way. Remarkably, for most neurons this audiovisual interaction resulted in a decrease of the response (Fig. 3b). Such a depression of auditory responses by visual stimulation was also observed in many elds of the ferret auditory cortex (Bizley et al. 2006). Using simplistic stimuli such as light ashes and noise bursts, and recording in anaesthetized animals, the authors found that even in A1 more than 15% of the neurons showed cross-modal effects; and this fraction further increased for higher auditory elds. The audiovisual interaction depended on the

s.d.

Complementary evidence from imaging and electrophysiology While functional imaging and electrophysiological studies nicely complement each other in terms of spatial and temporal resolution, they sometimes provide conicting results when applied to cross-modal paradigms. Most functional imaging studies report cross-modal enhancement, quite in line with the classical thinking about sensory integration. For example, visual modulation of primary auditory cortex was found to enhance auditory BOLD

123

Brain Struct Funct (2007) 212:121132

127

A
amplitude (V)

Response time course


2 1 0 0 50 time (ms) 100
Somato Audio bimodal Sum (A+S)

Temporal alignment

0 11 18 27 43
SOA (ms)

B
amplitude (V)

3 2 1 0

Inverse effectiveness

67 105 164 256

20

40

60

80

sound intensity (dB)

400 625 976

C
amplitude (V)

Spatial specificity

AU
0 0.8 1.2

0
Audio Audio + Somato contra Audio + Somato ipsi

amplitude (V)

Fig. 4 Somatosensory stimulation enhances activity in auditory cortex. a Time course of the multi-unit response. The response to the combined auditory-somatosensory stimulus is stronger than the arithmetic sum of the auditory and somatosensory responses, indicating multisensory response enhancement. b Demonstration of the principle of inverse effectiveness. For auditory stimuli that are little effective in eliciting a response (low sound intensities), there is a signicant enhancement of the response when the somatosensory stimulus is added. c Spatial specicity of the audiosomatosensory interaction. Combining the (binaural) auditory stimulus with a somatosensory stimulus on the ipsilateral side of the recording location leads to response depression, while a somatosensory stimulus on the contralatereal side leads to response enhancement. d Temporal prole of response enhancement. While a simultaneous presentation of auditory and somatosensory stimuli leads to a response enhancement, there is a variable effect of altering the stimulus onset asynchrony (SOA). Response enhancement occurs when the SOA reaches multiples of a gamma (30 ms) or theta (140 ms) oscillation cycle, while response depression occurs for intermediate values. Adapted with permission from (Lakatos et al. 2007)

neurotransmission-triggered changes in blood ow and blood oxygenation. Given our current knowledge, the BOLD signal reects the metabolic correlate of the aggregate synaptic activity in a local patch of cortex (Logothetis et al. 2001; Logothetis and Wandell 2004; Lauritzen 2005). As a result, the BOLD signal could reect the sum of both excitatory and inhibitory synaptic activity, while individual neurons reect the imbalance between their excitatory and inhibitory afferents. For example, cross-modal interactions could involve inhibitory interneurons, which would result in an enhancement of local synaptic activity but might decrease the activity of the pyramidal neurons to which prototypical neurophysiological experiments are biased (Towe and Harding 1970; Logothetis and Wandell 2004). As a result, fMRI-BOLD studiesand also measurements of local eld potentials or current source densitiesmight detect crossmodal enhancement, while single unit recordings would nd response depression. While this scenario is speculation, recent data provide compelling evidence that inhibitory circuits might be important for mediating cortical cross-modal interactions (Dehner et al. 2004; Meredith et al. 2006). The cat ectosylvian cortex contains separate regions dominated by either the auditory (region FAES) or somatosensory modality (region SIV) that are connected via direct anatomical projections. Yet, electrical stimulation of the auditory eld (FAES) by itself does not elicit responses in the somatosensory eld (SIV). Only when neurons in SIV are driven by stimulating their somatosensory receptive elds, electrical stimulation of the FAES leads to a reduction of the ring rates in SIV. Hence, this type of cross-modal effect is only visible when neurons are driven by their dominant modality and proper controls conrmed the inhibitory (GABAergic) nature of these cross-modal interactions (Dehner et al. 2004; Meredith et al. 2006). Noteworthy, the interaction observed in these studies increased monotonically with the strength of electrical FAES stimulation, quite in contrast to what might be expected from the principle of inverse effectiveness (Dehner et al. 2004).

responses (see Fig. 2c or Calvert et al. 1997; Calvert 2001; Pekkola et al. 2005b; Lehmann et al. 2006). At the level of neuronal activity, however, the evidence is more diverse: while local eld potentials and current source densities seem to be enhanced (Schroeder and Foxe 2002; Ghazanfar et al. 2005), the ring rates of many individual neurons show suppression (see Fig. 3b and Bizley et al. 2006). This conicting evidence might either result from slight differences in the individual paradigms, but it might also result from the different sources of the respective signals. The fMRI-BOLD signal reects neuronal activity only indirectly via neurovascular coupling and

Cross-modal interactions in cortex and superior colliculus Given the growing number of studies providing evidence for cross-modal interactions in early sensory cortices, there remains little doubt as to their existence. Yet, current results clearly show thatwhen considering individual neuronsthese effects can be rather subtle and are often only detectable when the right stimulus is chosen. As a result, such cross-modal interactions are best detectable at the level of population analysis, or direct population

123

128

Brain Struct Funct (2007) 212:121132

responses such as multi-unit activity and local eld potentials. In addition, cross-modal interactions can be restricted to particular areas within a sensory system (e.g., in (Kayser et al. 2005, 2007), which makes them hard to detect unless the right region is sampled or spatially resolved imaging techniques are used. However, within those cortical regions exhibiting cross-modal interactions, their frequency can be very high. For example, Lakatos and coworkers reported auditory-somatosensory interactions at nearly every recording site in A1 (Lakatos et al. 2007) and Ghazanfar and colleagues found audiovisual interactions at 70% of the sites in the same area (Ghazanfar et al. 2005). These numbers are much larger compared to 30% of the recording sites showing multisensory responses in the (monkey) superior colliculus (Stein 1998). Altogether, this suggests a number of differences between the patterns of sensory integration found in cortex and in the superior colliculus. Though much of our understanding about how individual neurons merge sensory information is derived from studies on the superior colliculus, it is a rather specialized subcortical structure involved in motor planning and detecting salient events (Krauzlis et al. 2004). Neurons in this structure accumulate evidence across space, time and modalities, and co-localized multi-modal features belonging to the same object reinforce other to attract attention. This is reected in the typical response enhancement seen for stimuli originating form the same source (Alvarado et al. 2007). Yet, the function of the superior colliculus differs from that typically attributed to sensory cortices. Sensory cortical areas are engaged in analyzing and integrating multiple features of the same modality in order to form a representation of the sensory environment. For this, they synthesize information about sensory objects from the different features that lie within their receptive elds. Such intra-modal combinations often result in responses that reach about the average response of both features presented in isolation (Riesenhuber and Poggio 2000). Hence, cortical neurons rarely show supra-linear effects when several features of the dominant modality are presented simultaneously within their receptive elds. From these observations we infer two differences, which might be critical for a better understanding of the crossmodal interactions in cortex and superior colliculus. First, virtually all neurons in sensory cortex are dominated by the respective modality, suggesting that cross-modal inuences in cortex are more of a modulatory kind, i.e., impose small modulation on the sensory evoked activity. This is in contrast to the classical multisensory structures where many multisensory neurons can be driven by several modalities and multisensory interactions sometimes change ring rates by more than an order of magnitude. And second, while multisensory neurons form only a subset of

the neurons in the superior colliculus (Stein 1998) or the higher temporal and frontal association cortices (Benevento et al. 1977; Hyvarinen and Shelepin 1979; Bruce et al. 1981; Rizzolatti et al. 1981; Hikosaka et al. 1988; Graziano et al. 1994), cross-modal interactions seem to be rather widespread within the sensory cortices. Hence, while in the superior colliculus only a subset of the neurons shows strong cross-modal interactions, many neurons in sensory cortex seem to be weakly modulated by crossmodal input. This difference might not only hint upon the neuronal mechanism that mediate the respective crossmodal interactions but should also allow further insights into their function.

Conclusion Many studies reporting cross-modal interactions in early sensory cortices propose that the observed effects constitute a correlate of sensory integration. This claim is largely supported by the nding that many of these effects obey the traditional criteria imposed on sensory integration and show specicity to the temporal or spatial alignment of the stimuli. Yet, it is hard to believe that such simple characterization merits the term sensory integration, especially given the lack of evidence for any of these effects to aid behavior and increase the sensory systems ability to scrutinize its environment. Behavioral studies of sensory integration, for example those described in the introduction, usually differentiate the two stages of sensory combination and integration (Ernst and Bulthoff 2004): First, sensory combination leads to increased information about the environment as a result of merging non-redundant information provided by different senses. Second, sensory integration reduces the uncertainty in the internal representation of the stimulus, which then improves the behavioral reaction. With regard to the early integration effects none of these points has been tested so far. To merit the term early integration, it needs to be veried that sensory representations indeed gain information about the environment or become more reliable by receiving cross-modal inputs. Granted, such studies are not easy, but they are of great importance to understand the relevance of these cross-modal interactions for sensory perception and behavior. Despite this skepticism, we believe that it makes sense for the brain to provide early sensory pathways with information about stimuli impinging on the other senses. One way to think about this is to imagine engineering a complex sensory device. Of course, one could split the system into different and separated modules, each handling and analyzing the input provided by one of the sensors; only a nal stage would try merging the sensory pictures

123

Brain Struct Funct (2007) 212:121132

129 Alvarado JC, Vaughan JW, Stanford TR, Stein BE (2007) Multisensory versus unisensory integration: contrasting modes in the superior colliculus. J Neurophysiol 97:31933205 Avillac M, Ben Hamed S, Duhamel JR (2007) Multisensory integration in the ventral intraparietal area of the macaque monkey. J Neurosci 27:19221932 Banati RB, Goerres GW, Tjoa C, Aggleton JP, Grasby P (2000) The functional anatomy of visual-tactile integration in man: a study using positron emission tomography. Neuropsychologia 38:115 124 Barnes CL, Pandya DN (1992) Efferent cortical connections of multimodal cortex of the superior temporal sulcus in the rhesus monkey. J Comp Neurol 318:222244 Barraclough NE, Xiao D, Baker CI, Oram MW, Perrett DI (2005) Integration of visual and auditory information by superior temporal sulcus neurons responsive to the sight of actions. J Cogn Neurosci 17:377391 Beauchamp MS (2005) Statistical criteria in FMRI studies of multisensory integration. Neuroinformatics 3:93114 Beauchamp MS, Lee KE, Argall BD, Martin A (2004) Integration of auditory and visual information about objects in superior temporal sulcus. Neuron 41:809823 Benevento LA, Fallon J, Davis BJ, Rezak M (1977) Auditoryvisual interaction in single cells in the cortex of the superior temporal sulcus and the orbital frontal cortex of the macaque monkey. Exp Neurol 57:849872 Bental E, Dafny N, Feldman S (1968) Convergence of auditory and visual stimuli on single cells in the primary visual cortex of unanesthetized unrestrained cats. Exp Neurol 20:341351 Bernstein LE, Auer ET Jr, Moore JK, Ponton CW, Don M, Singh M (2002) Visual speech perception without primary auditory cortex activation. Neuroreport 13:311315 Bizley JK, Nodal FR, Bajo VM, Nelken I, King AJ (2006) Physiological and anatomical evidence for multisensory interactions in auditory cortex. Cereb Cortex Epub doi:10.1093/ cercor/bhl128 Bremmer F, Klam F, Duhamel JR, Ben Hamed S, Graf W (2002) Visual-vestibular interactive responses in the macaque ventral intraparietal area (VIP). Eur J Neurosci 16:15691586 Bruce C, Desimone R, Gross CG (1981) Visual properties of neurons in a polysensory area in superior temporal sulcus of the macaque. J Neurophysiol 46:369384 Budinger E, Heil P, Hess A, Scheich H (2006) Multisensory processing via early cortical stages: connections of the primary auditory cortical eld with other sensory systems. Neuroscience 143:10651083 Bullier J (2001) Integrated model of visual processing. Brain Res Brain Res Rev 36:96107 Calvert GA (2001) Crossmodal processing in the human brain: insights from functional neuroimaging studies. Cereb Cortex 11:11101123 Calvert GA, Bullmore ET, Brammer MJ, Campbell R, Williams SC, McGuire PK, Woodruff PW, Iversen SD, David AS (1997) Activation of auditory cortex during silent lipreading. Science 276:593596 Calvert GA, Brammer MJ, Bullmore ET, Campbell R, Iversen SD, David AS (1999) Response amplication in sensory-specic cortices during crossmodal binding. Neuroreport 10:26192623 Calvert GA, Campbell R, Brammer MJ (2000) Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex. Curr Biol 10:649657 Cappe C, Barone P (2005) Heteromodal connections supporting multisensory integration at low levels of cortical processing in the monkey. Eur J Neurosci 22:28862902 Colin C, Radeau M, Soquet A, Demolin D, Colin F, Deltenre P (2002) Mismatch negativity evoked by the McGurk-MacDonald effect:

provided by the different modules. In many cases this might work and result in a uniform sensory whole. In some cases, however, each module might come up with a distinct solution and the nal merging might be impossible, e.g., then the visual system only sees a house by the auditory system hears a dog. In trying to prevent such mismatching sensory percepts, one could introduce a mechanism that ensures consistent processing in the different sensory modules, and that selectively enhances the signal to noise ratio for objects common to the different sensory modalities; of course some (or the same) mechanism rst needs to decide whether a given object provided common input to more than a single sense. Such a mechanism might use predictions generated by higher stages to bias the ongoing processing in each sensory module using some sort of contextual or feedback signal. The idea of feedback facilitating the processing in lower sensory areas by using predictions generated by higher areas is not novel (Ullman 1995; Rao and Ballard 1999). Especially the visual system, with its rather well known connectivity, has served as a model for studying the contribution of feedback to sensory processing (Lamme and Roelfsema 2000; Martin 2002). Given the larger receptive elds at higher stages, feedback is thought to reect global signals that are integrated into the local processing in lower areas, i.e., it allows combining local ne scale analysis with large scale scene integration (Bullier 2001; Sillito et al. 2006). In addition, lateral projections from neurons at the same processing stage also provide contextual modulation to the ongoing processing (Levitt and Lund 1997; Somers et al. 1998; Moore et al. 1999). In an analogous way, a consistency signal from multi-sensory areas could modulate the processing in each sensory stream to ensure a consistent multi-modal picture of the environment. Interestingly, studies of contextual modulation in primary sensory areas suggest that the modulatory inuence depends on the efcacy of the primary stimulus driving the neurons; contextual modulation often enhances responses for weak stimuli but suppresses responses to highly salient inputs (Levitt and Lund 1997; Somers et al. 1998; Moore et al. 1999)quite reminiscent of the principle of inverse effectiveness. Clearly, these ideas are still vague, but speculations like this will be necessary to promote our understanding of what cross-modal modulations actually do in early sensory cortices.
Acknowledgments The authors acknowledge support from the Max Planck Society and the German Research Foundation (DFG).

References
Adrian ED (1949) The Sherrington lectures: 1. Sensory integration: University of Liverpool Press

123

130 a phonetic representation within short-term memory. Clin Neurophysiol 113:495506 Crabtree JW, Isaac JT (2002) New intrathalamic pathways allowing modality-related and cross-modality switching in the dorsal thalamus. J Neurosci 22:87548761 Crabtree JW, Collingridge GL, Isaac JT (1998) A new intrathalamic pathway linking modality-related nuclei in the dorsal thalamus. Nat Neurosci 1:389394 Crivello F, Schormann T, Tzourio-Mazoyer N, Roland PE, Zilles K, Mazoyer BM (2002) Comparison of spatial normalization procedures and their impact on functional maps. Hum Brain Mapp 16:228250 Cusick CG, Seltzer B, Cola M, Griggs E (1995) Chemoarchitectonics and corticocortical terminations within the superior temporal sulcus of the rhesus monkey: evidence for subdivisions of superior temporal polysensory cortex. J Comp Neurol 360:513535 de la Mothe LA, Blumell S, Kajikawa Y, Hackett TA (2006) Thalamic connections of the auditory cortex in marmoset monkeys: core and medial belt regions. J Comp Neurol 496:7296 Dehner LR, Keniston LP, Clemo HR, Meredith MA (2004) Crossmodal circuitry between auditory and somatosensory areas of the cat anterior ectosylvian sulcal cortex: a new inhibitory form of multisensory convergence. Cereb Cortex 14:387403 Desai R, Liebenthal E, Possing ET, Waldron E, Binder JR (2005) Volumetric vs. surface-based alignment for localization of auditory cortex activation. Neuroimage 26:10191029 Driver J, Spence C (1998) Crossmodal attention. Curr Opin Neurobiol 8:245253 Duhamel JR, Colby CL, Goldberg ME (1998) Ventral intraparietal area of the macaque: congruent visual and somatic response properties. J Neurophysiol 79:126136 Engel SA, Rumelhart DE, Wandell BA, Lee AT, Glover GH, Chichilnisky EJ, Shadlen MN (1994) fMRI of human visual cortex. Nature 369:525 Ernst MO, Bulthoff HH (2004) Merging the senses into a robust percept. Trends Cogn Sci 8:162169 Falchier A, Clavagnier S, Barone P, Kennedy H (2002) Anatomical evidence of multimodal integration in primate striate cortex. J Neurosci 22:57495759 Felleman DJ, Van Essen DC (1991) Distributed hierarchical processing in the primate cerebral cortex. Cereb Cortex 1:147 Fishman MC, Michael P (1973) Integration of auditory information in the cats visual cortex. Vision Res 13:14151419 Fogassi L, Gallese V, Fadiga L, Luppino G, Matelli M, Rizzolatti G (1996) Coding of peripersonal space in inferior premotor cortex (area F4). J Neurophysiol 76:141157 Fort A, Delpuech C, Pernier J, Giard MH (2002a) Dynamics of cortico-subcortical cross-modal operations involved in audio visual object detection in humans. Cereb Cortex 12:10311039 Fort A, Delpuech C, Pernier J, Giard MH (2002b) Early auditoryvisual interactions in human cortex during nonredundant target identication. Brain Res Cogn Brain Res 14:2030 Foxe JJ, Schroeder CE (2005) The case for feedforward multisensory convergence during early cortical processing. Neuroreport 16:419423 Foxe JJ, Wylie GR, Martinez A, Schroeder CE, Javitt DC, Guilfoyle D, Ritter W, Murray MM (2002) Auditory-somatosensory multisensory processing in auditory association cortex: an fMRI study. J Neurophysiol 88:540543 Frens MA, Van Opstal AJ (1995) A quantitative study of auditoryevoked saccadic eye movements in two dimensions. Exp Brain Res 107:103117 Fu KM, Johnston TA, Shah AS, Arnold L, Smiley J, Hackett TA, Garraghty PE, Schroeder CE (2003) Auditory cortical neurons respond to somatosensory stimulation. J Neurosci 23:75107515

Brain Struct Funct (2007) 212:121132 Fuster JM, Bodner M, Kroger JK (2000) Cross-modal and crosstemporal association in neurons of frontal cortex. Nature 405:347351 Ghazanfar AA, Schroeder CE (2006) Is neocortex essentially multisensory? Trends Cogn Sci 10:278285 Ghazanfar AA, Maier JX, Hoffman KL, Logothetis NK (2005) Multisensory integration of dynamic faces and voices in rhesus monkey auditory cortex. J Neurosci 25:50045012 Giard MH, Peronnet F (1999) Auditory-visual integration during multimodal object recognition in humans: a behavioral and electrophysiological study. J Cogn Neurosci 11:473490 Gielen SC, Schmidt RA, Van den Heuvel PJ (1983) On the nature of intersensory facilitation of reaction time. Percept Psychophys 34:161168 Graziano MS, Yap GS, Gross CG (1994) Coding of visual space by premotor neurons. Science 266:10541057 Graziano MS, Reiss LA, Gross CG (1999) A neuronal representation of the location of nearby sounds. Nature 397:428430 Guest S, Catmur C, Lloyd D, Spence C (2002) Audiotactile interactions in roughness perception. Exp Brain Res 146:161 171 Hackett TA, Stepniewska I, Kaas JH (1998) Subdivisions of auditory cortex and ipsilateral cortical connections of the parabelt auditory cortex in macaque monkeys. J Comp Neurol 394:475495 Hackett TA, De La Mothe LA, Ulbert I, Karmos G, Smiley J, Schroeder CE (2007) Multisensory convergence in auditory cortex, II. Thalamocortical connections of the caudal superior temporal plane. J Comp Neurol 502:924952 Hershenson M (1962) Reaction time as a measure of intersensory facilitation. J Exp Psychol 63:289293 Hikosaka K, Iwai E, Saito H, Tanaka K (1988) Polysensory properties of neurons in the anterior bank of the caudal superior temporal sulcus of the macaque monkey. J Neurophysiol 60:16151637 Howard IP, Templeton WB (1966) Human spatial orientation. Wiley, London Hyvarinen J, Shelepin Y (1979) Distribution of visual and somatic functions in the parietal associative area 7 of the monkey. Brain Res 169:561564 Jones EG (1998) Viewpoint: the core and matrix of thalamic organization. Neuroscience 85:331345 Jones EG, Powell TP (1970) An anatomical study of converging sensory pathways within the cerebral cortex of the monkey. Brain 93:793820 Jousmaki V, Hari R (1998) Parchment-skin illusion: sound-biased touch. Curr Biol 8:R190 Kaas JH, Hackett TA (2000) Subdivisions of auditory cortex and processing streams in primates. Proc Natl Acad Sci USA 97:1179311799 Kayser C, Petkov CI, Augath M, Logothetis NK (2005) Integration of touch and sound in auditory cortex. Neuron 48:373384 Kayser C, Petkov CI, Augath M, Logothetis NK (2007) Functional imaging reveals visual modulation of specic elds in auditory cortex. J Neurosci 27:18241835 Krauzlis RJ, Liston D, Carello CD (2004) Target selection and the superior colliculus: goals, choices and hypotheses. Vision Res 44:14451451 Lakatos P, Chen CM, OConnell MN, Mills A, Schroeder CE (2007) Neuronal oscillations and multisensory interaction in primary auditory cortex. Neuron 53:279292 Lamme VA, Roelfsema PR (2000) The distinct modes of vision offered by feedforward and recurrent processing. Trends Neurosci 23:571579 Lauritzen M (2005) Reading vascular changes in brain imaging: is dendritic calcium the key? Nat Rev Neurosci 6:7785

123

Brain Struct Funct (2007) 212:121132 Laurienti PJ, Perrault TJ, Stanford TR, Wallace MT, Stein BE (2005) On the use of superadditivity as a metric for characterizing multisensory integration in functional neuroimaging studies. Exp Brain Res 166:298297 Lehmann C, Herdener M, Esposito F, Hubl D, di Salle F, Schefer K, Bach DR, Federspiel A, Kretz R, Dierks T, Seifritz E (2006) Differential patterns of multisensory interactions in core and belt areas of human auditory cortex. Neuroimage 31:294300 Levitt JB, Lund JS (1997) Contrast dependence of contextual effects in primate visual cortex. Nature 387:7376 Logothetis NK, Wandell BA (2004) Interpreting the BOLD Signal. Annu Rev Physiol 66:735769 Logothetis NK, Guggenberger H, Peled S, Pauls J (1999) Functional imaging of the monkey brain. Nat Neurosci 2:555562 Logothetis NK, Pauls J, Augath M, Trinath T, Oeltermann A (2001) Neurophysiological investigation of the basis of the fMRI signal. Nature 412:150157 Lomo T, Mollica A (1959) Activity of single units of the primary optic cortex during stimulation by light, sound, smell and pain, in unanesthetized rabbits. Boll Soc Ital Biol Sper 35:1879 1882 Macaluso E, Frith CD, Driver J (2000) Modulation of human visual cortex by crossmodal spatial attention. Science 289:12061208 Martin KA (2002) Microcircuits in visual cortex. Curr Opin Neurobiol 12:418425 McDonald JJ, Teder-Salejarvi WA, Hillyard SA (2000) Involuntary orienting to sound improves visual perception. Nature 407:906 908 McGurk H, MacDonald J (1976) Hearing lips and seeing voices. Nature 264:746748 Meredith MA, Keniston LR, Dehner LR, Clemo HR (2006) Crossmodal projections from somatosensory area SIV to the auditory eld of the anterior ectosylvian sulcus (FAES) in Cat: further evidence for subthreshold forms of multisensory processing. Exp Brain Res Epub (ahead of publication) Molholm S, Ritter W, Murray MM, Javitt DC, Schroeder CE, Foxe JJ (2002) Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping study. Brain Res Cogn Brain Res 14:115128 Moore CI, Nelson SB, Sur M (1999) Dynamics of neuronal processing in rat somatosensory cortex. Trends Neurosci 22:513520 Morel A, Kaas JH (1992) Subdivisions and connections of auditory cortex in owl monkeys. J Comp Neurol 318:2763 Morrell F (1972) Visual systems view of acoustic space. Nature 238:4446 Mottonen R, Krause CM, Tiippana K, Sams M (2002) Processing of changes in visual speech in the human auditory cortex. Brain Res Cogn Brain Res 13:417425 Murata K, Cramer H, Bach-y-Rita P (1965) Neuronal convergence of noxious, acoustic, and visual stimuli in the visual cortex of the cat. J Neurophysiol 28:12231239 Murray MM, Molholm S, Michel CM, Heslenfeld DJ, Ritter W, Javitt DC, Schroeder CE, Foxe JJ (2005) Grabbing your ear: rapid auditory-somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment. Cereb Cortex 15:963974 Pandya DN, Rosene DL, Doolittle AM (1994) Corticothalamic connections of auditory-related areas of the temporal lobe in the rhesus monkey. J Comp Neurol 345:447471 Pekkola J, Ojanen V, Autti T, Jaaskelainen IP, Mottonen R, Sams M (2005a) Attention to visual speech gestures enhances hemodynamic activity in the left planum temporale. Hum Brain Mapp 27:471477 Pekkola J, Ojanen V, Autti T, Jaaskelainen IP, Mottonen R, Tarkiainen A, Sams M (2005b) Primary auditory cortex

131 activation by visual speech: an fMRI study at 3 T. Neuroreport 16:125128 Perrault TJ Jr, Vaughan JW, Stein BE, Wallace MT (2003) Neuronspecic response characteristics predict the magnitude of multisensory integration. J Neurophysiol 90:40224026 Petkov CI, Kayser C, Augath M, Logothetis NK (2006) Functional imaging reveals numerous elds in the monkey auditory cortex. PLOS Biology 4:e215 Rademacher J, Caviness VS Jr, Steinmetz H, Galaburda AM (1993) Topographical variation of the human primary cortices: implications for neuroimaging, brain mapping, and neurobiology. Cereb Cortex 3:313329 Rao RP, Ballard DH (1999) Predictive coding in the visual cortex: a functional interpretation of some extra-classical receptive-eld effects. Nat Neurosci 2:7987 Riesenhuber M, Poggio T (2000) Models of object recognition. Nat Neurosci 3(Suppl):11991204 Rizzolatti G, Scandolara C, Gentilucci M, Camarda R (1981) Response properties and behavioral modulation of mouth neurons of the postarcuate cortex (area 6) in macaque monkeys. Brain Res 225:421424 Rockland KS, Ojima H (2003) Multisensory convergence in calcarine visual areas in macaque monkey. Int J Psychophysiol 50:1926 Romanski LM, Bates JF, Goldman-Rakic PS (1999) Auditory belt and parabelt projections to the prefrontal cortex in the rhesus monkey. J Comp Neurol 403:141157 Saito DN, Yoshimura K, Kochiyama T, Okada T, Honda M, Sadato N (2005) Cross-modal binding and activated attentional networks during audiovisual speech integration: a functional MRI study. Cereb Cortex 15:17501760 Schlack A, Sterbing-DAngelo SJ, Hartung K, Hoffmann KP, Bremmer F (2005) Multisensory space representations in the macaque ventral intraparietal area. J Neurosci 25:46164625 Schroeder CE, Foxe JJ (2002) The timing and laminar prole of converging inputs to multisensory areas of the macaque neocortex. Brain Res Cogn Brain Res 14:187198 Schroeder CE, Lindsley RW, Specht C, Marcovici A, Smiley JF, Javitt DC (2001) Somatosensory input to auditory association cortex in the macaque monkey. J Neurophysiol 85:13221327 Schroeder CE, Smiley J, Fu KG, McGinnis T, OConnell MN, Hackett TA (2003) Anatomical mechanisms and functional implications of multisensory convergence in early cortical processing. Int J Psychophysiol 50:517 Schroeder CE, Molholm S, Lakatos P, Ritter W, Foxe JJ (2004) Humansimian correspondence in the early cortical processing of multisensory cues. Cogn Process 5:140151 Schurmann M, Caetano G, Hlushchuk Y, Jousmaki V, Hari R (2006) Touch activates human auditory cortex. Neuroimage 30:1325 1331 Seltzer B, Cola MG, Gutierrez C, Massee M, Weldon C, Cusick CG (1996) Overlapping and nonoverlapping cortical projections to cortex of the superior temporal sulcus in the rhesus monkey: double anterograde tracer studies. J Comp Neurol 370:173190 Shams L, Kamitani Y, Shimojo S (2000) Illusions. What you see is what you hear. Nature 408:788 Sillito AM, Cudeiro J, Jones HE (2006) Always returning: feedback and sensory processing in visual cortex and thalamus. Trends Neurosci 29:307316 Smiley JF, Hackett TA, Ulbert I, Karmas G, Lakatos P, Javitt DC, Schroeder CE (2007) Multisensory convergence in auditory cortex, I. Cortical connections of the caudal superior temporal plane in macaque monkeys. J Comp Neurol 502:894923 Somers DC, Todorov EV, Siapas AG, Toth LJ, Kim DS, Sur M (1998) A local circuit approach to understanding integration of long-range inputs in primary visual cortex. Cereb Cortex 8:204217

123

132 Spinelli DN, Starr A, Barrett TW (1968) Auditory specicity in unit recordings from cats visual cortex. Exp Neurol 22:7584 Stanford TR, Stein BE (2007) Superadditivity in multisensory integration: putting the computation in context. Neuroreport 18:787792 Stanford TR, Quessy S, Stein BE (2005) Evaluating the operations underlying multisensory integration in the cat superior colliculus. J Neurosci 25:64996508 Stein BE (1998) Neural mechanisms for synthesizing sensory information and producing adaptive behaviors. Exp Brain Res 123:124135 Stein BE, Meredith MA (1993) Merging of the Senses. MIT Press, Cambridge Stein BE, Wallace MT (1996) Comparisons of cross-modality integration in midbrain and cortex. Prog Brain Res 112:289299 Sugihara T, Diltz MD, Averbeck BB, Romanski LM (2006) Integration of auditory and visual communication information in the primate ventrolateral prefrontal cortex. J Neurosci 26:1113811147 Sumby WH, Polack I (1954) Visual contribution to speech intelligibility in noise. J Acoust Soc Am 26:212215

Brain Struct Funct (2007) 212:121132 Towe AL, Harding GW (1970) Extracellular microelectrode sampling bias. Exp Neurol 29:366381 Ullman S (1995) Sequence seeking and counter streams: a computational model for bidirectional information ow in the visual cortex. Cereb Cortex 5:111 van Atteveldt N, Formisano E, Goebel R, Blomert L (2004) Integration of letters and speech sounds in the human brain. Neuron 43:271282 Vaudano E, Legg CR, Glickstein M (1991) Afferent and efferent connections of temporal association cortex in the rat: a horseradish peroxidase study. Eur J Neurosci 3:317330 Vroomen J, de Gelder B (2000) Sound enhances visual perception: cross-modal effects of auditory organization on vision. J Exp Psychol Hum Percept Perform 26:15831590 Zaborszky L (2002) The modular organization of brain systems. Basal forebrain: the last frontier. Prog Brain Res 136:359372 Zaborszky L, Duque A (2000) Local synaptic connections of basal forebrain neurons. Behav Brain Res 115:143158 Zikopoulos B, Barbas H (2006) Prefrontal projections to the thalamic reticular nucleus form a unique circuit for attentional mechanisms. J Neurosci 26:73487361

123

You might also like