Thaler 11 PlosOne
Thaler 11 PlosOne
Thaler 11 PlosOne
Abstract
Background: A small number of blind people are adept at echolocating silent objects simply by producing mouth clicks and
listening to the returning echoes. Yet the neural architecture underlying this type of aid-free human echolocation has not
been investigated. To tackle this question, we recruited echolocation experts, one early- and one late-blind, and measured
functional brain activity in each of them while they listened to their own echolocation sounds.
Results: When we compared brain activity for sounds that contained both clicks and the returning echoes with brain
activity for control sounds that did not contain the echoes, but were otherwise acoustically matched, we found activity in
calcarine cortex in both individuals. Importantly, for the same comparison, we did not observe a difference in activity in
auditory cortex. In the early-blind, but not the late-blind participant, we also found that the calcarine activity was greater for
echoes reflected from surfaces located in contralateral space. Finally, in both individuals, we found activation in middle
temporal and nearby cortical regions when they listened to echoes reflected from moving targets.
Conclusions: These findings suggest that processing of click-echoes recruits brain regions typically devoted to vision rather
than audition in both early and late blind echolocation experts.
Citation: Thaler L, Arnott SR, Goodale MA (2011) Neural Correlates of Natural Human Echolocation in Early and Late Blind Echolocation Experts. PLoS ONE 6(5):
e20162. doi:10.1371/journal.pone.0020162
Editor: David C. Burr, Istituto di Neuroscienze, Italy
Received March 1, 2011; Accepted April 13, 2011; Published May 25, 2011
Copyright: ß 2011 Thaler et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits
unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Funding: This work was supported by the Canadian Institutes of Health Research (MAG) and the Ontario Ministry of Research and Innovation (LT). The funders
had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Competing Interests: The authors have declared that no competing interests exist.
* E-mail: [email protected]
Figure 1. Illustration of click sounds, click echoes and experimental materials, and summary of behavioural results. A: Waveplots and
spectrograms of the sound of a click (highlighted with black arrows) and its echo (highlighted with green arrows) recorded in the left (L) and right (R)
ears of EB and LB (sampling rate 44.1 kHz) (Sound S1 and Sound S2). Both EB and LB made the clicks in the presence of a position marker (shown in
1B) located straight ahead. Spectrograms were obtained using an FFT window of 256 samples, corresponding to approximately 5.6 ms in our
recordings. Waveform plots and spectrograms are for illustration. While the exact properties of the click and its echo (e.g. loudness, timbre) are
specific to the person generating the click as well as the sound reflecting surface, prominent characteristics of clicks are short duration (approximately
10 ms) and broad frequency spectra, both of which are evident in the plots. B: Position marker used for angular position discrimination experiments
during active echolocation, and to make recordings for the passive listening paradigm. The marker was an aluminium foil covered foam half-tube
(diameter 6 cm, height 180 cm), placed vertically, at a distance of 150 cm, with the concave side facing the subject. Note the 125-Hz cutoff wedge
system on the walls of the anechoic chamber. C: Results of angular position discrimination experiments (for examples of sound stimuli used during
passive listening listen to Sounds S5 and S6). Plotted on the ordinate is the probability that the participant judges the position marker to be located
to the right of its straight ahead reference position. Plotted on the abscissa is the position of the test position with respect to the straight ahead in
degrees. Negative numbers indicate a position shift in the counter clockwise direction. Psychometric functions were obtained by fitting a 3-
parameter sigmoid to the data. 25% and 75% thresholds and bias (denoted in red) were estimated from fitted curves. The zero-bias line (dashed line)
is drawn for comparison. D: Stimuli were recorded with microphones placed in the echolocator’s ears, directly in front of the ear canal. E: During
passive listening, stimuli were delivered using fMRI compatible in-ear headphones, which imposed a 10 kHz cutoff (marked with a dashed line in
spectrograms in A). F–G: Behavioral results from the various passive-listening classification tasks (for examples of sound stimuli used during the
various classification tasks listen to Sound S7, Sound S8, Sound S9, Sound S10, Sound S11, Sound S12, Sound S13). Shown is percentage correct.
Asterisks indicate that performance is significantly different from chance (p,.05). Unless otherwise indicated, chance performance is 50%. Sample
sizes (reported in Table S1 and Table S2) fulfil minimum requirement for confidence intervals for a proportion based on the normal approximation
[48]. 1 = less than chance, because of bias to classify as ‘tree’.
doi:10.1371/journal.pone.0020162.g001
auditory stimuli that allowed us to identify those brain areas that time of testing) lost vision at age 14 years due to optic nerve
responded only to the echoes within a train of echolocation atrophy (late onset blindness). Both were right-handed, had
sounds. normal hearing and normal auditory source localization abilities
Two blind skilled echolocators participated in the current study. (Figure S1; Audiology Report S1; for samples of sounds used
Participant EB (43 years at time of testing) had partial vision up to during source localization listen to Sound S3 and Sound S4). Both
13 months of age. At 13 months, his eyes were removed due to EB and LB use echolocation on a daily basis, enabling them to
retinoblastoma (early onset blindness). Participant LB (27 years at explore cities during travelling and to hike, mountain bike or play
basketball. Two non-echolocating, right-handed sighted males, C1 S7 and Sound S8); ii) as they sat in an anechoic chamber in front
and C2, were run as sex and age-matched fMRI controls for EB of a concave surface placed 40 cm in front with either the head
and LB, respectively. There is evidence that blind people, even held stationary or the head moving (when recordings of the latter
when they do not consciously echolocate, are more sensitive to were played back to EB and LB, they described a percept of a
echoes than sighted people [12]. This might pose a challenge when surface in motion; for examples of sounds used during the
comparing the brain activation of blind echolocators with the experiment listen to Sound S9, Sound S10 and Sound S11); and
brain activation of blind self-proclaimed non-echolocators. For this iii) as they stood outdoors in front of a tree, or a car, or a lamp
reason, we decided to use sighted self-proclaimed non-echoloca- post. We also created control sounds for the outdoor recordings,
tors as control participants. which contained the same background sounds and clicks, but no
The data show that the presence of echoes within a train of click echoes. Thus, outdoor control sounds were yoked to the
complex sounds increases BOLD signal in calcarine cortex in both outdoor echolocation sounds, but they did not contain the click’s
EB and LB. This increase in activity in calcarine cortex is absent in echoes (for examples of sounds used during the experiment listen
C1 and C2. Importantly, the presence of echoes within a train of to Sound S12 and Sound S13). Behavioral testing demonstrated
complex sounds does not lead to an increase in BOLD signal in that, when presented with the recordings from the anechoic
auditory cortex in any of the four participants. This finding chamber, EB was able to determine the shape, movement and
suggests that brain structures that process visual information in location of surfaces with near perfect accuracy, whereas LB was
sighted people process echo information in blind echolocation less accurate at the shape and movement task and in fact
experts. performed at chance levels on the localization task (Figure 1F).
Finally, when presented with the outdoor echolocation recordings
Results both EB and LB readily distinguished control sounds from
echolocation sounds and they identified objects well above
Validation of the Echolocation Stimuli chance levels. In addition, both echolocators performed equally
To overcome the difficulties posed by studying echolocation in well when listening to outdoor recordings of the other person as
an MRI environment (i.e., hearing protection must be worn, head compared to their own (Figure 1G). Control participants C1 and
and mouth movements must be minimized, etc.), a passive C2 had trained with the echolocation stimuli of EB and LB prior
listening paradigm was adopted whereby the echolocation clicks to testing. Both control participants performed at chance levels
and their echoes were pre-recorded in the listener’s ears for shape and location classification, but well above chance for
(Figure 1D) and then presented via fMRI compatible insert movement classification (Figure 1F). Upon questioning, both C1
earphones (Figure 1E). To test the validity of this paradigm, a and C2 stated that clicks in ‘moving’ stimuli had a slightly more
direct behavioral comparison between active echolocation and regular rhythm (compare Sound S9 and Sound S10 to Sound
passive listening was conducted using an angular position S11). However, both C1 and C2 maintained that they had not
discrimination task, in which EB and LB discriminated the perceived any kind of movement in those recordings. When C1
angular position of a test pole with respect to straight ahead and C2 were presented with outdoor recordings they could
(Figure 1B). The results of this test are illustrated in Figure 1C. It is distinguish echolocation sounds from control sounds, but they
evident from the data that EB and LB can determine the angular were unable to identify objects (Figure 1G). Upon questioning,
position of the pole in both active and passive echolocation tasks C1 and C2 reported that echolocation and control stimuli
(for samples of sounds used during angular position discrimination sounded ‘somehow different’, but they could not pinpoint the
through passive listening listen to Sound S5 and Sound S6). For nature of this difference (compare Sound S12 and Sound S13).
EB, thresholds are very low (approx. 3u) and performance in active Both C1 and C2 said that they had not perceived any objects in
and passive tasks is the same. Thus, EB can reliably distinguish the recordings. For more detailed results, including sample sizes,
a 3u difference in the position of the test pole away from straight see Table S1 and Table S2.
ahead, even when listening only to recordings of echolocation
sounds. For LB, thresholds are generally higher than for EB and Brain activation
performance in the active task (threshold approx. 9u) is better than Cerebral Cortex. Functional MRI revealed reliable blood-
in the passive task (threshold approx. 22u). With regard to bias, EB oxygen-level dependent (BOLD) activity in auditory cortex as well
is unbiased (red line at zero), but LB tends to judge test locations to as in the calcarine sulcus and surrounding regions of ‘‘visual’’
be to the left of the straight ahead (red line shifted to the right). cortex in EB and LB when they listened to recordings of their
This means, that LB’s subjective straight ahead is shifted to the echolocation clicks and echoes, as compared to silence (Figure 2,
right. In summary, the data show that during active echolocation, top). EB showed stronger activity in the calcarine cortex than did
both EB and LB resolved the angular position of a sound reflecting LB, which could reflect EB’s much longer use of echolocation
surface with high precision. This was expected based on what EB and/or his more reliable performance in passive echolocation
and LB do in everyday life. In addition, the data show that during tasks. Activity in calcarine cortex was entirely absent in C1 and C2
passive listening, LB’s precision was somewhat reduced, but EB’s when they listened to the echolocation recordings of EB and LB,
performance was unaffected, reflecting perhaps his greater although both control subjects showed robust activity in auditory
experience with echolocation and/or the fact that he was blinded cortex (Figure 2, bottom). This pattern of results was expected
early in life. In any case, we felt confident that passive listening was based on previous experiments that have measured brain
a feasible paradigm to probe the neural substrates of echolocation activation in blind and sighted people in response to auditory
in the scanner. stimulation as compared to silence [13–15].
To obtain stimuli that would elicit strong echolocation Remarkably, however, when we compared BOLD activation to
percepts, we recorded echolocation clicks and echoes from EB outdoor recordings that contained click echoes with activation to
and LB outside of the MRI under three scenarios: i) as they sat in outdoor recordings without echoes, activity disappeared in EB and
an anechoic chamber in front of a concave or flat surface that was LB’s auditory cortex, but remained in calcarine cortex (Figure 3,
placed 40 cm in front of them and 20u to the left or right (for top). Again, the activation in the calcarine cortex was more evident
examples of sounds used during the experiment listen to Sound in EB than it was in LB. The results were quite different for the
Figure 2. BOLD activity projected on participants reconstructed and partially inflated cortical surface. Concavities and convexities are
colored dark and light, respectively. CS-central sulcus, CaS-calcarine sulcus, LS- lateral sulcus, MFS – middle frontal sulcus. Top panel: BOLD activity
while EB and LB listened to recordings of their own echolocation sounds that had been made in an anechoic chamber and judged the location (left
vs. right), shape (concave vs. flat) or stability (moving vs. stationary) of the sound reflecting surface (see Figure 1F for behavioral results). Bottom
Panel: BOLD activity while C1 and C2 listened to recordings they had trained with, i.e. EB and LB’s echolocation sounds, respectively. Just as EB and
LB, C1 and C2 judged the location (left vs. right), shape (concave vs. flat) or stability (moving vs. stationary) of the sound reflecting surface (see
Figure 1F for behavioral results). Both EB and LB, but not C1 or C2, show reliable BOLD activity in calcarine sulcus, typically associated with the
processing of visual stimuli. EB shows more BOLD activity in calcarine sulcus than LB. All subjects (except C2) also show BOLD activity along the
central sulcus (i.e. Motor Cortex) of the left hemisphere, most likely due to the response related right-hand button press. All subjects also show BOLD
activity in the lateral sulcus (i.e. Auditory Complex) of the left and right hemispheres and adjacent and inferior to the right medial frontal sulcus. The
former likely reflects the auditory nature of the stimuli. The latter most likely reflects the involvement of higher order cognitive and executive control
processes during task performance.
doi:10.1371/journal.pone.0020162.g002
control participants. When we contrasted BOLD activity related to The lack of any difference in activity in auditory cortex in all the
outdoor recordings that contained click echoes with those that did participants for the contrast between outdoor recordings with and
not, neither C1 nor C2 showed any differential activation in any without echoes was not unexpected, because we had created
region of their brains (Figure 3, bottom). The results also hold at a echolocation and control stimuli so that the acoustic differences
more liberal statistical threshold (Figure S2). were minimal and the only difference was the presence or absence
Figure 3. BOLD activity projected on participants reconstructed and partially inflated cortical surface. Marking of cortical surfaces and
abbreviations as in Figure 2. Top panel: Contrast between activations for outdoor recordings containing echoes from objects and recordings that did
not contain such echoes for EB and LB. During the experiment EB and LB listened to outdoor scene recordings and judged whether the recording
contained echoes reflected from a car, tree or pole or no object echoes at all. Each participant listened to recordings of his own clicks and echoes as
well as to recordings of the other person (see Figure 1G for behavioral results; for example sounds listen to Sound S12 and Sound S13). Bottom panel:
Contrast between activations for outdoor recordings containing echoes from objects and recordings that did not contain such echoes for C1 and C2.
The task was the same as for EB and LB and each participant listened to recordings they had trained with as well as to the recordings of the other
person, e.g. C1 listened to both EB’s and LB’s recordings (see Figure 1G for behavioral results). It is evident that both EB and LB, but not C1 or C2,
show increased BOLD activity in the calcarine sulcus for recordings that contain echoes (highlighted in white). EB mainly shows increased activity in
the calcarine sulcus of the right hemisphere, whereas LB shows activity at the apex of the occipital lobes of the right and left hemisphere, as well as in
the calcarine sulcus of the left hemisphere. In addition, both EB and LB, but not C1 or C2, show an increase in BOLD activity in along the medial
frontal sulcus. This result most likely reflects the involvement of higher order cognitive and executive control processes during echolocation. There is
no difference in BOLD activity along the lateral sulcus for any participant, i.e. Auditory Complex (highlighted in magenta). This result was expected
because the Echo stimuli and the Control stimuli had been designed in a way that minimized any spectral, temporal or intensity differences. No BOLD
activity differences were found when activations for EB’s recordings were contrasted with activations for LB’s recordings.
doi:10.1371/journal.pone.0020162.g003
Figure 4. Results of the analysis of contralateral preference for EB and LB. Regions of interest (ROI) were defined based on anatomical and
functional criteria. For illustration purposes, we show projections of ROI on the partially inflated cortical surfaces. However, all statistical analyses were
performed in volume space. Bar graphs indicate beta values for the various ROIs. Gray and white bars indicate beta weights for ‘echo from surface on
left’ and ‘echo from surface on right’, respectively, averaged across voxels within each ROI. Colored bars denote the difference between beta weights
within each brain side (red bars indicate higher beta values for ‘echo from surface on right’; blue bars the reverse). Error bars denote SEM. To
determine if activity during echolocation exhibits a contralateral preference, we applied independent measures ANOVA to the beta weights with
‘echo side’ (i.e. ‘echo from surface on left’ vs. ‘echo from surface on right’) and ‘brain side’ (e.g. ‘left calcarine’ vs. ‘right calcarine’) as factors to each
ROI. ANOVA results are summarized below each bar graph. Results show that activity in calcarine cortex exhibits contralateral preference for EB
(significant interaction effect), but not LB. Activity in auditory cortex shows neither contra- nor ipsilateral preference in either subject. For both EB and
LB, beta values in the right calcarine exceed those in the left calcarine (main effect of ‘brain side’).
doi:10.1371/journal.pone.0020162.g004
of very faint echoes (Sound S12 vs. Sound S13). In addition, the contralateral bias in EB, but not LB (Figure 4, bottom). In other
environmental background sounds that were contained in both words, EB’s calcarine cortex showed the same kind of contralateral
outdoor echolocation and outdoor control recordings made both bias for echoes as the calcarine cortex in sighted people shows for
kinds of stimuli meaningful and interesting to all participants. This, light. As expected, there was no evidence for contralateral bias in
however, makes the increased BOLD activity in the calcarine auditory cortex in either EB or LB (Figure 4, bottom).
cortex and other occipital cortical regions in EB and LB during Finally, we also examined BOLD activity related to echoloca-
echolocation all the more remarkable. It implies that the presence tion stimuli that conveyed object movement with activity related to
of the low-amplitude echoes activates ‘visual’ cortex in the blind stimuli that did not convey such movement in both the blind and
participants (particularly in EB), without any detectable activation the sighted participants. Both EB and LB showed activity in areas
in auditory cortex. Of course, when we compared activation of the temporal lobe commonly associated with motion processing
associated with both the outdoor echolocation and control (Figure 5 top). This activity was absent in the control participants
recordings as compared to silence, there was robust activation in (Figure 5, bottom), who also did not perceive any sense of
auditory cortex in both the blind and the sighted participants movement. The results also hold at a more liberal statistical
(Figure S3). threshold (see Figure S4). Also a more powerful region of interest
Given the echo related activation of calcarine cortex in both EB analysis for C1 and C2, in which we analyzed the response to
and LB, the question arises as to whether the echo related activity echolocation motion stimuli within functionally defined visual
in calcarine cortex shows a contralateral preference – as is the case motion areas MT+, did not reveal any significant activation
for light related activity in calcarine cortex in the sighted brain. To (Figure 5, bottom; Table S3).
test this, we performed a region of interest analysis that com- The comparison between concave vs. flat conditions, as well as
pared BOLD activity in left and right calcarine in response to the comparison between tree vs. car vs. pole did not reveal
echolocation stimuli that contained echoes from surfaces located significant differences. It is evident from the behavioural data, that
on the left or right side of space. For comparison, we also applied EB and LB certainly perceived these conditions as different; so at
this analysis to the left and right auditory cortex. Previous fMRI some level, there must be a difference in neural activity. It is likely
research has shown a contralateral bias in auditory cortex for that the temporal and spatial resolution of our paradigm was not
monoaural stimulation [16–18]. But to date, fMRI research has able to detect these differences.
not been able to detect a contralateral bias with binaural Cerebellum. It is well established that the cerebellum is
stimulation, even though subjects may report hearing the sound involved in the control and coordination of movement, and there is
source to be lateralized to the left or right, e.g. [18]. In short, we also mounting evidence that the cerebellum may be involved in
would not expect our ROI analysis to reveal a contralateral bias in higher order cognitive function (for reviews see [19–24]). Recently,
auditory cortex. The results of our ROI analyses are shown in it has also been suggested that the cerebellum is involved in purely
Figure 4. As can be seen, activity in calcarine cortex exhibited a sensory tasks, such as visual and auditory motion perception [25].
Figure 5. BOLD activity projected on participants reconstructed and partially inflated cortical surface. Concavities and convexities are
colored dark and light, respectively. STS-superior temporal sulcus, ITS -inferior temporal sulcus, LOS – lateral occipital sulcus. Top Panel: BOLD
activity related to recordings of echolocation sounds conveying movement to EB and LB. Both EB and LB show significant activity in regions adjacent
and inferior to the ITS/LOS junction, that are typically involved in motion processing. Bottom Panel: BOLD activity in C1 and C2’s brain related to
recordings of echolocation sounds that convey movement to EB and LB. Even though C1 and C2 could reliably classify echolocation sounds as
‘moving’ or ‘stationary’, they reported to not perceive any sense of movement. Also shown are areas sensitive to visual motion (area MT+) functionally
defined at different significance levels (p,.05: light green or p,.05 Bonf. Corrected: dark green). Bar graphs show beta weights (+/2 SEM) obtained
from a region of interest analysis applied to areas MT+ (contrast: EchoMoving.EchoStationary). Bar color denotes the MT+ used for the ROI analysis (i.e.
MT+ defined at p,.05: light green, or p,.05; Bonf. Corrected: dark green). In contrast to EB and LB, neither C1 nor C2 show increased BOLD activity in
regions adjacent and inferior to the ITS/LOS junction for the contrast between ‘moving’ and ‘stationary’ echolocation stimuli, even at more liberal
statistical thresholds (see Figure S4). The statistically more powerful region of interest analysis applied to area MT+ was not significant either, i.e. SEM
error bars (and therefore any confidence interval) include zero (see also Table S3).
doi:10.1371/journal.pone.0020162.g005
Consistent with the idea that the cerebellum might be involved in EB and LB showed robust activation in vermal lobule VI and
non-motor functions in general, and sensory processing in lobule X, both of which have been linked to visual sensory
particular, we also observed significant BOLD activity in the processing [25]. Interestingly, however, C2 also shows activity in
cerebellum in both the blind and the sighted participants in our vermal lobule VI and close to lobule X. In summary, for the
experiments. We identified and labeled cerebellar structures based comparison of echolocation to silence, we found reliable activation
on anatomical landmarks and the nomenclature developed by [26]. in the cerebellum, but this activation did not clearly distinguish
When EB and LB listened to recordings of their echolocation between EB and LB on the one hand, and C1 and C2 on the
clicks and echoes, as compared to silence, they both showed other.
significant BOLD activity in lobules VI and VIII (Figure 6, left). A The result was different, however, when we compared BOLD
similar pattern was observed in the two sighted participants activation to outdoor recordings that contained click echoes with
(Figure 6, left). In other words, lobules VI and VIII appeared to be activation to outdoor recordings that did not contain echoes.
more active when all our participants listened to auditory stimuli Specifically, this analysis did not reveal any differential activity
as compared to silence. This pattern of activity is generally anywhere in the cerebellum for the two sighted control subjects C1
consistent with results that link activity in lobules VI and VIII to and C2. In contrast, for both EB and LB, this analysis revealed
auditory sensory processing [25]. We also found robust activation differential activity in lobule X and lobule VIIAt/Crus II (Figure 6,
in left lobule VIIAt/Crus II in all participants (Figure 6, left). To right). Again, activity in left lobule VIIAt/Crus II coincides with
date, lobule VIIAt/Crus II has not been implicated in sensory activity adjacent and inferior to the right middle frontal sulcus in
processing, but it has been suggested that it is part of a non-motor both EB and LB (compare Figure 3). In addition, for LB only, this
loop involving Brodmann area 46 in prefrontal cortex [24]. analysis also revealed differential activity in vermal lobule VI and
Consistent with this idea, the activation in left lobule VIIAt/Crus lobules VI and VIII.
II coincides with activity adjacent and inferior to right medial Of course, when we compared activation associated with both
frontal sulcus in all participants (compare Figure 2). Finally, both the outdoor echolocation and control recordings as compared to
Figure 6. BOLD activity in the cerebellum. Data are shown in neurological convention, i.e. left is left. Activity in the cerebellum was analyzed in
stereotaxic space [49]. To evaluate significance of activity we used the same voxelwise significance thresholds as for cortical surface analyses for each
participant. However, because the number of voxels in volume space differed from the number of vertices in surface space for each participant, the
Bonferroni corrected significance level differs between cortex and cerebellum (compare Figure 2). To increase accuracy, cerebellar structures for each
participant were identified based on anatomical landmarks. Structures were labeled according to the nomenclature developed by [26]. Left panel:
BOLD activity while participants listened to recordings of echolocation sounds that had been made in an anechoic chamber and judged the location
(left vs. right), shape (concave vs. flat) or stability (moving vs. stationary) of the sound reflecting surface (see Figure 1F for behavioral results). Right
Panel: Contrast between BOLD activations for recordings containing echoes from objects and recordings that did not contain such echoes. Data are
not shown if no significant activity was found (empty cells in table).
doi:10.1371/journal.pone.0020162.g006
silence, the pattern of activity in the cerebellum was very similar to However, support for an interpretation of the activation in terms
when we compared activation associated with echolocation sounds of echolocation, but not blindness per se, is provided by the
to activation associated with silence (Figure S5). outdoor scenes experiment, in which we see differential activation
The comparison between concave vs. flat conditions, as well as in calcarine cortex in EB and LB, but not in auditory cortex when
the comparison between tree vs. car vs. pole did not reveal echoes are present (or not) in the outdoor sounds (Figure 3). In this
significant differences regard our data go beyond ‘classical’ cross-modal results that show
co-activation of visual cortex and areas primarily sensitive to the
Discussion stimulus (i.e. primary auditory or somatosensory cortex). In a
related point, we want to emphasize that the differences in the
Here we show that two blind individuals can use echolocation to level of activation in the visual areas of EB’s and LB’s brains could
determine the shape, motion and location of objects with great have arisen for a number of reasons. First, there might be
accuracy, even when only listening passively to echolocation differences in cortical development in the two individuals; after all,
sounds that were recorded earlier. When these recordings were EB lost his sight much earlier than LB. Second, EB started using
presented during fMRI scanning, we found that ‘visual’ cortex was echolocation as a small child and has used it longer than LB. A
strongly activated in one early blind participant (EB) and to a consequence of this might be that EB creates a more vivid
lesser degree in one late blind participant (LB). Most remarkably, representation of the spatial scene from click-echoes. Third, EB
the comparison of brain activity during sounds that contained performed better in the passive-listening paradigm than LB even
echoes with brain activity during control sounds that did not though this difference was reduced for ‘outdoor’ sound recordings.
contain echoes revealed echo related activity in calcarine, but not But of course, any combination of all these factors could account
auditory cortex. for the differences in the activity in visual areas we observed in
The question arises if the activity that we observe in calcarine these two individuals.
cortex is truly related to echolocation, or if it is simply due to the It would be useful in future neuroimaging studies of echoloca-
fact that EB and LB are blind. Blindness can result in re- tion to include sighted people who have been trained to
organization of many brain areas, including but not limited to echolocate, or blind people who have a ‘regular’ sensitivity to
visual, auditory and somatosensory cortex and subcortical echoes. With respect to the latter, there is evidence that blind
structures, even though the underlying mechanism and exact people, even when they do not consciously echolocate, are more
nature of the changes are still unclear [13–15,27–32]. Based on the sensitive to echoes than sighted people [12], and this might pose a
existing literature, therefore, it is not surprising to see activity in challenge when comparing the brain activation of self-proclaimed
visual cortex in response to auditory stimuli in EB and LB. echolocators to the brain activation of self-proclaimed non-
echolocators who are also blind. In any case, the comparison we cortical organization that is not modality specific, such that visual
draw here (i.e. between blind echolocators and sighted non- and auditory motion areas largely overlap [37]. Finally, neurons
echolocators) is insightful, because it highlights the involvement of in and around visual motion area MT+ may also respond to
visual rather than auditory cortex in the processing of echoes. tactile motion, even though it remains to be determined to what
The patterns of activation observed in their brains might shed degree this activity is potentially mediated by visual imagery [38–
some light on the possible role that sensory deprivation plays in the 40]. Future research is needed to investigate how neurons that
recruitment of visual cortex during echolocation in the blind. On the are active during echolocation motion correspond to visual
behavioural level, of course, sighted people’s echolocation abilities motion area MT+ in sighted people.
have been repeatedly shown to be inferior to those of blind people An obvious question that arises from our findings is what
(for reviews see [1–3]). There are various reasons why this is the case. function calcarine cortex might serve during echolocation. One
One possibility is that blind people use echolocation on a daily basis possibility is that it is involved in the comparison between outgoing
and therefore acquire a higher skill level through practice. Another source sound (e.g. mouth click) and incoming echo. This
possibility might be that blind people have better hearing abilities explanation seems unlikely, however, because if the calcarine
which may also make them better at echolocation, e.g. [33,34]. Our computed a comparison between outgoing source sound and
current data suggest that hearing ability is not a variable, because incoming echo, it would also compute that comparison in the
both EB and LB performed within the normal range on standard absence of echoes. If that were the case, however, we would expect
hearing and source localization tests (Figure S1; Audiology Report the calcarine to be equally active in the presence and the absence
S1). Furthermore, we also saw no obvious differences in activation in of echoes – provided the corresponding clicks were present. The
auditory cortex between EB and LB or between these two pattern of activity we found in EB and LB does not support this
individuals and the control participants (Figure 2, Figure S3). It interpretation (Figure 3). An alternative, and perhaps more
cannot be ruled out, however, that the tests and comparisons we plausible, explanation is that calcarine cortex performs some sort
used are not suitable for detecting the auditory abilities that may of spatial computation that uses input from the processing of
underlie superior echolocation performance. Finally, it is also echolocation sounds that was carried out elsewhere, most likely in
possible that sighted individuals might simply be at a disadvantage in brain areas devoted to auditory processing. In this case, one would
acquiring echolocation skills, because echolocation and vision expect calcarine cortex to be more active in the presence than in
compete for neural resources. Clearly, more investigations of human the absence of echoes, because the trains of sounds with echoes
echolocation are needed on the behavioural, computational, and contain more spatial information than those without echoes. The
neural level, to uncover how echolocation works, how it is acquired activity patterns we found in EB and LB would certainly support
and which neural processes are involved. this interpretation (Figure 3). We are not the first to propose that
It is important to emphasize that the use of echolocation in the visual cortex could potentially subserve ‘supra-modal’ spatial
blind goes well beyond localizing objects in the environment. The functions after loss of visual sensory input [41]. Recently, a similar
experts we studied were also able to use echolocation to perceive supra-modal spatial function has also been suggested for certain
object shape and motion – and even object identity. In addition, parts of auditory cortex after loss of auditory sensory input [42].
they were able to use passive listening with 10-kHz cut-off to do Again, future research is needed to determine exactly how activity
these kinds of tasks – which made it possible for us to probe neural in calcarine cortex mediates echolocation.
substrates of their abilities. Clearly more work is needed com- The cerebellar structures linked to visual sensory processing [25]
paring performance with active and passive echolocation across a also appear to play a role in echolocation in the blind. In
range of different tasks – where the available frequency ranges in particular, we found that lobule X is more active in both EB and
both conditions are systematically varied. LB during echolocation than during control sounds. Thus, the
It could be argued that the contralateral bias that we observed arguments discussed above for potential function of calcarine
in EB’s calcarine cortex reflects differences in spatial attention cortex during echolocation also apply to lobule X.
between the two conditions. Effects of attention on brain activity In addition to lobule X, we also found activity in left lobule VIIAt/
have been shown for visual [35], as well as other cortical areas, Crus II during echolocation. Since this part of the cerebellum is
including auditory cortices, e.g. [17,36]. Thus, although we cannot involved in a non-motor loop involving Brodmann area 46 in pre-
rule out this explanation, it would still be remarkable that EB, frontal cortex [24], the co-activation that we see in this part of the
who lost his eyes when he was 13 months of age, would show cerebellum and in cortex adjacent and inferior to the right middle
attentional modulation of the calcarine cortex, but not the frontal sulcus makes sense. As a caveat, we want to note however, that
auditory cortex – and would do this in a contralateral fashion. we cannot be certain that the activity we found adjacent and inferior
Both EB and LB show BOLD activity in temporal cortical to the middle frontal sulcus actually corresponds to activity in
regions typically devoted to motion processing, but this activity is Brodmann area 46, because there is natural variability in the
absent in C1 and C2. In a similar fashion, both EB and LB anatomical location of Brodmann area 46 in the human brain [43].
reported to perceive motion, but this percept was absent in C1 In any event, we suggest that the activation of right middle prefrontal
and C2. Thus, we see good correspondence in terms of brain cortex and left cerebellar lobule VIIAt/Crus II most likely reflects the
activity and perception. The question remains, however, as to involvement of cognitive and executive control processes that are
what the ‘preferred modality’ of the neurons is that are active in non-echolocation specific. This hypothesis is supported by the fact
EB and LB when they perceive motion using echolocation. that we also saw activity in these brain areas in C1 and C2. It is
Neurons adjacent and inferior to the ITS/LOS junction are unlikely that this activity reflects motor imagery or the activation of a
sensitive to both visual and auditory motion as determined with ‘click motor-scheme’ during the passive listening paradigm, because
functional localization techniques [37]. Sighted individuals the click sound was the same between outdoor echo and outdoor
typically show a modality specific cortical organization, such control stimuli where only the echo was missing.
that neurons that are sensitive to visual motion (i.e. area MT+)
are located adjacent but posterior to neurons that are sensitive to Conclusion
auditory motion [37]. In contrast, individuals who regained vision The current study is the first to investigate which brain areas
at a later point in their life (i.e. late onset sight recovery) show potentially underlie natural echolocation in early- and late-blind
people (EB and LB). In EB, we found robust echolocation-specific Anatomical Images: Anatomical images of the whole brain were
activity in calcarine cortex – but not in auditory cortex. A similar acquired at a resolution of 16161 mm using an optimized
pattern was observed in LB, but the activity in the calcarine cortex sequence (MPRAGE).
was not as extensive. We also found that the calcarine activity was Functional Paradigms. Shape/Location: Each run contained
greater for echoes reflected from surfaces located in contralateral silent baseline and experimental trials. Experimental trials
space in EB but not LB. Our findings also shed new light on how began with a pre-recorded spoken instruction (i.e., ‘‘shape’’ or
the cerebellum might be involved in sensory processing. In ‘‘location’’) indicating which attribute the listener should attend
addition, our study introduced novel methodology that can be to from the echo. Total time including the brief silent gap that
used in future experiments on echolocation. followed the instruction was 1 s. Next, 10 s of echolocation
From a more applied point of view, our data clearly show that stimuli were presented. Since stimuli were shorter than 10 s (see
EB and LB use echolocation in a way that seems uncannily similar experimental stimuli) the sound was played in a loop. This was
to vision. In this way, our study shows that echolocation can followed by a 200 ms 1000 Hz tone. The participant was
provide blind people with a high degree of independence and self- instructed to indicate his response with a key press after he
reliance in their daily life. This has broad practical implications in heard the tone (see behavioral paradigm below). Functional
that echolocation is a trainable skill that can potentially offer scans started 12 s after the run had started and lasted 2 s. The
powerful and liberating opportunities for blind and vision- next trial started after scanning had ended. Silent baseline trials
impaired people. differed from experimental trials in that the 2 s functional scan
occurred after 12 s of silence. No cues were provided and no
Materials and Methods key-presses were produced. Trials were counterbalanced such
that a silent trial always preceded two experimental trials and
All testing procedures were approved by the ethics board at the that experimental trials occurred in alternating order (i.e.
University of Western Ontario, and participants gave written shape-location followed location-shape and vice versa). Each
informed consent prior to testing. The consent form was read to run began and ended with a silent baseline trial. The total
participants, and the location to sign was indicated manually. number of trials in each run was 25 (8 shape, 8 location and 9
Software used to conduct testing was programmed using silent) and each run lasted 25614 s. Each participant
Psychophysics toolbox 2.54 [44], Matlab7 (R14, The Mathworks) performed 5 runs.
and C/C++. fMRI data were analyzed using Brain Voyager QX Motion: Motion experiment runs were the same as in the Shape/
version 2.1 (Brain Innovation, Maastricht, The Netherlands) and Location experiments with the exception that no cue was
Matlab R14 (The MathWorks, Natick, MA, USA). Sound editing presented prior to the echolocation sounds, thus making the
was performed with Adobe Audition version 1.5 software (Adobe echolocation stimuli duration 11 s. Trials were counterbalanced
Systems, San Jose, CA, USA). Sound equalization was performed such that a silent trial always preceded two experimental trials and
with filters provided by the headphone manufacturer (Sensi- that experimental trials occurred in alternating order (i.e.
metrics, Malden, MA, USA). stationary-moving followed moving-stationary and vice versa).
Each participant performed 5 runs.
fMRI Data Acquisition Outdoor Scenes: Outdoor Scene runs were similar to those in the
All imaging was performed at the Robarts Research Institute motion experiment. Stimuli were played for 11 s. Participants
(London, Ontario, Canada) on a 3-Tesla, whole-body MRI system listened to scene echolocation recordings from both persons (thus,
(Magnetom Tim Trio; Siemens, Erlangen, Germany) using a 32- four different experimental conditions, i.e. EB-Echo, EB- Control,
channel head coil. LB-Echo, LB-Control). Stimuli presentation order was balanced
Setup and Scanning Parameters. fMRI Echolocation: Audio using a clustered Latin square design, such that each run
stimuli were delivered over MRI-compatible insert earphones contained four clusters, each cluster contained all 4 experimental
(Sensimetrics, Malden, MA, USA, Model S-14). Earphones were conditions, and the order of conditions within each cluster was
encased in replaceable foam tips that provided a 20–40 dB chosen such that every condition was preceded by every other
attenuation level (information provided by the manufacturer). condition in a run. A cluster was always preceded by a silent
Further sound attenuation was attained by placing foam inserts baseline trial and each run began and ended with a silent baseline
between the head rest and the listener’s ears. To minimize trial. Thus, there were 21 trials per run (5 silent+464
background noise, the MRI bore’s circulatory air fan was turned experimental) and the duration of each run was 21614 s. Each
off during experimental runs. A single-shot gradient echo-planar participant performed 6 runs.
pulse sequence in combination with a sparse-sampling design [45] MT+ Localizer (C1 and C2 Only):We employed a standard MT+
was used for functional image acquisition. Repetition time [TR] localizer paradigm that displayed white dots that were either
was 14 s (12 s silent gap+2 s slice acquisition). We used a FOV of stationary or moved in smooth linear motion in front of a black
211 mm and 64664 matrix size, which led to in-slice resolution of background. See Methods S1 for more details.
3.363.3 mm. Slice thickness was 3.5 mm and we acquired 38 Behavioral Paradigms. Shape/Location: The basic paradigm
contiguous axial slices covering the whole brain (including was a 1-interval-2-alternative forced choice (AFC) paradigm. The
cerebellum) in ascending interleaved order. Echo time [TE] was participant listened to the echolocation sound and, depending on
30 ms and Flip-Angle [FA] was 78u. the cue, judged the shape (concave vs. flat) or location (right vs.
fMRI MT+ Localizer (C1 and C2 Only): Visual stimuli were viewed left) of the sound reflecting surface. The participant indicated his
through a front-surface mirror mounted on top of the head coil response on an MR compatible keypad by pressing the key located
and were projected with an LCD projector (AVOTEC Silent under his right index or middle finger, respectively.
Vision Model 6011, Avotec, FL, USA) on a rear-projection Motion: The basic paradigm was a 1-interval-2-AFC paradigm.
screen located behind the head-coil in the bore. fMRI scanning The participant listened to the echolocation sound and judged the
parameters were the same as the echolocation experiments, motion (moving vs. stationary) of the sound reflecting surface as
with exception of a 2 s TR related to the continuous scanning conveyed by the echo. As in the shape/location experiment,
procedure. responses were collected with the same keypad and the participant
indicated his response by pressing the key located under his right threshold for this was set to 0.001 (voxelwise). To increase power
index or middle finger, respectively. for our control participants we also used a threshold of p,.01.
Outdoor Scenes: The basic paradigm was a 1-interval-4-AFC Functional Analysis – ROI. ROI Selection for analysis of
paradigm. The participant listened to the echolocation sound and contralateral preference (EB and LB only): ROIs were defined
judged whether the scene contained a car, a tree or a pole or no anatomically and functionally. Anatomically, we considered
sound reflecting object at all (Control Sounds). The response in the voxels only within and in close proximity to the left and right
Scenes experiment was obtained with the same keypad as in the calcarine sulcus (ROI: left and right calcarine) and the left and
other experiments and the participant pressed the key located right Heschl’s gyrus (ROI: left and right Heschl’s gyrus). To avoid
under his right index, middle, ring and little finger to report ‘tree’, ‘bleeding in of activity’ from the right to the left hemisphere, and
‘pole’, ‘car’ and ‘nothing’, respectively. vice versa, we defined a 6 mm voxel selection gap between left and
Order of experiments. (see Methods S1). right hemispheres for the ROI definition for the calcarine.
Functionally, we considered only those voxels for which the
contrast (EchoMotion+EchoStationary).0 was significant. The
fMRI Data Analysis
minimum threshold for statistical significance to select voxels in
Standard routines were employed for fMRI data pre-processing,
any ROI was p,.001 with a combined cluster-size threshold of 10
coregistration and cortical surface reconstruction (see Meth-
voxels. For various ROIs, however, we adopted more stringent
ods S1).
levels of significance, either to shrink a large area of activity to a
Functional Analysis – Voxelwise. BOLD activity related to
more localized cluster (e.g. for the right calcarine in EB) or in
echolocation as compared to silence: To obtain activity related to
order to uniquely determine the source of activity. More details are
echolocation processing as compared to a silent baseline for each
provided in Methods S1. Importantly, in all cases we confirmed
participant, we applied a fixed effect GLM with the stick-predictor with additional statistical analyses that the results of our ROI
‘‘Echo’’ to the z-transformed time courses of runs obtained in analysis held regardless of ROI selection criteria.
shape/location and motion experiments (10 runs per participant).
ROI Analysis of contralateral preference (EB and LB only): To
To determine where BOLD activity during echolocation trials determine activity for echoes from objects located to the right or
exceeded that during silent baseline trials, we isolated voxels where left side of space, regardless of task (i.e. shape or location) or
the beta value of the ‘Echo’ predictor was significantly larger than surface shape (i.e. concave or flat), we applied a GLM with stick-
zero. The significance threshold for evaluation of results in volume predictors ‘‘left’’ and ‘‘right’’ to the time courses of runs obtained
space was set to 0.1 (Bonferroni corrected (BC) and taking into in shape/location experiments (5 runs per participant). Thus, data
account all voxels in the functional volume) in order to remove for functional ROI analysis were independent from data used for
obvious false positives (e.g., activations outside of the brain) while ROI selection. Predictors as well as the time course for each voxel
still showing positive activation in expected areas (i.e. in auditory were z-transformed before the analysis. It follows that beta values
cortex) (see Methods S1 for more details). As it turned out, a .1 obtained from the GLM are equivalent to correlation coefficients.
(BC) threshold in volume space corresponded very closely to a .05 The GLM was run as a fixed effect model for each voxel inside
(BC) threshold in surface space for each participant. Hence, we each ROI and participant.
applied a .05 threshold (BC) to the cortical data in surface space From this analysis we obtained a separate beta value for ‘left’
and a threshold of .1 (BC) to the cerebellum data in volume space. and ‘right’ predictors for each voxel. To determine if there was a
BOLD activity related to moving echoes: To obtain activity related to right or left echo preference in the left or right portion of the
processing of moving echoes as compared to stationary echoes for calcarine sulcus or Heschl’s gyrus, we subjected those beta values
each participant, we applied a fixed effect GLM with stick- to an ANOVA with ‘brain side’ and ‘echo side’ as independent
predictors ‘‘moving’’ and ‘‘stationary’’ to the z-transformed time factors, separately for the calcarine sulcus and Heschl’s gyrus.
courses of runs obtained in motion experiments (5 runs per Technically, we could have used the number of beta values to
participant). The GLM results were then subjected to a determine error degrees of freedom (df) for each ANOVA, but
conjunction analysis, i.e. (moving.0) AND (moving.stationary), this would have resulted in different df for the error terms (and
the significance threshold for which was set to 0.001 (voxelwise) for thus differences in statistical power) between participants and
both surface and volume data. To increase power for our control ROIs. To avoid this, we determined df based on the number of
participants we also used a threshold of p,.01. times an event occurred. For example, in the calcarine, ‘left’ and
BOLD activity related to outdoor sounds: To obtain activity related ‘right’ events each occurred 40 times in the left and 40 times in
to processing of outdoor sounds, regardless of the presence of the right hemisphere resulting in 160 independent events and 156
echoes (i.e. echolocation vs. Control sounds) or participant (i.e. EB df for the error term to compute the ANOVA for the calcarine
or LB) for each participant, we applied a fixed effect GLM with sulcus. The same applies to the ANOVA applied to Heschl’s
four stick-predictors, i.e. ‘‘EB-Echo’’, ‘‘EB-Control’’, ‘‘LB-Echo’’ gyrus.
and ‘‘LB-Control’’ to the z-transformed time courses of runs In this way we could use data obtained from all voxels inside
obtained in scenes experiments (6 runs per participant). The GLM each ROI to determine interaction effects between ‘brain side’ and
results were then subjected to a contrast (i.e., ‘‘EB-Echo’’+‘‘EB- ‘echo side’ for each participant. In contrast, a traditional ROI
Control’’+‘‘LB-Echo’’+‘‘LB-Control’’) against zero. The signifi- analysis averages across voxels before applying the GLM, such
cance threshold for this contrast was chosen as in ‘‘echolocation as that interaction effects can only be computed when data from
compared to silence’’. multiple participants is available.
BOLD activity related to outdoor echolocation sounds as compared to MT+ ROI Selection (C1 and C2 only): First, we applied a fixed
outdoor control sounds: To obtain activity related to processing of effect GLM to determine which voxels showed activity during a
outdoor echolocation sounds as compared to outdoor control ‘moving’ visual stimulus. MT+ was then defined by selecting
sounds, regardless of the participant (i.e. EB or LB), the results of voxels posterior to the ITS/LOS junction for which the activity
the GLM as described in the previous paragraph were sub- was significant. For selection we used both a liberal voxelwise
jected to a conjunction analysis, i.e. (EB-Echo+LB-Echo).0 AND p,.05 threshold and more conservative Bonferroni corrected
(EB-Echo+LB-Echo).(EB-control+LB-control). The significance p,.05 threshold, where the correction was computed based on all
voxels in the functional volume. For more details and ROI MT+ driveway was bordered by two-storey buildings (see Figure S6).
coordinates see Methods S1and Table S4. Echolocation recordings were made while the participant made
clicks in front of a sound reflecting object (i.e. a tree, lamp-post or
Experimental Stimuli car, see Figure S7). Recordings were made separately for each
Setup and Recording Procedure - Anechoic Chamber. object and participant. Echolocation clicks were self-paced (SOA
With the exception of the outdoor recordings, all auditory stimuli roughly 500 ms) with the participant sampling the object at
were recorded in the Beltone Anechoic Chamber at the National slightly different head positions. Non-clicking, baseline audio
Centre for Audiology in London, Ontario, Canada, that was recordings (approximately 15 s in duration) were made while the
equipped with a 125 Hz cut-off wedge system on the walls and participant stood silently in front of each sound reflecting object.
ceiling, and a vinyl covered concrete floor. Ambient noise Again, recordings were made separately for each object and
recordings indicated a background noise (i.e., ‘noise floor’) of participant.
18.6 dBA. The participant was seated in the center of the room. Sound Editing. Shape/Location: For the Shape/Location
For each recording trial, the experimenters placed an object at a experiment, two unique click sequences were extracted from
desired position, and then retreated to the back of the chamber each of the 20 clicks that were produced in the anechoic chamber
(approximately 1.5 m behind the participant) before instructing by each echolocator during each of the conditions (i.e., concave
the participant to start producing echolocation clicks. High-quality left, concave right, flat left and flat right). Each of these click
stereo recordings of the entire sessions’ audio were acquired with sequences was approximately 5 s in duration, which, depending
the in-ear microphones and saved for off-line editing. EB and on the participant’s clicking rate, resulted in sequences containing
LB participated in separate recording sessions, i.e. during any anywhere from 6–9 clicks. The total number of click sequences
recording session three people were in the room (two exper- used in the Shape/Location experiment was 16 (4 conditions62
imenters and one participant). echolocators62 exemplars), 8 for each participant.
Shape/Location: Two surfaces were used to generate recordings Motion: Four unique click sequences were produced for each
for the shape and location classification experiments. The first was condition in the Motion recording sessions (object left or right). All
a standard sized safety helmet, made from plastic, and positioned ‘moving’ head stimuli contained in between 6–9 clicks and had
such that the helmet’s inside faced the participant (concave surface). duration of approximately 5–6 s. ‘Stationary’ head stimuli (object
The second surface was a wooden 12 cm-cube with smooth paint left and object right) were taken from the Shape/Location
finish, positioned such that one of the cube’s flat sides faced the experiment in which the echolocators had made clicking sounds
participant (flat surface). Objects were positioned at a distance of at the same concave object located in the same left and right
40 cm from the seated listener, either 20u to the left or right of positions, but always with their head fixed and oriented straight-
straight ahead. The height of the object was adjusted on a 0.5 cm ahead. The total number of click sequences for the motion
diameter telescopic steel pole so as to create optimal echolocation experiment was 32 (2 object positions62 types of head motion
conditions as indicated by each participant (i.e., typically at (moving, stationary)64 exemplars62 echolocators), 16 for each
participant’s mouth level or approximately 1.3 m above the floor). participant. To match the number of stationary exemplars to the
For each of the four conditions (concave or flat surface, positioned number of moving ones, each stationary exemplar had been
to the left or right) recordings were made as follows: First the duplicated once.
surface was placed. Then, the participant (either EB or LB) Angular position discrimination (Passive Listening):With respect to the
produced at least 20 echolocation clicks with his head held Angular Position Discrimination recording sessions, two unique
stationary and straight ahead. click sequences of exactly 6 clicks each were extracted for each of
Motion: It is possible to mimic the echolocator’s perception of a the 25 pole locations (see Angular Position Discrimination), summing to
moving object by recording echolocation clicks from a head in a total of 50 stimuli (25 pole locations62 exemplars) for each
different positions relative to a stationary object, and then playing echolocator.
these recordings back to an echolocator whose head is stationary. Outdoor Scenes: Two unique 5 s exemplars were extracted from
To create the perception of moving objects, we made audio each of the ‘scenes’ recordings (i.e., the sequence of 20 clicks made
recordings with a concave surface positioned to the left or right (as in front of a car, tree, or pole by each echolocator). This provided
described for the shape/location experiment), but this time the 12 sound files (3 object scenes62 echolocators62 exemplars).
participant made echolocation clicks with his head in different Depending on the participant’s clicking rate, each of these sound
positions during clicking, rather than held stationary straight files contained anywhere between 6 and 12 clicks in those 5 s. To
ahead. Several examples of these echolocation sequences were create the control stimuli, we took the non-clicking baseline audio
recorded for each object position and (i.e., 20u left or right). Each recordings that were made as each echolocator silently stood in
sequence contained 6–9 clicks. The participant started and ended front of the three objects (car, tree and pole), and we extracted two
each sequence with his head held straight ahead. unique 5 s recordings from each. This provided us with 12 sound
Angular Position Discrimination (Passive Listening):To create stimuli files (3 object scenes62 echolocators62 exemplars) containing only
for the angular position discrimination via passive listening, a background noises (i.e., distant traffic, wind, birds, etc.), but no
position marker (described in main text) was placed at a radial clicks or click echoes. Next, the click sequences, but not the echoes
distance of 150 cm at various angular intervals around the associated with them, were copied from each of the corresponding
participant (i.e. straight ahead and 36u, 27u, 18u, 16u, 14u, 12u, echolocation sound files, and then overlayed onto the respective
10u, 8u, 6u, 4u, 2u, 1u to the left and right of the straight ahead). sound files containing just the background noise. More specifically,
Then, the participant (either EB or LB) produced at least 20 with the aid of a spectral waveform display (see for example
echolocation clicks with his head held stationary and aimed Figure 1A), the initial 10–20 ms burst of energy associated with the
straight ahead. onset of each mouth-generated click was selected by hand from the
Setup and Recording Procedure - Outdoor Scenes. left channel, being careful to avoid including any energy associated
Stimulus recording for the Scenes experiments took place in a with click echoes. Each copy of these click waveforms was then
garden-style courtyard, approximately 40 m long by 20 m wide overlayed in both left and right channels of the corresponding
and surrounded by an elliptical driveway. Two thirds of the background noise file, at the precise time point that it had been
copied from. This was carried out for every click in each of the 12 activity in EB’s and LB’s brains while they listened to outdoor scene
echolocation sound files. In the end, for every one of the 12 recordings (both echo and control sounds) and judged whether the
echolocation sound files, there existed a control sound file that recording contained echoes reflected from a car, tree or pole or no
contained essentially the same click sounds, occurring at the same object echoes at all. Each participant listened to recordings of his
temporal points, but devoid of any click echoes. own clicks and echoes as well as to recordings of the other person
(see Figure 1G for behavioral results). EB shows highly reliable
Behavioral Testing Procedure for Angular Position BOLD activity in the calcarine sulcus of the right hemisphere. LB
Discrimination (EB and LB) shows activity at the apex of the occipital lobes of the right and left
Active Echolocation. To determine angular position discrim- hemisphere, typically considered the ‘foveal part’ of visual cortex.
ination thresholds we employed a 2-Interval-2-AFC adaptive Both participants also show BOLD activity in the lateral sulcus (i.e.
staircase method, with step-sizes in the first two trials computed Auditory Complex) of the left and right hemispheres, most likely due
based on [46], and in subsequent trials based on [47]. The to the auditory nature of the stimuli. Bottom panel: BOLD
participant’s task on every trial was to actively echolocate and activity in C1’s and C2’s brains while they listened to outdoor scene
determine whether a position marker (described in main text) at a recordings (both echo and control sounds. The task was the same as
test position was located to the left or right of a position marker at for EB and LB, and each participant listened to recordings they had
a straight ahead reference position. Presentation was sequential. trained with as well as to the recordings of the other person, e.g. C1
See Methods S1 for more details. listened to both EB’s and LB’s recordings (see Figure 1G for
Passive Listening. During passive listening we used the same behavioral results). In contrast to EB and LB, neither C1 nor C2
procedure as during active echolocation with the exception that show BOLD activity in calcarine sulcus. However, just as EB and
participants did not actively echolocate, but listened to recordings LB, both C1 and C2 show robust BOLD activity in the lateral sulcus
of their own clicks and echoes. See Methods S1 for more details. (i.e. Auditory Complex) of the left and right hemispheres.
(TIF)
Supporting Information Figure S4 BOLD activity in C1 and C2 brains that is related to
recordings of echolocation sounds that convey movement to EB
Figure S1 Results of source localization experiment. Plotted on
and LB, evaluated at a more liberal statistical threshold than
the ordinate is the probability that the participant judges the
reported in the main text, i.e. p,.01 instead of p,.001 (compare
source to be located to the right of its straight ahead reference
Figure 5 in main text). Also shown are areas sensitive to visual
position. Plotted on the abscissa is the position of the test position
motion (area MT+) functionally defined at different significance
with respect to the straight ahead in degrees. Negative numbers
levels (p,.05 (light green) or p,.05 Bonf. Corrected (dark green)).
indicate a position shift in the counter clockwise direction.
Even at this more liberal statistical threshold, neither C1 nor C2
Psychometric functions were obtained by fitting a 3-parameter
show increased BOLD activity in regions posterior to the ITS/
sigmoid to the data. 25% and 75% thresholds and bias (denoted in
LOS junction for the contrast between ‘moving’ and ‘stationary’
red) were estimated from fitted curves. The zero-bias line is drawn
echolocation stimuli.
for comparison (dashed line). It is evident from the data that EB
(TIF)
and LB can determine the angular position of a source with high
accuracy, i.e., thresholds for EB and LB are 2u and 2.5u, Figure S5 BOLD activity in the cerebellum while participants
respectively. The localization thresholds for both EB and LB are listened to outdoor scene recordings (both echo and control
within the range of what has been reported for source localization sounds) and judged whether the recording contained echoes
thresholds of sighted participants with respect to a centrally reflected from a car, tree or pole or no object echoes at all. Each
located reference source (Blauert, 1998; page 39, table 2.1). For EB and LB listened to recordings of his own clicks and echoes as
both EB and LB, performance is slightly better during source well as to recordings of the other person. Similarly, each C1 and
localization than during active or passive echolocation (compare C2 listened to recordings he had trained with as well as to the
Figure 1 in main text). With regard to bias, the data show that EB recordings of the other person, e.g. C1 listened to both EB’s and
is unbiased (red line at zero), but that LB tends to judge test LB’s recordings (see Figure 1G for behavioral results). Data are
locations to be to the left of the straight ahead (red line shifted to shown in neurological convention, i.e. left is left. Activity in the
the right). This means, that LB’s subjective straight ahead is shifted cerebellum was analyzed in stereotaxic space [49]. To evaluate
to the right. Thus, bias in source localization is similar to bias significance of activity we used the same voxelwise significance
during active and passive echolocation for both participants thresholds as for cortical surface analyses for each participant.
(compare Figure 1 in main text). However, because the number of voxels in volume space differed
(TIF) from the number of vertices in surface space for each participant,
the Bonferroni corrected significance level differs between cortex
Figure S2 BOLD activity projected on participants reconstruct-
and cerebellum (compare Figure S3). To increase accuracy,
ed and partially inflated cortical surface. Shown is the contrast
cerebellar structures for each participant were identified based on
between activations for outdoor recordings containing echoes from
anatomical landmarks. Structures were labeled according to the
objects, and outdoor recordings that did not contain such echoes,
nomenclature developed by [26]. Data are not shown if no
evaluated at a more liberal statistical threshold then in the main
significant activity was found (empty cells in table).
text, i.e. p,.01 instead of p,.001 (compare Figure 3 in main text).
(TIF)
Even at this more liberal statistical threshold, neither C1 nor C2
shows any difference in BOLD activity in visual cortex between Figure S6 Bird’s eye view of the courtyard (highlighted in red)
echo and control conditions. that was used to make outdoor scene recordings.
(TIF) (TIF)
Figure S3 BOLD activity projected on participants reconstructed Figure S7 Illustrations of outdoor scenes used to make
and partially inflated cortical surface. Marking of cortical surfaces echolocation recordings (the participant stood in front of each
and abbreviations as in Figure 2, main text. Top panel: BOLD object and made clicks) and background recordings used to make
outdoor control sounds (the participant stood silently in front of and click echoes made in SRA’s ears in the anechoic chamber,
each object). while he listened to pseudo-clicks (derived from EB’s original
(TIF) clicks) from a loudspeaker located 150 cm 10u to the left of straight
ahead. In the experiment sounds were presented via MR
Table S1 Expanded Classification Results (incl. sample size) for
compatible headphones (Sensimetrics, Malden, MA, USA, Model
location, shape, motion and outdoor scenes experiments for EB
S-14). To illustrate the sounds that participants heard through
and LB. Asterisks indicate that performance is significantly
these headphones during the experiments, sample sounds have
different from chance (p,.05). Unless otherwise indicated, chance
been passed through a 10 kHz low-pass filter. NOTE: We advise
performance is 50%. Tests of significance were only computed for
to use in-ear stereo headphones to listen to sound sample.
entries in black (also contained in the main text). Sample sizes
(WAV)
(shown in parenthesis) fulfill minimum requirement for confidence
intervals for a proportion based on the normal approximation Sound S5 Illustrations of sounds used during angular position
[48]. discrimination – passive listening. Binaural recording of a click and
(DOC) click echoes made in EB’s ears in the anechoic chamber, while he
made clicks in the presence of a position marker located 150 cm
Table S2 Expanded Classification Results (incl. sample size) for 10u to the right of straight ahead. In the experiment sounds were
location, shape, motion and outdoor scenes experiments for C1 presented via MR compatible headphones (Sensimetrics, Malden,
and C2. Asterisks indicate that performance is significantly MA, USA, Model S-14). To illustrate the sounds that participants
different from chance (p,.05). Unless otherwise indicated, chance heard through these headphones during the experiments, sample
performance is 50%. Tests of significance were only computed for sounds have been passed through a 10 kHz low-pass filter. NOTE:
entries in black (also contained in the main text). Sample sizes We advise to use in-ear stereo headphones to listen to sound
(shown in parenthesis) fulfill minimum requirement for confidence sample.
intervals for a proportion based on the normal approximation (WAV)
[48]. 1 = less than chance, because of bias to classify as ‘tree’.
(DOC) Sound S6 Illustrations of sounds used during angular position
discrimination – passive listening. Binaural recording of a click
Table S3 Statistical results of ROI analysis (contrast: EchoMoving and click echoes made in EB’s ears in the anechoic chamber,
2EchoStationary ) applied to area MT+ in C1 and C2. We applied while he made clicks in the presence of a position marker located
regions of interest analysis to MT+ ROIs for both control participants 150 cm 10u to the left of straight ahead. In the experiment sounds
to determine if the contrast EchoMoving2EchoStationary was significant were presented via MR compatible headphones (Sensimetrics,
(contrast values and SEM are shown in Figure 5, main text). It is Malden, MA, USA, Model S-14). To illustrate the sounds that
evident that the contrast was not significant in any condition. participants heard through these headphones during the
(DOC) experiments, sample sounds have been passed through a
Table S4 Center-of-Gravity Talairach Coordinates for MT+ 10 kHz low-pass filter. NOTE: We advise to use in-ear stereo
ROIs. For ROI selection methods see Methods S1. headphones to listen to sound sample.
(DOC) (WAV)
Sound S1 Binaural recording of a click and click echoes made in Sound S7 Illustrations of sounds used during Shape/Location
EB’s ears in the anechoic chamber, while he made a click in the Classification. Binaural recording of click and click echoes made in
presence of a position marker located 150 cm straight ahead. This LB’s ears in the anechoic chamber, while he held his head
sound accompanies Figure 1A, main text. NOTE: We advise to stationary and made clicks in the presence of a concave surface
use in-ear stereo headphones to listen to sound sample. located 40 cm and 20u to the left of straight ahead. In the
(WAV) experiment sounds were presented via MR compatible head-
phones (Sensimetrics, Malden, MA, USA, Model S-14). To
Sound S2 Binaural recording of a click and click echoes made in illustrate the sounds that participants heard through these
LB’s ears in the anechoic chamber, while he made a click in the headphones during the experiments, sample sounds have been
presence of a position marker located 150 cm straight ahead. This passed through a 10 kHz low-pass filter. NOTE: We advise to use
sound accompanies Figure 1A, main text. NOTE: We advise to in-ear stereo headphones to listen to sound sample.
use in-ear stereo headphones to listen to sound sample. (WAV)
(WAV)
Sound S8 Illustrations of sounds used during Shape/Location
Sound S3 Illustrations of sounds used during angular position Classification. Binaural recording of click and click echoes made in
discrimination – source localisation.Binaural recording of a click LB’s ears in the anechoic chamber, while he held his head
and click echoes made in SRA’s ears in the anechoic chamber, stationary and made clicks in the presence of a flat surface located
while he listened to pseudo-clicks (derived from EB’s original 40 cm and 20u to the left of straight ahead. In the experiment
clicks) from a loudspeaker located 150 cm 10u to the right of sounds were presented via MR compatible headphones (Sensi-
straight ahead. In the experiment sounds were presented via MR metrics, Malden, MA, USA, Model S-14). To illustrate the sounds
compatible headphones (Sensimetrics, Malden, MA, USA, that participants heard through these headphones during the
Model S-14). To illustrate the sounds that participants heard experiments, sample sounds have been passed through a 10 kHz
through these headphones during the experiments, sample low-pass filter. NOTE: We advise to use in-ear stereo headphones
sounds have been passed through a 10 kHz low-pass filter. to listen to sound sample.
NOTE: We advise to use in-ear stereo headphones to listen to (WAV)
sound sample.
Sound S9 Illustrations of sounds used during Motion Classification.
(WAV)
Binaural recording of click and click echoes made in LB’s ears in the
Sound S4 Illustrations of sounds used during angular position anechoic chamber, while he moved his head randomly and made
discrimination – source localisation. Binaural recording of a click clicks in the presence of a concave surface located 40 cm and 20u to
the left of straight ahead. In the experiment sounds were presented via We advise to use in-ear stereo headphones to listen to sound
MR compatible headphones (Sensimetrics, Malden, MA, USA, sample.
Model S-14). To illustrate the sounds that participants heard through (WAV)
these headphones during the experiments, sample sounds have been
Sound S13 Illustrations of sounds used during Outdoor Scenes
passed through a 10 kHz low-pass filter. NOTE: We advise to use in-
Classification. Control sound for Sound S12. This sound contains
ear stereo headphones to listen to sound sample.
(WAV) background sounds very similar to those in Sound S12, as the
recording was also made in EB’s ears in an outdoor setting while
Sound S10 Illustrations of sounds used during Motion he stood in front of the lamp post. However, during the recording
Classification. Binaural recording of click and click echoes EB was silent. The click-like sounds in the audio file are Pseudo-
made in LB’s ears in the anechoic chamber, while he moved his clicks derived from EB’s own clicks but placed in the same
head in a sweeping motion from left to right and made clicks in positions as the original clicks in Sound S12 (see Methods S1).
the presence of a concave surface located 40 cm and 20u to the Thus, the control sound is yoked to the Sound 12, but does not
left of straight ahead. In th e experiment sounds were presented contain click-echoes. In the experiment sounds were presented via
via MR compatible headphones (Sensimetrics, Malden, MA, MR compatible headphones (Sensimetrics, Malden, MA, USA,
USA, Model S-14). To illustrate the sounds that participants Model S-14). To illustrate the sounds that participants heard
heard through these headphones during the experiments, through these headphones during the experiments, sample sounds
sample sounds have been passed through a 10 kHz low-pass have been passed through a 10 kHz low-pass filter. NOTE: We
filter. NOTE: We advise to use in-ear stereo headphones to advise to use in-ear stereo headphones to listen to sound sample.
listen to sound sample. (WAV)
(WAV)
Audiology Report S1 Summary of audiological test results
Sound S11 Illustrations of sounds used during Motion Classifi- for EB and LB (Air Conduction Thresholds, Tympanograms,
cation. Binaural recording of click and click echoes made in LB’s Acoustic Reflex Thresholds, Distortion Product Otoacoustic
ears in the anechoic chamber, while he held his head stationary Emissions).
and made clicks in the presence of a concave surface located
(PDF)
40 cm and 20u to the left of straight ahead. In the experiment
sounds were presented via MR compatible headphones (Sensi- Methods S1 Additional information about the experimental
metrics, Malden, MA, USA, Model S-14). To illustrate the sounds methods.
that participants heard through these headphones during the (DOC)
experiments, sample sounds have been passed through a 10 kHz
low-pass filter. NOTE: We advise to use in-ear stereo headphones Acknowledgments
to listen to sound sample.
(WAV) We thank Daniel Kish and Brian Bushway from World Access for the
Blind, who acted as consultants throughout the experiments, providing
Sound S12 Illustrations of sounds used during Outdoor Scenes invaluable technical and practical advice about echolocation and the
Classification. Binaural recording of clicks and click echoes made nature of the testing materials we used. We thank D. Purcell for conducting
in EB’s ears in an outdoor setting, while he made clicks in the the audiological testing. We thank J. Ween, G. Dutton, L. van Eimeren,
presence of lamp-post located in front of him (background sounds and H. Yang for technical support and logistics.
contain birds, leaves, etc.). In the experiment sounds were
presented via MR compatible headphones (Sensimetrics, Malden, Author Contributions
MA, USA, Model S-14). To illustrate the sounds that participants Conceived and designed the experiments: LT SRA MAG. Performed the
heard through these headphones during the experiments, sample experiments: LT SRA. Analyzed the data: LT SRA. Contributed reagents/
sounds have been passed through a 10 kHz low-pass filter. NOTE: materials/analysis tools: LT. Wrote the paper: LT SRA MAG.
References
1. Schenkman BN, Nilsson ME (2010) Human echolocation: Blind and sighted 11. Jones G (2005) Echolocation. Curr Biology 15(13): 484–488.
persons’ ability to detect sounds recorded in the presence of a reflecting object. 12. Dufour A, Després O, Candas V (2005) Enhanced sensitivity to echo cues in
Perception 39: 483–501. blind subjects. Exp Brain Res 165: 515–519.
2. Stoffregen TA, Pittenger JB (1995) Human echolocation as a basic form of 13. Bavelier D, Neville H (2002) Cross-modal plasticity: where and how? Nat Rev
perception and action. Ecol Psychol 7: 181–216. Neurosci 3(6): 443–452.
3. Teng S, Whitney D (2011) The acuity of echolocation: Spatial resolution in the sighted 14. Burton H (2003) Visual cortex activity in early and late blind people. J Neurosci
compared to expert performance. J Visual Impairment Blindness 105(1): 20–32. 23(10): 4005–4011.
4. Ciselet V, Pequet E, Richard I, Veraart C, Meulders M (1982) Substitution 15. Merabet LB, Pascual-Leone A (2010) Neural reorganization following sensory
sensorielle de la vision par l’audition au moyen de capteurs d’information spatial. loss: the opportunity of change. Nat Rev Neurosci 11(1): 44–52.
Arch Int Physiol Biochem 90: P47. 16. Langers DRM, van Dijk P, Backes WH (2005) Lateralization, connectivity and
5. Heyes AD (1984) Sonic Pathfinder: A programmable guidance aid for the blind. plasticity in the human central auditory system. NeuroImage 28: 490–499.
Electronics and Wireless World 90: 26–29. 17. Petkov CI, Kan X, Alho K, Bertrand O, Yund EW, et al. (2004) Attentional
6. Hughes B (2001) Active artificial echolocation and the nonvisual perception of modulation of human auditory cortex. Nat Neurosci 7: 658–663.
aperture passability. Hum Mov Sci 20: 371–400. 18. Woldorff MG, Tempelmann C, Fell J, Tegeler C, Gaschler-Markefski B, et al.
7. Kay L (1964) An ultrasonic sensing probe as a mobility aid for the Blind. (1999) Lateralized Auditory Spatial Perception and the Contralaterality of
Ultrasonics 2: 53. Cortical Processing as Studied With Functional Magnetic Resonance Imaging
8. Kish DC (1995) Evaluation of an echo-mobility program for young blind people and Magnetoencephalography. Human Brain Mapping 7: 49–66.
[Master’s thesis]. San Bernardino (California): Department of Psychology, 19. Glickstein M, Doron K (2008) Cerebellum: Connections and Functions.
California State University. 277 p. Cerebellum 7: 589–594.
9. Rojas JAM, Hermosilla JA, Montero RS, Espi PLL (2009) Physical Analysis of 20. Glickstein M, Strata P, Voogd J (2009) Cerebellum: History. Neuroscience 162:
Several Organic Signals for Human Echolocation: Oral Vacuum Pulses. Acta 549–559.
Acust United Acust 95: 325–330. 21. Glickstein M, Sultan F, Voogd J (2011) Functional localization in the
10. De Volder AG, Catalan-Ahumada M, Robert A, Bol A, Labar D, et al. (1999) cerebellum. Cortex 47(1): 59–80.
Changes in occipital cortex activity in early blind humans using a sensory 22. Haarmeier T, Thier P (2007) The attentive cerebellum – myth or reality?
substitution device. Brain Res 826(1): 128–34. Cerebellum 6: 177–183.
23. Stoodley CJ, Schmahmann JD (2009) Functional topography in the human 36. Lipschutz B, Kolinsky R, Damhaut P, Wikler D, Goldman S (2002) Attention-
cerebellum: a meta-analysis of neuroimaging studies. Neuro Image 44: Dependent Changes of Activation and Connectivity in Dichotic Listening.
489–501. Neuro Image 17: 643–656.
24. Strick PL, Dum RP, Fiez JA (2009) Cerebellum and Nonmotor Function. Ann 37. Saenz M, Lewis LB, Huth AG, Fine I, Koch C (2008) Visual Motion Area
Rev Neurosci 32: 413–434. MT+/V5 Responds to Auditory Motion in Human Sight-Recovery Subjects.
25. Baumann O, Mattingley JB (2010) Scaling of neural responses to visual and J Neurosci 28: 5141–5148.
auditory motion in the human cerebellum. J Neurosci 30: 4489–4495. 38. Blake R, Sobel KV, James TW (2004) Neural synergy between kinetic vision and
26. Schmahmann JD, Doyon J, McDonald D, Holmes C, Lavoie K, et al. (1999) touch. Psychol Sci 15: 397–402.
Three-dimensional MRI atlas of the human cerebellum in proportional 39. Beauchamp MS, Yasar NE, Kishan N, Ro T (2007) Human MST but not MT
stereotaxic space. Neuroimage 10: 233–260. responds to tactile stimulation. J Neurosci 27(31): 8261–8267.
27. Bridge H, Cowey A, Ragge N, Watkins K (2009) Imaging studies in congenital 40. Hagen MC, Franzen O, McGlone F, Essick G, Dancer C, et al. (2002) Tactile
anophthalmia reveal preservation of brain architecture in ‘visual’ cortex. Brain motion activates the human middle temporal/V5 (MT/V5) complex.
132: 3467–3480. Eur J Neurosci 16: 957–964.
41. Pascual-Leone A, Hamilton R (2001) The metamodal organization of the brain.
28. Jiang J, Zhu W, Shi F, Liu Y, Li J, et al. (2009) Thick visual cortex in the early
Progress Brain Res 134: 427–445.
blind. J Neurosci 29: 2205–2211.
42. Lomber SG, Meredith MA, Kral A (2010) Crossmodal plasticity in specific
29. Lepore N, Voss P, Lepore F, Chou Y, Fortin M, et al. (2010) Brain structure
auditory cortices underlies compensatory visual functions in the deaf. Nat
changes visualized in early- and late-onset blind subjects. Neuro Image 49(1):
Neurosci 13: 1421–1427.
134–140. 43. Rajkowska G, Goldman-Rakic PS (1995) Cytoarchitectonic Definition of
30. Noppeney U, Friston KJ, Ashburner J, Frackowiak R, Price CJ (2005) Early Prefrontal Areas in the Normal Human Cortex: II. Variability in Locations of
visual deprivation induces structural plasticity in gray and white matter. Curr Areas 9 and 46 and Relationship to the Talairach Coordinate System. Cereb
Biol 15: R488–490. Cortex 5(4): 323–337.
31. Rauschecker JP (1995) Compensatory plasticity and sensory substitution in the 44. Brainard DH (1997) The Psychophysics Toolbox. Spatial Vision 10: 433–436.
cerebral cortex. Trends Neurosci 18: 36–43. 45. Hall DA, Haggard MP, Akeroyd MA, Palmer AR, Summerfield AQ, et al.
32. Rauschecker JP (1999) Auditory cortical plasticity: a comparison with other (1999) ‘‘Sparse’’ temporal sampling in auditory fMRI. Human brain mapping 7:
sensory systems. Trends Neurosci 22: 74–80. 213–223.
33. Gougoux F, Lepore F, Lassonde M, Voss P, Zatorre RJ, et al. (2004) Pitch 46. Robbins H, Monro S (1951) A stochastic approximation method. Annals of
discrimination in the early blind. Nature 430(6997): 309. Mathematical Statistics 22: 400–407.
34. Roeder B, Teder-Sälejärvi W, Sterr A, Rösler F, Hillyard SA, et al. (1999) 47. Kesten H (1958) Accelerated stochastic approximation. Annals of Mathematical
Improved auditory spatial tuning in blind humans. Nature 400: 162–166. Statistics 29: 41–59.
35. Gandhi SP, Heeger DJ, Boynton GM (1999) Spatial attention affects brain 48. Samuels ML, Lu TFC (1992) Sample size requirements for back-of-the-envelope
activity in human primary visual cortex. Proc Nat Academy Sci USA 96: binomial confidence interval. American Statistician 46: 228–231.
3314–3319. 49. Talairach J, Tournoux P (1988) Co-planar stereotaxic atlas of the human brain.
New York: Thieme. 122 p.