• We are sorry, but NCBI web applications do not support your browser and may not function properly. More information
Logo of pnasPNASInfo for AuthorsSubscriptionsAboutThis Article
Proc Natl Acad Sci U S A. Oct 9, 2001; 98(21): 12301–12306.
Published online Sep 25, 2001. doi:  10.1073/pnas.211209098
PMCID: PMC59809
Psychology

“What” and “where” in the human auditory system

Abstract

The extent to which sound identification and sound localization depend on specialized auditory pathways was examined by using functional magnetic resonance imaging and event-related brain potentials. Participants performed an S1–S2 match-to-sample task in which S1 differed from S2 in its pitch and/or location. In the pitch task, participants indicated whether S2 was lower, identical, or higher in pitch than S1. In the location task, participants were asked to localize S2 relative to S1 (i.e., leftward, same, or rightward). Relative to location, pitch processing generated greater activation in auditory cortex and the inferior frontal gyrus. Conversely, identifying the location of S2 relative to S1 generated greater activation in posterior temporal cortex, parietal cortex, and the superior frontal sulcus. Differential task-related effects on event-related brain potentials (ERPs) were seen in anterior and posterior brain regions beginning at 300 ms poststimulus and lasting for several hundred milliseconds. The converging evidence from two independent measurements of dissociable brain activity during identification and localization of identical stimuli provides strong support for specialized auditory streams in the human brain. These findings are analogous to the “what” and “where” segregation of visual information processing, and suggest that a similar functional organization exists for processing information from the auditory modality.

Auditory scene analysis involves identifying the content (“what”) and the location (“where”) of sounds in the environment. Evidence from anatomical and neurophysiological studies in non-human primates (15) suggests that identification and localization of auditory events may be functionally segregated in specialized auditory streams. Combining anatomical and electrophysiological recording methods in non-human primates, Romanski et al. (5) have recently identified two separate auditory streams that originate in caudal and rostral auditory cortex, respectively, and project to different regions within the frontal lobe. The functional significance of these separate pathways has not been determined, although they suggest functional dissociations for auditory processes analogous to the “what” and “where” or ventral and dorsal cortical information streams for identifying and localizing visual (6, 7) and somatosensory (8) stimuli.

Auditory neuroimaging studies employing positron emission tomography or functional magnetic resonance imaging (fMRI) have revealed enhanced blood flow in parietal areas during sound localization (911). In comparison, tasks requiring individuals to make tone discriminations (12) or identify auditory stimuli (e.g., words or environmental sounds) show enhanced activation in inferior frontal cortex (13, 14). Although these results suggest that the processing of sound identity and sound location is functionally separable, the segregation in auditory information processing has yet to be demonstrated within the same individuals when using the same set of stimuli.

The present study was designed to directly test, by using fMRI and event-related brain potentials (ERPs), the hypothesis that specialized streams exist in humans for processing sound identity and sound location. Young adults were presented with the same stimuli but were required to perform two different tasks: a pitch (what) and a location (where) judgment task. Because free-field auditory stimulation is not possible within the scanner, a virtual three-dimensional (3D) auditory environment was created by using synthesized sounds with appropriate free-field acoustic cues (15, 16). Such a design has proven to be effective for imaging the neural substrates involved in localizing auditory events (9, 17).

Methods

Fifteen young adults (aged between 21 and 31 years; four males) participated in the study. All were right-handed and reported normal hearing. Each participant signed an informed consent form approved by the University of Toronto Human Subject Review Committee. Data from three participants were excluded because of head motion greater than 1 mm during the experiment. ERPs were recorded in a separate session and were obtained from those 12 participants that were included in the fMRI analysis.

Stimuli and Tasks.

Stimuli consisted of five synthesized two-octave band noise bursts starting with a center frequency of 2000 Hz and stepping up four times. Stimulus duration was 500 ms including 5-ms rise/fall time. Stimuli were generated digitally with 16-bit resolution and a sampling rate of 50 kHz, passed through a digital-to-analogue converter, and then low pass filtered at 10 kHz by using an anti-aliasing filter (Tucker-Davis Technology, Gainesville, FL). Stimuli were presented at 85 dB sound pressure level (SPL) by means of circumaural, fMRI-compatible headphones (Avotec, Jensen Beach, FL), acoustically padded to suppress scanner noise by 25 dB. Stimuli were presented at five possible azimuth locations relative to straight ahead (−90°, −45°, 0°, +45°, +90°). Virtual 3D sources were synthesized by using a head-related transfer function that replicated the acoustic effects of the head and ears of an average listener (18).

Participants performed a delayed match-to-sample task in which the first acoustic stimulus (S1) was held in memory (for 500 ms) for comparison with the second (S2) stimulus. Each trial began with a binaural warning tone (1000 Hz, 500-ms duration, 5-ms rise/fall time). After a delay of 1000 ms, a noise burst (S1) was presented at one of three possible locations and one of three possible pitches that excluded the two extreme locations and the two extreme pitches. A second noise burst (S2) was presented 500 ms after the offset of S1. S2 was presented at one of three possible and equally probable locations that included the same location of S1 and the two adjacent locations (immediately rightward and leftward). For example, if S1 was presented at 45° to left of center, S2 could have been presented along the azimuth at 90° left of center, 45° left of center, or 0°. Similarly, the frequency of S2 was either identical or adjacent (lower or higher) than S1. The likelihood of having the second stimulus lower, leftward, identical, higher, or rightward was equally probable. The intertrial interval was controlled by the participant, whose response initiated the presentation of the next trial 1000 ms later. In the pitch task, participants indicated by pressing one of three buttons whether S2 was lower, identical, or higher in pitch than S1, regardless of its location. Similarly, in the spatial discrimination task, participants indicated by pressing one of three buttons whether S2 was at a leftward, an identical, or a rightward position relative to S1, regardless of changes in pitch. The stimuli were identical in both conditions; only the task instructions differed. Participants performed each designated task (e.g., location comparisons) for 30 s followed by a 30-s rest period in which no stimuli were presented. This on/off sequence was repeated five times for a total duration of 5 min. Three to four pitch and location sequences were performed on each participant, which alternated over the course of the experiment. The order of conditions (pitch or location) was counterbalanced across participants.

To ensure that changes in brain activation were not due to differences in task difficulty, the pitch separation was adjusted for each individual such that they performed equally well in both tasks. This separation varied between 3 and 10% (mean 6.6 ± 2.4%). Responses and latencies were obtained by using two fMRI-compatible response pads (Lightwave Technologies, Surrey, BC, Canada), each containing two buttons side-by-side. On each trial, participants indicated whether the sound was leftward or lower in pitch by pressing the extreme left button of the pad in their left hand with their left middle finger (button 1). They pressed the rightmost button (button 4) on the pad in their right hand with their right middle finger for sounds that were rightward or higher in pitch. Lastly, they simultaneously pressed the remaining buttons (2 and 3) on each response pad with their left and right index fingers when the sounds were at the same location or of the same pitch. Participants kept their eyes closed during the scanning.

The tasks carried out during the ERP measurements were similar to those used during the fMRI procedure. For each condition, participants were given five blocks of 60 trials. The stimuli were the same as those used for the fMRI experiment and the trials themselves were identical (i.e., warning tone, then S1, and then S2 500 ms later). The order of conditions was counterbalanced across participants.

fMRI Procedure.

Participants' regional cerebral activity was assessed by using a 1.5-T Signa MR scanner with a standard head coil (CV/i hardware, LX8.3 software; General Electric Medical Systems, Waukesha, WI). Each scan sequence consisted of five 30-s task blocks alternating with 30-s blocks in which no sound was presented. Functional imaging was performed to measure brain activation by means of the blood oxygenation level-dependent (BOLD) effect (19) with optimal signal contrast. Eighteen axial slices 7 mm thick were obtained. Functional scans were obtained by using a single shot T2*-weighted pulse sequence with spiral readout, offline gridding, and reconstruction (ref. 20; TR = 2000 ms, TE = 40 ms, flip angle 80°, 90 × 90 effective acquisition matrix). For each participant, standard volumetric anatomical MRI was performed before functional scanning by using a standard 3D T1-weighted pulse sequence (TR = 12.4 ms, TE = 5.4 ms, flip angle 35°, 22 × 16.5 field of view, 256 × 192 acquisition matrix, 124 axial slices 1.4 mm thick).

Data processing and analyses were performed by using Analysis of Functional NeuroImages (AFNI) software (21, 22). Time series data were spatially coregistered to correct for head motion by using a 3D Fourier transform interpolation, and detrended to a constant reference scan by using a fifth-order polynomial. Percent changes in signal intensity with respect to rest were analyzed by using voxel-wise correlations of the location and pitch time series with square-wave reference vectors (23) shifted to account for the delay in hemodynamic response. The statistical cut-off for activation was set at P < 0.001 or lower, uncorrected. The minimum cluster size was 10 mm3 with a radius of 2 mm. This produced two activation images per participant, one for location vs. rest and one for pitch vs. rest. These activation images were then transformed into Talairach coordinates (21, 22, 24) and smoothed with a Gaussian filter of 6 mm full width at half maximum (FWHM) to increase the signal-to-noise ratio. The latter step was performed to facilitate the subsequent group analysis, which consisted of a random effect, voxel-wise two-factor ANOVA with tasks (location and pitch) as within-subject factors. Because the ANOVA was performed on the task vs. rest contrast images, the degrees of freedom were based on the number of subjects rather than on the number of scans. For the comparison between pitch and location conditions, the statistical cut-off was set at P < 0.01 and the minimum cluster size was 10 mm3 with a radius of 2 mm.

Recording and Analysis of ERPs.

The electroencephalogram (EEG) was recorded from an array of 64 electrodes including those from the standard 10–20 placement. Vertical and horizontal eye movements were recorded with electrodes at the outer canthi and at the superior and inferior orbit. Electrophysiological signals were digitized continuously (bandpass 0.05–50 Hz; 250 Hz sampling rate) by means of NeuroScan SynAmps and stored for offline analysis. During the recording, all electrodes were referenced to the midline central electrode (Cz); for data analysis, they were re-referenced to an average reference and the electrode Cz was reinstated.

The analysis epoch included 200 ms of prestimulus activity and 3000 ms of poststimulus activity. Trials contaminated by eye blink or excessive peak-to-peak deflection (±150 μV) at the electrodes not adjacent to the eyes were automatically rejected before averaging. The ERPs were then averaged separately for each site, stimulus type, and listening condition. ERPs were digitally lowpass filtered to attenuate frequencies above 12 Hz. For each individual average, the ocular artifacts (e.g., blinks and lateral movements) were removed by means of ocular source components, using BRAIN ELECTRICAL SOURCE ANALYSIS (BESA) software (25). ERP waveforms were quantified by computing mean values in selected latency regions, relative to the mean amplitude of the 200-ms prestimulus activity. All measurements were subjected to repeated measures ANOVA with task (pitch and location) and electrodes (CP1, CP2, FT9, FT10) as within-subject factors. Scalp topographies using the 61 electrodes (omitting the periocular electrodes) were statistically analyzed after scaling the amplitudes to eliminate amplitude differences between conditions (26). The original degrees of freedom for all analyses are reported throughout the paper. Type I errors associated with inhomogeneity of variance were controlled by decreasing the degrees of freedom using the Greenhouse-Geisser epsilon (epsilon), and the probability estimates are based on these reduced degrees of freedom.

Results

fMRI Experiment.

There were no differences in accuracy between the two tasks: participants correctly judged the pitch or the location of S2 in 72% and 70% of the trials, respectively. Similarly, response latency did not significantly differ between pitch (1121 ± 308 ms) and location (1069 ± 204 ms) judgments.

For the group as a whole, we calculated the mean percent increase in BOLD signal intensity from rest for the pitch and location conditions separately (Fig. (Fig.1).1). In both location and pitch tasks, there was bilateral activity enhancement in primary and secondary auditory cortices, inferior and superior parietal cortices, and the superior and inferior frontal gyri (Table (Table1).1). Both auditory discrimination tasks also were associated with a decrease in BOLD signal in visual cortex.

Figure 1
The first two columns (Left) show the group mean activation during the pitch and location tasks vs. rest, respectively. The color scale below is based on t values ranging from 6 to 15 (P < 0.00001). The rightmost column shows the difference in ...
Table 1
Brain regions where there were significant differences between pitch and location

Specificity of brain activity for localizing and identifying auditory events was determined by directly comparing the changes in hemodynamic response obtained during the pitch task vs. rest with changes in activity measured during the location task vs. rest. Relative to the location task, pitch judgment was associated with greater activation in primary auditory cortices, extending anteriorly to auditory association cortices on the supratemporal plane. We also found concurrent increases in BOLD signal in the right inferior frontal gyrus. These differences in brain activation were seen primarily in the right hemisphere (see Fig. Fig.22 and Table Table1).1). Conversely, selectively processing sound locations was associated with greater bilateral activation in posterior temporal areas, and in inferior and superior parietal cortices compared with the pitch judgment (Fig. (Fig.2,2, Table Table1).1). Importantly, there was a parallel increase in BOLD signal in the right superior frontal sulcus, an area very similar to that observed during a visual spatial location task (27).

Figure 2
Three-dimensional pattern of cortical activation to highlight differences between pitch and location discrimination tasks. The right hemisphere is foremost in the figure and part of the temporal lobe has been removed to show activation in temporal and ...

Another way of examining task-specific changes is to see whether areas with greater increases of activity in a given task also show larger correlations among their activity measures during that task. To explore these functional correlations, we compared the time course of changes in BOLD signal in the peak voxels of four brain areas; primary auditory cortex, superior parietal cortex, and inferior and superior prefrontal gyrus. These regions of interest were chosen because (i) they were proposed to be part of a dual pathway model (5, 27) and (ii) they were differentially active during the pitch and location tasks. Across participants, the time series for each condition were averaged into a 60-s sequence containing 30 data points (15 ON and 15 OFF). Pair-wise correlation coefficients were then computed among these group mean time courses of BOLD signal changes for the four regions.

Table Table22 shows the correlation matrices of interregional correlations for the location and the pitch task. During the location task, the enhanced BOLD signal in auditory cortex was correlated with that of parietal cortex. Furthermore, there was a significant correlation between the BOLD signal in parietal cortex and that of the superior frontal gyrus. Importantly, there was no significant correlation between BOLD signal in auditory cortices and inferior frontal cortex during the location task. In contrast, we found a significant correlation between the observed changes in BOLD signal in auditory cortices and that of the inferior prefrontal gyrus during the pitch task. There was no significant correlation between activity in the parietal cortex and superior frontal gyrus during the pitch task. This pattern of functional correlations therefore supports distinct ventral and dorsal networks active during nonspatial and spatial auditory tasks, respectively.

Table 2
Correlations between BOLD signal changes in regions of interest

ERP Experiment.

As in the fMRI experiment, there were no differences in accuracy between the two tasks. Accuracy in both the pitch and the location tasks was 82%. However, participants were slower in the pitch (1026 ± 162 ms) than in the location task [954 ± 134 ms; F(1,11) = 5.52, P < 0.05].

Fig. Fig.33 shows event-related brain potentials elicited during the two tasks. There were two ERP modulations that distinguished neural activity associated with processing pitch and location. The first occurred between 300 and 500 ms after the presentation of S1, and consisted of greater positivity of the waveform over inferior frontotemporal regions during the pitch task and greater positivity over centroparietal regions during the location task [F(1,11) = 6.86, P < 0.05]. This task effect on the ERPs may reflect the online processing and maintenance of S1 in working memory for an eventual comparison with S2. The second modulation occurred 300–400 ms after S2 was presented [F(1,11) = 26.05, P < 0.001], and showed a similar difference in the waveforms as was seen for S1 (Fig. (Fig.3).3). This modulation preceded the P3b wave at parietal sites and may reflect the comparison between the representations of S1 and S2.

Figure 3
Group mean event-related waveforms elicited during the pitch (solid line) and the location (dashed line) judgment tasks. FT9/FT10, left and right frontal-temporal electrodes; CP1/CP2, left and right central-parietal electrodes.

Discussion

The present study was designed to examine the extent to which processing sound identity and sound location depends on specialized auditory pathways within the same individuals and using similar stimuli. Relative to location processing, pitch processing was associated with greater activity in auditory cortices and inferior prefrontal gyrus. The inferior prefrontal gyrus has consistently been shown to be active during the processing of pitch changes (28), auditory word and tone working memory (29), semantic processing of auditory materials (13), and phoneme discrimination (14). Conversely, spatial judgment was associated with greater bilateral activation in posterior temporal areas, and in inferior and superior parietal cortices compared with the pitch judgment. This finding provides further evidence that the parietal cortex plays an important role in processing the spatial relation between consecutive auditory events, consistent with findings from electrophysiological studies in non-human primates (30), lesion studies in humans (3133), and positron emission tomography in humans by using a virtual auditory display (9, 10).

The novel finding of this study is the dissociation of the effects of task instruction on both fMRI and ERP measures. Processing pitch information recruited brain areas that were primarily distributed in the ventral part of the brain, whereas sound localization recruited areas that were primarily distributed in dorsal regions. The dissociation in fMRI signals was paralleled by task-related changes in ERPs. Differences in both BOLD signal change and in ERP amplitude during the pitch and location tasks suggest that auditory information processing may be specialized into nonspatial and spatial domains. The present study supports and extends previous neuroimaging studies by providing the first direct evidence that the neural systems involved in identifying and localizing auditory objects are functionally and neuroanatomically segregated based on task demands even when stimuli are identical across tasks. Our findings are consistent with animal models of auditory processing (4, 5, 3437) in which object identification recruits activation in anterior temporal and inferior frontal areas, whereas object location is mediated by posterior temporal areas, parietal cortex, and dorsal frontal regions. The observed pattern of BOLD signal correlations between auditory and inferior prefrontal cortices during the pitch task and between parietal and dorsal prefrontal cortices during the location task is also consistent with the proposal that sound identification and localization are functionally distinct along ventral and dorsal pathways, respectively.

Here, the functional segregation is due to the degree to which the areas are active in the two tasks (Figs. (Figs.11 and and2,2, Table Table1),1), similar to the differential activation of the ventral and dorsal visual streams by objects and locations (27, 38). There was also a significant correlation between temporal and parietal cortex during both pitch and location judgment tasks. This finding may reflect “cross talk” between ventral and dorsal streams as previously suggested for visual stimuli (39) and emphasizes that, although functional segregation may be an important property of the brain, higher perceptual and cognitive functions involve interaction among many brain areas.

Lastly, there is a remarkable similarity between the prefrontal areas recruited during sound identity and sound location in the current experiment and those observed in previous anatomical studies (4, 5) and reported during a comparable visual task (27). This result suggests that these ventral and dorsal prefrontal areas are involved in representing two distinct types of information about the environment, “what” and “where,” regardless of which stimulus modality provides that information. Together, these findings suggest that the segregation of information processing into “what” and “where” pathways may be a fundamental property of cortical organization.

Acknowledgments

We thank Rhonda Walcarius and Cathy Nangini for technical support, and Drs. Helen Mayberg, Randy McIntosh, Terry Picton, and Don Stuss for helpful comments on an earlier version of the manuscript. We also thank Dr. Gary Glover for the use of his spiral imaging pulse sequence. This research was supported by grants from the Natural Sciences and Engineering Research Council of Canada, the Canadian Institutes for Health Research, and the Canadian Foundation for Innovation.

Abbreviations

BOLD
blood oxygenation level-dependent
3D
three-dimensional
ERP
event-related brain potentials
fMRI
functional magnetic resonance imaging

Footnotes

This paper was submitted directly (Track II) to the PNAS office.

References

1. Rauschecker J P, Tian B, Hauser M. Science. 1995;268:111–114. [PubMed]
2. Rauschecker J P, Tian B, Pons T, Mishkin M. J Comp Neurol. 1997;382:89–103. [PubMed]
3. Rauschecker J P. Audiol Neurootol. 1998;3:86–103. [PubMed]
4. Hackett T A, Stepniewska I, Kaas J H. Brain Res. 1999;817:45–58. [PubMed]
5. Romanski L M, Tian B, Fritz J, Mishkin M, Goldman-Rakic P S, Rauschecker J P. Nat Neurosci. 1999;2:1131–1136. [PMC free article] [PubMed]
6. Ungerleider L G, Mishkin M. In: Analysis of Visual Behavior. Ingle D J, Goodale M A, Mansfield R J W, editors. Cambridge, MA: MIT press; 1982. pp. 549–586.
7. Haxby J V, Grady C L, Horwitz B, Ungerleider L G, Mishkin M, Carson R E, Herscovitch P, Schapiro M B, Rapoport S I. Proc Natl Acad Sci USA. 1991;88:1621–1625. [PMC free article] [PubMed]
8. Pons T P, Garraghty P E, Mishkin M. J Neurophysiol. 1992;68:518–527. [PubMed]
9. Bushara K O, Weeks R A, Ishii K, Catalan M J, Tian B, Rauschecker J P, Hallett M. Nat Neurosci. 1999;2:759–766. [PubMed]
10. Weeks R A, Aziz-Sultan A, Bushara K O, Tian B, Wessinger C M, Dang N, Rauschecker J P, Hallett M. Neurosci Lett. 1999;262:155–158. [PubMed]
11. Zatorre R J, Mondor T A, Evans A C. Neuroimage. 1999;10:544–554. [PubMed]
12. Holcomb H H, Medoff D R, Caudill P J, Zhao Z, Lahti A C, Dannals R F, Tamminga C A. Cereb Cortex. 1998;8:534–542. [PubMed]
13. Gabrieli J D, Poldrack R A, Desmond J E. Proc Natl Acad Sci USA. 1998;95:906–913. [PMC free article] [PubMed]
14. Pugh K R, Offywitz B A, Shaywitz S E, Fulbright R K, Byrd D, Skudlarski P, Shankweiler D P, Katz L, Constable R T, Fletcher J, et al. Neuroimage. 1996;4:159–173. [PubMed]
15. Wightman F L, Kistler D J. J Acoust Soc Am. 1989;85:868–878. [PubMed]
16. Langendijk E H, Bronkhorst A W. J Acoust Soc Am. 2000;107:528–537. [PubMed]
17. Weeks R, Horwitz B, Aziz-Sultan A, Tian B, Wessinger C M, Cohen L G, Hallett M, Rauschecker J P. J Neurosci. 2000;20:2664–2672. [PubMed]
18. Wenzel E M, Arruda M, Kistler D J, Wightman F L. J Acoust Soc Am. 1993;94:111–123. [PubMed]
19. Ogawa S, Lee T M, Kay A R, Tank D W. Proc Natl Acad Sci USA. 1990;87:9868–9872. [PMC free article] [PubMed]
20. Glover G H, Lai S. Magn Reson Med. 1998;39:361–368. [PubMed]
21. Cox R W. Comput Biomed Res. 1996;29:162–173. [PubMed]
22. Cox R W, Hyde J S. NMR Biomed. 1997;10:171–178. [PubMed]
23. Bandettini P A, Jesmanowicz A, Wong E C, Hyde J S. Magn Reson Med. 1993;30:161–173. [PubMed]
24. Talairach J, Tournoux P. Co-Planar Stereotaxic Atlas of the Human Brain. New York: Thieme Medical Publishers; 1988.
25. Picton T W, van Roon P, Armilio M L, Berg P, Ille N, Scherg M. Clin Neurophysiol. 2000;111:53–65. [PubMed]
26. McCarthy G, Wood C C. Electroencephalogr Clin Neurophysiol. 1985;62:203–208. [PubMed]
27. Courtney S M, Ungerleider L G, Keil K, Haxby J V. Cereb Cortex. 1996;6:39–49. [PubMed]
28. Zatorre R J, Evans A C, Meyer E, Gjedde A. Science. 1992;256:846–849. [PubMed]
29. Stevens A A, Goldman-Rakic P S, Gore J C, Fulbright R K, Wexler B E. Arch Gen Psychiatry. 1998;55:1097–1103. [PubMed]
30. Stricanne B, Andersen R A, Mazzoni P. J Neurophysiol. 1996;76:2071–2076. [PubMed]
31. Pinek B, Duhamel J R, Cave C, Brouchon M. Cortex. 1989;25:175–186. [PubMed]
32. Pinek B, Brouchon M. Brain Cognit. 1992;18:1–11. [PubMed]
33. Clarke S, Bellmann A, Meuli R A, Assal G, Steck A J. Neuropsychologia. 2000;38:797–807. [PubMed]
34. Rauschecker J P. Curr Opin Neurobiol. 1998;8:516–521. [PubMed]
35. Romanski L M, Bates J F, Goldman-Rakic P S. J Comp Neurol. 1999;403:141–157. [PubMed]
36. Kaas J H, Hackett T A, Tramo M J. Curr Opin Neurobiol. 1999;9:164–170. [PubMed]
37. Rauschecker J P, Tian B. Proc Natl Acad Sci USA. 2000;97:11800–11806. [PMC free article] [PubMed]
38. Haxby J V, Horwitz B, Ungerleider L G, Maisog J M, Pietrini P, Grady C L. J Neurosci. 1994;14:6336–6353. [PubMed]
39. Kohler S, Moscovitch M, Winocur G, Houle S, McIntosh A R. Neuropsychologia. 1998;36:129–142. [PubMed]

Articles from Proceedings of the National Academy of Sciences of the United States of America are provided here courtesy of National Academy of Sciences
PubReader format: click here to try

Formats:

Related citations in PubMed

See reviews...See all...

Cited by other articles in PMC

See all...

Links

  • Cited in Books
    Cited in Books
    PubMed Central articles cited in books
  • PubMed
    PubMed
    PubMed citations for these articles