NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Murray MM, Wallace MT, editors. The Neural Bases of Multisensory Processes. Boca Raton (FL): CRC Press/Taylor & Francis; 2012.

Cover of The Neural Bases of Multisensory Processes

The Neural Bases of Multisensory Processes.

Show details

Chapter 6Multisensory Influences on Auditory Processing

Perspectives from fMRI and Electrophysiology

, , , and .

6.1. INTRODUCTION

Traditionally, perception has been described as a modular function, with the different sensory modalities operating as independent and separated processes. Following this view, sensory integration supposedly occurs only after sufficient unisensory processing and only in higher association cortices (Jones and Powell 1970; Ghazanfar and Schroeder 2006). Studies in the past decade, however, promote a different view, and demonstrate that the different modalities interact at early stages of processing (Kayser and Logothetis 2007; Schroeder and Foxe 2005; Foxe and Schroeder 2005). A good model for this early integration hypothesis has been the auditory cortex, where multisensory influences from vision and touch have been reported using a number of methods and experimental paradigms (Kayser et al. 2009c; Schroeder et al. 2003; Foxe and Schroeder 2005). In fact, anatomical afferents are available to provide information about nonacoustic stimuli (Rockland and Ojima 2003; Cappe and Barone 2005; Falchier et al. 2002) and neuronal responses showing cross-modal influences have been described in detail (Lakatos et al. 2007; Kayser et al. 2008, 2009a; Bizley et al. 2006). These novel insights, together with the traditional notion that multisensory processes are more prominent in higher association regions, suggest that sensory integration is a rather distributed process that emerges over several stages.

Of particular interest in the context of sensory integration are stimuli with particular behavioral significance, such as sights and sounds related to communication (Campanella and Belin 2007; Petrini et al. 2009; Ghazanfar and Logothetis 2003; von Kriegstein and Giraud 2006; von Kriegstein et al. 2006). Indeed, a famous scenario used to exemplify sensory integration—the cocktail party— concerns exactly this: when in a loud and noisy environment, we can better understand a person talking to us when we observe the movements of his/her lips at the same time (Sumby and Polack 1954; Ross et al. 2007). In this situation, the visual information about lip movements enhances the (perceived) speech signal, hence providing an example of how visual information can enhance auditory perception. However, as for many psychophysical phenomena, the exact neural substrate mediating the sensory integration underlying this behavioral benefit remains elusive.

In this review, we discuss some of the results of early multisensory influences on auditory processing, and provide evidence that sensory integration occurs distributed and across several processing stages. In particular, we discuss some of the methodological aspects relevant for studies seeking to localize and characterize multisensory influences, and emphasize some of the recent results pertaining to speech and voice integration.

6.2. THE WHERE AND HOW OF SENSORY INTEGRATION

To understand how the processing of acoustic information benefits from the stimulation of other modalities, we need to investigate “where” along auditory pathways influences from other modalities occur, and “how” they affect the neural representation of the sensory environment. Noteworthy, the questions of “where” and “how” address different scales and levels of organization. Probing the “where” question requires the observation of sensory responses at many stages of processing, and hence a large spatial field of view. This is, for example, provided by functional imaging, which can assess signals related to neural activity in multiple brain regions at the same time. Probing the “how” question, in contrast, requires an investigation of the detailed neural representation of sensory information in localized regions of the brain. Given our current understanding of neural information processing, this level is best addressed by electrophysiological recordings that assess the responses of individual neurons, or small populations thereof, at the same time (Donoghue 2008; Kayser et al. 2009b; Quian Quiroga 2009).

These two approaches, functional imaging (especially functional magnetic resonance imaging (fMRI)-blood oxygenation level-dependent (BOLD) signal) and electrophysiology, complement each other not only with regard to the sampled spatiotemporal dimensions, but also with regard to the kind of neural activity that is seen by the method. Although electrophysiological methods sample neural responses at the timescale of individual action potentials (millisecond precision) and the spatial scale of micrometers, functional imaging reports an aggregate signal derived from (subthreshold) responses of millions of neurons sampled over several hundreds of micrometers and hundreds of milliseconds (Logothetis 2002, 2008; Lauritzen 2005). In fact, because the fMRI-BOLD signal is only indirectly related to neuronal activity, it is difficult, at least at the moment, to make detailed inferences about neuronal responses from imaging data (Leopold 2009). As a result, both methods provide complementary evidence on sensory integration.

In addition to defining methods needed to localize and describe sensory interactions, operational criteria are required to define what kind of response properties are considered multisensory influences. At the level of neurons, many criteria have been derived from seminal work on the superior colliculus by Stein and Meredith (1993). Considering an auditory neuron, as an example, visual influences would be assumed if the response to a bimodal (audiovisual) stimulus differs significantly from the unimodal (auditory) response. Although this criterion can be easily implemented as a statistical test to search for multisensory influences, it is, by itself, not enough to merit the conclusion that an observed process merits the label “sensory integration.” At the level of behavior, sensory integration is usually assumed if the bimodal sensory stimulus leads to a behavioral gain compared with the unimodal stimulus (Ernst and Bülthoff 2004). Typical behavioral gains are faster responses, higher detection rates, or improved stimulus discriminability. Often, these behavioral gains are highest when individual unimodal stimuli are least effective in eliciting responses, a phenomenon known as the principle of inverse effectiveness. In addition, different unimodal stimuli are only integrated when they are perceived to originate from the same source, i.e., when they occur coincident in space and time. Together, these two principles provide additional criteria to decide whether a particular neuronal process might be related to sensory integration (Stein 1998, 2008). This statistical criterion, in conjunction with the verification of these principles, has become the standard approach to detect neural processes related to sensory integration. In addition, recent work has introduced more elaborate concepts derived from information theory and stimulus decoding. Such methods can be used to investigate whether neurons indeed become more informative about the sensory stimuli, and whether they allow better stimulus discrimination in multisensory compared to unisensory conditions (Bizley et al. 2006; Bizley and King 2008; Kayser et al. 2009a).

6.3. USING FUNCTIONAL IMAGING TO LOCALIZE MULTISENSORY INFLUENCES IN AUDITORY CORTEX

Functional imaging is by far the most popular method to study the cortical basis of sensory integration, and many studies report multisensory interactions between auditory, visual, and somatosensory stimulation in association cortices of the temporal and frontal lobes (Calvert 2001). In addition, a number of studies reported that visual or somatosensory stimuli activate regions in close proximity to the auditory cortex or enhance responses to acoustic stimuli in these regions (Calvert and Campbell 2003; Calvert et al. 1997, 1999; Pekkola et al. 2005; Lehmann et al. 2006; van Atteveldt et al. 2004; Schurmann et al. 2006; Bernstein et al. 2002; Foxe et al. 2002; Martuzzi et al. 2006; van Wassenhove et al. 2005). Together, these studies promoted the notion of early multisensory interactions in the auditory cortex.

However, the localization of multisensory influences is only as good as the localization of those structures relative to which the multisensory influences are defined. To localize multisensory effects to the auditory core (primary) or belt (secondary) fields, one needs to be confident about the location of these auditory structures in the respective subjects. Yet, this can be a problem given the small scale and variable position of auditory fields in individual subjects (Kaas and Hackett 2000; Hackett et al. 1998; Fullerton and Pandya 2007; Clarke and Rivier 1998; Chiry et al. 2003). One way to overcome this would be to first localize individual areas in each subject and to analyze functional data within these regions of interest. Visual studies often follow this strategy by mapping visual areas using retinotopically organized stimuli, which exploit the well-known functional organization of the visual cortex (Engel et al. 1994; Warnking et al. 2002). Auditory studies, in principle, could exploit a similar organization of auditory cortex, known as tonotopy, to define individual auditory fields (Rauschecker 1998; Rauschecker et al. 1995; Merzenich and Brugge 1973). In fact, electrophysiological studies have demonstrated that several auditory fields contain an ordered representation of sound frequency, with neurons preferring similar sound frequencies appearing in clusters and forming continuous bands encompassing the entire range from low to high frequencies (Merzenich and Brugge 1973; Morel et al. 1993; Kosaki et al. 1997; Recanzone et al. 2000). In addition, neurons in the auditory core and belt show differences in their preferences to narrow and broadband sounds, providing a second feature to distinguish several auditory fields (Rauschecker 1998; Rauschecker et al. 1997) (Figure 6.1a). Yet, although these properties in principle provide characteristics to differentiate individual auditory fields, this has proven surprisingly challenging in human fMRI studies (Wessinger et al. 2001; Formisano et al. 2003; Talavage et al. 2004).

FIGURE 6.1. (See color insert.

FIGURE 6.1

(See color insert.) Mapping individual auditory fields using fMRI. (a) Schematic of organization of monkey auditory cortex. Three primary auditory fields (core region) are surrounded by secondary fields (belt region) as well as higher association areas (parabelt). (more...)

To sidestep these difficulties, we exploited high-resolution imaging facilities in combination with a model system for which there exists considerably more prior knowledge about the organization of the auditory cortex: the macaque monkey. This model system allows imaging voxel sizes on the order of 0.5 × 0.5 mm, whereas conventional human fMRI studies operate on a resolution of 3 × 3 mm (Logothetis et al. 1999). Much of the evidence about the anatomical and functional structure of the auditory cortex originates from this model system, providing important a priori information about the expected organization (Kaas and Hackett 2000; Hackett et al. 1998; Rauschecker and Tian 2004; Recanzone et al. 2000). Combining this a priori knowledge with high-resolution imaging systems as well as optimized data acquisition for auditory paradigms, we were able to obtain a tonotopic functional parcellation in individual animals (Petkov et al. 2006, 2009). By comparing the activation to stimulation with sounds of different frequency compositions, we obtained a smoothed frequency preference map which allowed determining the anterior–posterior borders of potential fields. In addition, the preference to sounds of different bandwidths often allowed a segregation of core and belt fields, hence providing borders in medial–lateral directions. When combined with the known organization of auditory cortex, the evidence from these activation patterns allowed a more complete parcellation into distinct core and belt fields, and provided constraints for the localization of the parabelt regions (Figure 6.1b). This functional localization procedure for auditory fields now serves as a routine tool to delineate auditory structures in experiments involving auditory cortex.

6.4. MULTISENSORY INFLUENCES ALONG THE AUDITORY PROCESSING STREAM

In search for a better localization of multisensory influences in the auditory cortex reported by human imaging studies, we combined the above localization technique with audiovisual and audio-tactile stimulation paradigms (Kayser et al. 2005, 2007). To localize multisensory influences, we searched for regions (voxels) in which responses to acoustic stimuli were significantly enhanced when a visual stimulus was presented at the same time. Because functional imaging poses particular constraints on statistical contrasts (Laurienti et al. 2005), we used a conservative formulation of this criterion in which multisensory influences are defined as significant superadditive effects, i.e., the response in the bimodal condition is required to be significantly stronger than the sum of the two unisensory responses: AV > (A + V). In our experiments, we employed naturalistic stimuli in order to activate those regions especially involved in the processing of everyday scenarios. These stimuli included scenes of conspecific animals vocalizing as well as scenes showing other animals in their natural settings.

In concordance with previous reports, we found that visual stimuli indeed influence fMRI responses to acoustic stimuli within the classical auditory cortex. These visual influences were strongest in the caudal portions of the auditory cortex, especially in the caudo–medial and caudo–lateral belt, portions of the medial belt, and the caudal parabelt (Figure 6.2a and b). These multisensory interactions in secondary and higher auditory regions occurred reliably in both anesthetized and alert animals. In addition, we found multisensory interactions in the core region A1, but only in the alert animal, indicating that these early interactions could be dependent on the vigilance of the animal, perhaps involving cognitive or top-down influences. To rule out nonspecific modulatory projections as the source of these effects, we tested two functional criteria of sensory integration: the principles of temporal coincidence and inverse effectiveness. We found both criteria to be obeyed, and multisensory influences were stronger when sensory stimuli were in temporal coincidence and when unisensory stimuli were less effective in eliciting BOLD responses. Overall, these findings not only confirm previous results from human imaging, but also localize multisensory influences mostly to secondary fields and demonstrate a clear spatial organization, with caudal regions being most susceptible to multisensory inputs (Kayser et al. 2009c).

FIGURE 6.2. (See color insert.

FIGURE 6.2

(See color insert.) Imaging multisensory influences in monkey auditory cortex. (a) Data from an experiment with audiovisual stimulation. Sensory activation to auditory (left) and visual (right) stimuli are shown on single image slices (red to yellow voxels). (more...)

In addition to providing a good localization of cross-modal influences (the “where” question), functional imaging can also shed light on the relative influence of visual stimuli on auditory processing at several processing stages. Because fMRI allows measuring responses at many locations at the same time, we were able to quantify visual influences along multiple stages in the caudal auditory network (Figure 6.2c). Using the above-mentioned localization technique in conjunction with anatomical landmarks, we defined several regions of interest outside the classical auditory cortex: these comprised the caudal parabelt, the superior temporal gyrus, as well as the upper bank of the STS (uSTS). The uSTS is a well-known multisensory area where neuronal responses as well as fMRI activations to stimulation of several modalities have been described (Benevento et al. 1977; Bruce et al. 1981; Beauchamp et al. 2004, 2008; Dahl et al. 2009). As a result, one should expect a corresponding increase in visual influence when proceeding from the auditory core to the uSTS. This was indeed the case, as shown in Figure 6.2d: visual influences were relatively small in auditory core and belt fields, as described above. In the parabelt/STG region, an auditory association cortex, visual influences already contributed a considerable proportion to the total activation, and were yet much stronger in the uSTS. As a rule of thumb, it seemed that the contribution of visual stimuli to the total measured activation roughly doubled from stage to stage along this hierarchy.

Although human functional imaging has described multisensory influences at different stages of auditory processing, and in a number of behavioral contexts, imaging studies with the animal model localized these influences to the identified areas. These results promote a model in which multisensory influences already exist at early processing stages and progressively increase in higher areas. This suggests that sensory integration is a distributed process involving several processing stages to varying degrees, in opposition to the traditional idea of a modular organization of sensory processing into independent unisensory processes modules.

6.5. MULTISENSORY INFLUENCES AND INDIVIDUAL NEURONS

Having localized multisensory influences to particular auditory fields, the obvious question arises of whether and how nonauditory inputs improve the processing of acoustic information. As noted above, this “how” question is ideally investigated using electrophysiological methods, for two reasons. First, the imaging signal reflects neuronal activity only indirectly and does not permit definite conclusions about the underlying neuronal processes (Logothetis 2008; Kayser et al. 2009c; Laurienti et al. 2005). And second, electrophysiology can directly address those parameters that are believed to be relevant for neural information processing, such as the spike count of individual neurons, temporal patterns of action potentials, or the synchronous firing of several neurons (Kayser et al. 2009b).

Several electrophysiological studies have characterized multisensory influences in the auditory cortex. Especially at the level of subthreshold activity, as defined by field potentials and current source densities, strong visual or somatosensory influences were reported (Ghazanfar et al. 2005, 2008; Lakatos et al. 2007; Schroeder and Foxe 2002; Schroeder et al. 2001, 2003). These multisensory influences were widespread, in that they occurred at the vast majority of recording sites in each of these studies. In addition, these multisensory influences were not restricted to secondary areas but also occurred in regions functionally and anatomically characterized as primary auditory cortex (Kayser et al. 2008; Lakatos et al. 2007). Given that field potentials are especially sensitive to synaptic activity in the vicinity of the electrode (Mitzdorf 1985; Juergens et al. 1999; Logothetis 2002), these observations demonstrate that multisensory input to the auditory cortex occurs at the synaptic level. These results provide a direct neural basis for the multisensory influences seen in imaging studies, but do not yet reveal whether the neural information representation benefits from the multisensory input.

Other studies provide evidence for multisensory influences on the firing of individual neurons in the auditory cortex. For example, measurements in ferret auditory cortex revealed that 15% of the neurons in core fields are sensitive no nonauditory inputs such as flashes of light (Bizley et al. 2006; and see Cappe et al. 2007 for similar results in monkeys). We investigated such visual influences in the macaque and found that a similar proportion (12%) of neurons in the auditory core revealed multisensory interactions in their firing rates. Of these, nearly 4% responded to both acoustic and visual stimuli when presented individually, and hence, constitute bimodal neurons. The remaining 8% responded to unimodal sounds but did not respond to unimodal visual stimuli; however, their responses were enhanced (or reduced) by the simultaneous presentation of both stimuli. This response pattern does not conform to the traditional notion of bimodal neurons but represents a kind of multisensory influence typically called subthreshold response modulation (Dehner et al. 2004). Similar subthreshold response modulations have been observed in a number of cortical areas (Allman et al. 2008a, 2008b; Allman and Meredith 2007; Meredith and Allman 2009), and suggest that multisensory influences can fall along a continuum, ranging from true unimodal neurons to the classical bimodal neuron that exhibits suprathreshold responses to stimuli in several modalities.

Noteworthy, the fraction of neurons with significant multisensory influences in the auditory cortex was considerably smaller than the fraction of sites showing similar response properties in the local field potential (LFP), or the spatial area covered by the voxels showing multisensory responses in the imaging data. Hence, although visual input seems to be widely present at the subthreshold level, only a minority of neurons actually exhibit significant changes of their firing rates. This suggests that the effect of visual stimulation on auditory information coding in early auditory cortex is weaker than one would estimate from the strong multisensory influences reported in imaging studies.

When testing the principles of temporal coincidence and inverse effectiveness for these auditory cortex neurons, we found both to be obeyed: the relative timing of auditory and visual stimuli was as important in shaping the multisensory influence as was the efficacy of the acoustic stimulus (Kayser et al. 2008). Similar constraints of spatiotemporal stimulus alignment on audiovisual response modulations in the auditory cortex have been observed in other studies as well (Bizley et al. 2006). Additional experiments using either semantically congruent or incongruent audiovisuals revealed that visual influences in the auditory cortex also show specificity to more complex stimulus attributes. For example, neurons integrating information about audiovisual communication signals revealed reduced visual modulation when the acoustic communication call was paired with a moving disk instead of the movie displaying the conspecific animal (Ghazanfar et al. 2008). A recent study also revealed that pairing a natural sound with a mismatching movie abolishes multisensory benefits for acoustic information representations (Kayser et al. 2009a). Altogether, this suggests that visual influences in the primary and secondary auditory fields indeed provide functionally specific visual information.

Given that imaging studies reveal an increase of multisensory influence in higher auditory regions, one should expect a concomitant increase in the proportion of multisensory neurons. Indeed, when probing neurons in a classical association cortex, such as the STS, much stronger multisensory influences are visible in the neurons firing. Using the same stimuli and statistical criteria, a recent study revealed a rather homogeneous population of unimodal and bimodal neurons in the upper bank STS (Dahl et al. 2009): about half the neurons responded significantly to both sensory modalities, whereas 28% of the neurons preferred the visual and 19% preferred the auditory modality. Importantly, this study not only revealed a more complex interplay of auditory and visual information representations in this region, but detailed electrophysiological mappings demonstrated that a spatial organization of neurons according to their modality preferences exists in the STS: neurons preferring the same modality (auditory or visual) co-occurred in close spatial proximity or occurred intermingled with bimodal neurons, whereas neurons preferring different modalities occurred only spatially separated. This organization at the scale of individual neurons led to extended patches of same modality preference when analyzed at the scale of millimeters, revealing large-scale regions that preferentially respond to the same modality. These results lend support to the notion that topographical organizations might serve as a general principle of integrating information within and across the sensory modalities (Beauchamp et al. 2004; Wallace et al. 2004).

These insights from studies of multisensory integration at the neuronal level are in concordance with the notion that sensory integration is a distributed hierarchical process that extends over several processing stages. Given the difficulty in characterizing and interpreting the detailed effect of multisensory influences at a single processing stage, a comparative approach might prove useful: comparing multisensory influences at different stages using the same stimuli might help not only in understanding the contribution of individual stages to the process of sensory integration, but also facilitate the understanding of the exact benefit for a particular region to receive multisensory input.

6.6. MULTISENSORY INFLUENCES AND PROCESSING OF COMMUNICATION SIGNALS

The above findings clearly reveal that the processing of auditory information is modulated by visual (or somatosensory) information already at processing stages in or close to the primary auditory cortex. Noteworthy, these cross-modal influences were seen not only in the context of naturalistic stimuli, but also for very simple and artificial stimuli. For example, visual influences on neuronal firing rates occurred when using flashes of light, short noise bursts, or very rapid somatosensory stimulation (Bizley et al. 2006; Lakatos et al. 2007; Kayser et al. 2008; Cappe et al. 2007; Bizley and King 2008). This suggests that multisensory influences in early auditory fields are not specialized for natural stimuli such as communication sounds, but rather reflect a more general process that is sensitive to basic stimulus attributes such as relative timing, relative position, or other semantic attributes.

To those especially interested in the neural basis of communication or speech, this poses the immediate question of where in the brain multisensory influences are specialized for such stimuli and mediate the well-known behavioral benefits of integration. As seen above, the cocktail party effect—the integration of face and voice information—serves as one of the key examples to illustrate the importance of audiovisual integration; the underlying neural substrate, however, remains elusive. One approach to elucidate this could be to focus on those cortical regions in which neural processes directly related to the processing of communication sounds have been reported. Besides the classical speech areas, in case of the human brain, a number of other areas have been implicated in the nonhuman primate: response preferences to conspecific vocalizations have been reported in the lateral belt (Tian et al. 2001), the insula cortex (Remedios et al. 2009), in a voice area on the anterior temporal plane (Petkov et al. 2008), and in the ventrolateral prefrontal cortex (Cohen et al. 2007; Romanski et al. 2005). Noteworthy, several of these stages have not only been investigated in the context of purely auditory processing but have also been assessed for audiovisual integration.

The lateral belt is one of the regions classically implicated in an auditory “what” pathway concerned with the processing of acoustic object information (Romanski et al. 1999; Rauschecker and Tian 2000). The process of object segmentation or identification could well benefit from input from other modalities. Indeed, studies have reported that audiovisual interactions in the lateral belt are widespread at the level of LFPs and include about 40% of the recorded units (Ghazanfar et al. 2005). In fact, the multisensory influences in this region were found to depend on stimulus parameters such as the face–voice onset asynchrony or the match of visual and acoustic vocalizations, suggesting a good degree of specificity of the visual input. At the other end of this pathway, in the ventrolateral prefrontal cortex, 46% of the neurons were found to reflect audiovisual components of vocalization signals (Sugihara et al. 2006). Although the existence of a dedicated “what” pathway is still debated (Bizley and Walker 2009; Hall 2003; Wang 2000), these results highlight the prominence of multisensory influences in the implicated areas.

In addition to these stages of the presumed “what” pathway, two other regions have recently been highlighted in the context of vocal communication sounds. Recording in the primate insula, we recently found a large cluster of neurons that respond preferentially to conspecific vocalizations, when contrasted with a large set of other natural sounds (Remedios et al. 2009) (Figure 6.3a). Many of these neurons not only responded more strongly to conspecific vocalizations, but also responded selectively to only a few examples, and their responses allowed the decoding of the identity of individual vocalizations. This suggests that the insular cortex might play an important role in the representation of vocal communication sounds. Noteworthy, this response preference to conspecific vocalizations is also supported by functional imaging studies in animals (Figure 6.3b) and humans (Griffiths et al. 1997; Rumsey et al. 1997; Kotz et al. 2003; Meyer et al. 2002; Zatorre et al. 1994). In addition, lesions of the insula often manifest as deficits in sound or speech recognition (auditory agnosia) and speech production, confirming a central function of this structure in communication-related processes (Habib et al. 1995; Cancelliere and Kertesz 1990; Engelien et al. 1995). Noteworthy, some of the neurons in this auditory responsive region in the insula also show sensitivity to visual stimuli or response interactions during audiovisual stimulation (R. Remedios and C. Kayser, unpublished data). However, the vast majority of units in this structure are not affected by visual stimuli, suggesting that in this region, it is likely not concerned with the sensory integration of information related to communication calls, but mostly processes acoustic input.

FIGURE 6.3. (See color insert.

FIGURE 6.3

(See color insert.) Response preferences to (vocal) communication sounds. Preferences to conspecific communication sounds have been found in insula (panels a and b) and in anterior temporal lobe (panel c). In both cases, responses to conspecific communication (more...)

Another region that has recently been implicated in the processing of communication sounds is the so-called voice region in the anterior temporal lobe. A preference for the human voice, in particular, the identity of a human speaker, has been found in the human anterior temporal lobe (Belin and Zatorre 2003; Belin et al. 2000; von Kriegstein et al. 2003) and a similar preference for conspecific vocalizations and the identity of a monkey caller has been observed in the anterior temporal lobe of the nonhuman primate (Petkov et al. 2008). For example, highresolution functional imaging revealed several regions in the superior temporal lobe responding preferentially to the presentation of conspecific macaque vocalizations over other vocalizations and natural sounds (see the red clusters in the middle panel of Figure 6.3c), as has been seen in humans (Belin et al. 2000; von Kriegstein et al. 2003). These results can be interpreted as evidence for sensitivity to the acoustic features that distinguish the vocalizations of members of the species from other sounds. Further experiments have shown that one of these regions located in the anterior temporal lobe respond more vigorously to sounds that come from different speakers, whose meaning is constant, rather than to those that come from the same speaker, whose meaning and acoustics vary (Belin and Zatorre 2003; von Kriegstein et al. 2003; Petkov et al. 2008). These observations support the conclusion of a high-level correspondence in the processing of species-specific vocal features and a common cross-species substrate in the brains of human and nonhuman primates.

Noteworthy, this human voice region can also be influenced by multisensory input. For instance, von Kriegstein and colleagues (2006) used face and voice stimuli to first localize the human “face” and “voice” selective regions. They then showed that the activity of each of these regions was modulated by multisensory input. Comparable evidence from the animal model is still unavailable. Ongoing work in our laboratory is pursuing this question (Perrodin et al. 2009a, 2009b).

6.7. CONCLUSIONS

During everyday actions, we benefit tremendously from the combined input provided by our different sensory modalities. Although seldom experienced explicitly, only this combined sensory input makes an authentic and coherent percept of our environment possible (Adrian 1928; Stein and Meredith 1993). In fact, multisensory integration helps us to react faster or with higher precision (Calvert et al. 2004; Hershenson 1962), improves our learning capacities (Montesori 1967; Oakland et al. 1998), and sometimes even completely alters our percept (McGurk and MacDonald 1976). As a result, the understanding of sensory integration and its neural basis not only shed insights into brain function and perception, but could also provide improved strategies for learning and rehabilitation programs (Shams and Seitz 2008).

Evidence from functional imaging and electrophysiology demonstrates that this process of sensory integration is likely distributed across multiple processing stages. Multisensory influences are already present at early stages, such as in the primary auditory cortex, but increase along the processing hierarchy and are ubiquitous in higher association cortices. Existing data suggest that multisensory influences at early stages are specific to basic stimulus characteristics such as spatial and temporal localization, but are not specialized toward particular kinds of stimuli, such as communication signals. Whether, where, and how multisensory influences become more specialized, remains to be investigated by future work. In this search, a comparative approach comparing the multisensory influences at multiple processing stages during the same stimulation paradigm might prove especially useful. And as highlighted here, this ideally precedes using a combination of methods that probe neural responses at different spatiotemporal scales, such as electrophysiology and functional imaging. Definitely, much remains to be learned until we fully understand the neural basis underlying the behavioral gains provided by multisensory stimuli.

REFERENCES

  1. Adrian E.D. The Basis of Sensations. New York: Norton; 1928.
  2. Allman B.L, Meredith M.A. Multisensory processing in “unimodal” neurons: Cross-modal subthreshold auditory effects in cat extrastriate visual cortex. Journal of Neurophysiology. 2007;98:545–9. [PubMed: 17475717]
  3. Allman B.L, Bittencourt-Navarrete R.E, Keniston L.P, et al., editors. Do cross-modal projections always result in multisensory integration? Cerebral Cortex. 2008a;18:2066–76. [PMC free article: PMC2517110] [PubMed: 18203695]
  4. Allman B.L, Keniston L.P, Meredith M.A. Subthreshold auditory inputs to extrastriate visual neurons are responsive to parametric changes in stimulus quality: Sensory-specific versus non-specific coding. Brain Research. 2008b;1242:95–101. [PMC free article: PMC2645081] [PubMed: 18479671]
  5. Beauchamp M.S, Argall B.D, Bodurka J, Duyn J.H, Martin A. Unraveling multisensory integration: Patchy organization within human STS multisensory cortex. Nature Neuroscience. 2004;7:1190–2. [PubMed: 15475952]
  6. Beauchamp M.S, Yasar N.E, Frye R.E, Ro T. Touch, sound and vision in human superior temporal sulcus. NeuroImage. 2008;41:1011–20. [PMC free article: PMC2409200] [PubMed: 18440831]
  7. Belin P, Zatorre R.J. Adaptation to speaker’s voice in right anterior temporal lobe. Neuroreport. 2003;14:2105–9. [PubMed: 14600506]
  8. Belin P, Zatorre R.J, Lafaille P, Ahad P, Pike B. Voice-selective areas in human auditory cortex. Nature. 2000;403:309–12. [PubMed: 10659849]
  9. Benevento L.A, Fallon J, Davis B.J, Rezak M. Auditory–visual interaction in single cells in the cortex of the superior temporal sulcus and the orbital frontal cortex of the macaque monkey. Experimental Neurology. 1977;57:849–72. [PubMed: 411682]
  10. Bernstein L.E, Auer E.T Jr, Moore J.K, et al., editors. Visual speech perception without primary auditory cortex activation. Neuroreport. 2002;13:311–5. [PubMed: 11930129]
  11. Bizley J.K, King A.J. Visual–auditory spatial processing in auditory cortical neurons. Brain Research. 2008;1242:24–36. [PMC free article: PMC4340571] [PubMed: 18407249]
  12. Bizley J.K, Walker K.M. Distributed sensitivity to conspecific vocalizations and implications for the auditory dual stream hypothesis. Journal of Neuroscience. 2009;29:3011–3. [PMC free article: PMC6666462] [PubMed: 19279236]
  13. Bizley J.K, Nodal F.R, Bajo V.M, Nelken I, King A.J. Physiological and anatomical evidence for multisensory interactions in auditory cortex. Cerebral Cortex. 2006;17:2172–89. [PubMed: 17135481]
  14. Bruce C, Desimone R, Gross C.G. Visual properties of neurons in a polysensory area in superior temporal sulcus of the macaque. Journal of Neurophysiology. 1981;46:369–84. [PubMed: 6267219]
  15. Calvert G.A. Crossmodal processing in the human brain: Insights from functional neuroimaging studies. Cerebral Cortex. 2001;11:1110–23. [PubMed: 11709482]
  16. Calvert G.A, Campbell R. Reading speech from still and moving faces: The neural substrates of visible speech. Journal of Cognitive Neuroscience. 2003;15:57–70. [PubMed: 12590843]
  17. Calvert G.A, Brammer M.J, Bullmore E.T, et al., editors. Response amplification in sensory-specific cortices during crossmodal binding. Neuroreport. 1999;10:2619–23. [PubMed: 10574380]
  18. Calvert G, Spence C, Stein B.E. The Handbook of Multisensory Processes. Cambridge: MIT Press; 2004.
  19. Calvert G.A, Bullmore E.T, Brammer M.J, et al., editors. Activation of auditory cortex during silent lipreading. Science. 1997;276:593–6. [PubMed: 9110978]
  20. Campanella S, Belin P. Integrating face and voice in person perception. Trends in Cognitive Sciences. 2007;11:535–43. [PubMed: 17997124]
  21. Cancelliere A.E, Kertesz A. Lesion localization in acquired deficits of emotional expression and comprehension. Brain and Cognition. 1990;13:133–47. [PubMed: 1697174]
  22. Cappe C, Barone P. Heteromodal connections supporting multisensory integration at low levels of cortical processing in the monkey. European Journal of Neuroscience. 2005;22:2886–902. [PubMed: 16324124]
  23. Cappe C, Loquet G, Barone P, Rouiller E.M. Neuronal responses to visual stimuli in auditory cortical areas of monkeys performing an audio-visual detection task. European Brain and Behaviour Society. Trieste. 2007
  24. Chiry O, Tardif E, Magistretti P.J, Clarke S. Patterns of calcium-binding proteins support parallel and hierarchical organization of human auditory areas. European Journal of Neuroscience. 2003;17:397–410. [PubMed: 12542677]
  25. Clarke S, Rivier F. Compartments within human primary auditory cortex: Evidence from cytochrome oxidase and acetylcholinesterase staining. European Journal of Neuroscience. 1998;10:741–5. [PubMed: 9749735]
  26. Cohen Y.E, Theunissen F, Russ B.E, Gill P. Acoustic features of rhesus vocalizations and their representation in the ventrolateral prefrontal cortex. Journal of Neurophysiology. 2007;97:1470–84. [PubMed: 17135477]
  27. Dahl C, Logothetis N, Kayser C. Spatial organization of multisensory responses in temporal association cortex. Journal of Neuroscience. 2009;29:11924–32. [PMC free article: PMC6666661] [PubMed: 19776278]
  28. Dehner L.R, Keniston L.P, Clemo H.R, Meredith M.A. Cross-modal circuitry between auditory and somatosensory areas of the cat anterior ectosylvian sulcal cortex: A ‘new’ inhibitory form of multisensory convergence. Cerebral Cortex. 2004;14:387–403. [PubMed: 15028643]
  29. Engel S.A, Rumelhart D.E, Wandell B.A, et al., editors. fMRI of human visual cortex. Nature. 1994;369:525. [PubMed: 8031403]
  30. Engelien A, Silbersweig D, Stern E, et al., editors. The functional anatomy of recovery from auditory agnosia. A PET study of sound categorization in a neurological patient and normal controls. Brain. 1995;118(Pt 6):1395–409. [PubMed: 8595472]
  31. Ernst M.O, Bülthoff H.H. Merging the senses into a robust percept. Trends in Cognitive Science. 2004;8:162–9. [PubMed: 15050512]
  32. Falchier A, Clavagnier S, Barone P, Kennedy H. Anatomical evidence of multimodal integration in primate striate cortex. Journal of Neuroscience. 2002;22:5749–59. [PubMed: 12097528]
  33. Formisano E, Kim D.S, Di Salle F, et al., editors. Mirror-symmetric tonotopic maps in human primary auditory cortex. Neuron. 2003;40:859–69. [PubMed: 14622588]
  34. Foxe J.J, Schroeder C.E. The case for feedforward multisensory convergence during early cortical processing. Neuroreport. 2005;16:419–23. [PubMed: 15770144]
  35. Foxe J.J, Wylie G.R, Martinez A, et al., editors. Auditory–somatosensory multisensory processing in auditory association cortex: An fMRI study. Journal of Neurophysiology. 2002;88:540–3. [PubMed: 12091578]
  36. Fullerton B.C, Pandya D.N. Architectonic analysis of the auditory-related areas of the superior temporal region in human brain. Journal of Comparative Neurology. 2007;504:470–98. [PubMed: 17701981]
  37. Ghazanfar A.A, Chandrasekaran C, Logothetis N.K. Interactions between the superior temporal sulcus and auditory cortex mediate dynamic face/voice integration in rhesus monkeys. Journal of Neuroscience. 2008;28:4457–69. [PMC free article: PMC2663804] [PubMed: 18434524]
  38. Ghazanfar A.A, Logothetis N.K. Neuroperception: Facial expressions linked to monkey calls. Nature. 2003;423:937–8. [PubMed: 12827188]
  39. Ghazanfar A.A, Schroeder C.E. Is neocortex essentially multisensory? Trends in Cognitive Sciences. 2006;10:278–85. [PubMed: 16713325]
  40. Ghazanfar A.A, Maier J.X, Hoffman K.L, Logothetis N.K. Multisensory integration of dynamic faces and voices in rhesus monkey auditory cortex. Journal of Neuroscience. 2005;25:5004–12. [PMC free article: PMC6724848] [PubMed: 15901781]
  41. Griffiths T.D, Rees A, Witton C, et al., editors. Spatial and temporal auditory processing deficits following right hemisphere infarction. A psychophysical study. Brain. 1997;120(Pt 5):785–94. [PubMed: 9183249]
  42. Habib M, Daquin G, Milandre L, et al., editors. Mutism and auditory agnosia due to bilateral insular damage—role of the insula in human communication. Neuropsychologia. 1995;33:327–39. [PubMed: 7791999]
  43. Hackett T.A, Stepniewska I, Kaas J.H. Subdivisions of auditory cortex and ipsilateral cortical connections of the parabelt auditory cortex in macaque monkeys. Journal of Comparative Neurology. 1998;394:475–95. [PubMed: 9590556]
  44. Hall D.A. Auditory pathways: Are ‘what’ and ‘where’ appropriate? Current Biology. 2003;13:R406–8. [PubMed: 12747854]
  45. Hershenson M. Reaction time as a measure of intersensory facilitation. Journal of Experimental Psychology. 1962;63:289–93. [PubMed: 13906889]
  46. Jones E.G, Powell T.P. An anatomical study of converging sensory pathways within the cerebral cortex of the monkey. Brain. 1970;93:793–820. [PubMed: 4992433]
  47. Juergens E, Guettler A, Eckhorn R. Visual stimulation elicits locked and induced gamma oscillations in monkey intracortical- and EEG-potentials, but not in human EEG. Experimental Brain Research. 1999;129:247–59. [PubMed: 10591899]
  48. Kaas J.H, Hackett T.A. Subdivisions of auditory cortex and processing streams in primates. Proceedings of the National Academy of Sciences of the United States of America. 2000;97:11793–9. [PMC free article: PMC34351] [PubMed: 11050211]
  49. Kayser C, Logothetis N.K. Do early sensory cortices integrate cross-modal information? Brain Structure and Function. 2007;212:121–32. [PubMed: 17717687]
  50. Kayser C, Petkov C.I, Augath M, Logothetis N.K. Integration of touch and sound in auditory cortex. Neuron. 2005;48:373–84. [PubMed: 16242415]
  51. Kayser C, Petkov C.I, Augath M, Logothetis N.K. Functional imaging reveals visual modulation of specific fields in auditory cortex. Journal of Neuroscience. 2007;27:1824–35. [PMC free article: PMC6673538] [PubMed: 17314280]
  52. Kayser C, Petkov C.I, Logothetis N.K. Visual modulation of neurons in auditory cortex. Cerebral Cortex. 2008;18:1560–74. [PubMed: 18180245]
  53. Kayser C, Logothetis N, Panzeri S. Visual enhancement of the information representation in auditory cortex. Current Biology (in press). 2009a [PubMed: 20036538]
  54. Kayser C, Montemurro M.A, Logothetis N, Panzeri S. Spike-phase coding boosts and stabilizes the information carried by spatial and temporal spike patterns. Neuron. 2009b;61:597–608. [PubMed: 19249279]
  55. Kayser C, Petkov C.I, Logothetis N.K. Multisensory interactions in primate auditory cortex: fMRI and electrophysiology. Hearing Research (in press). 2009c doi:10.1016/j.heares.2009.02.011. [PubMed: 19269312]
  56. Kosaki H, Hashikawa T, He J, Jones E.G. Tonotopic organization of auditory cortical fields delineated by parvalbumin immunoreactivity in macaque monkeys. Journal of Comparative Neurology. 1997;386:304–16. [PubMed: 9295154]
  57. Kotz S.A, Meyer M, Alter K, et al., editors. On the lateralization of emotional prosody: An event-related functional MR investigation. Brain and Language. 2003;86:366–76. [PubMed: 12972367]
  58. Lakatos P, Chen C.M, O’Connell M.N, Mills A, Schroeder C.E. Neuronal oscillations and multisensory interaction in primary auditory cortex. Neuron. 2007;53:279–92. [PMC free article: PMC3717319] [PubMed: 17224408]
  59. Laurienti P.J, Perrault T.J, Stanford T.R, Wallace M.T, Stein B.E. On the use of superadditivity as a metric for characterizing multisensory integration in functional neuroimaging studies. Experimental Brain Research. 2005;166:289–97. [PubMed: 15988597]
  60. Lauritzen M. Reading vascular changes in brain imaging: Is dendritic calcium the key? Nature Neuroscience Reviews. 2005;6(1):77–85. [PubMed: 15611729]
  61. Lehmann C, Herdener M, Esposito F, et al., editors. Differential patterns of multisensory interactions in core and belt areas of human auditory cortex. NeuroImage. 2006;31:294–300. [PubMed: 16473022]
  62. Leopold D.A. Neuroscience: Pre-emptive blood flow. Nature. 2009;457:387–8. [PMC free article: PMC2715363] [PubMed: 19158777]
  63. Logothetis N.K. The neural basis of the blood-oxygen-level-dependent functional magnetic resonance imaging signal. Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences. 2002;357:1003–37. [PMC free article: PMC1693017] [PubMed: 12217171]
  64. Logothetis N.K. What we can do and what we cannot do with fMRI. Nature. 2008;453:869–78. [PubMed: 18548064]
  65. Logothetis N.K, Guggenberger H, Peled S, Pauls J. Functional imaging of the monkey brain. Nature Neuroscience. 1999;2:555–62. [PubMed: 10448221]
  66. Martuzzi R, Murray M.M, Michel C.M, et al., editors. Multisensory interactions within human primary cortices revealed by BOLD dynamics. Cerebral Cortex. 2006;17:1672–9. [PubMed: 16968869]
  67. McGurk H, MacDonald J. Hearing lips and seeing voices. Nature. 1976;264:746–8. [PubMed: 1012311]
  68. Meredith M.A, Allman B.L. Subthreshold multisensory processing in cat auditory cortex. Neuroreport. 2009;20:126–31. [PMC free article: PMC2839368] [PubMed: 19057421]
  69. Merzenich M.M, Brugge J.F. Representation of the cochlear partition of the superior temporal plane of the macaque monkey. Brain Research. 1973;50:275–96. [PubMed: 4196192]
  70. Meyer M, Alter K, Friederici A.D, Lohmann G, Von Cramon D.Y. FMRI reveals brain regions mediating slow prosodic modulations in spoken sentences. Human Brain Mapping. 2002;17:73–88. [PubMed: 12353242]
  71. Mitzdorf U. Current source-density method and application in cat cerebral cortex: Investigation of evoked potentials and EEG phenomena. Physiological Reviews. 1985;65:37–100. [PubMed: 3880898]
  72. Montessori M. The Absorbent Mind. New York: Henry Holt & Co; 1967.
  73. Morel A, Garraghty P.E, Kaas J.H. Tonotopic organization, architectonic fields, and connections of auditory cortex in macaque monkeys. Journal of Comparative Neurology. 1993;335:437–59. [PubMed: 7693772]
  74. Oakland T, Black J.L, Stanford G, Nussbaum N.L, Balise R.R. An evaluation of the dyslexia training program: A multisensory method for promoting reading in students with reading disabilities. Journal of Learning Disabilities. 1998;31:140–7. [PubMed: 9529784]
  75. Pekkola J, Ojanen V, Autti T, et al., editors. Attention to visual speech gestures enhances hemodynamic activity in the left planum temporale. Human Brain Mapping. 2005;27:471–7. [PubMed: 16161166]
  76. Perrodin C, Kayser C, Logothetis N, Petkov C. International Conference on Auditory Cortex. Madgeburg; Germany: 2009a. 2009. Visual influences on voice-selective neurons in the anterior superior-temporal plane.
  77. Perrodin C, Veit L, Kayser C, Logothetis N.K, Petkov C.I. International Conference on Auditory Cortex. Madgeburg; Germany: 2009b. 2009. Encoding properties of neurons sensitive to species-specific vocalizations in the anterior temporal lobe of primates.
  78. Petkov C.I, Kayser C, Augath M, Logothetis N.K. Functional imaging reveals numerous fields in the monkey auditory cortex. PLoS Biology. 2006;4:e215. [PMC free article: PMC1479693] [PubMed: 16774452]
  79. Petkov C.I, Kayser C, Steudel T, et al., editors. A voice region in the monkey brain. Nature Neuroscience. 2008;11:367–74. [PubMed: 18264095]
  80. Petkov C.I, Kayser C, Augath M, Logothetis N.K. Optimizing the imaging of the monkey auditory cortex: Sparse vs. continuous fMRI. Magnetic Resonance Imaging. 2009;27:1065–73. [PubMed: 19269764]
  81. Petrini K, Russell M, Pollick F. When knowing can replace seeing in audiovisual integration of actions. Cognition. 2009;110:432–9. [PubMed: 19121519]
  82. Rauschecker J.P. Cortical processing of complex sounds. Current Opinion in Neurobiology. 1998;8:516–21. [PubMed: 9751652]
  83. Rauschecker J.P, Tian B. Mechanisms and streams for processing of what and where in auditory cortex. Proceedings of the National Academy of Sciences of the United States of America. 2000;97:11800–6. [PMC free article: PMC34352] [PubMed: 11050212]
  84. Rauschecker J.P, Tian B. Processing of band-passed noise in the lateral auditory belt cortex of the rhesus monkey. Journal of Neurophysiology. 2004;91:2578–89. [PubMed: 15136602]
  85. Rauschecker J.P, Tian B, Hauser M. Processing of complex sounds in the macaque nonprimary auditory cortex. Science. 1995;268:111–4. [PubMed: 7701330]
  86. Rauschecker J.P, Tian B, Pons T, Mishkin M. Serial and parallel processing in rhesus monkey auditory cortex. Journal of Comparative Neurology. 1997;382:89–103. [PubMed: 9136813]
  87. Recanzone G.H, Guard D.C, Phan M.L. Frequency and intensity response properties of single neurons in the auditory cortex of the behaving macaque monkey. Journal of Neurophysiology. 2000;83:2315–31. [PubMed: 10758136]
  88. Remedios R, Logothetis N.K, Kayser C. An auditory region in the primate insular cortex responding preferentially to vocal communication sounds. Journal of Neuroscience. 2009;29:1034–45. [PMC free article: PMC6665141] [PubMed: 19176812]
  89. Rockland K.S, Ojima H. Multisensory convergence in calcarine visual areas in macaque monkey. International Journal of Psychophysiology. 2003;50:19–26. [PubMed: 14511833]
  90. Romanski L.M, Tian B, Fritz J, et al., editors. Dual streams of auditory afferents target multiple domains in the primate prefrontal cortex. Nature Neuroscience. 1999;2:1131–6. [PMC free article: PMC2778291] [PubMed: 10570492]
  91. Romanski L.M, Averbeck B.B, Diltz M. Neural representation of vocalizations in the primate ventrolateral prefrontal cortex. Journal of Neurophysiology. 2005;93:734–47. [PubMed: 15371495]
  92. Ross L.A, Saint-Amour D, Leavitt V.M, Javitt D.C, Foxe J.J. Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments. Cerebral Cortex. 2007;17:1147–53. [PubMed: 16785256]
  93. Rumsey J.M, Horwitz B, Donohue B.C, et al., editors. Phonological and orthographic components of word recognition. A PET-rCBF study. Brain. 1997;120(Pt 5):739–59. [PubMed: 9183247]
  94. Schroeder C.E, Foxe J.J. The timing and laminar profile of converging inputs to multisensory areas of the macaque neocortex. Brain Research. Cognitive Brain Research. 2002;14:187–98. [PubMed: 12063142]
  95. Schroeder C.E, Foxe J. Multisensory contributions to low-level, ‘unisensory’ processing. Current Opinion in Neurobiology. 2005;15:454–8. [PubMed: 16019202]
  96. Schroeder C.E, Lindsley R.W, Specht C, et al., editors. Somatosensory input to auditory association cortex in the macaque monkey. Journal of Neurophysiology. 2001;85:1322–7. [PubMed: 11248001]
  97. Schroeder C.E, Smiley J, Fu K.G, et al., editors. Anatomical mechanisms and functional implications of multisensory convergence in early cortical processing. International Journal of Psychophysiology. 2003;50:5–17. [PubMed: 14511832]
  98. Schurmann M, Caetano G, Hlushchuk Y, Jousmaki V, Hari R. Touch activates human auditory cortex. NeuroImage. 2006;30:1325–31. [PubMed: 16488157]
  99. Shams L, Seitz A.R. Benefits of multisensory learning. Trends in Cognitive Sciences. 2008;12:411–7. [PubMed: 18805039]
  100. Stein B.E. Neural mechanisms for synthesizing sensory information and producing adaptive behaviors. Experimental Brain Research. 1998;123:124–35. [PubMed: 9835401]
  101. Stein B.E, Stanford T.R. Multisensory integration: Current issues from the perspective of the single neuron. Nature Reviews Neuroscience. 2008;9:255–66. [PubMed: 18354398]
  102. Stein B.E, Meredith M.A. Merging of the Senses. Cambridge: MIT Press; 1993.
  103. Sugihara T, Diltz M.D, Averbeck B.B, Romanski L.M. Integration of auditory and visual communication information in the primate ventrolateral prefrontal cortex. Journal of Neuroscience. 2006;26:11138–47. [PMC free article: PMC2767253] [PubMed: 17065454]
  104. Sumby W.H, Polack I. Visual contribution to speech intelligibility in noise. Journal of the Acoustical Society of America. 1954;26:212–5.
  105. Talavage T.M, Sereno M.I, Melcher J.R, et al., editors. Tonotopic organization in human auditory cortex revealed by progressions of frequency sensitivity. Journal of Neurophysiology. 2004;91:1282–96. [PubMed: 14614108]
  106. Tian B, Reser D, Durham A, Kustov A, Rauschecker J.P. Functional specialization in rhesus monkey auditory cortex. Science. 2001;292:290–3. [PubMed: 11303104]
  107. van Atteveldt N, Formisano E, Goebel R, Blomert L. Integration of letters and speech sounds in the human brain. Neuron. 2004;43:271–82. [PubMed: 15260962]
  108. van Wassenhove V, Grant K.W, Poeppel D. Visual speech speeds up the neural processing of auditory speech. Proceedings of the National Academy of Sciences of the United States of America. 2005;102:1181–6. [PMC free article: PMC545853] [PubMed: 15647358]
  109. von Kriegstein K, Giraud A.L. Implicit multisensory associations influence voice recognition. PLoS Biol. 2006;4:e326. [PMC free article: PMC1570760] [PubMed: 17002519]
  110. von Kriegstein K, Eger E, Kleinschmidt A, Giraud A.L. Modulation of neural responses to speech by directing attention to voices or verbal content. Brain Research. Cognitive Brain Research. 2003;17:48–55. [PubMed: 12763191]
  111. von Kriegstein K, Kleinschmidt A, Giraud A.L. Voice recognition and cross-modal responses to familiar speakers’ voices in prosopagnosia. Cerebral Cortex. 2006;16:1314–22. [PubMed: 16280461]
  112. Wallace M.T, Ramachandran R, Stein B.E. A revised view of sensory cortical parcellation. Proceedings of the National Academy of Sciences of the United States of America. 2004;101:2167–72. [PMC free article: PMC357070] [PubMed: 14766982]
  113. Wang X. On cortical coding of vocal communication sounds in primates. Proceedings of the National Academy of Sciences of the United States of America. 2000;97:11843–9. [PMC free article: PMC34358] [PubMed: 11050218]
  114. Warnking J, Dojat M, Guerin-Dugue A, et al., editors. fMRI retinotopic mapping—step by step. NeuroImage. 2002;17:1665–83. [PubMed: 12498741]
  115. Wessinger C.M, Vanmeter J, Tian B, et al., editors. Hierarchical organization of the human auditory cortex revealed by functional magnetic resonance imaging. Journal of Cognitive Neuroscience. 2001;13:1–7. [PubMed: 11224904]
  116. Zatorre R.J, Evans A.C, Meyer E. Neural mechanisms underlying melodic perception and memory for pitch. Journal of Neuroscience. 1994;14:1908–19. [PMC free article: PMC6577137] [PubMed: 8158246]
Copyright © 2012 by Taylor & Francis Group, LLC.
Bookshelf ID: NBK92843PMID: 22593869

Views

  • PubReader
  • Print View
  • Cite this Page

Other titles in this collection

Related information

  • PMC
    PubMed Central citations
  • PubMed
    Links to PubMed

Similar articles in PubMed

See reviews...See all...

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...