Chapter 5Audiovisual Integration in Nonhuman Primates

A Window into the Anatomy and Physiology of Cognition

Kajikawa Y, Falchier A, Musacchia G, et al.

5.1. BEHAVIORAL CAPACITIES

Humans can associate a sound with its visual source, where it comes from, how it is produced, and what it means. This association, or audiovisual (AV) integration, also occurs in many nonhuman primate species, and may be used in kin recognition, localization, and social interaction, among other things (Cheney and Seyfarth 1990; Ghazanfar and Santos 2004). These abilities suggest that nonhuman primates integrate sight and sound as humans do: through recognition of AV vocalizations and enhanced perception of audiovisual stimuli.

5.1.1. Recognition

One of the most ubiquitous AV functions in everyday human life is recognizing and matching the sight and sounds of other familiar humans. Nonhuman primates can also recognize the sight and sound of a familiar object and can express this association behaviorally. Primates reliably associate coincident auditory and visual signals of conspecific vocalizations (Evans et al. 2005; Ghazanfar and Logothetis 2003; Jordan et al. 2005; Sliwa et al. 2009) and can match pictures to vocal sounds of both conspecifics and familiar humans (Izumi and Kojima 2004; Kojima et al. 2003; Martinez and Matsuzawa 2009). Monkeys can also identify a picture in which the number of individuals matches the number of vocal sounds (Jordan et al. 2005). Although it appears that primates recognize the AV components of a talking face much better when the individual is socially familiar, familiarity does not appear to be a critical component of audiovisual recognition; many of the studies cited above showed that primates can correctly match AV vocalizations from other primate species (Martinez and Matsuzawa 2009; Zangenehpour et al. 2009). Facial movement, on the other hand, appears to be a key component for nonhuman primates in recognizing the vocal behavior of others. When matching a visual stimulus to a vocalization, primates correctly categorized a still face as a mismatch (Izumi and Kojima 2004; Evans et al. 2005; Ghazanfar and Logothetis 2003) and performed poorly when only the back view was presented (Martinez and Matsuzawa 2009).

AV matching by monkeys is not limited to facial recognition. Ghazanfar et al. (2002) showed that a rising-intensity sound attracted a monkey’s attention to a similar degree as a looming visual object (Schiff et al. 1962). These auditory and visual signals are signatures of an approaching object. Monkeys preferentially look at the corresponding looming rather than receding visual signal when presented with a looming sound. This was not the case when the monkey was presented with either a receding sound or white noise control stimulus with an amplitude envelope matching that of the looming sound (Maier et al. 2004). Therefore, monkeys presumably form single events by associating sound and visual attributes at least for signals of approaching objects.

Taken together, these data indicate that the dynamic structure of the visual stimulus and compatibility between two modalities is vital for AV recognition in primates and suggest a common mechanistic nature across primate species.

5.1.2. Fusion and Illusions

For humans, one of the most striking aspects of AV integration is that synchronous auditory and visual speech stimuli seem fused together, and illusions relating to this phenomenon may arise. The McGurk illusion is a case of this sort. When a mismatch between certain auditory and visual syllables occurs (e.g., an auditory “ba” with a visual “ga”), humans often perceive a synthesis of those syllables, mostly “da” (McGurk and MacDonald 1976). The illusion persists even when the listener is aware of the mismatch, which indicates that visual articulations are automatically integrated into speech perception (Green et al. 1991; Soto-Faraco and Alsius 2009).

Vatakis et al. (2008) examined whether auditory and visual components of monkey vocalizations elicited a fused perception in humans. It is well known that people are less sensitive to temporal asynchrony when auditory and visual components of speech are matched compared to a mismatched condition (called the “unity effect”). Capitalizing on this phenomenon, Vatakis and colleagues used a temporal order judgment task with matched and mismatched sounds and movies of monkey vocalizations across a range of stimulus onset asynchronies (SOA). The unity effect was observed for human speech vocalization, but was not observed when people observed monkey vocalizations. The authors also showed negative results for human vocalizations mimicking monkey vocalizations, suggesting that the fusion of face–voice components is limited to human speech for humans. This may be because of the fact that monkey vocal repertoires are much more limited than those of humans and have a large dissimilarity between facial expressive components and sound (Chakladar et al. 2008; Partan 2002).

Another famous AV illusion, called the “ventriloquist effect,” also appears to have a corollary in nonhuman primate perception. The effect is such that under the right conditions, a sound may be perceived as originating from a visual location despite a spatial disparity. After training a monkey to identify the location of a sound source, Recanzone’s group introduced a 20 to 60 min period of spatially disparate auditory (tones) and visual (dots) stimuli (Woods and Recanzone 2004). The consequence of this manipulation appeared in the sound lateralization task as a deviation of the “auditory center spot” in the direction to the location of sound relative to visual fixation spot during the prior task. The underlying neural mechanism of this effect may be similar to the realignment of visual and auditory spatial maps after adapting to an optical prism displacing the visual space (Cui et al. 2008; Knudsen and Knudsen 1989).

What about perception of multisensory moving objects? Preferential looking at looming sound and visual signal suggests that monkeys associate sound and visual attributes of approaching objects (Maier et al. 2004). However, longer looking does not necessarily imply fused perception, but may instead suggest the attentional attraction to moving stimuli after assessing their congruency. Fused perception of looming AV signals was supported by human studies, showing the redundant signal effect (see Section 5.1.3 for more details) in reaction time (shorter reaction time to congruent looming AV signals) under the condition of bimodal attention (Cappe et al. 2010; see also Romei et al. 2009 for data suggesting preattentive effects of looming auditory signals). Interestingly, for such an AV looming effect to happen, the spectrum of the sound has to be dynamically structured along with sound intensity. It is not known which other attributes of a visual stimulus, other than motion, could contribute to this effect. It is likely that auditory and visual stimuli must be related, not only in spatial and temporal terms, but also in dynamic spectral dimensions in both modalities in order for an attentional bias or performance enhancement to appear.

5.1.3. Perception

Visual influences on auditory perception, and vice versa, is well established in humans (Sumby and Pollack 1954; Raab 1962; Miller 1982; Welch and Warren 1986; Sams et al. 1991; Giard and Peronnet 1999; for review, see Calvert 2001; Stein and Meredith 1993) and has been examined in several studies on nonhuman primates (described below). By using simple auditory and visual stimuli, such as tones and dots, the following studies show that auditory and visual information interact with each other to modulate perception in monkeys.

Barone’s group trained monkeys to make a saccade to a visual target that starts to flash at the moment when the fixation point disappears (Wang et al. 2008). In half of the trials, the visual target was presented with a brief task-irrelevant noise. The result was faster saccadic reaction times when the visual target was accompanied with a sound than without it. Frens and Van Opstal (1998) also studied the influence of auditory stimulation on saccadic responses in monkeys performing tasks similar to that of Wang et al. (2008). They showed not only a shortening of reaction time, but also that reaction time depended on the magnitude of the spatial and temporal shift between visual and auditory stimuli; smaller distance and closer timing yielded shorter reaction times. These results demonstrated a temporal effect of sound on visual localization. These results are compatible with human psychophysical studies of AV integration (Frens et al. 1995; Diederich and Colonius 2004; Perrott et al. 1990) and suggest that the underlying mechanism may be common to both human and nonhuman primates.

Like humans, monkeys have also been shown to have shorter manual reaction times to bimodal targets compared with unimodal targets. In a simple detection task in which a monkey had to report the detection of a light flash (V alone), noise sound (A alone), or both (AV) stimuli by manual response, reaction times to AV stimuli were faster than V alone regardless of its brightness (Cappe et al. 2010; see also Miller et al. 2001, showing similar data for small data sets). When the sound was loud, reaction times to AV stimuli and A alone were not different. When sound intensity was low, the overall reaction time was longer and the response to AV stimuli was still faster than A alone. A study from our laboratory showed that reaction times to perceptual “oddballs,” or novel stimuli in a train of standard stimuli, were faster for AV tokens than for the visual or auditory tokens presented alone (Kajikawa and Schroeder 2008). Monkeys were presented with a series of standard AV stimuli (monkey picture and vocal sound) with an occasional oddball imbedded in the series that differed from the standard in image (V alone), sound (A alone), or both (AV) stimuli. The monkey had to manually respond upon detection of such oddballs. In that case, whereas intensity levels were fixed, reaction times to the AV oddballs were faster than either A alone or V alone oddballs. In addition, the probability of a correct response was highest for the AV oddball and lowest for the A alone condition. Therefore, not only the detection of signals, but also its categorization benefited from AV integration.

This pattern of reaction times conforms to the results of human psychophysics studies showing faster reaction time to bimodal than unimodal stimuli (Frens et al. 1995; Diederich and Colonius 2004; Perrott et al. 1990). Observations of faster reaction in response to bimodal compared with unimodal stimuli in different motor systems suggest that AV integration occurs in sensory systems before the motor system is engaged to generate a behavioral response (or that a similar integration mechanism is present in several motor systems).

Difference in task demands complicates the ability to define the role of attention in the effect of AV integration on reaction times. In the study conducted by Wang et al. (2008), monkeys were required to monitor only the occurrence of the visual stimulus. Therefore, task-irrelevant sound acted exogenously from outside of the attended sensory domain, that is, it likely drew the monkey’s attention, but this possibility is impossible to assess. In contrast, Cappe et al. (2010) and Kajikawa and Schroeder (2008) used monkeys that were actively paying attention to both visual and auditory modalities during every trial. It is worth noting that the sound stimuli used by Wang et al. (2008) did not act as distracters. Hence, it was possible that monkeys could do the task by paying attention to both task-relevant visual stimuli and task-irrelevant sound (see Section 5.6).

5.2. NEUROANATOMICAL AND NEUROPHYSIOLOGICAL SUBSTRATES

In the following sections, we will describe AV interactions in numerous monkey brain regions (Figure 5.1). Investigators have identified AV substrates in broadly two ways: by showing that (1) the region responds to both auditory and visual stimuli or (2) AV stimuli produce neural activity that differs from the unimodal responses presented alone. AV integration has been shown at the early stages of processing, including primary sensory and subcortical areas (for review, see Ghazanfar and Schroeder 2006; Musacchia and Schroeder 2009; Schroeder and Foxe 2005; Stein and Stanford 2008). Other areas that respond to both modalities have been identified in the prefrontal cortex (PFC), the posterior parietal cortex (PPC), the superior temporal polysensory area (STP), and medial temporal lobe (MTL). Even though most studies could not elucidate the relationship between behavior and physiology because they did not test the monkey’s behavior in conjunction with physiological measures, these studies provide promising indirect evidence that is useful in directing future behavioral/physiological studies.

FIGURE 5.1. (See color insert.

FIGURE 5.1

(See color insert.) Connections mediating multisensory interactions in primate auditory cortex. Primate auditory cortices receive a variety of inputs from other sensory and multisensory areas. Somatosensory areas (PV, parietoventral area; Ri, retroinsular (more...)

5.2.1. Prefrontal Cortex

In the PFC, broad regions have been reported to be multisensory. PFC is proposed to have “what” and “where” pathways of visual object and space information processing segregated into dorsolateral (DLPFC) and ventrolateral (VLPFC) parts of PFC (Goldman-Rakic et al. 1996; Levy and Goldman-Rakic 2000; Ungerleider et al. 1998). Although numerous studies support the idea of segregated information processing in PFC (Wilson et al. 1993), others found single PFC neurons integrated what and where information during a task that required monitoring of both object and location (Rao et al. 1997).

It appears that auditory information processing in PFC also divides into analogous “what” (e.g., speaker specific) and “where” (e.g., location specific) domains. The proposed “what” and “where” pathways of the auditory cortical system (Kaas and Hackett 2000; Rauschecker and Tian 2000) have been shown to project to VLPFC and DLPFC, respectively (Hackett et al. 1999; Romanski et al. 1999a, 1999b). Broad areas of the DLPFC were shown to be sensitive to sound location (Artchakov et al. 2007; Azuma and Suzuki 1984; Kikuchi-Yorioka and Sawaguchi 2000; Vaadia et al. 1986). Conversely, response selectivity to macaque vocal sounds were found in VLPFC (Cohen et al. 2009; Gifford et al. 2005; Romanski and Goldman-Rakic 2002; Romanski et al. 2005) and orbitofrontal cortex (Rolls et al. 2006). These two areas may correspond to face-selective regions of frontal lobe in nonhuman primates (Parr et al. 2009; Tsao et al. 2008b). Taken together, these findings support the notion that, as in the visual system, sensitivity to location and nonspatial features of sounds are segregated in PFC.

Although the dorsolateral stream in PFC has largely been shown to be sensitive to location, auditory responses to species-specific vocalizations were also found in regions of DLPFC in squirrel monkey (Newman and Lindsley 1976; Wollberg and Sela 1980) and macaque monkey (Bon and Lucchetti 2006). Interestingly, visual fixation diminished responses to vocal sounds in some neurons (Bon and Lucchetti 2006). Taken together with the results of Rao et al. (1997) showing that neurons of the “what” and “where” visual stream are distributed over a region spanning both the DLPFC and VLPFC, these studies suggest that the “what” auditory stream might extend outside the VLPFC.

Apart from showing signs of analogous processing streams in auditory and visual pathways, PFC is cytoarchitecturally primed to process multisensory stimuli. In addition to auditory cortical afferents, the DLPFC and VLPFC have reciprocal connections with rostral and caudal STP subdivisions (Seltzer and Pandya 1989). The VLPFC also receives inputs from the PPC, a presumed “where” visual region (Petrides and Pandya 2009). Within both the DLPFC and VLPFC, segregated projections of different sensory afferents exist. Area 8 receives projections from visual cortices (occipital and IPS) in its caudal part, and auditory-responsive cortices [superior temporal gyrus (STG) and STP] in its rostral part (Barbas and Mesulam 1981). Similar segregation of visual [inferior temporal (IT)] and auditory (STG and STP) afferents exist within VLPFC (Petrides and Pandya 2002). Thus, DLPFC and VLPFC contain regions receiving both or either one of auditory and visual projections, and those regions are intermingled. Additionally, orbitofrontal cortex and medial PFC receive inputs from IT, STP, and STG (Barbas et al. 1999; Carmichael and Price 1995; Cavada et al. 2000; Kondo et al. 2003; Saleem et al. 2008), and may contribute to AV integration (see Poremba et al. 2003).

Not surprisingly, bimodal properties of PFC neurons have been described in numerous studies. Some early studies described neurons responsive to both tones and visual stimuli (Kubota et al. 1980; Aou et al. 1983). However, because these studies used sound as a cue to initiate immediate behavioral response, it is possible that the neuronal response to the sound might be related to motor execution. Other studies of PFC employed tasks in which oculomotor or manual responses were delayed from sensory cues (Artchakov et al. 2007; Ito 1982; Joseph and Barone 1987; Kikuchi-Yorioka and Sawaguchi 2000; Vaadia et al. 1986; Watanabe 1992). Despite the delayed response, populations of neurons still responded to both visual and auditory stimuli. Such responses had spatial tuning and dependence on task conditions such as modality of task and task demands of discrimination, active detection, passive reception (Vaadia et al. 1986), or reward/no reward contingency (Watanabe 1992). One report shows that visuospatial and audiospatial working memory processes seem to share a common neural mechanism (Kikuchi-Yorioka and Sawaguchi 2000).

The behavioral tasks used in studies described so far did not require any comparison of visual and auditory events. Fuster et al. (2000) trained monkeys to learn pairing of tones and colors and perform a cross-modal delayed matching task using tones as the sample cue and color signals as the target. They found that PFC neurons in those monkeys had elevated firing during the delay period that was not present on error trials. Therefore, PFC has many neurons responsive to both auditory and visual signals, somehow depending on behavioral conditions, and possibly associates them.

Romanski’s group explored multisensory responses in VLPFC (Sugihara et al. 2006), and found that this region may have unimodal visual, unimodal auditory, or bimodal AV responsive regions (Romanski et al. 2002, 2005). Their group used movies, images, and sounds of monkeys producing vocalizations as stimuli, and presented them unimodally or bimodally while subjects fixated. Although neurons responded exclusively to one or both modalities, about half of the neurons examined exhibited AV integration as either enhancement or suppression of unimodal response. Because subjects were not required to maintain working memory or make decision, those responses are considered to be sensory.

In addition to the above described regions, premotor (PM) areas between the primary motor cortex and the arcuate sulcus contain neurons sensitive to sound and vision. Although most of the neurons in PM respond to somatosensory stimuli, there are neurons that also respond to sound and visual stimuli and have receptive fields spatially registered between different modalities (Graziano et al. 1994, 1999). Those neurons are located in caudal PM particularly coding the space proximal to the face (Fogassi et al. 1996; Graziano et al. 1997; Graziano and Gandhi 2000) as well as defensive actions (Cooke and Graziano 2004a, 2004b). Rostral PM contains audiovisual mirror neurons activity that is elevated not only during the execution of actions but also during the observation of such actions from others. Those neurons generate specific manual actions and respond to sound in addition to the sight of such actions (Keysers et al. 2003; Kohler et al. 2002; Rizzolatti et al. 1996; Rizzolatti and Craighero 2004) and the goal objects of those actions (Murata et al. 1997). Although AV sensitivity in caudal PM seems directly connected to the subject’s actions, rostral PM presumably reflects the cognitive processing of others’ actions.

In summary, the PFC is subdivided into various regions based on sensory, motor, and other cognitive processes. Each subdivision contains AV sensitivity that could serve to code locations or objects. There are neurons specialized in coding vocalization, associating sound and visual signals, or engaged in representation/execution of particular motor actions.

5.2.2. Posterior Parietal Cortex

The PPC in the monkey responds to different modalities (Cohen 2009), is known to be a main station of the “where” pathway before the information enters PFC (Goodale and Milner 1992; Ungerleider and Mishkin 1982), and is highly interconnected with multisensory areas (see below).

PPC receives afferents from various cortices involved in visual spatial and motion processing (Baizer et al. 1991; Cavada and Goldman-Rackic 1989a; Lewis and Van Essen 2000; Neal et al. 1990). The caudal area of PPC has reciprocal connections with multisensory parts of PFC and STS, suggesting that the PPC plays a key role in multisensory integration (Cavada and Goldman-Rackic 1989b; Neal et al. 1990). The ventral intraparietal area receives input from the auditory association cortex of the temporoparietal area (Lewis and Van Essen 2000). The anterior intraparietal area also receives projections from the auditory cortex (Padberg et al. 2005). PPC receives subcortical inputs from the medial pulvinar (Baizer et al. 1993) and superior colliculus (SC; Clower et al. 2001) that may subserve multisensory responses in PPC.

Several subregions of PPC are known to be bimodal. An auditory responsive zone in PPC overlaps with visually responsive areas (Poremba et al. 2003). Space-sensitive responses to sound (noise) in several areas of PPC, typically thought to be primarily visual oriented, have been observed in the lateral intraparietal cortex (LIP; Stricane et al. 1996), ventral intraparietal area (Schlack et al. 2005), the medial intraparietal cortex, and the parietal reach region (Cohen and Andersen 2000, 2002). The auditory space-sensitive neurons in PPC also respond to visual stimulation with similar spatial tuning (Mazzoni et al. 1996; Schlack et al. 2005). Furthermore, the spatial tuning of the auditory and visual response properties was sufficiently correlated to be predictive of one another, indicating a shared spatial reference frame across modalities (Mullette-Gilman et al. 2005, 2009).

PPC also plays a major role in motor preparation during localization tasks (Andersen et al. 1997). Auditory responses in LIP only appeared after training on memory-guided delayed reaction tasks with auditory and visual stimuli (Grunewald et al. 1999) and disappeared when the sound cue became irrelevant for the task (Linden et al. 1999). These results suggested that auditory responses in PPC were not just sensory activity. Information for encoding spatial auditory cues evolve as the phase of the task progresses but remains constantly higher for visual ones in LIP and parietal reach region (Cohen et al. 2002, 2004). Thus, there is a difference in processing between modalities.

Even though most PPC studies used simple stimuli such as LED flashes and noise bursts, one study also examined LIP response to vocal sounds and showed that LIP neurons are capable of carrying information of sound acoustic features in addition to spatial location (Gifford and Cohen 2005). In that study, sounds were delivered passively to monkeys during visual fixation. Thus, it seems inconsistent with the previously mentioned findings that manifestation of auditory response in PPC requires behavioral relevance of the sounds (Grunewald et al. 1999; Linden et al. 1999). Nevertheless, that study suggested the possibility that auditory coding in PPC may not be limited to spatial information. Similarly, the existence of face-selective patches was shown in PPC of chimpanzee using PET (Parr et al. 2009).

Although these studies suggest AV integration in PPC, responses to stimuli in bimodal conditions have not yet been directly examined in monkeys.

5.2.3. STP Area

The STP, located in the anterior region of the superior temporal sulcus, from the fundus to the upper bank, responds to multisensory stimuli in monkeys (Bruce et al. 1981; Desimone and Gross 1979; Schroeder and Foxe 2002; Poremba et al. 2003) and is a putatively key site for AV integration in both monkeys and humans.

STP is highly connected to subcortical and cortical multisensory regions. STP receives inputs from presumed multisensory thalamic structures (Yeterian and Pandya 1989) and medial pulvinar (Burton and Jones 1976), and has reciprocal connections with the PFC and other higher-order cortical regions such as PPC, IT cortex, cingulate cortex, MTL, and auditory parabelt regions (Barnes and Pandya 1992; Cusick et al. 1995; Padberg et al. 2003; Saleem et al. 2000; Seltzer et al. 1996; Seltzer and Pandya 1978, 1994). Based on connectivity patterns, area STP can be subdivided into rostral and caudal regions. Its anterior part is connected to the ventral PFC, whereas the caudal part seems to be connected to the dorsal PFC (Seltzer and Pandya 1989).

STP exhibits particular selectivity to complex objects, faces, and moving stimuli. STP was shown to have responses to visual objects (Oram and Perrett 1996), and particularly to show some degree of face selectivity (Bruce et al. 1981; Baylis et al. 1987). Face-selectivity was shown to exist in discrete patches in monkeys (Pinsk et al. 2005; Tsao et al. 2006, 2008a) and chimpanzees (Parr et al. 2009), although others found responses to faces over a wide area (Hoffman et al. 2007). Responses to faces are further selective to identity, gaze direction, and/or viewing angle of the presented face (De Souza et al. 2005; Eifuku et al. 2004). Both regions of caudal STS, like MT (Born and Bradley 2005; Duffy and Wurtz 1991; Felleman and Kaas 1984) or MST (Gu et al. 2008; Tanaka et al. 1986) as well as anterior STP (Anderson and Siegel 1999, 2005; Nelissen et al. 2006; Oram et al. 1993), are sensitive to directional movement patterns. Although the caudal STS is regarded as a part of the “where” pathway, the anterior STP is probably not because of its large spatial receptive field size (Bruce et al. 1981, 1986; Oram et al. 1993). Given this and taken together with face selectivity, it stands to reason that anterior STP may be important for the perception or recognition of facial gestures, such as mouth movement.

In addition, STP responds to somatosensory, auditory, and visual stimulation. Multisensory responsiveness of neurons in STS was tested in anesthetized (Benevento et al. 1977; Bruce et al. 1981; Hikosaka et al. 1988) and alert monkeys (Baylis et al. 1987; Perrett et al. 1982; Watanabe and Iwai 1991). In both cases, stimuli were delivered unimodally (Baylis et al. 1987; Bruce et al. 1981; Hikosaka et al. 1988) or simple bimodal stimuli (tone and LED flash) were used (Benevento et al. 1977; Watanabe and Iwai 1991). Although auditory and visual selective neurons were present in STG and formed segregated clusters in STP (Dahl et al. 2009), a population of neurons responded to both visual and auditory stimuli (Baylis et al. 1987; Bruce et al. 1981; Hikosaka et al. 1988). When the response to bimodal stimuli was examined, the neural firing rate was either enhanced or reduced compared to unimodal stimuli (Benevento et al. 1977; Watanabe and Iwai 1991). The laminar profile of current source density (CSD), which reflects a pattern of afferent termination across cortical layers in response to sounds (click) and lights (flash), indicated that STP receives feedforward auditory and visual inputs to layer IV (Schroeder and Foxe 2002).

Lesion studies in STP reveal that the region appears to process certain dimensions of sound and vision used for discrimination. Monkeys with lesions of STG and STP areas showed an impairment of auditory but not visual working memory and auditory pattern discrimination while sparing hearing (Iversen and Mishkin 1973; Colombo et al. 2006). Although IT lesions impair many visual tasks, IT and STP lesions (Aggleton and Mishkin 1990; Eaccott et al. 1993) selectively impair visual discrimination of objects more severely while sparing the performance of other visual tasks. These findings suggest that multisensory responses in STP are not simply sensory, but are involved in cognitive processing of certain aspects of sensory signals.

A series of recent studies examined AV integration in STS using more naturalistic stimuli during visual fixation, using sound and sight of conspecific vocalizations, naturally occurring scenes, and artifactual movies (Barraclough et al. 2005; Dahl et al. 2009; Chandrasekaran and Ghazanfar 2009; Ghazanfar et al. 2008; Kayser and Logothetis 2009; Maier et al. 2008). As in previous studies (Benevento et al. 1977; Watanabe and Iwai 1991), neuronal firing to bimodal stimuli was found to be either stronger or weaker when compared to unimodal stimuli. Barraclough et al. (2005) showed that the direction of change in the magnitude of response to AV stimuli from visual response depended on the size of visual response. Incongruent pairs of sound and scenes seem to evoke weaker responses (Barraclough et al. 2005; Maier et al. 2008).

To our knowledge, there are no animal studies that used task conditions requiring active behavioral discrimination. Therefore, results may not be conclusive about whether the STS can associate/integrate information of different modalities to form a recognizable identity. However, their bimodal responsiveness, specialization for objects such as faces in the visual modality, and sensitivity to congruence of signals in different modalities suggests that areas in STP are involved in such cognitive processes and/or AV perception.

5.2.4. MTL Regions

The MTL is composed of the hippocampus, entorhinal, perirhinal and parahippocampal cortices. These regions are involved in declarative memory formation (Squire et al. 2004) and place coding (McNaughton et al. 2006). The amygdala plays a predominant role in emotional processes (Phelps and LeDoux 2005), some of which may be affected by multisensory conjunction (e.g., in response to “dominant” conspecifics or looming stimuli, as discussed above).

The MTL receives various multisensory cortical inputs. Entorhinal cortex (EC), the cortical gate to the hippocampus, receives inputs from STG, STP, IT, and other nonprimary sensory cortices either directly or through parahippocampal and perirhinal cortices (Blatt et al. 2003; Mohedano-Moriano et al. 2007, 2008; Suzuki and Amaral 1994). Auditory, visual, and somatosensory association cortices also project to the nuclei of the amygdala (Kosmal et al. 1997; Turner et al. 1980).

Although IT, a part of the ventral “what” pathway (Ungerleider and Mishkin 1982) and the major input stage to MTL, responds mainly to complex visual stimuli, IT can exhibit postauditory sample delay activity during cross-modal delayed match-to-sample tasks, in which auditory sample stimuli (tones or broadband sounds) were used to monitor the type of visual stimuli (Colombo and Gross 1994; Gibson and Maunsell 1997). During the same task, greater auditory responses and delay activity were observed in the hippocampus. Those delay activities presumably reflected the working memory of a visual object associated with sound after learning. In a visual discrimination task that used tone as a warning to inform the start of trials to monkeys, ventral IT neurons responded to this warning sound (Ringo and O’Neill 1993). Such auditory responses did not appear when identical tones were used to signal the end of a trial, thereby indicating that effects were context-dependent.

In the hippocampus, a small population of neurons responds to both auditory and visual cues for moving tasks in which monkeys control their own spatial translation and position (Ono et al. 1993). Even without task demands, hippocampal neurons exhibit spatial tuning properties to auditory and visual stimuli (Tamura et al. 1992).

Neurons in the amygdala respond to face or vocalization of conspecifics passively presented (Brothers et al. 1990; Kuraoka and Nakamura 2007; Leonard et al. 1985). Some neurons respond selectively to emotional content (Hoffman et al. 2007; Kuraoka and Nakamura 2007). Multisensory responses to different sensory cues were also shown in the amygdala of monkeys performing several kinds of tasks to retrieve food or drink, avoid aversive stimuli, or discriminate sounds associated with reward (Nishijo et al. 1988a). These responses reflected affective values of those stimuli rather than the sensory aspect (Nishijo et al. 1988b).

These data corroborate the notion that sensory activity in MTL is less likely to contribute to detection, but more related to sensory association, evaluation, or other cognitive processes (Murray and Richmond 2001). The integrity of these structures is presumably needed for the formation and retention of cross-modal associational memory (Murray and Gaffan 1994; Squire et al. 2004).

5.2.5. Auditory Cortex

Recent findings of multisensory sensitivity in sensory (early) cortical areas, including primary areas, have revised our understanding of cortical “AV integration” (for review, see Ghazanfar and Schroeder 2006). Before these findings came to light, it was thought that AV integration occurred in higher-order cortices during complex component processing. To date, a large body of work has focused on multisensory mechanisms in the AC. Like some of the seminal findings with human subjects in this field (Sams et al. 1991; Calvert and Campbell 2003), the monkey AC appears to respond to visual stimulus presented alone. Kayser et al. (2007) measured the BOLD signal to natural unimodal and bimodal stimuli over the superior temporal plane. They observed that visual stimuli alone could induce activity in the caudal area of the auditory cortex. In this same area, the auditory-evoked signal was also modulated by cross-modal stimuli.

The primate auditory cortex stretches from the fundus of the lateral sulcus (LS) medially to the STG laterally, and has more than 10 defined areas (Hackett 2002; Hackett et al. 2001; Kaas and Hackett 2000). Among auditory cortical areas, the first area in which multisensory responsiveness was examined was the caudal–medial area (CM; Schroeder et al. 2001). In addition to CM, other auditory areas including the primary auditory cortex (A1) were also shown to receive somatosensory inputs (Cappe and Barone 2005; Disbrow et al. 2003; de la Mothe et al. 2006a; Kayser et al. 2005; Lakatos et al. 2007; Smiley et al. 2007; for a review, see Musacchia and Schroeder 2009). Most areas also received multisensory thalamic inputs (de la Mothe 2006b; Hackett et al. 2007; Kosmal et al. 1997). Documented visual inputs to the auditory cortex have thus far originated from STP (Cappe and Barone 2005) as well as from peripheral visual fields of V2 and prostriata (Falchier et al. 2010).

Schroeder and Foxe (2002) reported CSD responses to unimodal and bimodal combinations of auditory, visual, and somatosensory stimuli in area CM of the awake macaque. The laminar profiles of CSD activity in response to visual stimuli differed from those of auditory and somatosensory responses. Analysis of activity in different cortical layers revealed that visual inputs targeted the extragranular layers, whereas auditory and somatosensory inputs terminated in the granular layers in area CM. These two termination profiles are in accordance with the pattern of laminar projections of visual corticocortical projections (Falchier et al. 2002; Rockland and Ojima 2003) and primary-like thalamocortical projections (Jones 1998), respectively. In contrast, A1 receives auditory and somatosensory inputs in the granular and supragranular cortical layers, respectively (Lakatos et al. 2007). This suggests that somatosensory input to A1 originates from lateral, feedback, or nonspecific thalamic nuclei connections. Our laboratory showed that attended visual stimuli presented in isolation modulate activity in the extragranular layer of A1 (Lakatos et al. 2009) and the same pattern is observed with attended auditory stimuli in V1 (Lakatos et al. 2008). These findings strengthen the hypothesis that nonspecific thalamic projections (Sherman and Guillery 2002) or pulvinar-mediated lateral connections (Cappe et al. 2009) contribute to AV integration in A1.

Ghazanfar et al. and Logothetis et al. groups have shown that concurrent visual stimuli influenced auditory cortical response systematically in A1 as well as in the lateral associative auditory cortices and STP (Ghazanfar et al. 2005; Hoffman et al. 2008; Kayser et al. 2007, 2008). These studies used complex and natural AV stimuli, which are more efficient in evoking responses in some nonprimary auditory areas (Petkov et al. 2008; Rauschecker et al. 1995; Russ et al. 2008). Their initial study (Ghazanfar et al. 2005) revealed that movies of vocalizations presented with the associated sounds could modulate local field potential (LFP) responses in A1 and the lateral belt. Kayser et al. (2008) showed visual responses in LFP at frequency bands near 10 Hz. This frequency component responded preferably to faces, and more preference existed in the lateral belt than A1 (Hoffman et al. 2008). However, multiunit activity (MUA) barely showed visual response that correlated in magnitude with the LFP response. AV interactions occurred as a small enhancement in LFP and suppression in MUA (see also Kayser and Logothetis 2009).

Although AV integration in areas previously thought to be unisensory are intriguing and provocative, the use of a behavioral task is imperative in order to determine the significance of this phenomenon. Brosch et al. (2005) employed a task in which an LED flash cued the beginning of an auditory sequence. Monkeys were trained to touch a bar to initiate the trial and to signal the detection of a change in the auditory sequence. They found that some neurons in AC responded to LED, but only when the monkey touched the bar after detecting the auditory change. This response disappeared when the monkey had to perform a visual task that did not require auditory attention. Although this may be due in part to the fact that the monkeys were highly trained (or potentially overtrained) on the experimental task, they also point to the importance of engaging auditory attention in evoking responses to visual stimuli. Findings like these, which elucidate the integrative responses of individual and small populations of neurons, can provide key substrates to understand the effects of bimodal versus unimodal attention on cross-modal responses demonstrated in humans (Jääskeläinen et al. 2007; McDonald et al. 2003; Rahne and Böckmann-Barthel 2009; Talsma et al. 2009; von Kriegstein and Giraud 2006).

The timing of cross-modal effects in primary auditory and posterior auditory association cortices in resting or anesthetized monkeys seemed consistent with the cross-modal influence of touch and sight in monkeys engaged in an auditory task. In resting monkeys, the somatosensory CSD response elicited by electrical stimulation of the median nerve had an onset latency as short as 9 ms (Lakatos et al. 2007; Schroeder et al. 2001), and single neurons responded to air puff stimulation at dorsum hand in anesthetized monkey with a latency of about 30 ms (Fu et al. 2003). Cutaneous sensory response of single units in AC during active task peaked at 20 ms (Brosch et al. 2005) and occurred slower than direct electrical activation of afferent fibers but faster than passive condition. Similarly, visual responses of single units in AC were observed from 60 ms and peaked at around 100 ms after the onset of LED during an active task (Brosch et al. 2005). That was within the same range of the onset latency, about 100 ms, of neuronal firing and the peak timing of LFP responses to complex visual stimuli in AC when monkeys were simply visually fixating (Hoffman et al. 2007; Kayser et al. 2008). The effect of gaze direction/saccades will also need to be taken into account in future studies because it has been proposed that it can considerably affect auditory processing (Fu et al. 2004; Groh et al. 2001; Werner-Reiss et al. 2006).

5.2.6. Visual Cortex

There has been much less multisensory research done in visual cortex than in auditory cortex, although it has been shown that the peripheral visual field representations of primary visual cortex (V1) receive inputs from auditory cortical areas, A1, parabelt areas on STG, and STP (Falchier et al. 2002). The peripheral visual field representation of area V2 also receives feedback inputs from caudal STG/auditory belt region (Rockland and Ojima 2003). A preference to vocal sounds, relative to other sounds, was found in the nonprimary visual cortex using functional MRI (fMRI) in monkeys (Petkov et al. 2008).

In contrast to studies of visual responses in the auditory cortex, not many visual studies recorded auditory responses in visual cortex during the performance of a task. Wang et al. (2008) recorded V1 single-unit firing while monkeys performed a visual detection task. Concurrent presentation of auditory and visual stimuli not only shortened saccadic reaction time, but also increased the neuronal response magnitude and reduced response latency. This effect was greatest when the intensity of visual stimuli was of a low to moderate level, and disappeared when the luminance of the visual stimuli was intense. When monkeys were not performing a task, no auditory effect was observed in V1 (see Section 5.6.1).

In a series of studies from our laboratory, a selective attention task was employed to determine whether attention to auditory stimuli influenced neuronal activity in V1 (Lakatos et al. 2008, 2009; Mehta et al. 2000a, 2000b). In these studies, tones and flashes were presented alternatively and monkeys had to monitor a series of either visual or auditory stimuli, while ignoring the other modality. The visual response was stronger when monkeys tracked the visual series than when they tracked the auditory series. In the attend-auditory condition, it appeared that a phase reset of ongoing neuronal oscillations occurred earlier than the visual response (Lakatos et al. 2009). This effect disappeared when the same stimuli were ignored. Thus, auditory influences on V1 were observed only when auditory stimuli were attended. It contrasted with the findings of Wang et al. (2008) in which sound affected V1 activity in monkeys performing a visual task. As we propose later, control of attention likely has a major role in the manifestation of auditory effects in V1 (see Section 5.6.2).

5.2.7. Subcortical Regions

The basal ganglia group is composed of several nuclei, each having a distinct function, such as motor planning and execution, habitual learning, and motivation. Several studies show auditory, visual, and bimodally responsive neurons in basal ganglia nuclei. Even though multisensory responses could be observed under passive conditions (Santos-Benitez et al. 1995), many studies showed that these responses were related to reinforcement (Wilson and Rolls 1990) or sensorimotor association (Aosaki et al. 1995; Hikosaka et al. 1989; Kimura 1992).

Although it is well known that the SC is a control station orienting movement (Wurtz and Albano 1980), its multisensory property has been a hotbed of research for decades in monkey (Allon and Wollberg 1978; Cynader and Berman 1972; Updyke 1974; Wallace et al. 1996) and other animal models (Meredith and Stein 1983; Meredith et al. 1987; Rauschecker and Harris 1989; Stein et al. 2001, 2002). Neurons in the monkey SC adhered to well-established principles of multisensory integration such as spatial contiguity and inverse effectiveness (for review, see Stein and Steinford 2008), whether the animals were engaged in tasks (Frens and Van Opstal 1998) or under anesthesia (Wallace et al. 1996). In the SC of awake animals, AV integration depended on the task conditions, whether they fixated on visible or memory-guided spots during AV stimuli (Bell et al. 2003). The presence of a visual fixation spot decreased unimodal responses, and nearly suppressed response enhancement by AV stimuli. Bell et al. (2003) attributed the reason of weaker AV integration during visually guided fixation to fixation-mediated inhibition in SC. It is consistent with the fact that, whereas activity in SC is coupled to eye movements, fixation requires the monkey to refrain from gaze shifts.

Although the inferior colliculus (IC) has been generally assumed to be a passive station for primarily auditory information and immune to nonauditory or cognitive influences, recent AV studies challenge this view. Neuronal activity in the IC has been shown to be influenced by eye position (Groh et al. 2001), saccades, and visual stimuli (Porter et al. 2007), suggesting that the IC may be influenced by covert orienting of concurrent visual events. This covert orienting may contribute to the visual influence observed on portions of human auditory brainstem responses that are roughly localized to the IC (Musacchia et al. 2006).

Studies of thalamic projections to the primary auditory cortex show that multisensory connections are present in centers previously thought to be “unisensory” (de la Mothe et al. 2006b; Hackett et al. 2007; Jones 1998). Multiple auditory cortices also receive divergent afferents originating from common thalamic nuclei (Cappe et al. 2009; Jones 1998). In addition, the connections between thalamic nuclei and cortices are largely reciprocal. Even though the functions of those thalamic nuclei have to be clarified, they may contribute to multisensory responsiveness in cerebral cortices. Bimodal responsiveness was shown in a few thalamic nuclei (Matsumoto et al. 2001; Tanibuchi and Goldman-Rakic 2003).

5.3. FUNCTIONAL SIGNIFICANCE OF MULTISENSORY INTERACTIONS

It was shown in monkeys that, under certain circumstances, audition influences vision (Wang et al. 2008), vision influences audition (Woods and Recanzone 2004), or the two senses influence each other (Cappe et al. 2010). For AV integration of any form, auditory and visual information has to converge. As described in the previous section, most brain regions have the potential to support that interaction (for review, see Ghazanfar and Schroeder 2006; Musacchia and Schroeder 2009), but the importance of that potential can only be determined by assessing the functional role that each region plays in helping to achieve perceptual integration of sight and sound. This can be achieved by observing the behavioral effects of cortical lesions or electrical stimulation in different areas and by simultaneously measuring behavioral performance and neural activity in normal functioning and impaired populations.

5.3.1. Influences on Unimodal Perception

Neural activity in a unimodal area is thought to give rise to sensations only in the preferential modality of the area. It is not surprising, therefore, that lesions in these areas only extinguish sensations of the “primary” modality. For example, STG lesions impair auditory memory retention but leave visual memory retention intact (Colombo et al. 1996). One exception to this rule lies in cases of acquired cross-modal activity such as auditory responses in the occipital cortex in blind people (Théoret et al. 2004). Despite this reorganization, direct cortical stimulation in the visual cortex of blind people elicits photic sensations of simple patterns (such as letters) (Dobelle et al. 1974). Similar sensations of phosphenes can also be induced in sighted individuals using transcranial magnetic stimulation (TMS) (Bolognini et al. 2010; Ramos-Estebanez et al. 2007; Romei et al. 2007, 2009). But do they also induce auditory sensations? Our opinion is that auditory activity in the visual cortex does not induce visual sensations, and visual activity in the auditory cortex does not induce auditory sensations, although it may depend on the condition of subjective experience with stimuli (Meyer et al. 2010). In humans, influences of cross-modal attention on activity of sensory cortices during cross-modal stimulus presentation, e.g., visual attention gates visual modulation in auditory cortex, is known (Ciaramitaro et al. 2007; Lehman et al. 2006; Nager et al. 2006; Teder-Sälejärvi et al. 1999). In particular, the functional role of visual information on speech perception and underlying auditory cortical modulation is well documented (Besle et al. 2009; van Attenveldt et al. 2009; Schroeder et al. 2008). The findings described below also suggest that the functional role of cross-modal activation in early sensory cortices is likely the modulation of primitive (low-level) sensory perception/detection.

5.3.1.1. Influence on Temporal Dynamics of Visual Processing

In the sensory system, more intense stimuli generally produce a higher neuronal firing rate, faster response onset latencies, and stronger sensations. AV interactions often have a facilitative effect on the neural response, either through increased firing rate or faster response (for review, see Stein and Stanford 2008), suggesting that AV stimuli should increase the acuity of the behavioral sensation in some fashion. In humans, AV stimuli increase reaction time speed during target detection (Diederich and Colonius 2004; Giard and Peronnet 1999; Molholm et al. 2002, 2007) and improve temporal order judgments (Hairston et al. 2006; Santangelo and Spence 2009).

In the monkey, Wang et al. (2008) showed electrophysiological results consistent with this notion. During a visual localization task, the effect of AV enhancement in V1 occurred as shorter response latency. Interestingly, no appreciable enhancement of visual response was elicited by auditory stimuli when monkeys were not engaged in tasks.

The auditory stimuli by themselves did not evoke firing response in V1. This suggests that auditory influence on V1 activity is a subthreshold phenomenon. Suprathreshold response in V1 begins at about 25 to 30 ms poststimulation (Chen et al. 2007; Musacchia and Schroeder 2009). To achieve auditory influences on visual responses, auditory responses must arrive within a short temporal window, a few milliseconds before visual input arrives (Lakatos et al. 2007; Schroeder et al. 2008). Auditory responses in the auditory system generally begin much earlier than visual responses in V1. For some natural events such as speech, visible signals lead the following sounds (Chandrasekaran et al. 2009; for review, see Musacchia and Schroeder 2009). For these events, precedence of visual input, relative to auditory input, is likely a requirement for very early AV interaction in early sensory interactions.

5.3.1.2. Sound Localization

The ventriloquist aftereffect observed by Woods and Recanzone (2004) involves the alteration of auditory spatial perception by vision. This phenomenon implies the recruitment of structures whose auditory response depends on or encodes sound location. Several brain structures have sensitivity to spatial location of sound in monkeys. Those include IC (Groh et al. 2001), SC (Wallace et al. 1996), ventral division of the medial geniculate body (Starr and Don 1972), caudal areas of auditory cortex (Recanzone et al. 2000; Tian et al. 2001), PPC (Cohen 2009), and PFC (Artchakov et al. 2007; Kikuchi-Yorioka and Sawaguchi 2000).

Woods and Recanzone (2004) used two tasks to test for bimodal interaction during sound localization: one for training to induce the ventriloquist aftereffect and another to test spatial sound lateralization. Monkeys maintained fixation except when making a saccade to the target sound location in the latter test task. The location of the LED light on which monkeys fixated during training task differed between sessions and affected the sound localization in the subsequent sound localization test tasks. Monkey’s “sound mislocalization” was predicted by the deviation of the LED position during the training task from the true center position on which the monkey fixated during the test task. Because monkeys always fixated on the LED, the retinotopic locus of the LED was identical across the tasks. However, there was a small difference in gaze direction that played a key role in causing “mislocalization” by presumably inducing plastic change in proprioceptive alignment of gaze position to sensory LED position. An additional key to that study was even though LED positions were not identical between tasks, they were so close to each other that monkeys presumably treated fixation points of slightly different positions as the same and did not notice differences in gaze directions. Therefore, it could be guessed that the plasticity of the visual spatial localization affected the auditory spatial localization.

Although the precise substrate for the ventriloquist aftereffect in the macaque has not been established, several structures are candidates: IC (Groh et al. 2001), SC (Jay and Sparks 1984), AC (Werner-Reiss et al. 2006), and LIP and MIP (Mullette-Gilman et al. 2005). However, in all structures, except for the SC, the observed effects varied between simple gain modulation without altering the spatial receptive field (head-centered coordinate), systematic change that followed gaze direction (eye-centered coordinate), or other complex changes. Plastic change in either coordinate or both can presumably contribute to inducing the ventriloquist aftereffect.

Fixation during head restraint does not allow any eye movement. During fixation, subjects can pay visual attention to locations off from the fixated spot (covert attention) or listen carefully. Neuronal activity correlates of such processes were seen in PFC (Artchakov et al. 2007; Kikuchi-Yorioka and Sawaguchi 2000) or PPC (Andersen et al. 1997). Meanwhile, subjects have to keep feeding oculomotor command signals to maintain steady eye position. Therefore, the signal that transmits fixating location and differentiates between center and deviant should be present. A possible correlate to such a signal was described in AC, a change in spontaneous activity dependent on gaze direction, whereas it was not observed in IC (Werner-Reiss et al. 2006). Even though what provides the eye positional signal to AC is unknown, it suggests AC as one of the candidates inducing the ventriloquist aftereffect.

It is worth mentioning that regardless of the name “ventriloquist aftereffect,” it is quite different from the ventriloquist effect. The ventriloquist effect happens when audio and visual signals stem from a shared vicinity, but does not require fixation on a visual spot and a steady eye positional signal. In contrast, the ventriloquist aftereffect is about spatial coding of solely auditory events. Hence, the study of this phenomenon may be useful to clarify which type of neuronal coding is the main strategy for cortical encoding of sound localization.

5.3.2. AV Recognition

Identifying a previously known AV object, such as a speaker’s face and voice, requires AV integration, discrimination, and retention. This process likely relies on accurate encoding of complex stimulus features in sensory cortices and more complex multiplexing in higher-order multisensory association cortices. Multisensory cortices in the “what” pathway probably function to unite these sensory attributes. In humans, audiovisual integration plays an important role in person recognition (Campanella and Belin 2007). Several studies have shown that unimodal memory retrieval of multisensory experiences activated unisensory cortices, presumably because of multisensory association (Wheeler et al. 2000; Nyberg et al. 2000; Murray et al. 2004, 2005; von Kriegstein and Giraud 2006) and such memory depended on meaningfulness of combined signals (Lehmann and Murray 2005).

Differential responses to vocal sounds were observed in PFC (Gifford et al. 2005; Romanski et al. 2005), STG (Rauschecker et al. 1995; Russ et al. 2008), and AC (Ghazanfar et al. 2005). Differential responses to faces were found in PFC (Rolls et al. 2006), temporal lobe cortices (Eifuku et al. 2004), and amygdala (Kuraoka and Nakamura 2007). Some of these structures may possess selectivity to both vocal sounds and faces. Recognition of a previously learned object suggests that this process relies in part on working and long-term memory centers. The fact that the identification of correspondence between vocal sound and face is better when the individuals are socially familiar (Martinez and Matsuzawa 2009) supports this notion. PFC and MTL are also involved in the association of simple auditory and visual stimuli as shown by delayed match to sample task studies (Colombo and Gross 1994; Fuster et al. 2000; Gibson and Maunsell 1997). Lesions in MTL (Murray and Gaffan 1994) or PFC (Gaffan and Harrison 1991) impaired performance in tasks requiring memory and AV association. These findings implicate PFC, STG, and MTL in AV recognition.

5.4. PRINCIPLES OF MULTISENSORY INTERACTION

Relationships between multisensory responses and stimulus parameters, derived primarily from single-unit studies in the cat SC, are summarized in three principles of multisensory interaction: inverse effectiveness, temporal, and spatial principles (Stein and Meredith 1993). These organizing principles have been shown to be preserved with other sensory combinations (i.e., auditory–somatosensory; Lakatos et al. 2007) and in humans (Stevenson and James 2009); however, systematic examination of these principles for AV integration in monkey cerebral cortex is limited to the auditory cortex.

5.4.1. Inverse Effectiveness

The inverse effectiveness principle of multisensory interaction states that the interaction of weaker unimodal inputs results in larger gain of multisensory response. In the case of audition, the response to a softer sound should be enhanced more by visual input, relative to a louder sound. In the case of vision, the response to a dimmer object should be enhanced more by sounds relative to a brighter object.

Cappe et al. (2010) showed a behavioral correlate to inverse effectiveness in monkeys. Manual reaction times to soft sounds were slower relative to loud sounds, and only the reaction time to soft sound was shortened by simultaneous visual stimuli. Responses to AV stimuli were also more accurate than responses to sounds alone at the lowest sound intensities. The same group also showed that the effect of sound on saccades as well as V1 neuronal response latencies is larger in the case of less salient visual stimuli (Wang et al. 2008).

fMRI studies show that degraded auditory and visual stimuli both evoke weaker BOLD signal responses in the macaque AC, relative to intact stimuli (Kayser et al. 2007). When those degraded stimuli were presented simultaneously, enhancement of BOLD signal responses was larger than simultaneous intact stimuli. Even though they did not test the combination of degraded and intact stimuli, the results suggest synergistic inverse effectiveness between modalities.

Electrophysiologically, Ghazanfar et al. (2005) showed that weaker LFP responses to vocal sounds were enhanced more by concurrently viewing a movie clip of a vocalizing monkey, relative to stronger responses. Another study showed that responses to vocal stimuli were modulated by movie stimuli differentially depending on loudness: responses to the loud vocal stimuli were suppressed when the movie was added, whereas the responses to the soft sounds were enhanced (Kayser et al. 2008). These studies are compatible with the idea that weak responses are enhanced by AV integration. Additionally, a recent study reported a small but significant increase in the information capacity of auditory cortical activity (Kayser et al. 2010).

Thus, visual stimuli may not only enhance responses but also deploy more cortical neurons in computational analysis of auditory signals, creating redundancy in processed information to secure the perception.

5.4.2. Temporal Contiguity

The Temporal Principle of multisensory processing (Stein and Meredith 1993) predicts that integration effects will be greatest when neuronal responses evoked by stimuli of the two modalities are within a small temporal window. Quite a few studies investigated spatial and temporal contiguity principles of AV integration in nonhuman primates.

Overall, results in the monkey SC and A1 conform to the principle of temporal contiguity and describe a range of enhancement and suppression effects. In the SC, Wallace et al. (1996) showed that visual stimuli preceding auditory stimuli tend to produce more interaction. This condition corresponds to the natural order of physical events in everyday stimuli where the visual stimulus precedes the accompanying auditory one.

Ghazanfar et al. (2005) described neural responses in A1 and lateral belt areas to the presentation of conspecific vocal sounds, with and without the accompanying movies at different SOAs. In this region, bimodal stimulation can elicit suppression or enhancement, depending on the neural population. Results showed that the proportion of sites exhibiting bimodal enhancement depended on the SOA: SOAs longer than 100 ms enhanced less regions of AC. When the auditory response was suppressed by a movie, the proportion of suppressed locations peaked at SOAs shorter than 80 ms and longer than 100 ms, interestingly sparing the peak timing of visually evoked LFPs.

Kayser et al. (2008) tested responses in A1 and belt areas to systematic combinations of noise bursts and flashes in 20 ms steps. Bimodal suppression was only observed when the flash preceded noise by 20 to 80 ms. For the natural AV stimuli, bimodal enhancement was observed in some populations of auditory cortex at an SOA of 0 ms, and that was abolished by introducing a perceivable delay between stimuli (160 ms).

These results suggest that AV interaction in AC could happen as either enhancement (if audio and visual stimuli are nearly synchronized or separated by less than 100 ms delay) or suppression (at delays longer than 100 ms). Interpretations of these data should be approached with a little caution. In the first study, the effect of AV interaction was attributed to the interaction between movements of the mouth and the following vocal sound (Ghazanfar et al. 2005). However, because the mouth movement started immediately after the abrupt appearance of the first movie frame, the sudden change in the screen image could capture visual attention. In other studies, an abrupt visual change was shown to elicit a brief freeze of gaze position in monkeys (Cui et al. 2009) and in humans (e.g., Engbert and Kliegl 2003). Therefore, the onset of the movie itself could evoke transient activity. This would suggest that the observed effects were related simply to visual response or a transient change in covert visual attention. Because LFPs capture the response of a large population of neurons, such activity generated in non-AC structures may be superimposed. Further studies are necessary to dissociate the AV interaction into mouth movement-related and other components.

5.4.3. Spatial Contiguity

The spatial principle of multisensory integration states that multisensory integration is greatest when loci of events of different modalities overlap with the receptive fields of neurons and when those receptive fields of different modalities overlap with each other. Although there is little data in monkey cortex on this topic for AV integration, we can speculate how it operates based on anatomical and electrophysiological findings.

Anatomical studies predict that peripheral representations of visual stimuli should be more susceptible to auditory influences. The representation of the visual periphery is retinotopically organized in the visual cortex and is interconnected with caudal auditory cortices (Falchier et al. 2002, 2010; Rockland and Ojima 2003). In accordance with this prediction, Wang et al. (2008) observed auditory influences on V1 responses to visual stimuli presented more peripherally than 10°, although central vision was not tested. Similarly, in humans, auditory activation of visual cortex subserving the peripheral visual fields was shown (Cate et al. 2009). However, many human studies used central and parafoveal stimuli, for which anatomical substrates or other physiological mechanisms await to be found.

Other studies used different types of visual stimuli to study auditory cortical responses. Flashes (e.g., Kayser et al. 2008; Lakatos et al. 2009) excite a wide area of the retinotopic map. Images and movies were overlaid around a central fixation point (Ghazanfar et al. 2005, 2008). In the latter case, visual stimulation did not extend to peripheral visual space. In addition, when monkey faces are used, the subjects tend to look at the mouth and eyes proximal to the center of face (Ghazanfar et al. 2006).

These findings suggest that visual influence may have different sources depending on the type of stimulus preference in each area. For example, cortices across STS possess face preference, large receptive fields, and position invariance of object selectivity. Therefore, facial influences on AC may originate from STS, as proposed by recent studies (Ghazanfar et al. 2008; Kayser and Logothetis 2009, see below). Such speculation may be further clarified by comparing the vocal movie effect on AC between face positions relative to gaze position considering the difference in the size of receptive field between visually responsive cortices.

In PPC, common spatial tuning to visual and auditory stimuli was observed (Mazzoni et al. 1996; Schlack et al. 2005). Even though PPC response to simultaneous AV stimuli has not been investigated, it is likely that integration there depends on spatial congruency between modalities. Further studies are needed to verify this.

5.5. MECHANISMS AND DYNAMICS OF MULTISENSORY INTERACTION

Traditionally, multisensory integration is indexed at the neuronal level by a change in the averaged magnitude of evoked activity relative to the sum of unimodal responses. This type of effect was most often studied in the classical higher-order multisensory regions of the temporal, parietal, and frontal cortices, and generally manifested as a simple enhancement of the excitatory response beginning at the initial input stage in layer 4 as reviewed by Schroeder and Foxe (2002). Recent studies have shown that cross-modal influence on traditional unisensory cortices could occur via manipulation of ongoing oscillatory activity in supragranular layers, which in turn modulates the probability that neurons will fire in response to the dominant (driving) auditory input (Lakatos et al. 2007; Schroeder and Lakatos 2009). Similarly, modulatory rather than driving multisensory influences were found in single-unit studies as well (Allman and Meredith 2007; Allman et al. 2008; Dehner et al. 2004; Meredith et al. 2009). This more novel mechanism will be the focus of discussion here.

5.5.1. Phase Reset: Mechanisms

Somatosensory stimuli evoked a modulatory response in the supragranular layer of A1, with an onset time even faster than the auditory response (Lakatos et al. 2007). When it was paired with synchronized auditory stimuli, faster somatosensory activation influenced the forthcoming auditory response. However, somatosensory activity did not evoke a single rapid bolus of afferent activity like a click, which elevates signal power across a broad frequency range at once. Instead, the somatosensory effect appeared as a modulation by phase reset of certain dominating neuronal oscillations observed in CSD. In other words, the somatosensory stimulus changed randomly fluctuating excitability of auditory neuronal ensembles to a certain excitable condition (represented by the oscillatory phase), thereby determining the effect of the auditory input. The modulatory effect is differential across somatosensory–auditory SOAs dependent on how a given SOA relates to periods of delta, theta, and gamma oscillations; that is, facilitation is maximal at SOAs corresponding to full gamma, theta, and delta cycles, and these peaks in the function are separated by “suppressive” troughs, particularly at SOAs corresponding to 1/2 of a theta cycle, and 1/2 of a delta cycle.

In contrast with somatosensory activation of A1, visual responses are relatively slow even within the visual systems (Chen et al. 2007; Musacchia and Schroeder 2009; Schmolesky et al. 1998). It takes more time for visual activity to reach the auditory cortex than auditory activity in both A1 and V1 (Lakatos et al. 2009). Therefore, for the timing of visual signals to coincide with or to reach AC earlier than that of the auditory signal, visual stimuli have to occur earlier than auditory stimuli, which is the case for many natural forms of AV stimulation, particularly speech (Chandrasekaran et al. 2009).

Cross-modal auditory modulation of V1 activity and visual modulation of A1 activity were observed in monkeys performing an intermodal selective attention task, in which auditory and visual stimuli were presented alternatively at a rate in the range of delta frequency (Lakatos et al. 2009). Just like in the case of somatosensory modulation of A1 activity, cross-modal responses occurred as a modulatory phase reset of ongoing oscillatory activity in the supragranular layers, without a significant change in neuronal firing while those stimuli were attended.

Supragranular and granular layers are recipients of corticocortical, nonspecific thalamocortical inputs or sensory-specific thalamocortical inputs, respectively. Modulatory phase reset in supragranular layer without any change in neuronal firing in granular and even supragranular layers suggests that cross-modal activation happens as a transient change in supragranular cellular excitability at the subthreshold level. It is consistent with the fact that cross-modal sensory firing response has not been reported for primary sensory cortices in many studies that relied on action potentials as a sole dependent measure. The manifestations of multiple poststimulus time windows of excitability are consistent with nested hierarchical structure of frequency bands of ongoing neuronal activity (Lakatos et al. 2005).

Cross-modal responses during an intermodal selective attention task were observed in response to unimodal stimuli (Lakatos et al. 2008, 2009). What would be the effect of a phase reset when auditory and visual stimuli are presented simultaneously? Wang et al. (2008) analyzed neuronal firing responses to light with or without paired auditory noise stimuli using single-unit recordings in V1. When stimuli were presented passively, firing rate in a population of V1 neurons increased and remained high for 500 ms. V1 population responses to a visual target without sound during visual detection tasks appeared as double peaks in a temporal pattern. The timing of each peak after response onset was in the range of cycle length of gamma or theta frequency bands. In response to AV stimuli, an additional peak near the time frame of a full delta cycle showed up in the temporal firing pattern. Although translation of firing activity into underlying membrane potential is not straightforward, those activity parameters are roughly monotonically proportional to each other (e.g., Anderson et al. 2000). Thus, the oscillatory pattern of neuronal firing suggests oscillatory modulation of neuronal excitability by the nonauditory stimuli.

5.5.2. Phase Reset: Dependence on Types of Stimuli

How would phase reset work in response to stimuli with complex temporal envelopes? Sounds and movies of vocalizations are the popular stimuli examined in studies of AV integration in auditory cortical areas and STP in nonhuman primates. As vocalization starts with visible facial movement before a sound is generated, phase reset by visible movement pattern is in a position to affect processing of a following sound. Kayser et al. (2008) showed changes in frequency bands of LFP (around and below 10 Hz) consistent with the above predictions, that is, they observed phase reset and excitability increases when response to the sound of complex AV stimuli started in A1. When phase reset occurred, it was accompanied with enhanced firing responses.

There were differences in the frequency bands in which phase reset is produced by visual inputs between Kayser et al. (2008) and the findings of Lakatos et al. (2009), who showed cross-modal phase reset in A1 and V1 occurred around theta (below 10 Hz) and gamma (above 25 Hz) bands leaving a 10 to 25 Hz band out of the phenomena. Kayser et al. observed phase reset by visual input alone across the range of 5 to 25 Hz. The differences between the results of these studies are likely attributable to differences in visual stimuli.

Lakatos et al. (2009) did not examine whether phase reset of ongoing oscillatory activity at theta and gamma bands contributed to AV integration because their task did not present auditory and visual stimuli simultaneously. Kayser et al. (2008) showed that observation of enhanced neuronal firing response to AV stimuli compared with auditory stimuli correlated with the occurrence of phase reset about 10 Hz, underscoring the importance of reset in that band for AV response enhancement. Also, differences in frequency band of phase reset by visual stimuli between the Lakatos et al. and Kayser et al. studies suggests that the frequency of oscillation influenced by crossmodal inputs depends on conditions of attention and stimulation.

Is phase reset a phenomenon beyond primary sensory cortices? This question is open. At least STP clearly receives feedforward excitatory input from several modalities (Schroeder and Foxe 2002). The contribution of oscillatory phase reset in STP and other higher-order multisensory areas have not been examined in detail, although the suspicion is that phase reset may have more to do with attentional modulation than multisensory representation.

5.6. IMPORTANCE OF SALIENCE IN LOW-LEVEL MULTISENSORY INTERACTIONS

Variations in AV integration effects according to saliency and attentional conditions are so pervasive that some have begun to wonder if attention is a prerequisite to integration (Navarra et al. 2010). However, AV integration has been observed in many higher cortical areas even when subjects were only required to maintain visual fixation without further demands of a task (PFC, Sugihara et al. 2006; STP, Barraclough et al. 2005; AC, Ghazanfar et al. 2005; Kayser et al. 2008). Does this mean audiovisual interactions happen automatically? The answer may depend on the level of the system being studied, as well as the behavioral states, as discussed below.

5.6.1. Role of (Top-Down) Attention

There is strong evidence that top-down attention is required in order for AV integration to take place in primary sensory cortices. Using an intermodal selective attention task, Lakatos et al. (2008, 2009) showed that the manifestation of visual influence in A1 and auditory influence in V1 was dependent on attention. If a stimulus was ignored, its cross-modal influence could not be detected.

The selective role of sensory attention illustrated above contrasts with some findings that show how attention to either modality elicits AV effects. Wang et al. (2008) showed that neurons in V1 responded to auditory targets only when monkeys performed a purely visual localization task. Similarly, in humans, task-irrelevant sound promoted phosphene detection during a task that requires only visual attention to detect phosphene induced by TMS over visual cortex (Romei et al. 2007, 2009). Thus, tasks requiring either auditory (Lakatos et al. 2009) or visual (Romei et al. 2007, 2009; Wang et al. 2008) attention both rendered auditory influences observable in V1. This apparent disagreement is most likely because of differences in the role of unattended sensory stimuli during those tasks.

In the visual localization task (Wang et al. 2008), monkeys needed to react faster to localize visual targets. Task-irrelevant auditory stimuli occurred in half of the trials, being delivered always temporally congruent with visual targets and at a fixed center location. In this task, the status of sound is key. Auditory stimuli, when delivered, were always informative, and thus, could act as an instruction like that given verbally to subjects performing visual localization as in Posner’s classical study (Posner et al. 1980). Therefore, it was possible that monkeys paid attention to such informative auditory stimuli in addition to visual stimuli to perform the visual localization task. In a similar vein, responses to visual events in the auditory discrimination task of Brosch et al. (2005) may be regarded as an informative cross-modal cue to perform the task, although again, the effects of overtraining must also be considered.

In the intermodal attention task (Lakatos et al. 2008, 2009), subjects did not have to spread their spatial attention to different locations because visual and auditory stimuli were spatially congruent. However, those stimuli were temporally incongruent, divided into two series as asynchronous streams. Furthermore, whereas monkeys had to monitor a sequence of one modality, deviants appeared in the other sequence and monkeys had to refrain from responding to it. The easiest way to perform such a task is to plug one’s ears when watching and to close the eyes when you are listening. Prevented from these strategies, all monkeys could do actually was not only to pay attention to a modality cued to attend to but also to ignore the other stream at the same time, in order to perform the task.

Although it may be impossible to determine what monkeys are actually attending to during any given task, it can be argued that monkeys do not ignore informative sounds based on the observation of auditory influence on visual response in V1 (Wang et al. 2008). Further studies are needed to determine how attentive conditions influence AV integration. It would be interesting to see whether an auditory influence could be observable in a visual localization task, as in the study of Wang et al. (2008), but with auditory stimuli incongruent with visual stimuli matched both spatially and temporally, thereby acting as distracters.

Auditory attention has also been suggested to play a role in evoking auditory response in LIP (Linden et al. 1999) and PFC (Vaadia et al. 1986). Further clarification of the role of attention in higher associative areas, such as the PFC, is very important because many models assume that those cortices impose attentional control over lower cortices.

5.6.2. Attention or Saliency of Stimuli

Degrees of attentional focus and ranges of stimulus saliency surely have differential effects on AV integration. It is difficult to argue that monkeys monitor AV stimuli during simple tasks such as fixation because monkeys will receive reward anyway regardless of what happens during stimulus presentation. However, monkeys are certainly alert in such a condition. Even though the mandated level of such attention is different from active monitoring, such weak attention, or lack of competing stimulation, may be enough to induce audiovisual integration.

Besides attentive requirements, there are differences in stimulus saliency between simple stimuli, such as flashes and tones, and complex stimuli such as faces. It is well known that meaningful visual stimuli attract attention in a behaviorally observable manner. The eyes and mouths of individuals vocalizing draw a subject’s gaze (Ghazanfar et al. 2006). Thus, it is possible that highly salient stimuli may passively induce AV effects in the absence of explicit requirements to attend.

Certain forms of AV effects in adult animals occur only after training (Grunewald et al. 1999; Woods and Recanzone 2004). In that sense, perception of vocalization has already been acquired by life-long training in monkeys. We may suppose that AV integration is essential for acquisition of communication skills in nonhuman primates. Once trained, AV integration may become “pre-potent” requiring less attention and may be done “effortlessly.”

5.7. CONCLUSIONS, UNRESOLVED ISSUES, AND QUESTIONS FOR FUTURE STUDIES

Compared to human studies, behavioral studies of AV integration in nonhuman primates are still relatively rare. The ability to simultaneously record behavior and local neural activity has helped to reconcile the multisensory findings in humans, and expand our understanding of how AV integration occurs in the nervous system. Below, we list several issues to be addressed in the future.

5.7.1. Complex AV Interactions

Tasks requiring linguistic ability may be out of reach for experiments involving nonhuman primates; however, visual tasks of high complexity have been done in previous studies. Considering that many AV effects in humans were seen with purely visual tasks, it may be possible to train monkeys to perform complex visual tasks and then study the effect of auditory presentation on visual performance.

5.7.2. Anatomical Substrates of AV Interaction

The anatomical substrates of cross-modal inputs to primary sensory cortices (de la Mothe et al. 2006b; Cappe and Barone 2005; Cappe et al. 2009; Falchier et al. 2002, 2010; Hackett et al. 2007; Rockland and Ojima 2003; Smiley et al. 2007) provide the basis for the models of routes for AV integration. These data show that two types of corticocortical inputs (feedback and lateral connections), and thalamocortical along with subcortical inputs from nonspecific as well as multisensory thalamic nuclei are potential pathways mediating early multisensory convergence and integration. The challenge here is to discriminate the influence of each of these pathways during a behavioral task. It is probable that the weight of these different pathways is defined by the sensory context as well as by the nature of the task objective.

5.7.3. Implication of Motor Systems in Modulation of Reaction Time

Brain structures showing AV responses included parts of not just sensory but motor systems. Facilitated reaction time in both saccadic and manual responses raises an issue of whether enhancement occurs in just sensory systems or somewhere else additionally. As Miller et al. (2001) showed, motor cortical activation triggered by sensory stimuli reflected that sensory signals were already integrated at the stage of primary motor cortex, it is possible that activation of PPC, PFC, particularly PM areas or SC is facilitated by redundant sensory inputs. These possibilities are not fully discerned yet. The possibility of additional sources for facilitated reaction time was also suggested by the findings of Wang et al. (2008). When intense visual stimuli were presented, additional auditory stimuli did not affect visual response in V1, but it did influence saccadic reaction time. This suggests either that visual response is facilitated somewhere in the visual system outside of V1 or that auditory stimuli directly affect motor responses.

5.7.4. Facilitation or Information?

In general, larger neuronal responses can be beneficial for faster reactions to and discrimination of events because they have faster onset latencies and better signal-to-noise ratios. The coding of which strategy, or strategies, neurons take as they respond to stimuli has to be discerned. For example, visual localization tasks require not only fast reaction times but also good discrimination of visual target location. Visual influences on ongoing oscillations by phase reset mechanisms and the consequence of modulations on response magnitude have been shown by several groups. Additionally, Kayser et al. (2010) has shown the possibility that visual influences can tune the auditory response by increasing the signal-to-noise ratio and thereby its information capacity. Because it is not known what aspect of neuronal response the brain utilizes, it is desirable to compare mechanisms of modulation with behavioral responses.

5.7.5. Inverse Effectiveness and Temporal Interaction

Inverse effectiveness states that multisensory integration is most effective when weak stimuli are presented. Even though most electrophysiological studies of AV integration in monkey auditory cortex often utilize loud sounds, low stimulus intensity can degrade the temporal response pattern of sensory neurons. Such an effect would be more prominent for complex stimuli, such as vocal sounds, because smaller peaks in the temporal envelope (e.g., the first envelope peak of macaque grunt call) may be missed in auditory encoding. The condition of weak sound is relevant to Sumby and Pollack’s (1954) classic observation of inverse effectiveness of human speech. It is thus important to investigate how AV integration works in degraded conditions. It could be possible that degraded stimuli reveal a more central role of attention because weaker stimuli require more attention in order to discern them. Also, altered timing of peaks in response to weak vocal sound may interact differently with the excitability phases of ongoing oscillation, leading to different patterns of enhancement.

5.7.6. What Drives and What Is Driven by Oscillations?

Recent studies of AV integration in AC and STP stress the importance of oscillatory neuronal activity. Oscillations in field potentials and CSD reflect rhythmic net excitability fluctuations of the local neuronal ensemble in sensory cortical areas. Although numerous hypotheses are available, the role of oscillatory modulation in other structures is unknown. Endogenous attention may also be reflected in ongoing activity by top-down modulation. Its interaction with bottom-up sensory activation can contribute to and be influenced by oscillatory dynamics. This is an extremely fruitful area for future studies.

5.7.7. Role of Attention

Although some multisensory studies in monkeys did control for attention, most studies were done where attention was not specifically controlled. The former studies provide ample evidence for a definitive role of sensory attention in AV integration. To get a clear picture on the role attention plays in multisensory interactions, more studies are needed in which attention, even unimodal, is controlled through behavioral tasks and stimuli. It will be also important to investigate issues of attentional load because differences in selective attention may only emerge under high load conditions, as under high attentional loads in attended modality subjects may try to ignore stimuli of irrelevant modalities either consciously or unconsciously.

ACKNOWLEDGMENT

This work was supported by grant nos. K01MH082415, R21DC10415, and R01MH61989.

REFERENCES

  1. Aggleton J.P, Mishkin M. Visual impairments in macaques following inferior temporal lesions are exacerbated selectively by additional damage to superior temporal sulcus. Behavioural Brain Research. 1990;39:262–274. [PubMed: 2244972]
  2. Allman B.L, Keniston L.P, Meredith M.A. Subthreshold auditory inputs to extrastriate visual neurons are responsive to parametric changes in stimulus quality: Sensory-specific versus non-specific coding. Brain Research. 2008;1242:95–101. [PMC free article: PMC2645081] [PubMed: 18479671]
  3. Allman B.L, Meredith M.A. Multisensory processing in “unimodal” neurons: Cross-modal subthreshold auditory effects in cat extrastriate visual cortex. Journal of Neurophysiology. 2007;98:545–549. [PubMed: 17475717]
  4. Allon N, Wollberg Z. Responses of cells in the superior colliculus of the squirrel monkey to auditory stimuli. Brain Research. 1978;159:321–330. [PubMed: 103598]
  5. Andersen R.A, Snyder L.H, Bradley D.C, Xing J. Multimodal representation of space in the posterior parietal cortex and its use in planning movements. Annual Review of Neuroscience. 1997;20:303–330. [PubMed: 9056716]
  6. Anderson J, Lampl I, Reichova I, Carandini M, Ferster D. Stimulus dependence of two-state fluctuations of membrane potential in cat visual cortex. Nature Neuroscience. 2000;3:617–621. [PubMed: 10816319]
  7. Anderson K.C, Siegel R.M. Optic flow selectivity in the anterior superior temporal polysensory area, STPa, of the behaving monkey. Journal of Neuroscience. 1999;19:2681–2691. [PubMed: 10087081]
  8. Anderson K.C, Siegel R.M. Three-dimensional structure-from-motion selectivity in the anterior superior temporal polysensory area STPs of the behaving monkey. Cerebral Cortex. 2005;15:1299–1307. [PMC free article: PMC1859860] [PubMed: 15647529]
  9. Aosaki T, Kimura M, Graybiel A.M. Temporal and spatial characteristics of tonically active neurons of the primate’s striatum. Journal of Neurophysiology. 1995;73:1234–1252. [PubMed: 7608768]
  10. Aou S, Oomura Y, Nishino H, editors. et al. Functional heterogeneity of single neuronal activity in the monkey dorsolateral prefrontal cortex. Brain Research. 1983;260:121–124. [PubMed: 6402271]
  11. Artchakov D, Tikhonravov D, Vuontela V, Linnankoski I, Korvenoja A, Carlson S. Processing of auditory and visual location information in the monkey prefrontal cortex. Experimental Brain Research. 2007;180:469–479. [PubMed: 17390128]
  12. Azuma M, Suzuki H. Properties and distribution of auditory neurons in the dorsolateral prefrontal cortex of the alert monkey. Brain Research. 1984;298:343–346. [PubMed: 6722560]
  13. Baizer J.S, Ungerleider L.G, Desimone R. Organization of visual inputs to the inferior temporal and posterior parietal cortex in macaques. Journal of Neuroscience. 1991;11:168–190. [PubMed: 1702462]
  14. Baizer J.S, Desimone R, Ungerleider L.G. Comparison of subcortical connections of inferior temporal and posterior parietal cortex in monkeys. Visual Neuroscience. 1993;10:59–72. [PubMed: 8424928]
  15. Barbas H, Ghashghaei H, Dombrowski S.M, Rempel-Clower N.L. Medial prefrontal cortices are unified by common connections with superior temporal cortices and distinguished by input from memory-related areas in the rhesus monkey. Journal of Comparative Neurology. 1999;410:343–367. [PubMed: 10404405]
  16. Barbas H, Mesulam M.M. Organization of afferent input to subdivisions of area 8 in the rhesus monkey. Journal of Comparative Neurology. 1981;200:407–431. [PubMed: 7276245]
  17. Barnes C.L, Pandya D.N. Efferent cortical connections of multimodal cortex of the superior temporal sulcus in the rhesus monkey. Journal of Comparative Neurology. 1992;318:222–244. [PubMed: 1583161]
  18. Barraclough N.E, Xiao D, Baker C.I, Oram M.W, Perrett D.I. Integration of visual and auditory information by superior temporal sulcus neurons responsive to the sight of actions. Journal of Cognitive Neuroscience. 2005;17:377–391. [PubMed: 15813999]
  19. Baylis G.C, Rolls E.T, Leonard C.M. Functional subdivisions of the temporal lobe neocortex. Journal of Neuroscience. 1987;7:330–342. [PubMed: 3819816]
  20. Bell A.H, Corneil B.D, Munoz D.P, Meredith M.A. Engagement of visual fixation suppresses sensory responsiveness and multisensory integration in the primate superior colliculus. European Journal of Neuroscience. 2003;18:2867–2873. [PubMed: 14656336]
  21. Benevento L.A, Fallon J, Davis B.J, Rezak M. Auditory–visual interaction in single cells in the cortex of the superior temporal sulcus and the orbital frontal cortex of the macaque monkey. Experimental Neurology. 1977;57:849–872. [PubMed: 411682]
  22. Besle J, Bertrand O, Giard M.H. Electrophysiological (EEG, sEEG, MEG) evidence for multiple audiovisual interactions in the human auditory cortex. Hearing Research. 2009;258:143–151. [PubMed: 19573583]
  23. Blatt G.J, Pandya D.N, Rosene D.L. Parcellation of cortical afferents to three distinct sectors in the parahippocampal gyrus of the rhesus monkey: An anatomical and neurophysiological study. Journal of Comparative Neurology. 2003;466:161–179. [PubMed: 14528446]
  24. Bologninia N, Senna I, Maravita A, Pascual-Leone A, Merabet L.B. Auditory enhancement of visual phosphene perception: The effect of temporal and spatial factors and of stimulus intensity. Neuroscience Letters. 2010;477:109–114. [PMC free article: PMC3538364] [PubMed: 20430065]
  25. Bon L, Lucchetti C. Auditory environmental cells and visual fixation effect in area 8B of macaque monkey. Experimental Brain Research. 2006;168:441–449. [PubMed: 16317576]
  26. Born R.T, Bradley D.C. Structure and function of visual area MT. Annual Review of Neuroscience. 2005;28:157–189. [PubMed: 16022593]
  27. Brosch M, Selezneva E, Scheich H. Nonauditory events of a behavioral procedure activate auditory cortex of highly trained monkeys. Journal of Neuroscience. 2005;25:6797–6806. [PubMed: 16033889]
  28. Brothers L, Ring B, Kling A. Response of neurons in the macaque amygdala to complex social stimuli. Behavioural Brain Research. 1990;41:199–213. [PubMed: 2288672]
  29. Bruce C.J, Desimone R, Gross C.G. Visual properties of neurons in polysensory area in superior temporal sulcus of the macaque. Journal of Neurophysiology. 1981;46:369–384. [PubMed: 6267219]
  30. Bruce C.J, Desimone R, Gross C.G. Both striate cortex and superior colliculus contributes to visual properties of neurons in superior temporal polysensory area of macaque monkey. Journal of Neurophysiology. 1986;55:1057–1075. [PubMed: 3711967]
  31. Burton H, Jones E.G. The posterior thalamic region and its cortical projection in new world and old world monkeys. Journal of Comparative Neurology. 1976;168:249–302. [PubMed: 821975]
  32. Carmichael S.T, Price J.L. Sensory and premotor connections of the orbital and medial prefrontal cortex of macaque monkeys. Journal of Comparative Neurology. 1995;363:642–664. [PubMed: 8847422]
  33. Calvert G.A. Crossmodal processing in the human brain: Insights from functional neuroimaging studies. Cerebral Cortex. 2001;11:1110–1123. [PubMed: 11709482]
  34. Calvert G.A, Campbell R. Reading speech from still and moving faces: The neural substrates of visible speech. Journal of Cognitive Neuroscience. 2003;15:57–70. [PubMed: 12590843]
  35. Campanella S, Belin P. Integrating face and voice in person perception. Trends in Cognitive Sciences. 2007;11:535–543. [PubMed: 17997124]
  36. Cappe C, Barone P. Heteromodal connections supporting multisensory integration at low levels of cortical processing in the monkey. European Journal of Neuroscience. 2005;22:2886–2902. [PubMed: 16324124]
  37. Cappe C, Morel A, Barone P, Rouiller E. The thalamocortical projection systems in primate: An anatomical support for multisensory and sensorimotor interplay. Cerebral Cortex. 2009;19:2025–2037. [PMC free article: PMC2722423] [PubMed: 19150924]
  38. Cappe C, Murray M.M, Barone P, Rouiller E.M. Multisensory facilitation of behavior in monkeys: Effects of stimulus intensity. Journal of Cognitive Neuroscience. 2010;22:2850–2863. [PubMed: 20044892]
  39. Cate A.D, Herron T.J, Yund E.W, editors. et al. Auditory attention activates peripheral visual cortex. PLoS ONE. 2009;4:e4645. [PMC free article: PMC2644787] [PubMed: 19247451]
  40. Cavada C, Goldman-Rakic P.S. Posterior parietal cortex in rhesus monkey: I. Parcellation of areas based on distinctive limbic and sensory corticocortical connections. Journal of Comparative Neurology. 1989a;287:393–421. [PubMed: 2477405]
  41. Cavada C, Goldman-Rakic P.S. Posterior parietal cortex in rhesus monkey: II. Evidence for segregated corticocortical networks linking sensory and limbic areas with the frontal lobe. Journal of Comparative Neurology. 1989b;287:422–445. [PubMed: 2477406]
  42. Cavada C, Company T, Tejedor J, Cruz-Rizzolo R.J, Reinoso-Suarez F. The anatomical connections of the macaque monkey orbitofrontal cortex. A review. Cerebral Cortex. 2000;10:220–242. [PubMed: 10731218]
  43. Chakladar S, Logothetis N.K, Petkov C.I. Morphing rhesus monkey vocalizations. Journal of Neuroscience Methods. 2008;170:45–55. [PubMed: 18289695]
  44. Chandrasekaran C, Ghazanfar A.A. Different neural frequency bands integrate faces and voices differently in the superior temporal sulcus. Journal of Neurophysiology. 2009;101:773–788. [PMC free article: PMC2657063] [PubMed: 19036867]
  45. Chandrasekaran C, Trubanova A, Stillittano S, Caplier A, Ghazanfar A.A. The natural statistics of audiovisual speech. PLoS Computational Biology. 2009;5:e1000436. [PMC free article: PMC2700967] [PubMed: 19609344]
  46. Chen C.M, Lakatos P, Shah A.S, editors. et al. Functional anatomy and interaction of fast and slow visual pathways in macaque monkeys. Cerebral Cortex. 2007;17:1561–1569. [PubMed: 16950866]
  47. Cheney D.L, Seyfarth R.M. How Monkeys See the World. Chicago: Univ. of Chicago Press; 1990.
  48. Ciaramitaro V.M, Buracas G.T, Boynton G.M. Spatial and crossmodal attention alter responses to unattended sensory information in early visual and auditory human cortex. Journal of Neurophysiology. 2007;98:2399–2413. [PubMed: 17715196]
  49. Clower D.M, West R.A, Lynch J.C, Strick P.L. The inferior parietal lobule is the target of output from the superior colliculus, hippocampus, and cerebellum. Journal of Neuroscience. 2001;21:6283–6291. [PubMed: 11487651]
  50. Cohen Y.E. Multimodal activity in the parietal cortex. Hearing Research. 2009;258:100–105. [PMC free article: PMC2810529] [PubMed: 19450431]
  51. Cohen Y.E, Andersen R.A. Reaches to sounds encoded in an eye-centered reference frame. Neuron. 2000;27:647–652. [PubMed: 11055445]
  52. Cohen Y.E, Andersen R.A. A common reference frame for movement plans in the posterior parietal cortex. Nature Reviews. Neuroscience. 2002;3:553–562. [PubMed: 12094211]
  53. Cohen Y.E, Batista A.P, Andersen R.A. Comparison of neural activity preceding reaches to auditory and visual stimuli in the parietal reach region. Neuroreport. 2002;13:891–894. [PubMed: 11997708]
  54. Cohen Y.E, Cohen I.S, Gifford III G.W. Modulation of LIP activity by predictive auditory and visual cues. Cerebral Cortex. 2004;14:1287–1301. [PubMed: 15166102]
  55. Cohen Y.E, Russ B.E, Davis S.J, Baker A.E, Ackelson A.L, Nitecki R. A functional role for the ventrolateral prefrontal cortex in non-spatial auditory cognition. Proceedings of the National Academy of Sciences of the United States of America. 2009;106:20045–20050. [PMC free article: PMC2785289] [PubMed: 19897723]
  56. Colombo M, Gross C.G. Responses of inferior temporal cortex and hippocampal neurons during delayed matching to sample in monkeys (Macaca fascicularis) Behavioral Neuroscience. 1994;108:443–455. [PubMed: 7917038]
  57. Colombo M, Rodman H.R, Gross C.G. The effects of superior temporal cortex lesions on the processing and retention of auditory information in monkeys (Cebus apella) Journal of Neuroscience. 1996;16:4501–4517. [PubMed: 8699260]
  58. Cooke D.F, Graziano M.S.A. Super-flinchers and nerves of steel: Defensive movements altered by chemical manipulation of a cortical motor area. Neuron. 2004a;43:585–593. [PubMed: 15312656]
  59. Cooke D.F, Graziano M.S.A. Sensorimotor integration in the precentral gyrus: Polysensory neurons and defensive movements. Journal of Neurophysiology. 2004b;91:1648–1660. [PubMed: 14586035]
  60. Cui Q.N, Bachus L, Knoth E, O’Neill W.E, Paige G.D. Eye position and cross-sensory learning both contribute to prism adaptation of auditory space. Progress in Brain Research. 2008;171:265–270. [PubMed: 18718311]
  61. Cui J, Wilke M, Logothetis N.K, Leopold D.A, Liang H. Visibility states modulate microsaccade rate and direction. Vision Research. 2009;49:228–236. [PMC free article: PMC3427003] [PubMed: 19007803]
  62. Cusick C.G, Seltzer B, Cola M, Griggs E. Chemoarchitectonics and corticocortical terminations within the superior temporal sulcus of the rhesus monkey: Evidence for subdivisions of superior temporal polysensory cortex. Journal of Comparative Neurology. 1995;360:513–535. [PubMed: 8543656]
  63. Cynader M, Berman N. Receptive field organization of monkey superior colliculus. Journal of Neurophysiology. 1972;35:187–201. [PubMed: 4623918]
  64. Dahl C.D, Logothetis N.K, Kayser C. Spatial organization of multisensory responses in temporal association cortex. Journal of Neuroscience. 2009;29:11924–11932. [PubMed: 19776278]
  65. de la Mothe L.A, Blumell S, Kajikawa Y, Hackett T.A. Cortical connections of the auditory cortex in marmoset monkeys: Core and medial belt regions. Journal of Comparative Neurology. 2006a;496:27–71. [PubMed: 16528722]
  66. de la Mothe L.A, Blumell S, Kajikawa Y, Hackett T.A. Thalamic connections of the auditory cortex in marmoset monkeys: Core and medial belt regions. Journal of Comparative Neurology. 2006b;496:72–96. [PubMed: 16528728]
  67. De Souza W.C, Eifuku S, Tamura R, Nishijo H, Ono T. Differential characteristics of face neuron responses within the anterior superior temporal sulcus of macaques. Journal of Neurophysiology. 2005;94:1251–1566. [PubMed: 15857968]
  68. Dehner L.R, Keniston L.P, Clemo H.R, Meredith M.A. Cross-modal circuitry between auditory and somatosensory areas of the cat anterior ectosylvian sulcal cortex: A ‘new’ inhibitory form of multisensory convergence. Cerebral Cortex. 2004;14:387–403. [PubMed: 15028643]
  69. Desimone R, Gross C.G. Visual areas in the temporal cortex of the macaque. Brain Research. 1979;178:363–380. [PubMed: 116712]
  70. Diederich A, Colonius H. Modeling the time course of multisensory interaction in manual and saccadic responses. In: Handbook of Multisensory Processes. Cambridge: MA: MIT Press; 2004. pp. 373–394.
  71. Disbrow E, Litinas E, Recanzone G.H, Padberg J, Krubitzer L. Cortical connections of the second somatosensory area and the parietal ventral area in macaque monkeys. Journal of Comparative Neurology. 2003;462:382–399. [PubMed: 12811808]
  72. Dobelle W.H, Mladejovsky M.G, Girvin J.P. Artificial vision for the blind: Electrical stimulation of visual cortex offers hope for a functional prosthesis. Science. 1974;183:440–444. [PubMed: 4808973]
  73. Duffy C.J, Wurtz R.H. Sensitivity of MST neurons to optic flow stimuli: I. A continuum of response selectivity to large-field stimuli. Journal of Neurophysiology. 1991;65:1329–1345. [PubMed: 1875243]
  74. Eaccott M.J, Heywood C.A, Gross C.G, Cowey A. Visual discrimination impairments following lesions of the superior temporal sulcus are not specific for facial stimuli. Neuropsychologia. 1993;31:609–619. [PubMed: 8341417]
  75. Eifuku S, De Souza W.C, Tamura R, Nishijo H, Ono T. Neuronal correlates of face identification in the monkey anterior temporal cortical areas. Journal of Neurophysiology. 2004;91:358–371. [PubMed: 14715721]
  76. Engbert R, Kliegl R. Microsaccades uncover the orientation of covert attention. Vision Research. 2003;43:1035–1045. [PubMed: 12676246]
  77. Evans T.A, Howell S, Westergaard G.C. Auditory–visual cross-modal perception of communicative stimuli in tufted capuchin monkeys (Cebus apella) Journal of Experimental Psychology. Animal Behavior Processes. 2005;31:399–406. [PubMed: 16248726]
  78. Falchier A, Clavagnier S, Barone P, Kennedy H. Anatomical evidence of multimodal integration in primate striate cortex. Journal of Neuroscience. 2002;22:5749–5759. [PubMed: 12097528]
  79. Falchier A, Schroeder C.E, Hackett T.A, editors. et al. Projection from visual areas V2 and prostriata to caudal auditory cortex in the monkey. Cerebral Cortex. 2010;20:1529–1538. [PMC free article: PMC2882821] [PubMed: 19875677]
  80. Felleman D.J, Kaas J.H. Receptive field properties of neurons in middle temporal visual area (MT) of owl monkeys. Journal of Neurophysiology. 1984;52:488–513. [PubMed: 6481441]
  81. Fogassi L, Gallese V, Fadiga L, Luppino F, Matelli M, Rizzolatti G. Coding of peripersonal space in inferior premotor cortex (area F4) Journal of Neurophysiology. 1996;76:141–157. [PubMed: 8836215]
  82. Frens M.A, Van Opstal A.J. Visual–auditory interactions modulate saccade-related activity in monkey superior colliculus. Brain Research Bulletin. 1998;46:211–224. [PubMed: 9667814]
  83. Frens M.A, Van Opstal A.J, Van der Willigen R.F. Spatial and temporal factors determine auditory–visual interactions in human saccadic eye movements. Perception & Psychophysics. 1995;57:802–816. [PubMed: 7651805]
  84. Fu K.G, Johnston T.A, Shah A.S, editors. et al. Auditory cortical neurons respond to somatosensory stimulation. Journal of Neuroscience. 2003;23:7510–7515. [PubMed: 12930789]
  85. Fu K.G, Shah A.S, O’Connell M.N, editors. et al. Timing and laminar profile of eye-position effects on auditory responses in primate auditory cortex. Journal of Neurophysiology. 2004;92:3522–3531. [PubMed: 15282263]
  86. Fuster J.M, Bodner M, Kroger J.K. Cross-modal and cross-temporal association in neurons of frontal cortex. Nature. 2000;405:347–351. [PubMed: 10830963]
  87. Gaffan D, Harrison S. Auditory–visual associations, hemispheric specialization and temporal–frontal interaction in the rhesus monkey. Brain. 1991;114:2133–2144. [PubMed: 1933238]
  88. Ghazanfar A.A, Logothetis N.K. Facial expressions linked to monkey calls. Nature. 2003;423:934–934. [PubMed: 12827188]
  89. Ghazanfar A.A, Santos L.R. Primate brains in the wild: The sensory bases for social interactions. Nature Reviews. Neuroscience. 2004;5:603–616. [PubMed: 15263891]
  90. Ghazanfar A.A, Schroeder C.E. Is neocortex essentially multisensory? Trends in Cognitive Sciences. 2006;10:278–285. [PubMed: 16713325]
  91. Ghazanfar A.A, Neuhoff J.G, Logothetis N.K. Auditory looming perception in rhesus monkeys. Proceedings of the National Academy of Sciences of the United States of America. 2002;99:15755–15757. [PMC free article: PMC137788] [PubMed: 12429855]
  92. Ghazanfar A.A, Maier J.X, Hoffman K.L, Logothetis N.K. Multisensory integration of dynamic faces and voices in rhesus monkey auditory cortex. Journal of Neuroscience. 2005;25:5004–5012. [PubMed: 15901781]
  93. Ghazanfar A.A, Nielsen K, Logothetis N.K. Eye movements of monkey observers viewing vocalizing conspecifics. Cognition. 2006;101:515–529. [PubMed: 16448641]
  94. Ghazanfar A.A, Chandrasekaran C, Logothetis N.K. Interactions between the superior temporal sulcus and auditory cortex mediate dynamic face/voice integration in rhesus monkeys. Journal of Neuroscience. 2008;28:4457–4469. [PMC free article: PMC2663804] [PubMed: 18434524]
  95. Giard M.H, Peronnet F. Auditory–visual integration during multimodal object recognition in humans: A behavioral and electrophysiological study. Journal of Cognitive Neuroscience. 1999;11:473–490. [PubMed: 10511637]
  96. Gibson J.R, Maunsell J.H.R. Sensory modality specificity of neural activity related to memory in visual cortex. Journal of Neurophysiology. 1997;78:1263–1275. [PubMed: 9310418]
  97. Gifford III G.W, Cohen Y.E. Spatial and non-spatial auditory processing in the lateral intraparietal area. Experimental Brain Research. 2005;162:509–512. [PubMed: 15864568]
  98. Gifford III G.W, MacLean K.A, Hauser M.D, Cohen Y.E. The neurophysiology of functionally meaningful categories: Macaque ventrolateral prefrontal cortex plays a critical role in spontaneous categorization of species-specific vocalizations. Journal of Cognitive Neuroscience. 2005;17:1471–1482. [PubMed: 16197700]
  99. Goldman-Rakic P.S, Cools A.R, Srivastava K. The prefrontal landscape: Implications of functional architecture for understanding human mentation and the central executive. Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences. 1996;351:1445–1453. [PubMed: 8941956]
  100. Goodale M.A, Milner A.D. Separate visual pathways for perception and action. Trends in Neurosciences. 1992;15:20–25. [PubMed: 1374953]
  101. Graziano M.S.A, Gandhi S. Location of the polysensory zone in the precentral gyrus of anesthetized monkeys. Experimental Brain Research. 2000;135:259–266. [PubMed: 11131511]
  102. Graziano M.S.A, Hu X.T, Gross C.G. Visuospatial properties of ventral premotor cortex. Journal of Neurophysiology. 1997;77:2268–2292. [PubMed: 9163357]
  103. Graziano M.S.A, Reiss L.A.J, Gross C.G. A neuronal representation of the location of nearby sounds. Nature. 1999;397:428–430. [PubMed: 9989407]
  104. Graziano M.S.A, Yap G.S, Gross C.G. Coding of visual space by premotor neurons. Science. 1994;266:1054–1057. [PubMed: 7973661]
  105. Green K.P, Kuhl P.K, Meltzoff A.N, Stevens E.B. Integrating speech information across talkers, gender, and sensory modality: Female faces and male voices in the McGurk effect. Perception & Psychophysics. 1991;50:524–536. [PubMed: 1780200]
  106. Groh J.M, Trause A.S, Underhill A.M, Clark K.R, Inati S. Eye position influences auditory responses in primate inferior colliculus. Neuron. 2001;29:509–518. [PubMed: 11239439]
  107. Grunewald A, Linden J.F, Andersen R.A. Responses to auditory stimuli in macaque lateral intraparietal area I. Effects of training. Journal of Neurophysiology. 1999;82:330–342. [PubMed: 10400962]
  108. Gu Y, Angelaki D.E, DeAngelis G.C. Neural correlates of multisensory cue integration in macaque MSTd. Nature Neuroscience. 2008;11:1201–1210. [PMC free article: PMC2713666] [PubMed: 18776893]
  109. Hackett T.A. The comparative anatomy of the primate auditory cortex. In: Primate Audition: Ethology and Neurobiology. Boca Raton, FL: CRC; 2002. pp. 199–226.
  110. Hackett T.A, de la Mothe L.A, Ulbert I, Karmos G, Smiley J.F, Schroeder C.E. Multisensory convergence in auditory cortex: II. Thalamocortical connections of the caudal superior temporal plane. Journal of Comparative Neurology. 2007;502:894–923. [PubMed: 17447261]
  111. Hackett T.A, Preuss T.M, Kaas J.H. Architectonic identification of the core region in auditory cortex of macaques, chimpanzees, and humans. Journal of Comparative Neurology. 2001;441:197–222. [PubMed: 11745645]
  112. Hackett T.A, Stepniewska I, Kaas J.H. Prefrontal connections of the parabelt auditory cortex in macaque monkeys. Brain Research. 1999;817:45–58. [PubMed: 9889315]
  113. Hairston W.D, Hodges D.A, Burdette J.H, Wallace M.T. Auditory enhancement of visual temporal order judgment. Neuroreport. 2006;17:791–795. [PubMed: 16708016]
  114. Hikosaka K, Iwai E, Saito H, Tanaka K. Polysensory properties of neurons in the anterior bank of the caudal superior temporal sulcus of the macaque monkey. Journal of Neurophysiology. 1988;60:1615–1637. [PubMed: 2462027]
  115. Hikosaka O, Sakamoto M, Usui S. Functional properties of monkey caudate neurons: II. Visual and auditory responses. Journal of Neurophysiology. 1989;61:799–813. [PubMed: 2723721]
  116. Hoffman K.L, Ghazanfar A.A, Gauthier I, Logothetis N.K. Category-specific responses to faces and objects in primate auditory cortex. Frontiers in Systems Neuroscience. 2008;1:2. [PMC free article: PMC2526270] [PubMed: 18958243]
  117. Hoffman K.L, Gothard K.M, Schmid M.C, Logothetis N.K. Facial-expression and gaze-selective responses in the monkey amygdala. Current Biology. 2007;17:766–772. [PubMed: 17412586]
  118. Ito S. Prefrontal activity of macaque monkeys during auditory and visual reaction time tasks. Brain Research. 1982;247:39–47. [PubMed: 7127120]
  119. Iversen S.D, Mishkin M. Comparison of superior temporal and inferior prefrontal lesions on auditory and non-auditory task in rhesus monkeys. Brain Research. 1973;55:355–367. [PubMed: 4197428]
  120. Izumi A, Kojima S. Matching vocalizations to vocalizing faces in chimpanzee (Pan troglodytes) Animal Cognition. 2004;7:179–184. [PubMed: 15015035]
  121. Jääskeläinen I.P, Ahveninen J, Belliveau J.W, Raij T, Sams M. Short-term plasticity in auditory cognition. Trends in Neurosciences. 2007;30:653–661. [PubMed: 17981345]
  122. Jay M.F, Sparks D.L. Auditory receptive fields in primate superior colliculus shift with changes in eye position. Nature. 1984;309:345–347. [PubMed: 6727988]
  123. Jones E.G. Viewpoint: The core and matrix of thalamic organization. Neuoroscience. 1998;85:331–345. [PubMed: 9622234]
  124. Jordan K.E, Brannon E.M, Logothetis N.K, Ghazanfar A.A. Monkeys match the number of voices they hear to the number of faces they see. Current Biology. 2005;15:1034–1038. [PubMed: 15936274]
  125. Joseph J.P, Barone P. Prefrontal unit activity during a delayed oculomotor task in the monkey. Experimental Brain Research. 1987;67:460–468. [PubMed: 3653308]
  126. Kaas J.H, Hackett T.A. Subdivisions of auditory cortex and processing streams in primates. Proceedings of the National Academy of Sciences of the United States of America. 2000;97:11793–11799. [PMC free article: PMC34351] [PubMed: 11050211]
  127. Kajikawa Y, Schroeder C.E. Face–voice integration and vocalization processing in the monkey. Abstracts Society for Neuroscience. 2008:852–22.
  128. Kayser C, Logothetis N.K. Directed interactions between auditory and superior temporal cortices and their role in sensory integration. Frontiers in Integrative Neuroscience. 2009;3:7. [PMC free article: PMC2691153] [PubMed: 19503750]
  129. Kayser C.I, Petkov C.I, Augath M, Logothetis N.K. Integration of touch and sound in auditory cortex. Neuron. 2005;48:373–384. [PubMed: 16242415]
  130. Kayser C, Petkov C.I, Augath M, Logothetis N.K. Functional imaging reveals visual modulation of specific fields in auditory cortex. Journal of Neuroscience. 2007;27:1824–1835. [PubMed: 17314280]
  131. Kayser C, Petkov C.I, Logothetis N.K. Visual modulation of neurons in auditory cortex. Cerebral Cortex. 2008;18:1560–1574. [PubMed: 18180245]
  132. Kayser C, Logothetis N.K. Directed interactions between auditory and superior temporal cortices and their role in sensory integration. Frontiers in Integrative Neuroscience. 2009;3:7. [PMC free article: PMC2691153] [PubMed: 19503750]
  133. Kayser C, Logothetis N.K, Panzeri S. Visual enhancement of the information representation in auditory cortex. Current Biology. 2010;20:19–24. [PubMed: 20036538]
  134. Keysers C, Kohler E, Umilta M.A, Nanetti L, Fogassi L, Gallese V. Audiovisual mirror neurons and action recognition. Experimental Brain Research. 2003;153:628–636. [PubMed: 12937876]
  135. Kikuchi-Yorioka Y, Sawaguchi T. Parallel visuospatial and audiospatial working memory processes in the monkey dorsolateral prefrontal cortex. Nature Neuroscience. 2000;3:1075–1076. [PubMed: 11036261]
  136. Kimura M. Behavioral modulation of sensory responses of primate putamen neurons. Brain Research. 1992;578:204–214. [PubMed: 1511278]
  137. Knudsen E.I, Knudsen P.F. Vision calibrates sound localization in developing barn owls. Journal of Neuroscience. 1989;9:3306–3313. [PubMed: 2795164]
  138. Kohler E, Keysers C, Umilta M.A, Fogassi L, Gallese V, Rizzolatti G. Hearing sounds, understanding actions: Action representation in mirror neurons. Science. 2002;297:846–848. [PubMed: 12161656]
  139. Kojima S, Izumi A, Ceugniet M. Identification of vocalizers by pant hoots, pant grants and screams in a chimpanzee. Primates. 2003;44:225–230. [PubMed: 12884113]
  140. Kondo H, Saleem K.S, Price J.L. Differential connections of the temporal pole with the orbital and medial prefrontal networks in macaque monkeys. Journal of Comparative Neurology. 2003;465:499–523. [PubMed: 12975812]
  141. Kosmal A, Malinowska M, Kowalska D.M. Thalamic and amygdaloid connections of the auditory association cortex of the superior temporal gyrus in rhesus monkey (Macaca mulatta) Acta Neurobiologiae Experimentalis. 1997;57:165–188. [PubMed: 9407703]
  142. Kubota K, Tonoike M, Mikami A. Neuronal activity in the monkey dorsolateral prefrontal cortex during a discrimination task with delay. Brain Research. 1980;183:29–42. [PubMed: 6766776]
  143. Kuraoka K, Nakamura K. Responses of single neurons in monkey amygdala to facial and vocal emotions. Journal of Neurophysiology. 2007;97:1379–1387. [PubMed: 17182913]
  144. Lakatos P, Chen C.-M, O’Connell M, Mills A, Schroeder C.E. Neuronal oscillations and multisensory interaction in primary auditory cortex. Neuron. 2007;53:279–292. [PMC free article: PMC3717319] [PubMed: 17224408]
  145. Lakatos P, Karmos G, Mehta A.D, Ulbert I, Schroeder C.E. Entrainment of neural oscillations as a mechanism of attentional selection. Science. 2008;320:110–113. [PubMed: 18388295]
  146. Lakatos P, O’Connell M.N, Barczak A, Mills A, Javitt D.C, Schroeder C.E. The leading sense: Supramodal control of neurophysiological context by attention. Neuron. 2009;64:419–430. [PMC free article: PMC2909660] [PubMed: 19914189]
  147. Lakatos P, Shaw A.S, Knuth K.H, Ulbert I, Karmos G, Schroeder C.E. An oscillatory hierarchy controlling neuronal excitability and stimulu processing in the auditory cortex. Journal of Neurophysiology. 2005;94:1904–1911. [PubMed: 15901760]
  148. Lehmann C, Herdener M, Esposito F, editors. et al. Differential patterns of multisensory interactions in core and belt areas of human auditory cortex. Neuroimage. 2006;31:294–300. [PubMed: 16473022]
  149. Lehmann S, Murray M.M. The role of multisensory memories in unisensory object discrimination. Brain Research. Cognitive Brain Research. 2005;24:326–334. [PubMed: 15993770]
  150. Leonard C.M, Rolls E.T, Wilson F.A, Baylis G.C. Neurons in the amygdala of the monkey with responses selective for faces. Behavioural Brain Research. 1985;15:159–176. [PubMed: 3994832]
  151. Levy R, Goldman-Rakic P.S. Segregation of working memory functions within the dorsolateral prefrontal cortex. Experimental Brain Research. 2000;133:23–32. [PubMed: 10933207]
  152. Lewis J.W, Van Essen D.C. Corticocortical connections of visual, sensorimotor, and multi modal processing areas in the parietal lobe of the macaque monkey. Journal of Comparative Neurology. 2000;428:112–137. [PubMed: 11058227]
  153. Linden J.F, Grunewald A, Andersen R.A. Responses to auditory stimuli in macaque lateral intraparietal area: II. Behavioral modulation. Journal of Neurophysiology. 1999;82:343–358. [PubMed: 10400963]
  154. Maier J.X, Neuhoff J.G, Logothetis N.K, Ghazanfar A.A. Multisensory integration of looming signals by rhesus monkeys. Neuron. 2004;43:177–181. [PubMed: 15260954]
  155. Maier J.X, Chandrasekaran C, Ghazanfar A.A. Integration of bimodal looming signals through neuronal coherence in the temporal lobe. Current Biology. 2008;18:963–968. [PubMed: 18585039]
  156. Martinez L, Matsuzawa T. Auditory–visual intermodal matching based on individual recognition in a chimpanzee (Pan troglodytes) Animal Cognition. 2009;12:S71–S85. [PubMed: 19701656]
  157. Matsumoto N, Minamimoto T, Graybiel A.M, Kimura M. Neurons in the thalamic CM-Pf complex supply striatal neurons with information about behaviorally significant sensory events. Journal of Neurophysiology. 2001;85:960–976. [PubMed: 11160526]
  158. Mazzoni P, Bracewell R.P, Barash S, Andersen R.A. Spatially tuned auditory responses in area LIP of macaques performing delayed memory saccades to acoustic targets. Journal of Neurophysiology. 1996;75:1233–1241. [PubMed: 8867131]
  159. McDonald J.J, Teder-Sälejärvi W.A, Di Russo F, Hillyard S.A. Neural substrates of perceptual enhancement by cross-modal spatial attention. Journal of Cognitive Neuroscience. 2003;15:10–19. [PubMed: 12590839]
  160. McGurk H, MacDonald J. Hearing lips and seeing voices. Nature. 1976;264:746–748. [PubMed: 1012311]
  161. McNaughton B.L, Battagllia F.P, Jensen O, Moser E.I, Moser M.B. Path integration and the neural basis of the ‘cognitive map.’ Nature Reviews. Neuroscience. 2006;7:663–678. [PubMed: 16858394]
  162. Mehta A.D, Ulbert U, Schroeder C.E. Intermodal selective attention in monkeys: I. Distribution and timing of effects across visual areas. Cerebral Cortex. 2000a;10:343–358. [PubMed: 10769247]
  163. Mehta A.D, Ulbert U, Schroeder C.E. Intermodal selective attention in monkeys: II. Physiological mechanisms of modulation. Cerebral Cortex. 2000b;10:359–370. [PubMed: 10769248]
  164. Meredith M.A, Allman B.L, Keniston L.P, Clemo H.R. Auditory influences on non-auditory cortices. Hearing Research. 2009;258:64–71. [PMC free article: PMC2787633] [PubMed: 19303926]
  165. Meredith M.A, Nemitz J.W, Stein B.E. Determinants of multisensory integration in superior colliculus neurons: I. Temporal factors. Journal of Neuroscience. 1987;7:3215–3229. [PubMed: 3668625]
  166. Meredith M.A, Stein B.E. Interactions among converging sensory inputs in the superior colliculus. Science. 1983;221:389–391. [PubMed: 6867718]
  167. Meyer K, Kaplan J.T, Essec R, Webber C, Damasio H, Damasio A. Predicting visual stimuli on the basis of activity in auditory cortices. Nature Neuroscience. 2010;13:667–668. [PubMed: 20436482]
  168. Miller J.O. Divided attention: Evidence for coactivation with redundant signals. Cognitive Psychology. 1982;14:247–279. [PubMed: 7083803]
  169. Miller J, Ulrich R, Lanarre Y. Locus of the redundant-signals effect in bimodal divided attention: A neurophysiological analysis. Perception & Psychophysics. 2001;63:555–562. [PubMed: 11414141]
  170. Mohedano-Moriano A, Pro-Sistiaga P, Arroyo-Jimenez M.M, editors. et al. Topographical and laminar distribution of cortical input to the monkey entorhinal cortex. Journal of Anatomy. 2007;211:250–260. [PMC free article: PMC2375768] [PubMed: 17573826]
  171. Mohedano-Moriano A, Martinez-Marcos A, Pro-Sistiaga P, editors. et al. Convergence of unimodal and polymodal sensory input to the entorhinal cortex in the fascicularis monkey. Neuroscience. 2008;151:255–271. [PubMed: 18065153]
  172. Molholm S, Ritter W, Murray M.M, Javitt D.C, Schroeder C.E, Foxe J.J. Multisensory auditory–visual interactions during early sensory processing in humans: A high-density electrical mapping study. Brain Research. Cognitive Brain Research. 2002;14:115–128. [PubMed: 12063135]
  173. Molholm S, Martinez A, Shpaner M, Foxe J.J. Object-based attention is multisensory: Co-activation of an object’s representations in ignored sensory modalities. European Journal of Neuroscience. 2007;26:499–509. [PubMed: 17650120]
  174. Mullette-Gilman O.A, Cohen Y.E, Groh J.M. Eye-centered, head-centered, and complex coding of visual and auditory targets in the intraparietal sulcus. Journal of Neurophysiology. 2005;94:2331–2352. [PubMed: 15843485]
  175. Mullette-Gilman O.A, Cohen Y.E, Groh J.M. Motor-related signals in the intraparietal cortex encode locations in a hybrid, rather than eye-centered reference frame. Cerebral Cortex. 2009;19:1761–1775. [PMC free article: PMC2705694] [PubMed: 19068491]
  176. Murata A, Fadiga L, Fogassi L, Gallese V, Raos V, Rizzolatti G. Object representation in the ventral premotor cortex (area F5) of the monkey. Journal of Neurophysiology. 1997;78:2226–2230. [PubMed: 9325390]
  177. Murray E.A, Gaffan D. Removal of the amygdala plus subjacent cortex disrupts the retention of both intramodal and crossmodal associative memories in monkeys. Behavioral Neuroscience. 1994;108:494–500. [PubMed: 7917043]
  178. Murray E.A, Richmond B.J. Role of perirhinal cortex in object perception, memory, and associations. Current Opinion in Neurobiology. 2001;11:188–193. [PubMed: 11301238]
  179. Murray M.M, Michel C.M, de Peralta R.G, editors. et al. Rapid discrimination of visual and multisensory memories revealed by electrical neuroimaging. Neuroimage. 2004;21:125–135. [PubMed: 14741649]
  180. Murray M.M, Foxe J.J, Wylie G.R. The brain uses single-trial multisensory memories to discriminate without awareness. Neuroimage. 2005;27:473–478. [PubMed: 15894494]
  181. Musacchia G, Sams M, Nicol T, Kraus N. Seeing speech affects acoustic information processing in the human brainstem. Experimental Brain Research. 2006;168:1–10. [PMC free article: PMC2535928] [PubMed: 16217645]
  182. Musacchia G, Schroeder C.E. Neuronal mechanisms, response dynamics and perceptual functions of multisensory interactions in auditory cortex. Hearing Research. 2009;258:72–79. [PMC free article: PMC2989528] [PubMed: 19595755]
  183. Nager W, Estorf K, Münte T.F. Crossmodal attention effects on brain responses to different stimulus classes. BMC Neuroscience. 2006;7:31. [PMC free article: PMC1456980] [PubMed: 16608524]
  184. Navarra J, Alsius A, Soto-Faraco S, Spence C. Assessing the role of attention in the audiovisual integration of speech. Information Fusion. 2010;11:4–11.
  185. Neal J.W, Pearson R.C, Powell T.P. The connections of area PG, 7a, with cortex in the parietal, occipital and temporal lobes of the monkey. Brain Research. 1990;532:249–264. [PubMed: 2282518]
  186. Nelissen K, Vanduffel W, Orban G.A. Charting the lower superior temporal region, a new motion-sensitive region in monkey superior temporal sulcus. Journal of Neuroscience. 2006;26:5929–5947. [PubMed: 16738235]
  187. Newman J.D, Lindsley D.F. Single unit analysis of auditory processing in squirrel monkey frontal cortex. Experimental Brain Research. 1976;25:169–181. [PubMed: 819284]
  188. Nishijo H, Ono T, Nishino H. Topographic distribution of modality-specific amygdalar neurons in alert monkey. Journal of Neuroscience. 1988a;8:3556–3569. [PubMed: 3193170]
  189. Nishijo H, Ono T, Nishino H. Single neuron responses in amygdala of alert monkey during complex sensory stimulation with affective significance. Journal of Neuroscience. 1988b;8:3570–3583. [PubMed: 3193171]
  190. Nyberg L, Habib R, McIntosh A.R, Tulving E. Reactivation of encoding-related brain activity during memory retrieval. Proceedings of the National Academy of Sciences of the United States of America. 2000;97:11120–11124. [PMC free article: PMC27158] [PubMed: 11005878]
  191. Ono T, Nakamura K, Nishijo H, Eifuku S. Monkey hippocampal neurons related to spatial and nonspatial functions. Journal of Neurophysiology. 1993;70:1516–1529. [PubMed: 8283212]
  192. Oram M.W, Perrett D.I. Integration of form and motion in the anterior superior temporal polysensory area (STPa) of the macaque monkey. Journal of Neurophysiology. 1996;76:109–129. [PubMed: 8836213]
  193. Oram M.W, Perrett D.I, Hietanen J.K. Directional tuning of motion-sensitive cells in the anterior superior temporal polysensory area of the macaque. Experimental Brain Research. 1993;97:274–294. [PubMed: 8150046]
  194. Padberg J, Seltzer B, Cusick C.G. Architectonics and cortical connections of the upper bank of the superior temporal sulcus in the rhesus monkey: An analysis in the tangential plane. Journal of Comparative Neurology. 2003;467:418–434. [PubMed: 14608603]
  195. Padberg J, Disbrow E, Krubitzer L. The organization and connections of anterior and posterior parietal cortex in titi monkeys: Do new world monkeys have an area 2? Cerebral Cortex. 2005;15:1938–1963. [PubMed: 15758196]
  196. Parr L.A, Hecht E, Barks S.K, Preuss T.M, Votaw J.R. Face processing in the chimpanzee brain. Current Biology. 2009;19:50–53. [PMC free article: PMC2651677] [PubMed: 19097899]
  197. Partan S.R. Single and multichannel signal composition: Facial expressions and vocalizations of rhesus macaques (Macaca mulatta) Behavior. 2002;139:993–1027.
  198. Perrett D.I, Rolls E.T, Caan W. Visual neurones responsive to faces in the monkey temporal cortex. Experimental Brain Research. 1982;47:329–342. [PubMed: 7128705]
  199. Perrott D.R, Saberi K, Brown K, Strybel T.Z. Auditory psychomotor coordination and visual search performance. Perception & Psychophysics. 1990;48:214–226. [PubMed: 2216648]
  200. Petkov C.I, Kayser C, Steudel T, Whittingstall K, Augath M, Logothetis N.K. A voice region in the monkey brain. Nature Neuroscience. 2008;11:367–374. [PubMed: 18264095]
  201. Petrides M, Pandya D.N. Comparative cytoarchitectonic analysis of the human and the macaque ventrolateral prefrontal cortex and corticocortical connection patterns in the monkey. European Journal of Neuroscience. 2002;16:291–310. [PubMed: 12169111]
  202. Petrides M, Pandya D.N. Distinct parietal and temporal pathways to the homologues of Broca’s area in the monkey. PLoS Biology. 2009;7:e1000170. [PMC free article: PMC2714989] [PubMed: 19668354]
  203. Phelps E.A, LeDoux J.E. Contributions of the amygdala to emotion processing: From animal models to human behavior. Neuron. 2005;48:175–187. [PubMed: 16242399]
  204. Pinsk M.A, DeSimone K, Moore T, Gross C.G, Kastner S. Representations of faces and body parts in macaque temporal cortex: A functional MRI study. Proceedings of the National Academy of Sciences of the United States of America. 2005;102:6996–7001. [PMC free article: PMC1100800] [PubMed: 15860578]
  205. Poremba A, Saunders R.C, Crane A.M, Cook M, Sokoloff L, Mishkin M. Functional mapping of the primate auditory system. Science. 2003;299:568–572. [PubMed: 12543977]
  206. Porter K.K, Metzger R.R, Groh J.M. Visual- and saccade-related signals in the primate inferior colliculus. Proceedings of the National Academy of Sciences of the United States of America. 2007;104:17855–17860. [PMC free article: PMC2077072] [PubMed: 17978183]
  207. Posner M.I, Snyder C.R.R, Davidson D.J. Attention and the detection of signals. Journal of Experimental Psychology. General. 1980;109:160–174. [PubMed: 7381367]
  208. Raab D.H. Statistical facilitation of simple reaction times. Transactions of the New York Academy of Sciences. 1962;24:574–590. [PubMed: 14489538]
  209. Rahne T, Böckmann-Barthel M. Visual cues release the temporal coherence of auditory objects in auditory scene analysis. Brain Research. 2009;1300:125–134. [PubMed: 19747455]
  210. Ramos-Estebanez C, Merabet L.B, Machii K, editors. et al. Visual phosphene perception modulated by subthreshold crossmodal sensory stimulation. Journal of Neuroscience. 2007;27:4178–4181. [PubMed: 17428995]
  211. Rao S.C, Rainer G, Miller E.K. Integration of what and where in the primate prefrontal cortex. Science. 1997;276:821–824. [PubMed: 9115211]
  212. Rauschecker J.P, Tian B. Mechanisms and streams for processing of “what” and “where” in auditory cortex. Proceedings of the National Academy of Sciences of the United States of America. 2000;97:11800–11806. [PMC free article: PMC34352] [PubMed: 11050212]
  213. Rauschecker J.P, Tian B, Hauser M. Processing of complex sounds in the macaque nonprimary auditory cortex. Science. 1995;268:111–114. [PubMed: 7701330]
  214. Rauschecker J.P, Harris L.R. Auditory and visual neurons in the cat’s superior colliculus selective for the direction of apparent motion stimuli. Brain Research. 1989;490:56–63. [PubMed: 2758330]
  215. Recanzone G.H, Guard D.C, Phan M.L, Su T.K. Correlation between the activity of single auditory cortical neurons and sound-localization behavior in the macaque monkey. Journal of Neurophysiology. 2000;83:2723–2739. [PubMed: 10805672]
  216. Ringo J.L, O’Neill S.G. Indirect inputs to ventral temporal cortex of monkey: The influence on unit activity of alerting auditory input, interhemispheric subcortical visual input, reward, and the behavioral response. Journal of Neurophysiology. 1993;70:2215–2225. [PubMed: 8120578]
  217. Rizzolatti G, Craighero L. The mirror-neuron system. Annual Review of Neuroscience. 2004;27:169–192. [PubMed: 15217330]
  218. Rizzolatti G, Fadiga L, Gallese V, Fogassi L. Premotor cortex and the recognition of motor actions. Brain Research. Cognitive Brain Research. 1996;3:131–141. [PubMed: 8713554]
  219. Rockland K.S, Ojima H. Multisensory convergence in calcarine visual areas in macaque monkey. International Journal of Psychophysiology. 2003;50:19–26. [PubMed: 14511833]
  220. Rolls E.T, Critchley H.D, Browning A.S, Inoue K. Face-selective and auditory neurons in the primate orbitofrontal cortex. Experimental Brain Research. 2006;170:74–87. [PubMed: 16328289]
  221. Romanski L.M, Averbeck B.B, Diltz M. Neural representation of vocalizations in the primate ventrolateral prefrontal cortex. Journal of Neurophysiology. 2005;93:734–747. [PubMed: 15371495]
  222. Romanski L.M, Bates J.F, Goldman-Rakic P.S. Auditory belt and parabelt projections to the prefrontal cortex in the rhesus monkey. Journal of Comparative Neurology. 1999a;403:141–157. [PubMed: 9886040]
  223. Romanski L.M, Goldman-Rakic P.S. An auditory domain in primate prefrontal cortex. Nature Neuroscience. 2002;5:15–16. [PMC free article: PMC2793092] [PubMed: 11753413]
  224. Romanski L.M, Tian B, Fritz J, Mishkin M, Goldman-Rakic P.S, Rauschecker J.P. Dual streams of auditory afferents target multiple domains in the primate prefrontal cortex. Nature Neuroscience. 1999b;2:1131–1136. [PMC free article: PMC2778291] [PubMed: 10570492]
  225. Romei V, Murray M.M, Merabet L.B, Thut G. Occipital transcranial magnetic stimulation has opposing effects on visual and auditory stimulus detection: Implications for multisensory interactions. Journal of Neuroscience. 2007;27:11465–11472. [PubMed: 17959789]
  226. Romei V, Murray M.M, Cappe C, Thut G. Preperceptual and stimulus-selective enhancement of low-level human visual cortex excitability by sounds. Current Biology. 2009;19:1799–1805. [PubMed: 19836243]
  227. Russ B.E, Ackelson A.L, Baker A.E, Cohen Y.E. Coding of auditory-stimulus identity in the auditory non-spatial processing stream. Journal of Neurophysiology. 2008;99:87–95. [PMC free article: PMC4091985] [PubMed: 18003874]
  228. Saleem K.S, Suzuki W, Tanaka K, Hashikawa T. Connections between anterior inferotemporal cortex and superior temporal sulcus regions in the macaque monkey. Journal of Neuroscience. 2000;20:5083–5101. [PubMed: 10864966]
  229. Saleem K.S, Kondo H, Price J.L. Complementary circuits connecting the orbital and medial prefrontal networks with the temporal, insular, and opercular cortex in the macaque monkey. Journal of Comparative Neurology. 2008;506:659–693. [PubMed: 18067141]
  230. Sams M, Aulanko R, Hämäläinen M, editors. et al. Seeing speech: Visual information from lip movements modifies activity in the human auditory cortex. Neuroscience Letters. 1991;127:141–145. [PubMed: 1881611]
  231. Santangelo V, Spence C. Crossmodal exogenous orienting improves the accuracy of temporal order judgments. Experimental Brain Research. 2009;194:577–586. [PubMed: 19242685]
  232. Santos-Benitez H, Magarinos-Ascone C.M, Garcia-Austt E. Nucleus basalis of Meynert cell responses in awake monkeys. Brain Research Bulletin. 1995;37:507–511. [PubMed: 7633898]
  233. Schiff W, Caviness J.A, Gibson J.J. Persistent fear responses in rhesus monkeys to the optical stimulus of “looming.” Science. 1962;136:982–983. [PubMed: 14498362]
  234. Schlack A, Sterbing-D’Angelo S.J, Hartung K, Hoffmann K.-P, Bremmer F. Multisensory space representations in the macaque ventral intraparietal area. Journal of Neuroscience. 2005;25:4616–4625. [PubMed: 15872109]
  235. Schmolesky M.T, Wang Y, Hanes D.P, editors. et al. Signal timing across the macaque visual system. Journal of Neurophysiology. 1998;79:3272–3278. [PubMed: 9636126]
  236. Schroeder C.E, Foxe J.J. The timing and laminar profile of converging inputs to multisensory areas of the macaque neocortex. Brain Research. Cognitive Brain Research. 2002;14:187–198. [PubMed: 12063142]
  237. Schroeder C.E, Foxe J.J. Multisensory contributions to low-level, ‘unisensory’ processing. Current Opinion in Neurobiology. 2005;15:454–458. [PubMed: 16019202]
  238. Schroeder C.E, Lakatos P. Low-frequency neuronal oscillations as instruments of sensory selection. Trends in Neurosciences. 2009;32:9–18. [PMC free article: PMC2990947] [PubMed: 19012975]
  239. Schroeder C.E, Lakatos P, Kajikawa Y, Partan S, Puce A. Neuronal oscillations and visual amplification of speech. Trends in Cognitive Sciences. 2008;12:106–113. [PMC free article: PMC3987824] [PubMed: 18280772]
  240. Schroeder C.E, Lindsley R.W, Specht C, Marcovici A, Smilery J.F, Javitt D.C. Somatosensory input to auditory association cortex in the macaque monkey. Journal of Neurophysiology. 2001;85:1322–1327. [PubMed: 11248001]
  241. Seltzer B, Cola M.G, Gutierrez C, Massee M, Weldon C, Cusick C.G. Overlapping and nonoverlapping cortical projections to cortex of the superior temporal sulcus in the rhesus monkey: Double anterograde tracer studies. Journal of Comparative Neurology. 1996;370:173–190. [PubMed: 8808729]
  242. Seltzer B, Pandya D.N. Afferent cortical connections and architectonics of the superior temporal sulcus and surrounding cortex in the rhesus monkey. Brain Research. 1978;149:1–24. [PubMed: 418850]
  243. Seltzer B, Pandya D.N. Frontal lobe connections of the superior temporal sulcus in the rhesus monkey. Journal of Comparative Neurology. 1989;281:97–113. [PubMed: 2925903]
  244. Seltzer B, Pandya D.N. Parietal, temporal, and occipital projections to cortex of the superior temporal sulcus in the rhesus monkey: A retrograde tracer study. Journal of Comparative Neurology. 1994;343:445–463. [PubMed: 8027452]
  245. Sherman S.M, Guillery R.W. The role of the thalamus in the flow of information to the cortex. Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences. 2002;357:1695–1708. [PMC free article: PMC1693087] [PubMed: 12626004]
  246. Sliwa J, Duhamel J.-R, Paxsalis O, Wirth S.C. Cross-modal recognition of identity in rhesus monkeys for familiar conspecifics and humans. Abstracts Society for Neuroscience. 2009:684–14.
  247. Smiley J.F, Hackett T.A, Ulbert I, editors. et al. Multisensory convergence in auditory cortex, I. Cortical connections of the caudal superior temporal plane in macaque monkeys. Journal of Comparative Neurology. 2007;502:894–923. [PubMed: 17447261]
  248. Soto-Faraco S, Alsius A. Deconstructing the McGurk-MacDonald illusion. Journal of Experimental Psychology. Human Perception and Performance. 2009;35:580–587. [PubMed: 19331510]
  249. Squire L.R, Stark C.E.L, Clark R.E. The medial temporal lobe. Annual Review of Neuroscience. 2004;27:279–306. [PubMed: 15217334]
  250. Starr A, Don M. Responses of squirrel monkey (Samiri sciureus) medial geniculate units to binaural click stimuli. Journal of Neurophysiology. 1972;35:501–517. [PubMed: 4624738]
  251. Stein B.E, Meredith M.A. The Merging of the Senses. Cambridge, MA: MIT Press; 1993.
  252. Stein B.E, Jiang W, Wallace M.T, Stanford T.R. Nonvisual influences on visual-information processing in the superior colliculus. Progress in Brain Research. 2001;134:143–156. [PubMed: 11702540]
  253. Stein B.E, Wallace M.W, Stanford T.R, Jiang W. Cortex governs multisensory integration in the midbrain. Neuroscientist. 2002;8:306–314. [PubMed: 12194499]
  254. Stein B.E, Stanford T.R. Multisensory integration: Current issues from the perspective of the single neuron. Nature Reviews. Neuroscience. 2008;9:255–266. [PubMed: 18354398]
  255. Stevenson R.A, James T.W. Audiovisual integration in human superior temporal sulcus: Inverse effectiveness and the neural processing of speech and object recognition. Neuroimage. 2009;44:1210–1223. [PubMed: 18973818]
  256. Stricane B, Andersen R.A, Mazzoni P. Eye-centered, head-centered, and intermediate coding of remembered sound locations in area LIP. Journal of Neurophysiology. 1996;76:2071–2076. [PubMed: 8890315]
  257. Sugihara T, Diltz M.D, Averbeck B.B, Romanski L.M. Integration of auditory and visual communication information in the primate ventrolateral prefrontal cortex. Journal of Neuroscience. 2006;26:11138–11147. [PMC free article: PMC2767253] [PubMed: 17065454]
  258. Sumby W.H, Pollack I. Visual contribution to speech intelligibility in noise. Journal of the Acoustical Society of America. 1954;26:212–215.
  259. Suzuki W.A, Amaral D.G. Perirhinal and parahippocampal cortices of the macaque monkey: Cortical afferents. Journal of Comparative Neurology. 1994;350:497–533. [PubMed: 7890828]
  260. Talsma D, Senkowski D, Woldorff M.G. Intermodal attention affects the processing of the temporal alignment of audiovisual stimuli. Experimental Brain Research. 2009;198:313–328. [PMC free article: PMC2733193] [PubMed: 19495733]
  261. Tamura R, Ono T, Fukuda M, Nakamura K. Spatial responsiveness of monkey hippocampal neurons to various visual and auditory stimuli. Hippocampus. 1992;2:307–322. [PubMed: 1308190]
  262. Tanaka K, Hikosaka K, Saito H, Yukie M, Fukada Y, Iwai E. Analysis of local and wide-field movements in the superior temporal visual areas of the macaque monkey. Journal of Neuroscience. 1986;6:134–144. [PubMed: 3944614]
  263. Tanibuchi I, Goldman-Rakic P.S. Dissociation of spatial-, object-, and sound-coding neurons in the mediodorsal nucleus of the primate thalamus. Journal of Neurophysiology. 2003;89:1067–1077. [PubMed: 12574481]
  264. Teder-Sälejärvi W.A, Münte T.F, Sperlich F, Hillyard S.A. Intra-modal and cross-modal spatial attention to auditory and visual stimuli. An event-related brain potential study. Brain Research. Cognitive Brain Research. 1999;8:327–343. [PubMed: 10556609]
  265. Théoret H, Merabet L, Pascual-Leone A. Behavioral and neuroplastic changes in the blind: Evidence for functionally relevant cross-modal interactions. Journal of Physiology, Paris. 2004;98:221–233. [PubMed: 15477034]
  266. Tian B, Reser D, Durham A, Kustov A, Rauschecker J.P. Functional specialization in rhesus monkey auditory cortex. Science. 2001;292:290–293. [PubMed: 11303104]
  267. Tsao D.Y, Freiwald W.A, Tootell R.B.H, Livingstone M.S. A cortical region consisting entirely of face-selective cells. Science. 2006;311:670–674. [PMC free article: PMC2678572] [PubMed: 16456083]
  268. Tsao D.Y, Moeller S, Freiwald W.A. Comparing face patch systems in macaques and humans. Proceedings of the National Academy of Sciences of the United States of America. 2008a;105:19514–19519. [PMC free article: PMC2614792] [PubMed: 19033466]
  269. Tsao D.Y, Schweers N, Moeller S, Freiwald W.A. Patches of face-selective cortex in the macaque frontal lobe. Nature Neuroscience. 2008b;11:877–879. [PubMed: 18622399]
  270. Turner B.H, Mishkin M, Knapp M. Organization of the amygdalopetal projections from modality-specific cortical association areas in the monkey. Journal of Comparative Neurology. 1980;191:515–543. [PubMed: 7419732]
  271. Ungerleider L.G, Mishkin M. Two cortical visual systems. In: Analysis of Visual Behavior. Cambridge: MIT Press; 1982. pp. 549–586.
  272. Ungerleider L.G, Courtney S.M, Haxby J.V. A neural system for human vision working memory. Proceedings of the National Academy of Sciences of the United States of America. 1998;95:883–890. [PMC free article: PMC33812] [PubMed: 9448255]
  273. Updyke B.V. Characteristics of unit responses in superior colliculus of the cebus monkey. Journal of Neurophysiology. 1974;37:896–909. [PubMed: 4212987]
  274. Vaadia E, Benson D.A, Hienz R.D, Goldstein M.H Jr. Unit study of monkey frontal cortex: Active localization of auditory and of visual stimuli. Journal of Neurophysiology. 1986;56:934–952. [PubMed: 3783237]
  275. van Attenveldt N, Roebroeck A, Goebel R. Interaction of speech and script in human auditory cortex: Insights from neuro-imaging and effective connectivity. Hearing Research. 2009;258:152–164. [PubMed: 19500658]
  276. Vatakis A, Ghazanfar A.A, Spence C. Facilitation of multisensory integration by the “unity effect” reveals that speech is special. Journal of Vision. 2008;8(9):14. [PubMed: 18831650]
  277. von Kriegstein K, Giraud A.-L. Implicit multisensory associations influence voice recognition. PLoS Biology. 2006;4:e326. [PMC free article: PMC1570760] [PubMed: 17002519]
  278. Wallace M.T, Wilkinson L.K, Stein B.E. Representation and integration of multiple sensory inputs in primate superior colliculus. Journal of Neurophysiology. 1996;76:1246–1266. [PubMed: 8871234]
  279. Wang Y, Celebrini S, Trotter Y, Barone P. Visuo-auditory interactions in the primary visual cortex of the behaving monkey: Electrophysiological evidence. BMC Neuroscience. 2008;9:79. [PMC free article: PMC2527609] [PubMed: 18699988]
  280. Watanabe M. Frontal units of the monkey coding the associative significance of visual and auditory stimuli. Experimental Brain Research. 1992;89:233–247. [PubMed: 1623971]
  281. Watanabe J, Iwai E. Neuronal activity in visual, auditory and polysensory areas in the monkey temporal cortex during visual fixation task. Brain Research Bulletin. 1991;26:583–592. [PubMed: 1868357]
  282. Welch R, Warren D. Intersensory interactions. In: Handbook of Perception and Human Performance. New York: Wiley; 1986. pp. 21–36.
  283. Werner-Reiss U, Kelly K.A, Trause A.S, Underhill A.M, Groh J.M. Eye position affects activity in primary auditory cortex of primates. Current Biology. 2006;13:554–562. [PubMed: 12676085]
  284. Wheeler M.E, Petersen S.E, Buckner R.L. Memory’s echo: Vivid remembering reactivates sensory-specific cortex. Proceedings of the National Academy of Sciences of the United States of America. 2000;97:11125–11129. [PMC free article: PMC27159] [PubMed: 11005879]
  285. Wilson F.A.W, Rolls E.T. Neuronal responses related to reinforcement in the primate basal forebrain. Brain Research. 1990;509:213–231. [PubMed: 2322819]
  286. Wilson F.A.W, Scalaidhe S.P.O, Goldman-Rakic P.S. Dissociation of object and spatial processing in primate prefrontal cortex. Science. 1993;260:1955–1958. [PubMed: 8316836]
  287. Wollberg Z, Sela J. Frontal cortex of the awake squirrel monkey: Responses of single cells to visual and auditory stimuli. Brain Research. 1980;198:216–220. [PubMed: 7407587]
  288. Woods T.M, Recanzone G.H. Visually induced plasticity of auditory spatial perception in macaques. Current Biology. 2004;14:1559–1564. [PubMed: 15341742]
  289. Wurtz R.H, Albano J.E. Visual–motor function of the primate superior colliculus. Annual Review of Neuroscience. 1980;3:189–226. [PubMed: 6774653]
  290. Yeterian E.H, Pandya D.N. Thalamic connections of the cortex of the superior temporal sulcus in the rhesus monkey. Journal of Comparative Neurology. 1989;282:80–97. [PubMed: 2468699]
  291. Zangenehpour S, Ghazanfar A.A, Lewkowicz D.J, Zatorre R.J. Heterochrony and cross-species intersensory matching by infant vervet monkeys. PLoS ONE. 2009;4:e4302. [PMC free article: PMC2627929] [PubMed: 19172998]