NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Murray MM, Wallace MT, editors. The Neural Bases of Multisensory Processes. Boca Raton (FL): CRC Press/Taylor & Francis; 2012.

Cover of The Neural Bases of Multisensory Processes

The Neural Bases of Multisensory Processes.

Show details

Chapter 7Multisensory Integration through Neural Coherence

, , and .


The inputs delivered by different sensory organs provide us with complementary information about the environment. Constantly, multisensory interactions occur in the brain to evaluate cross-modal matching or conflict of such signals. The outcome of these interactions is of critical importance for perception, cognitive processing, and the control of action (Meredith and Stein 1983, 1985; Stein and Meredith 1993; Macaluso and Driver 2005; Kayser and Logothetis 2007). Recent studies have revealed that a vast amount of cortical operations, including those carried out by primary regions, are shaped by inputs from multiple sensory modalities (Amedi et al. 2005; Ghazanfar and Schroeder 2006; Kayser and Logothetis 2007, 2009). Multisensory integration is highly automatized and can even occur when there is no meaningful relationship between the different sensory inputs and even under conditions with no perceptual awareness, as demonstrated in pioneering research on multisensory interactions in the superior colliculus of anesthetized cats (Meredith and Stein 1983, 1985; Stein and Meredith 1993; Stein et al. 2002). Clearly, these findings suggest the fundamental importance of multisensory processing for development (Sur et al. 1990; Shimojo and Shams 2001; Bavelier and Neville 2002) and normal functioning of the nervous system.

In recent years, an increasing number of studies has aimed at characterizing multisensory cortical regions, revealing multisensory processing in the superior temporal sulcus, the intraparietal sulcus, frontal regions as well as the insula and claustrum (Calvert 2001; Ghazanfar and Schroeder 2006; Kayser and Logothetis 2007). Interestingly, there is increasing evidence that neurons in areas formerly considered unimodal, such as auditory belt areas (Foxe et al. 2002; Kayser et al. 2005; Macaluso and Driver 2005; Ghazanfar and Schroeder 2006; Kayser and Logothetis 2007), can also exhibit multisensory characteristics. Furthermore, numerous subcortical structures are involved in multisensory processing. In addition to the superior colliculus (Meredith and Stein 1983, 1985), this includes the striatum (Nagy et al. 2006), the cerebellum (Baumann and Greenlee 2007), the amygdala (Nishijo et al. 1988), and there is evidence for cross-modal interactions at the level of the thalamus (Komura et al. 2005).

Whereas the ubiquity and fundamental relevance of multisensory processing have become increasingly clear, the neural mechanisms underlying multisensory interaction are much less well understood. In this chapter, we review recent studies that may cast new light on this issue. Although classical studies have postulated a feedforward convergence of unimodal signals as the primary mechanism for multisensory integration (Stein and Meredith 1993; Meredith 2002), there is now evidence that both feedback and lateral interaction may also be relevant (Driver and Spence 2000; Foxe and Schroeder 2005; Ghazanfar and Schroeder 2006; Kayser and Logothetis 2007). Beyond this changing view on the anatomical substrate, there is increasing awareness that complex dynamic interactions of cell populations, leading to coherent oscillatory firing patterns, may be crucial for mediating cross-systems integration in the brain (von der Malsburg and Schneider 1986; Singer and Gray 1995; Singer 1999; Engel et al. 1992, 2001; Varela et al. 2001; Herrmann et al. 2004a; Fries 2005). Here, we will consider the hypothesis that synchronized oscillations may also provide a potential mechanism for cross-modal integration and for the selection of information that is coherent across different sensory channels. We will (1) contrast the two different views on cross-modal integration that imply different mechanisms (feedforward convergence vs. neural coherence), (2) review recent studies on oscillatory responses and cross-modal processing, and (3) discuss functional aspects and scenarios for the involvement of neural coherence in cross-modal interaction.


7.2.1. Integration by Convergence

The classical view posits that multisensory integration occurs in a hierarchical manner by progressive convergence of pathways and, thus, sensory signals are integrated only in higher association areas and in specialized subcortical regions (Stein and Meredith 1993; Meredith 2002). A core assumption of this approach is that the neural representation of an object is primarily reflected in a firing rate code. Multisensory integration, accordingly, is expressed by firing rate changes in neurons or neural populations receiving convergent inputs from different modalities. A frequently used approach to investigate multisensory processing at the level of single neurons is the comparison of spike rate in response to multisensory stimuli with the firing rate observed when presenting the most effective of these stimuli alone (Meredith and Stein 1983, 1985; Stein and Meredith 1993; Stein et al. 2002). In more recent studies, an approach in which the neuronal responses to multisensory inputs are directly compared with the algebraic sum of the neuronal responses to the unisensory constituents has been applied (Rowland et al. 2007; Stanford et al. 2005). In this approach, multisensory responses that are larger than the sum of the unisensory responses are referred to as superadditive, whereas multisensory responses that are smaller are classified as subadditive. A large body of evidence demonstrates such multisensory response patterns in a wide set of brain regions (Calvert 2001; Macaluso and Driver 2005; Ghazanfar and Schroeder 2006).

However, as recognized by numerous authors in recent years, a pure convergence model would probably not suffice to account for all aspects of multisensory processing (Driver and Spence 2000; Foxe and Schroeder 2005; Ghazanfar and Schroeder 2006; Kayser and Logothetis 2007). First, strong cross-modal interactions and modulation occur in primary cortices, which is difficult to reconcile with the notion of hierarchical convergence. Second, a convergence scenario does not appear flexible enough because it does not allow for rapid recombination of cross-modal signals into completely novel percepts. Furthermore, a feedforward convergence model does not explain how low-level information about objects can remain accessible because the high-level representation is noncompositional, i.e., it does not explicitly make reference to elementary features.

7.2.2. Integration through Neural Coherence

A different account of multisensory interaction can be derived from data on the functional role of correlated neural activity, which is likely to play a key role for feature integration and response selection in various sensory modalities (von der Malsburg and Schneider 1986; Singer and Gray 1995; Singer 1999; Tallon-Baudry and Bertrand 1999; Engel et al. 1992, 2001; Herrmann et al. 2004a; Fries 2005). As shown by numerous studies in both animals and humans, synchronized oscillatory activity, in particular at frequencies in the gamma band (>30 Hz), is related to a large variety of cognitive and sensorimotor functions. The majority of these studies were conducted in the visual modality, relating gamma band coherence of neural assemblies to processes such as feature integration over short and long distances (Engel et al. 1991a, 1991b; Tallon-Baudry et al. 1996), surface segregation (Gray et al. 1989; Castelo-Branco et al. 2000), perceptual stimulus selection (Fries et al. 1997; Siegel et al. 2007), and attention (Müller et al. 2000; Fries et al. 2001; Siegel et al. 2008). Beyond the visual modality, gamma band synchrony has also been observed in the auditory (Brosch et al. 2002; Debener et al. 2003), somatosensory (Bauer et al. 2006), and olfactory systems (Bressler and Freeman 1980; Wehr and Laurent 1996). Moreover, gamma band synchrony has been implicated in processes such as sensorimotor integration (Roelfsema et al. 1997; Womelsdorf et al. 2006), movement preparation (Sanes and Donoghue 1993; Farmer 1998) or memory formation (Csicsvari et al. 2003; Gruber and Müller 2005; Herrmann et al. 2004b).

Collectively, these data provide strong support for the hypothesis that synchronization of neural signals is a key mechanism for integrating and selecting information in distributed networks. This so-called “temporal correlation hypothesis” (Singer and Gray 1995; Singer 1999; Engel et al. 2001) predicts that coherence of neural signals allows to set up highly specific patterns of effective neuronal coupling, thus enabling the flexible and context-dependent binding, the selection of relevant information, and the efficient routing of signals through processing pathways (Salinas and Sejnowski 2001; Fries 2005; Womelsdorf et al. 2007). Based on experimental evidence discussed in the subsequent sections, we suggest that the same mechanism may also serve to establish specific relationships across different modalities, allowing cross-modal interactions of sensory inputs and the preferential routing of matching cross-modal information to downstream assemblies (Senkowski et al. 2008). We would like to note that this view does not contradict the notion that cross-modal interactions have strong effects on neuronal firing rates, but it shifts emphasis to considering a richer dynamic repertoire of neural interactions and a more flexible scenario of cross-modal communication in the brain.


A variety of different paradigms have been used to study the role of oscillatory responses and neural coherence during multisensory processing. Most studies have been performed in humans using electroencephalography (EEG) or magnetoencephalography (MEG), whereas only few animal studies are available. The approaches used address different aspects of multisensory processing, including (1) bottom-up processing of multisensory information, (2) cross-modal semantic matching, (3) modulation by top-down attention, as well as (4) cross-modally induced perceptual changes. In all these approaches, specific changes in oscillatory responses or coherence of neural activity have been observed, suggesting that temporally patterned neural signals may be relevant for more than just one type of multisensory interaction.

7.3.1. Oscillations Triggered by Multisensory Stimuli

The first attempt to investigate neural synchronization of oscillatory responses in the human EEG was the comparison of phase coherence patterns for multiple pairs of electrodes during the presentation of auditory and visual object names, as well as pictures of objects (von Stein et al. 1999). Under conditions of passive stimulation (i.e., subjects were not required to perform any task), the authors reported an increase of phase coherence in the lower beta band between temporal and parietal electrode sites. The authors therefore suggested that meaningful semantic inputs are processed in a modality-independent network of temporal and parietal areas.

Additional evidence for the involvement of oscillatory beta responses in multisensory processing comes from a study in which subjects were instructed to respond to the appearance of any stimulus in a stream of semantically meaningless auditory, visual, and multisensory audiovisual stimuli (Senkowski et al. 2006). In the cross-modal condition, an enhancement was observed for evoked oscillations, i.e., early oscillatory activity that is phase-locked to stimulus onset. This integration effect, which specifically occurred in the beta band, predicted the shortening of reaction times observed for multisensory audiovisual stimuli, suggesting an involvement of beta activity in the multisensory processing of behaviorally relevant stimuli. Cross-modal effects on evoked beta responses have been also reported in a sensory gating paradigm (Kisley and Cornwell 2006), in which auditory and somatosensory stimuli were presented at short or long interstimulus intervals under conditions of passive stimulation. Higher auditory and somatosensory evoked beta responses were found when the preceding stimulus was from the other compared to when it was from the same modality, suggesting a cross-modal gating effect on the oscillatory activity in this frequency range. Further EEG investigations have focused on the examination of oscillatory activity in response to basic auditory, visual, and audiovisual stimuli during passive stimulation (Sakowitz et al. 2000, 2001, 2005). In these studies, multisensory interactions were found in evoked oscillatory responses across a wide range of frequencies and across various scalp sites, indicating an involvement of neural synchronization of cell assemblies in different frequency bands and brain regions.

Compelling evidence for an association between oscillatory responses and multisensory processing comes from a recent study on somatosensory modulation of processing in primary auditory cortex of alert monkeys (Lakatos et al. 2007). The authors investigated the effect of median nerve stimulation on auditory responses and observed a pronounced augmentation of oscillations in the delta, theta, and gamma frequency ranges. Further analysis revealed that this effect was mainly due to a phase resetting of auditory oscillations by the somatosensory inputs. Another intriguing observation in the same study was that systematic variation of the relative delay between somatosensory and auditory inputs lead to multisensory response enhancements at intervals corresponding to the cycle length of gamma, theta, and delta band oscillations. In contrast, for intermediate delays, the paired stimulus response was smaller than the responses to auditory stimuli alone. Further support for phase resetting as a potential mechanism of cross-modal interaction comes from a recent study focusing on visual modulation of auditory processing in the monkey (Kayser et al. 2008). Using auditory and visual stimuli while recording in the auditory core and belt regions of awake behaving monkeys, the authors observed both enhancement and suppression of unit and field potential responses. Importantly, visual stimuli could be shown to modulate the phase angle of auditory alpha and theta band activity.

Two recent studies have addressed interactions between auditory and multisensory regions in the superior temporal sulcus in behaving monkeys. One of the studies examined the effect of audiovisual looming signals on neural oscillations in the two regions (Maier et al. 2008). The main finding of this study was enhanced gamma band coherence between the two structures for cross-modally coherent looming signals compared to unimodal or receding motion inputs. This suggests that coupling of neuronal populations between primary sensory areas and higher-order multisensory structures may be functionally relevant for the integration of audiovisual signals. In a recent study, Kayser and Logothetis (2009) have investigated directed interactions between auditory cortex and multisensory sites in the superior temporal sulcus. Their analysis, which was confined to frequencies below the gamma band, suggests that superior temporal regions provide one major source of visual influences to the auditory cortex and that the beta band is involved in directed information flow through coupled oscillations.

In line with other studies (Foxe et al. 2002; Kayser et al. 2005; Ghazanfar and Schroeder 2006; Kayser and Logothetis 2007), these data support the notion that inputs from other modalities and from multisensory association regions can shape, in a context-dependent manner, the processing of stimuli in presumed unimodal cortices. Taken together, the findings discussed above suggest that modulation of both the power and the phase of oscillatory activity could be important mechanisms of cross-modal interaction.

7.3.2. Effects of Cross-Modal Semantic Matching on Oscillatory Activity

In addition to spatial and temporal congruency (Stein and Meredith 1993), an important factor influencing cross-modal integration is the semantic matching of information across sensory channels. A recent study has addressed this issue during audiovisual processing in an object recognition task, in which sounds of animals were presented in combination with a picture of either the same or a different animal. Larger gamma band activity (GBA) was observed for semantically congruent compared to semantically incongruent audiovisual stimuli (Yuval-Greenberg and Deouell 2007). We have recently been able to obtain similar results using a visual-to-auditory semantic priming paradigm (Schneider et al. 2008), in which we also observed stronger GBA for trials with cross-modal semantic congruence as compared to incongruent trials (Figure 7.1). Source localization using the method of “linear beamforming” revealed that the matching operation presumably reflected in the GBA involves multisensory regions in the left lateral temporal cortex (Schneider et al. 2008). In line with these results, we have recently observed an enhanced GBA for the matching of visual and auditory inputs in working memory in a visual-to-auditory object-matching paradigm (Senkowski et al. 2009).

FIGURE 7.1. Enhanced gamma band activity during semantic cross-modal matching.


Enhanced gamma band activity during semantic cross-modal matching. (a) Semantically congruent and incongruent objects were presented in a cross-modal visual-to-auditory priming paradigm. (b) GBA in response to auditory target stimuli (S2) was enhanced following (more...)

The effect of multisensory matching of meaningful stimuli on oscillatory activity has also been the subject of studies that have used socially important stimuli such as faces and voices. Exploiting the interesting case of synchronous versus asynchronous audiovisual speech (Doesburg et al. 2007), changes in phase synchrony were shown to occur in a transiently activated gamma oscillatory network. Gamma band phase-locking values were increased for asynchronous as compared to synchronous speech between frontal and left posterior sensors, whereas gamma band amplitude showed an enhancement for synchronous compared to asynchronous speech at long latencies after stimulus onset. A more complex pattern of multisensory interactions between faces and voices of conspecifics has been recently observed in the superior temporal sulcus of macaques (Chandrasekaran and Ghazanfar 2009). Importantly, this study demonstrates that faces and voices elicit distinct bands of activity in the theta, alpha, and gamma frequency ranges in the superior temporal sulcus, and moreover, that these frequency band activities show differential patterns of cross-modal integration effects.

The relationship between the early evoked auditory GBA and multisensory processing has also been investigated in an audiovisual symbol-to-sound-matching paradigm (Widmann et al. 2007). An enhanced left-frontally distributed evoked GBA and later parietally distributed induced (i.e., non–phase locked) GBA were found for auditory stimuli that matched the elements of a visual pattern compared to auditory inputs that did not match the visual pattern. In another study, the role of neural synchronization between visual and sensorimotor cortex has been examined in a multisensory matching task in which tactile Braille stimuli and visual dot patterns had to be compared (Hummel and Gerloff 2005). In trials in which subjects performed well compared to trials in which they performed poorly, this study revealed an enhancement of phase coherence in the alpha band between occipital and lateral central regions, whereas no significant effects could be found in other frequency bands. In summary, the available studies suggest that cross-modal matching may be reflected in both local and long-range changes of neural coherence.

7.3.3. Modulation of Cross-Modal Oscillatory Responses by Attention

One of the key functions of attention is to enhance perceptual salience and reduce stimulus ambiguity. Behavioral, electrophysiological, and functional imaging studies have shown that attention plays an important role in multisensory processing (Driver and Spence 2000; Macaluso et al. 2000; Foxe et al. 2005; Talsma and Woldorff 2005). The effect of spatial selective attention on GBA in a multisensory setting has recently been investigated (Senkowski et al. 2005). Subjects were presented with a stream of auditory, visual, and combined audiovisual stimuli to the left and right hemispaces and had to attend to a designated side to detect occasional target stimuli in either modality. An enhancement of the evoked GBA was found for attended compared to unattended multisensory stimuli. In contrast, no effect of spatial attention was observed for unimodal stimuli. An additional analysis of the gamma band phase distribution suggested that attention primarily acts to enhance GBA phase-locking, compatible with the idea already discussed above that cross-modal interactions can affect the phase of neural signals.

The effects of nonspatial intermodal attention and the temporal relation between auditory and visual inputs on the early evoked GBA have been investigated in another EEG study (Senkowski et al. 2007). Subjects were presented with a continuous stream of centrally presented unimodal and bimodal stimuli while they were instructed to detect an occasional auditory or visual target. Using combined auditory and visual stimuli with different onset delays revealed clear effects on the evoked GBA. Although there were no significant differences between the two attention conditions, an enhancement of the GBA was observed when auditory and visual inputs of multisensory stimuli were presented simultaneously (i.e., 0 ± 25 ms; Figure 7.2). This suggests that the integration of auditory and visual inputs, as reflected in high-frequency oscillatory activity, is sensitive to the relative onset timing of the sensory inputs.

FIGURE 7.2. Effect of relative timing of multisensory stimuli on gamma band oscillations.


Effect of relative timing of multisensory stimuli on gamma band oscillations. (a) Horizontal gratings and sinusoidal tones were presented with different stimulus onset asynchronies. (b) GBA to auditory and visual components of multisensory audiovisual (more...)

7.3.4. Percept-Related Multisensory Oscillations

A powerful approach to study cross-modal integration is the use of physically identical multisensory events that can lead to different percepts across trials. A well-known example for this approach is the sound-induced visual flash illusion that exploits the effect that a single flash of light accompanied by rapidly presented auditory beeps is often perceived as multiple flashes (Shams et al. 2000). This illusion allows the direct comparison of neural responses to illusory trials (i.e., when more than one flash is perceived) with nonillusory trials (i.e., when a single flash is perceived), whereas keeping the physical parameters of the presented stimuli constant. In an early attempt to study GBA during the sound-induced visual flash illusion, an increase in induced GBA was observed over occipital sites in an early (around 100 ms) and a late time window (around 450 ms) for illusory but not for nonillusory trials (Bhattacharya et al. 2002). Confirming these data, a more recent study has also observed enhanced induced GBA over occipital areas around 130 and 220 ms for illusory compared to nonillusory trials (Mishra et al. 2007).

Using a modified version of the McGurk effect, the link between induced GBA and illusory perception during audiovisual speech processing has been addressed in MEG investigations (Kaiser et al. 2005, 2006). In the McGurk illusion, an auditory phoneme is dubbed onto a video showing an incongruent lip movement, which often leads to an illusory auditory percept (McGurk and McDonald 1976). Exploiting this cross-modal effect, an enhanced GBA was observed in epochs in which an illusory auditory percept was induced by a visual deviant within a continuous stream of multisensory audiovisual speech stimuli (Kaiser et al. 2005). Remarkably, the topography of this effect was comparable with the frontal topography of a GBA enhancement obtained in an auditory mismatch study (Kaiser et al. 2002), suggesting that the GBA effect in the McGurk illusion study may represent a perceived auditory pattern change caused by the visual lip movement. Moreover, across subjects, the amplitude of induced GBA over the occipital cortex and the degree of the illusory acoustic change were closely correlated, suggesting that the induced GBA in early visual areas may be directly related to the generation of the illusory auditory percept (Kaiser et al. 2006).

Further evidence for a link of gamma band oscillations to illusory cross-modal perception comes from a study on the rubber hand illusion (Kanayama et al. 2007). In this study, a rubber hand was placed atop of a box in which the subject’s own hand was located. In such a setting, subjects can have the illusory impression that a tactile input presented to one of their fingers actually stimulated the rubber hand. Interestingly, there was a strong effect of cross-modal congruence of the stimulation site. Stronger induced GBA and phase synchrony between distant electrodes occurred when a visual stimulus was presented nearby the finger of the rubber hand that corresponded to the subject’s finger receiving a tactile stimulus, as compared to a spatial cross-modal misalignment. This finding suggests a close relationship between multisensory tactile-visual stimulation and phase coherence in gamma band oscillations. In sum, the findings discussed in this section suggest that oscillatory activity, in particular at gamma band frequencies, can reflect perceptual changes resulting from cross-modal interactions.


The data available support the hypothesis that synchronization of oscillatory responses plays a role for multisensory processing (Senkowski et al. 2008). They consistently show that multisensory interactions are accompanied by condition-specific changes in oscillatory responses which often, albeit not always, occur in the gamma band (Sakowitz et al. 2000, 2001, 2005; Bhattacharya et al. 2002; Kaiser et al. 2005, 2006; Senkowski et al. 2005, 2006, 2007; Lakatos et al. 2007; Mishra et al. 2007; Kanayama et al. 2007; Doesburg et al. 2007; Widmann et al. 2007; Schneider et al. 2008). Interpreting these effects observed in EEG or MEG signals, it is likely that they result not only from changes in oscillatory power, but also from altered phase coherence in the underlying neuronal populations. Several of the studies reviewed above have addressed this directly, providing evidence that coherence of neural signals across cortical areas may be a crucial mechanism involved in multimodal processing (von Stein et al. 1999; Hummel and Gerloff 2005; Doesburg et al. 2007; Maier et al. 2008; Kayser and Logothetis 2009).

Theoretical arguments suggest that coherent oscillatory signals may be well-suited to serve crossmodal integration. It has been argued that synchronization of neural activity may help to cope with binding problems that occur in distributed architectures (von der Malsburg and Schneider 1986; Singer and Gray 1995; Engel et al. 1992, 2001; Singer 1999). Clearly, multisensory processing poses binding problems in at least two respects (Foxe and Schroeder 2005): information must be integrated across different neural systems; moreover, real-world scenes comprise multiple objects, creating the need for segregating unrelated neural signals within processing modules while, at the same time, selectively coordinating signals across channels in the correct combination. It seems unlikely that such complex coordination could be achieved by anatomical connections alone because this would not provide sufficient flexibility to cope with a fast-changing multisensory world. In contrast, establishment of relations between signals by neural coherence may provide both the required flexibility and selectivity because transient phase-locking of oscillatory signals allows for the dynamic modulation of effective connectivity between spatially distributed neuronal populations (König et al. 1995; Salinas and Sejnowski 2001; Fries 2005; Womelsdorf et al. 2007).

If neural coherence indeed supports multisensory integration, a number of scenarios seem possible regarding the interaction of “lower-order” and “higher-order” regions. The studies reviewed above demonstrate the effects of multisensory interactions on oscillatory responses at multiple levels, including primary sensory areas (Kaiser et al. 2006; Kayser et al. 2008; Lakatos et al. 2007) as well as higher-order multimodal and frontal areas (Kaiser et al. 2005; Senkowski et al. 2006), suggesting that coherent neural activity might play a role for both “early” and “late” integration of multisensory signals. However, the available data do not yet allow to conclusively decide which interaction patterns are most plausibly involved and, likely, these will also depend on the nature of the task and the stimuli. Using the case of audiovisual interactions, a number of hypothetical scenarios are schematically depicted in Figure 7.3. The simplest scenario predicts that during multisensory interactions, neural synchronization changes between early sensory areas. An alternative possibility is that changes in neural coherence or power occur mainly within cell assemblies of multisensory association cortices like, e.g., superior temporal regions. More complex scenarios would result from a combination of these patterns. For instance, changes in neural synchrony among unimodal regions could also be associated with enhanced oscillatory activity in multisensory areas. This could result from reentrant bottom-up and top-down interactions between unimodal and multimodal cortices. In addition, changes in multisensory perception will often also involve frontal regions, which might exert a modulatory influence on temporal patterns in multisensory parietotemporal regions through oscillatory coupling. Most likely, at least for multisensory processing in naturalistic environments, these interactions will combine into a highly complex pattern involving the frontal cortex, temporoparietal regions as well as unimodal cortices and presumably also subcortical structures.

FIGURE 7.3. Scenarios for large-scale neural communication during cross-modal perception.


Scenarios for large-scale neural communication during cross-modal perception. The model proposed here is compatible with a number of different patterns of neural interactions. The figure refers to the case of audiovisual interactions. (a) Multisensory (more...)

Exploiting coherent oscillations as a potential mechanism would be compatible with various modes, or outcomes, of cross-modal interaction. An important case is the integration of spatially or semantically matching cross-modal signals. Congruent multisensory information would lead, very likely, to coherent activation of neurons processing sensory inputs from different modalities. This, in turn, will lead to stronger activation of cells in multisensory temporal, parietal, or frontal regions that receive input from such a synchronized assembly (Figure 7.3). Thus, cross-modal coherence might provide a plausible mechanism to implement the binding of features across different sensory pathways. In addition, cross-modal integration may be considerably facilitated by top-down influences from higher-order regions (Engel et al. 2001; Herrmann et al. 2004a). During the processing of natural multimodal scenes or semantically complex cross-modal information, such top-down influences might express a dynamic “prediction” (Engel et al. 2001) about expected multisensory inputs. In case of a match with newly arriving sensory inputs, “resonance” is likely to occur, which would augment and accelerate the processing and selection of matching multisensory information (Widmann et al. 2007; Schneider et al. 2008; Senkowski et al. 2009).

The mechanism postulated here may also account for the processing of conflicting cross-modal information. In this case, the mismatching of spatiotemporal phase patterns would presumably lead to competition between different assemblies and a winner-take-all scenario (Fries et al. 2007). Evidence from work in the visual system suggests that such a competition would lead to an augmentation of temporal coherence in the dominant assembly, but a weakening of the temporal binding in other assemblies (Roelfsema et al. 1994; Fries et al. 1997, 2001). Because synchronized signals are particularly efficient in driving downstream cell populations (König et al. 1996; Womelsdorf et al. 2007) and in modulating synaptic weights (Markram et al. 1997; Bi and Poo 2001), such a mechanism would then lead to a selection of strongly synchronized populations and suppression of decorrelated activity.

A third case may be cross-modal modulation, i.e., the bias of a percept by concurrent input from a different sensory modality. The model suggested here predicts that the inputs from the second modality can change the temporal structure of activity patterns in the first modality. One possible mechanism for such a modulation by oscillatory inputs is suggested by studies discussed above (Lakatos et al. 2007; Kayser et al. 2008). Both “lateral” interactions between assemblies in early areas as well as top-down influences could lead to a shift in phase of the respective local oscillations, thus entraining the local population into a temporal pattern that may be optimally suited to enhance the effect on downstream assemblies. The prediction is that this phase resetting or phase shifting should be maximally effective in case of spatial, temporal, or semantic matching cross-modal information. Such a mechanism might help to explain why cross-modal context can often lead to biases in the processing of information in one particular sensory system and might contribute to understanding the nature of “early” multisensory integration (Foxe and Schroeder 2005). Because such modulatory effects might occur on a range of time scales (defined by different frequency bands in oscillatory activity), this mechanism may also account for broader temporal integration windows that have been reported for multisensory interactions (Vroomen and Keetels 2010).

Finally, our hypothesis might also help to account for key features of multisensory processing such as the superadditivity or subadditivity of responses (Stein and Meredith 1993; Meredith 2002) and the principle of “inverse effectiveness” (Kayser and Logothetis 2007). Because of nonlinear dendritic processing, appropriately timed inputs will generate a much stronger postsynaptic response in target neuronal populations than temporally uncoordinated afferent signals (König et al. 1996; Singer 1999; Fries 2005) and, therefore, matching cross-modal inputs can have an impact that differs strongly from the sum of the unimodal responses. Conversely, incongruent signals from two modalities might result in temporally desynchronized inputs and, therefore, in “multisensory depression” in downstream neural populations (Stein et al. 2002).


Although partially supported by data, the hypothesis that neural synchrony may play a role in multisensory processing clearly requires further experimental testing. Thus far, only a relatively small number of multisensory studies have used coherence measures to explicitly address interactions across different neural systems. Very likely, substantial progress can be achieved by studies in humans if the approaches are suitable to capture dynamic cross-systems interactions among specific brain regions. Such investigations may be carried out using MEG (Gross et al. 2001; Siegel et al. 2007, 2008), combination of EEG with functional magnetic resonance imaging (Debener et al. 2006) or intracerebral multisite recordings (Lachaux et al. 2003), if the recordings are combined with advanced source modeling techniques (Van Veen et al. 1997) and analysis methods that quantify, e.g., directed information transfer between the activated regions (Supp et al. 2007). In addition, some of the earlier EEG studies on multisensory oscillations involving visual stimuli (e.g., Yuval-Greenberg and Deouell 2007) seem to be confounded by artifacts relating to microsaccades (Yuval-Greenberg et al. 2008), a methodological issue that needs to be clarified and possibly can be avoided by using MEG (Fries et al. 2008). To characterize the role of correlated activity for multisensory processing at the cellular level, further microelectrode studies in higher mammals will be indispensable.

The model put forward here has several implications. We believe that the study of synchronization phenomena may lead to a new view on multisensory processing that considers the dynamic interplay of neural populations as a key to cross-modal integration and stipulates the development of new research approaches and experimental strategies. Conversely, the investigation of multisensory interactions may also provide a crucial test bed for further validation of the temporal correlation hypothesis (Engel et al. 1992; Singer and Gray 1995; Singer 1995), because task- or percept-related changes in coherence between independent neural sources have hardly been shown in humans thus far. In this context, the role of oscillations in different frequency bands is yet another unexplored issue that future studies will have to address. As discussed above, multisensory effects are often, but not exclusively, observed in higher frequency ranges, and it is unclear why gamma band oscillations figure so prominently.

Finally, abnormal synchronization across sensory channels may play a role in conditions of abnormal cross-modal perception such as synesthesia (Hubbard and Ramachandran 2005) or in disorders such as schizophrenia or autism. In synesthesia, excessively strong multisensory coherence might occur, which then would not just modulate processing in unimodal regions but actually drive sensory neurons even in the absence of a proper stimulus. In contrast, abnormal weakness of cross-modal coupling might account for the impairment of multisensory integration that is observed in patients with schizophrenia (Ross et al. 2007) or autism (Iarocci and McDonald 2006). Thus, research on cross-modal binding may help to advance our understanding of brain disorders that partly result from dysfunctional integrative mechanisms (Schnitzler and Gross 2005; Uhlhaas and Singer 2006).


  1. Amedi A, von Kriegstein K, van Atteveldt N.M, Beauchamp M.S, Naumer M.J. Functional imaging of human crossmodal identification and object recognition. Experimental Brain Research. 2005;166:559–571. [PubMed: 16028028]
  2. Bauer M, Oostenveld R, Peeters M, Fries P. Tactile spatial attention enhances gamma-band activity in somatosensory cortex and reduces low-frequency activity in parieto-occipital areas. Journal of Neuroscience. 2006;26:490–501. [PubMed: 16407546]
  3. Baumann O, Greenlee M.W. Neural correlates of coherent audiovisual motion perception. Cerebral Cortex. 2007;17:1433–1443. [PubMed: 16928890]
  4. Bavelier D, Neville H.J. Cross-modal plasticity: Where and how? Nature Reviews. Neuroscience. 2002;3:443–452. [PubMed: 12042879]
  5. Bhattacharya J, Shams L, Shimojo S. Sound-induced illusory flash perception: Role of gamma band responses. Neuroreport. 2002;13:1727–1730. [PubMed: 12395112]
  6. Bi G.-Q, Poo M.-M. Synaptic modification by correlated activity: Hebb’s postulate revisited. Annual Review of Neuroscience. 2001;24:139–166. [PubMed: 11283308]
  7. Bressler S.L, Freeman W.J. Frequency analysis of olfactory system EEG in cat, rabbit, and rat. Electroencephalography and Clinical Neurophysiology. 1980;50:19–24. [PubMed: 6159187]
  8. Brosch M, Budinger E, Scheich H. Stimulus-related gamma oscillations in primate auditory cortex. Journal of Neurophysiology. 2002;87:2715–2725. [PubMed: 12037173]
  9. Calvert G.A. Crossmodal processing in the human brain: Insights from functional neuroimaging studies. Cerebral Cortex. 2001;11:1110–1123. [PubMed: 11709482]
  10. Castelo-Branco M, Goebel R, Neuenschwander S, Singer W. Neural synchrony correlates with surface segregation rules. Nature. 2000;405:685–689. [PubMed: 10864325]
  11. Csicsvari J, Jamieson B, Wise K.D, Buzsaki G. Mechanisms of gamma oscillations in the hippocampus of the behaving rat. Neuron. 2003;37:311–322. [PubMed: 12546825]
  12. Chandrasekaran C, Ghazanfar A.A. Different neural frequency bands integrate faces and voices differently in the superior temporal sulcus. Journal of Neurophysiology. 2009;101:773–788. [PMC free article: PMC2657063] [PubMed: 19036867]
  13. Debener S, Herrmann C.S, Kranczioch C, Gembris D, Engel A.K. Top-down attentional processing enhances auditory evoked gamma band activity. Neuroreport. 2003;14:683–686. [PubMed: 12692463]
  14. Debener S, Ullsperger M, Siegel M, Engel A.K. Single-trial EEG-fMRI reveals the dynamics of cognitive function. Trends in Cognitive Sciences. 2006;10:558–563. [PubMed: 17074530]
  15. Doesburg S.M, Emberson L.L, Rahi A, Cameron D, Ward L.M. Asynchrony from synchrony: Long-range gamma-band neural synchrony accompanies perception of audiovisual speech asynchrony. Experimental Brain Research. 2007;185:11–20. [PubMed: 17922119]
  16. Driver J, Spence C. Multisensory perception: Beyond modularity and convergence. Current Biology. 2000;10:R731–R735. [PubMed: 11069095]
  17. Engel A.K, König P, Singer W. Direct physiological evidence for scene segmentation by temporal coding. Proceedings of the National Academy of Sciences of the United States of America. 1991a;88:9136–9140. [PMC free article: PMC52667] [PubMed: 1924376]
  18. Engel A.K, König P, Kreiter A.K, Singer W. Interhemispheric synchronization of oscillatory neuronal responses in cat visual cortex. Science. 1991b;252:1177–1179. [PubMed: 2031188]
  19. Engel A.K, König P, Kreiter A.K, Schillen T.B, Singer W. Temporal coding in the visual cortex: New vistas on integration in the nervous system. Trends in Neurosciences. 1992;15:218–226. [PubMed: 1378666]
  20. Engel A.K, Fries P, Singer W. Dynamic predictions: Oscillations and synchrony in top-down processing. Nature Reviews. Neuroscience. 2001;2:704–716. [PubMed: 11584308]
  21. Farmer S.F. Rhythmicity, synchronization and binding in human and primate motor systems. Journal of Physiology. 1998;509:3–14. [PMC free article: PMC2230956] [PubMed: 9547376]
  22. Foxe J.J, Schroeder C.E. The case for feedforward multisensory convergence during early cortical processing. Neuroreport. 2005;16:419–423. [PubMed: 15770144]
  23. Foxe J.J, Wylie G.R, Martinez A, et al., editors. Auditory-somatosensory multisensory processing in auditory association cortex: An fMRI study. Journal of Neurophysiology. 2002;88:540–543. [PubMed: 12091578]
  24. Foxe J.J, Simpson G.V, Ahlfors S.P, Saron C.D. Biasing the brain’s attentional set: I. cue driven deployments of intersensory selective attention. Experimental Brain Research. 2005;166:370–392. [PubMed: 16086144]
  25. Fries P. A mechanism for cognitive dynamics: Neuronal communication through neuronal coherence. Trends in Cognitive Sciences. 2005;9:474–480. [PubMed: 16150631]
  26. Fries P, Roelfsema P.R, Engel A.K, König P, Singer W. Synchronization of oscillatory responses in visual cortex correlates with perception in interocular rivalry. Proceedings of the National Academy of Sciences of the United States of America. 1997;94:12699–12704. [PMC free article: PMC25091] [PubMed: 9356513]
  27. Fries P, Neuenschwander S, Engel A.K, Goebel R, Singer W. Modulation of oscillatory neuronal synchronization by selective visual attention. Science. 2001;291:1560–1563. [PubMed: 11222864]
  28. Fries P, Nikolic D, Singer W. The gamma cycle. Trends in Neurosciences. 2007;30:309–316. [PubMed: 17555828]
  29. Fries P, Scheeringa R, Oostenveld R. Finding gamma. Neuron. 2008;58:303–305. [PubMed: 18466741]
  30. Ghazanfar A.A, Schroeder C.E. Is neocortex essentially multisensory? Trends in Cognitive Sciences. 2006;10:278–285. [PubMed: 16713325]
  31. Gray C.M, König P, Engel A.K, Singer W. Oscillatory responses in cat visual cortex exhibit intercolumnar synchronization which reflects global stimulus properties. Nature. 1989;338:334–337. [PubMed: 2922061]
  32. Gross J, Kujala J, Hamalainen M, et al., editors. Dynamic imaging of coherent sources: Studying neural interactions in the human brain. Proceedings of the National Academy of Sciences of the United States of America. 2001;98:694–699. [PMC free article: PMC14650] [PubMed: 11209067]
  33. Gruber T, Müller M.M. Oscillatory brain activity dissociates between associative stimulus content in a repetition priming task in the human EEG. Cerebral Cortex. 2005;15:109–116. [PubMed: 15238442]
  34. Herrmann C.S, Munk M.H, Engel A.K. Cognitive functions of gamma-band activity: Memory match and utilization. Trends in Cognitive Sciences. 2004a;8:347–355. [PubMed: 15335461]
  35. Herrmann C.S, Lenz D, Junge S, Busch N.A, Maess B. Memory-matches evoke human gamma-responses. BMC Neuroscience. 2004b;5:13. [PMC free article: PMC419345] [PubMed: 15084225]
  36. Hubbard E.M, Ramachandran V.S. Neurocognitive mechanisms of synesthesia. Neuron. 2005;48:509–520. [PubMed: 16269367]
  37. Hummel F, Gerloff C. Larger interregional synchrony is associated with greater behavioral success in a complex sensory integration task in humans. Cerebral Cortex. 2005;15:670–678. [PubMed: 15342429]
  38. Iarocci G, McDonald J. Sensory integration and the perceptual experience of persons with autism. Journal of Autism and Developmental Disorders. 2006;36:77–90. [PubMed: 16395537]
  39. Kaiser J, Lutzenberger W, Ackermann H, Birbaumer N. Dynamics of gamma-band activity induced by auditory pattern changes in humans. Cerebral Cortex. 2002;12:212–221. [PubMed: 11739268]
  40. Kaiser J, Hertrich I, Ackermann H, Mathiak K, Lutzenberger W. Hearing lips: Gamma-band activity during audiovisual speech perception. Cerebral Cortex. 2005;15:646–653. [PubMed: 15342432]
  41. Kaiser J, Hertrich I, Ackermann W, Lutzenberger W. Gamma-band activity over early sensory areas predicts detection of changes in audiovisual speech stimuli. NeuroImage. 2006;30:1376–1382. [PubMed: 16364660]
  42. Kanayama N, Sato A, Ohira H. Crossmodal effect with rubber hand illusion and gamma-band activity. Psychophysiology. 2007;44:392–402. [PubMed: 17371495]
  43. Kayser C, Logothetis N.K. Do early sensory cortices integrate crossmodal information? Brain Structure and Function. 2007;212:121–132. [PubMed: 17717687]
  44. Kayser C, Logothetis N.K. Directed interactions between auditory and superior temporal cortices and their role in sensory integration. Frontiers in Integrative Neuroscience. 2009;3:7. [PMC free article: PMC2691153] [PubMed: 19503750]
  45. Kayser C, Petkov C.I, Augath M, Logothetis N.K. Integration of touch and sound in auditory cortex. Neuron. 2005;48:373–384. [PubMed: 16242415]
  46. Kayser C, Petkov C.I, Logothetis N.K. Visual modulation of neurons in auditory cortex. Cerebral Cortex. 2008;18:1560–1574. [PubMed: 18180245]
  47. Kisley M.A, Cornwell Z.M. Gamma and beta neural activity evoked during a sensory gating paradigm: Effects of auditory, somatosensory and cross-modal stimulation. Clinical Neurophysiology. 2006;117:2549–2563. [PMC free article: PMC1773003] [PubMed: 17008125]
  48. Komura Y, Tamura R, Uwano T, Nishijo H, Ono T. Auditory thalamus integrates visual inputs into behavioral gains. Nature Neuroscience. 2005;8:1203–1209. [PubMed: 16116444]
  49. König P, Engel A.K, Singer W. Relation between oscillatory activity and long-range synchronization in cat visual cortex. Proceedings of the National Academy of Sciences of the United States of America. 1995;92:290–294. [PMC free article: PMC42864] [PubMed: 7816836]
  50. König P, Engel A.K, Singer W. Integrator or coincidence detector? The role of the cortical neuron revisited. Trends in Neurosciences. 1996;19:130–137. [PubMed: 8658595]
  51. Lachaux J.P, Rudrauf D, Kahane P. Intracranial EEG and human brain mapping. Journal of Physiology. 2003;97:613–628. [PubMed: 15242670]
  52. Lakatos P, Chen C.M, O’Connell M.N, Mills A, Schroeder C.E. Neuronal oscillations and multisensory interaction in primary auditory cortex. Neuron. 2007;53:279–292. [PMC free article: PMC3717319] [PubMed: 17224408]
  53. Macaluso E, Driver J. Multisensory spatial interactions: A window onto functional integration in the human brain. Trends in Neurosciences. 2005;28:264–271. [PubMed: 15866201]
  54. Macaluso E, Frith C.D, Driver J. Modulation of human visual cortex by crossmodal spatial attention. Science. 2000;289:1206–1208. [PubMed: 10947990]
  55. Maier J.X, Chandrasekaran C, Ghazanfar A.A. Integration of bimodal looming signals through neuronal coherence in the temporal lobe. Current Biology. 2008;18:963–968. [PubMed: 18585039]
  56. Markram H, Lübke J, Frotscher M, Sakmann B. Regulation of synaptic efficacy by coincidence of postsynaptic APs and EPSPs. Nature. 1997;275:213–215. [PubMed: 8985014]
  57. McGurk H, MacDonald J. Hearing lips and seeing voices. Nature. 1976;264:746–748. [PubMed: 1012311]
  58. Meredith M.A. On the neuronal basis for multisensory convergence: A brief overview. Brain Research. Cognitive Brain Research. 2002;14:31–40. [PubMed: 12063128]
  59. Meredith M.A, Stein B.E. Interactions among converging sensory inputs in the superior colliculus. Science. 1983;221:389–391. [PubMed: 6867718]
  60. Meredith M.A, Stein B.E. Descending efferents from the superior colliculus relay integrated multisensory information. Science. 1985;227:657–659. [PubMed: 3969558]
  61. Mishra J, Martinez A, Sejnowski T.J, Hillyard S.A. Early cross-modal interactions in auditory and visual cortex underlie a sound-induced visual illusion. Journal of Neuroscience. 2007;27:4120–4131. [PMC free article: PMC2905511] [PubMed: 17428990]
  62. Müller M.M, Gruber T, Keil A. Modulation of induced gamma band activity in the human EEG by attention and visual information processing. International Journal of Psychophysiology. 2000;38:283–299. [PubMed: 11102668]
  63. Nagy A, Eördegh G, Paroczy Z, Markus Z, Benedek G. Multisensory integration in the basal ganglia. European Journal of Neuroscience. 2006;24:917–924. [PubMed: 16930419]
  64. Nishijo H, Ono T, Nishino H. Topographic distribution of modality-specific amygdalar neurons in alert monkey. Journal of Neuroscience. 1988;8:3556–3569. [PubMed: 3193170]
  65. Roelfsema P.R, König P, Engel A.K, Sireteanu R, Singer W. Reduced synchronization in the visual cortex of cats with strabismic amblyopia. European Journal of Neuroscience. 1994;6:1645–1655. [PubMed: 7874303]
  66. Roelfsema P.R, Engel A.K, König P, Singer W. Visuomotor integration is associated with zero time-lag synchronization among cortical areas. Nature. 1997;385:157–161. [PubMed: 8990118]
  67. Ross L.A, Saint-Amour D, Leavitt V.M, Molholm S, Javitt D.C, Foxe J.J. Impaired multisensory processing in schizophrenia: Deficits in the visual enhancement of speech comprehension under noisy environmental conditions. Schizophrenia Research. 2007;97:173–183. [PubMed: 17928202]
  68. Rowland B.A, Quessy S, Stanford T.R, Stein B.E. Multisensory integration shortens physiological response latencies. Journal of Neuroscience. 2007;27:5879–5884. [PubMed: 17537958]
  69. Sakowitz O.W, Schürmann M, Basar E. Oscillatory frontal theta responses are increased upon bisensory stimulation. Clinical Neurophysiology. 2000;111:884–893. [PubMed: 10802460]
  70. Sakowitz O.W, Quiroga R.Q, Schürmann M, Basar E. Bisensory stimulation increases gamma-responses over multiple cortical regions. Brain Research. Cognitive Brain Research. 2001;11:267–279. [PubMed: 11275488]
  71. Sakowitz O.W, Quiroga R.Q, Schürmann M, Basar E. Spatio-temporal frequency characteristics of intersensory components in audiovisual evoked potentials. Brain Research. Cognitive Brain Research. 2005;23:316–326. [PubMed: 15820639]
  72. Salinas E, Sejnowski T.J. Correlated neuronal activity and the flow of neural information. Nature Reviews Neuroscience. 2001;2:539–550. [PMC free article: PMC2868968] [PubMed: 11483997]
  73. Sanes J.N, Donoghue J.P. Oscillations in local field potentials of the primate motor cortex during voluntary movement. Proceedings of the National Academy of Sciences of the United States of America. 1993;90:4470–4474. [PMC free article: PMC46533] [PubMed: 8506287]
  74. Schnitzler A, Gross J. Normal and pathological oscillatory communication in the brain. Nature Reviews Neuroscience. 2005;6:285–296. [PubMed: 15803160]
  75. Schneider T.R, Debener S, Oostenveld R, Engel A.K. Enhanced EEG gamma-band activity reflects multisensory semantic matching in visual-to-auditory object priming. NeuroImage. 2008;42:1244–1254. [PubMed: 18617422]
  76. Senkowski D, Talsma D, Herrmann C.S, Woldorff M.G. Multisensory processing and oscillatory gamma responses: Effects of spatial selective attention. Experimental Brain Research. 2005;3–4:411–426. [PubMed: 16151775]
  77. Senkowski D, Molholm S, Gomez-Ramirez M, Foxe J.J. Oscillatory beta activity predicts response speed during a multisensory audiovisual reaction time task: A high-density electrical mapping study. Cerebral Cortex. 2006;16:1556–1565. [PubMed: 16357336]
  78. Senkowski D, Talsma D, Grigutsch M, Herrmann C.S, Woldorff M.G. Good times for multisensory integration: Effects of the precision of temporal synchrony as revealed by gamma-band oscillations. Neuropsychologia. 2007;45:561–571. [PubMed: 16542688]
  79. Senkowski D, Schneider T.R, Foxe J.J, Engel A.K. Crossmodal binding through neural coherence: Implications for multisensory processing. Trends in Neurosciences. 2008;31:401–409. [PubMed: 18602171]
  80. Senkowski D, Schneider T.R, Tandler R, Engel A.K. Gamma-band activity reflects multisensory matching in working memory. Experimental Brain Research. 2009;198:363–372. [PubMed: 19458939]
  81. Shams L, Kamitani Y, Shimojo S. Illusions. What you see is what you hear. Nature. 2000;408:788. [PubMed: 11130706]
  82. Shimojo S, Shams L. Sensory modalities are not separate modalities: Plasticity and interactions. Current Opinion in Neurobiology. 2001;11:505–509. [PubMed: 11502399]
  83. Siegel M, Donner T.H, Oostenveld R, Fries P, Engel A.K. High-frequency activity in human visual cortex is modulated by visual motion strength. Cerebral Cortex. 2007;17:732–741. [PubMed: 16648451]
  84. Siegel M, Donner T.H, Oostenveld R, Fries P, Engel A.K. Neuronal synchronization along the dorsal visual pathway reflects the focus of spatial attention. Neuron. 2008;60:709–719. [PubMed: 19038226]
  85. Singer W. Neuronal synchrony: A versatile code for the definition of relations? Neuron. 1999;24:49–65. [PubMed: 10677026]
  86. Singer W, Gray C.M. Visual feature integration and the temporal correlation hypothesis. Annual Review of Neuroscience. 1995;18:555–586. [PubMed: 7605074]
  87. Stanford T.R, Quessy S, Stein B.E. Evaluating the operations underlying multisensory integration in the cat superior colliculus. Journal of Neuroscience. 2005;25:6499–6508. [PMC free article: PMC1237124] [PubMed: 16014711]
  88. Stein B.E, Meredith M.A. The Merging of the Senses. Cambridge, MA: MIT Press; 1993.
  89. Stein B.E, Wallace M.W, Stanford T.R, Jiang W. Cortex governs multisensory integration in the midbrain. Neuroscientist. 2002;8:306–314. [PubMed: 12194499]
  90. Supp G.G, Schlögl A, Trujillo-Barreto N, Müller M.M, Gruber T. Directed cortical information flow during human object recognition: Analyzing induced EEG gamma-band responses in brain’s source space. PLoS ONE. 2007;2:e684. [PMC free article: PMC1925146] [PubMed: 17668062]
  91. Sur M, Pallas S.L, Roe A.W. Cross-modal plasticity in cortical development: Differentiation and specification of sensory neocortex. Trends in Neurosciences. 1990;13:227–233. [PubMed: 1694329]
  92. Tallon-Baudry C, Bertrand O. Oscillatory gamma activity in humans and its role in object representation. Trends in Cognitive Sciences. 1999;3:151–162. [PubMed: 10322469]
  93. Tallon-Baudry C, Bertrand O, Delpuech C, Pernier J. Stimulus specificity of phase-locked and non-phase-locked 40 Hz visual responses in human. Journal of Neuroscience. 1996;16:4240–4249. [PubMed: 8753885]
  94. Talsma D, Woldorff M.G. Selective attention and multisensory integration: Multiple phases of effects on the evoked brain activity. Journal of Cognitive Neuroscience. 2005;17:1098–1114. [PubMed: 16102239]
  95. Uhlhaas P.J, Singer W. Neural synchrony in brain disorders: Relevance for cognitive dysfunctions and pathophysiology. Neuron. 2006;52:155–168. [PubMed: 17015233]
  96. Van Veen B.D, van Drongelen W, Yuchtman M, Suzuki A. IEEE Transactions on Bio-Medical Engineering. Vol. 44. 1997. Localization of brain electrical activity via linearly constrained minimum variance spatial filtering; pp. 867–880. [PubMed: 9282479]
  97. Varela F, Lachaux J.P, Rodriguez E, Martinerie J. The brainweb: Phase synchronization and largescale integration. Nature Reviews. Neuroscience. 2001;2:229–239. [PubMed: 11283746]
  98. von der Malsburg C, Schneider W. A neural cocktail-party processor. Biological Cybernetics. 1986;54:29–40. [PubMed: 3719028]
  99. von Stein A, Rappelsberger P, Sarnthein J, Petsche H. Synchronization between temporal and parietal cortex during multimodal object processing in man. Cerebral Cortex. 1999;9:137–150. [PubMed: 10220226]
  100. Vroomen J, Keetels M. Perception of intersensory synchrony: A tutorial review. Attention, Perception, & Psychophysics. 2010;72:871–884. [PubMed: 20436185]
  101. Wehr M, Laurent G. Odour encoding by temporal sequences of firing in oscillating neural assemblies. Nature. 1996;384:162–166. [PubMed: 8906790]
  102. Widmann A, Gruber T, Kujala T, Tervaniemi M, Schröger E. Binding symbols and sounds: Evidence from event-related oscillatory gamma-band activity. Cerebral Cortex. 2007;17:2696–2702. [PubMed: 17272264]
  103. Womelsdorf T, Fries P, Mitra P.P, Desimone R. Gamma-band synchronization in visual cortex predicts speed of change detection. Nature. 2006;439:733–736. [PubMed: 16372022]
  104. Womelsdorf T, Schoffelen J.M, Oostenveld R, et al., editors. Modulation of neuronal interactions through neuronal synchronization. Science. 2007;316:1609–1612. [PubMed: 17569862]
  105. Yuval-Greenberg S, Deouell L.Y. What you see is not (always) what you hear: Induced gamma band responses reflect cross-modal interactions in familiar object recognition. Journal of Neuroscience. 2007;27:1090–1096. [PubMed: 17267563]
  106. Yuval-Greenberg S, Tomer O, Keren A.S, Nelken I, Deouell L.Y. Transient induced gamma-band response in EEG as a manifestation of miniature saccades. Neuron. 2008;58:429–441. [PubMed: 18466752]
Copyright © 2012 by Taylor & Francis Group, LLC.
Bookshelf ID: NBK92855PMID: 22593880


  • PubReader
  • Print View
  • Cite this Page

Other titles in this collection

Related information

  • PMC
    PubMed Central citations
  • PubMed
    Links to PubMed

Similar articles in PubMed

See reviews...See all...

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...