Format

Send to

Choose Destination
See comment in PubMed Commons below
Rev Neurosci. 2012;23(4):381-92. doi: 10.1515/revneuro-2012-0040.

Multisensory emotions: perception, combination and underlying neural processes.

Author information

1
Department of Psychiatry, RWTH Aachen University, Aachen, Germany. mklasen@ukaachen.de

Abstract

In our everyday lives, we perceive emotional information via multiple sensory channels. This is particularly evident for emotional faces and voices in a social context. Over the past years, a multitude of studies have addressed the question of how affective cues conveyed by auditory and visual channels are integrated. Behavioral studies show that hearing and seeing emotional expressions can support and influence each other, a notion which is supported by investigations on the underlying neurobiology. Numerous electrophysiological and neuroimaging studies have identified brain regions subserving the integration of multimodal emotions and have provided new insights into the neural processing steps underlying the synergistic confluence of affective information from voice and face. In this paper we provide a comprehensive review covering current behavioral, electrophysiological and functional neuroimaging findings on the combination of emotions from the auditory and visual domains. Behavioral advantages arising from multimodal redundancy are paralleled by specific integration patterns on the neural level, from encoding in early sensory cortices to late cognitive evaluation in higher association areas. In summary, these findings indicate that bimodal emotions interact at multiple stages of the audiovisual integration process.

PMID:
23089604
DOI:
10.1515/revneuro-2012-0040
[Indexed for MEDLINE]
PubMed Commons home

PubMed Commons

0 comments
How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for iFactory
    Loading ...
    Support Center