Format

Send to

Choose Destination
Trends Cogn Sci. 2017 Mar;21(3):216-228. doi: 10.1016/j.tics.2017.01.001. Epub 2017 Feb 4.

Emotion Perception from Face, Voice, and Touch: Comparisons and Convergence.

Author information

1
Chinese University of Hong Kong, Hong Kong; Max Planck Institute for Human Cognitive and Brain Sciences, Germany; National University of Singapore, Singapore. Electronic address: schirmer@cuhk.edu.hk.
2
California Institute of Technology, Pasadena, CA, USA. Electronic address: radolphs@caltech.edu.

Abstract

Historically, research on emotion perception has focused on facial expressions, and findings from this modality have come to dominate our thinking about other modalities. Here we examine emotion perception through a wider lens by comparing facial with vocal and tactile processing. We review stimulus characteristics and ensuing behavioral and brain responses and show that audition and touch do not simply duplicate visual mechanisms. Each modality provides a distinct input channel and engages partly nonoverlapping neuroanatomical systems with different processing specializations (e.g., specific emotions versus affect). Moreover, processing of signals across the different modalities converges, first into multi- and later into amodal representations that enable holistic emotion judgments.

PMID:
28173998
PMCID:
PMC5334135
DOI:
10.1016/j.tics.2017.01.001
[Indexed for MEDLINE]
Free PMC Article

Supplemental Content

Full text links

Icon for Elsevier Science Icon for PubMed Central
Loading ...
Support Center