Display Settings:

Format

Send to:

Choose Destination
See comment in PubMed Commons below
Exp Brain Res. 2006 Oct;174(3):588-94. Epub 2006 Aug 10.

Multisensory integration of speech signals: the relationship between space and time.

Author information

  • 1Centre for Cognitive Neuroscience, Wilfrid Laurier University, Waterloo, ON, Canada. jjones@wlu.ca

Abstract

Integrating audiovisual cues for simple events is affected when sources are separated in space and time. By contrast, audiovisual perception of speech appears resilient when either spatial or temporal disparities exist. We investigated whether speech perception is sensitive to the combination of spatial and temporal inconsistencies. Participants heard the bisyllable /aba/ while seeing a face produce the incongruent bisyllable /ava/. We tested the level of visual influence over auditory perception when the sound was asynchronous with respect to facial motion (from -360 to +360 ms) and emanated from five locations equidistant to the participant. Although an interaction was observed, it was not related to participants' perception of synchrony, nor did it indicate a linear relationship between the effect of spatial and temporal discrepancies. We conclude that either the complexity of the signal or the nature of the task reduces reliance on spatial and temporal contiguity for audiovisual speech perception.

PMID:
16900363
[PubMed - indexed for MEDLINE]
PubMed Commons home

PubMed Commons

0 comments
How to join PubMed Commons

    Supplemental Content

    Icon for Springer
    Loading ...
    Write to the Help Desk