Send to

Choose Destination
J Opt Soc Am A Opt Image Sci Vis. 2003 Jul;20(7):1391-7.

Bayesian integration of visual and auditory signals for spatial localization.

Author information

Department of Brain and Cognitive Sciences, The Center for Visual Science, University of Rochester, Rochester, New York 14627, USA.


Human observers localize events in the world by using sensory signals from multiple modalities. We evaluated two theories of spatial localization that predict how visual and auditory information are weighted when these signals specify different locations in space. According to one theory (visual capture), the signal that is typically most reliable dominates in a winner-take-all competition, whereas the other theory (maximum-likelihood estimation) proposes that perceptual judgments are based on a weighted average of the sensory signals in proportion to each signal's relative reliability. Our results indicate that both theories are partially correct, in that relative signal reliability significantly altered judgments of spatial location, but these judgments were also characterized by an overall bias to rely on visual over auditory information. These results have important implications for the development of cue integration and for neural plasticity in the adult brain that enables humans to optimally integrate multimodal information.

[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Optical Society of America
Loading ...
Support Center