Format

Send to:

Choose Destination
See comment in PubMed Commons below
Nature. 2002 Jan 24;415(6870):429-33.

Humans integrate visual and haptic information in a statistically optimal fashion.

Author information

  • 1Vision Science Program, School of Optometry, University of California, Berkeley 94720-2020, USA. marc.ernst@tuebingen.mpg.de

Abstract

When a person looks at an object while exploring it with their hand, vision and touch both provide information for estimating the properties of the object. Vision frequently dominates the integrated visual-haptic percept, for example when judging size, shape or position, but in some circumstances the percept is clearly affected by haptics. Here we propose that a general principle, which minimizes variance in the final estimate, determines the degree to which vision or haptics dominates. This principle is realized by using maximum-likelihood estimation to combine the inputs. To investigate cue combination quantitatively, we first measured the variances associated with visual and haptic estimation of height. We then used these measurements to construct a maximum-likelihood integrator. This model behaved very similarly to humans in a visual-haptic task. Thus, the nervous system seems to combine visual and haptic information in a fashion that is similar to a maximum-likelihood integrator. Visual dominance occurs when the variance associated with visual estimation is lower than that associated with haptic estimation.

PMID:
11807554
[PubMed - indexed for MEDLINE]
PubMed Commons home

PubMed Commons

0 comments
How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for Nature Publishing Group
    Loading ...
    Write to the Help Desk