Format

Send to

Choose Destination
Neuroimage. 2016 May 15;132:59-70. doi: 10.1016/j.neuroimage.2016.02.019. Epub 2016 Feb 16.

Perceptual similarity of visual patterns predicts dynamic neural activation patterns measured with MEG.

Author information

1
Department of Cognitive Science and ARC Centre of Excellence in Cognition and Its Disorders and Perception in Action Research Centre, Macquarie University, Sydney, New South Wales 2109, Australia.
2
Medical Research Council, Cognition and Brain Sciences Unit, Cambridge CB2 7EF, UK.
3
Department of Cognitive Science and ARC Centre of Excellence in Cognition and Its Disorders and Perception in Action Research Centre, Macquarie University, Sydney, New South Wales 2109, Australia; Department of Psychology, University of Maryland, College Park, MD, USA. Electronic address: thomas.carlson@mq.edu.au.

Abstract

Perceptual similarity is a cognitive judgment that represents the end-stage of a complex cascade of hierarchical processing throughout visual cortex. Previous studies have shown a correspondence between the similarity of coarse-scale fMRI activation patterns and the perceived similarity of visual stimuli, suggesting that visual objects that appear similar also share similar underlying patterns of neural activation. Here we explore the temporal relationship between the human brain's time-varying representation of visual patterns and behavioral judgments of perceptual similarity. The visual stimuli were abstract patterns constructed from identical perceptual units (oriented Gabor patches) so that each pattern had a unique global form or perceptual 'Gestalt'. The visual stimuli were decodable from evoked neural activation patterns measured with magnetoencephalography (MEG), however, stimuli differed in the similarity of their neural representation as estimated by differences in decodability. Early after stimulus onset (from 50ms), a model based on retinotopic organization predicted the representational similarity of the visual stimuli. Following the peak correlation between the retinotopic model and neural data at 80ms, the neural representations quickly evolved so that retinotopy no longer provided a sufficient account of the brain's time-varying representation of the stimuli. Overall the strongest predictor of the brain's representation was a model based on human judgments of perceptual similarity, which reached the limits of the maximum correlation with the neural data defined by the 'noise ceiling'. Our results show that large-scale brain activation patterns contain a neural signature for the perceptual Gestalt of composite visual features, and demonstrate a strong correspondence between perception and complex patterns of brain activity.

KEYWORDS:

Decoding; Gestalt perception; Magnetoencephalography (MEG); Perceptual similarity; Representational geometry; Representational similarity analysis

[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Elsevier Science
Loading ...
Support Center