Format

Send to

Choose Destination
Hum Brain Mapp. 2016 Jun;37(6):2319-30. doi: 10.1002/hbm.23176. Epub 2016 Mar 11.

Neural correlates of context-dependent feature conjunction learning in visual search tasks.

Author information

1
Department of Psychological and Brain Sciences, Dartmouth College, Hanover, New Hampshire, USA.
2
Institute of Experimental Psychology, University of Regensburg, Regensburg, Germany.

Abstract

Many perceptual learning experiments show that repeated exposure to a basic visual feature such as a specific orientation or spatial frequency can modify perception of that feature, and that those perceptual changes are associated with changes in neural tuning early in visual processing. Such perceptual learning effects thus exert a bottom-up influence on subsequent stimulus processing, independent of task-demands or endogenous influences (e.g., volitional attention). However, it is unclear whether such bottom-up changes in perception can occur as more complex stimuli such as conjunctions of visual features are learned. It is not known whether changes in the efficiency with which people learn to process feature conjunctions in a task (e.g., visual search) reflect true bottom-up perceptual learning versus top-down, task-related learning (e.g., learning better control of endogenous attention). Here we show that feature conjunction learning in visual search leads to bottom-up changes in stimulus processing. First, using fMRI, we demonstrate that conjunction learning in visual search has a distinct neural signature: an increase in target-evoked activity relative to distractor-evoked activity (i.e., a relative increase in target salience). Second, we demonstrate that after learning, this neural signature is still evident even when participants passively view learned stimuli while performing an unrelated, attention-demanding task. This suggests that conjunction learning results in altered bottom-up perceptual processing of the learned conjunction stimuli (i.e., a perceptual change independent of the task). We further show that the acquired change in target-evoked activity is contextually dependent on the presence of distractors, suggesting that search array Gestalts are learned. Hum Brain Mapp 37:2319-2330, 2016.

KEYWORDS:

cognition; learning; neuronal plasticity; visual cortex; visual perception

PMID:
26970441
DOI:
10.1002/hbm.23176
[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Wiley
Loading ...
Support Center