Format

Send to

Choose Destination
Trends Cogn Sci. 2015 Apr;19(4):215-26. doi: 10.1016/j.tics.2015.02.005. Epub 2015 Mar 11.

Visual attention mitigates information loss in small- and large-scale neural codes.

Author information

1
Neurosciences Graduate Program, University of California San Diego, La Jolla, CA 92093-0109, USA. Electronic address: tsprague@ucsd.edu.
2
Department of Biomedical Engineering, Columbia University, New York, NY, USA.
3
Neurosciences Graduate Program, University of California San Diego, La Jolla, CA 92093-0109, USA; Department of Psychology, University of California San Diego, La Jolla, CA 92093-0109, USA. Electronic address: jserences@ucsd.edu.

Abstract

The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires that sensory signals are processed in a manner that protects information about relevant stimuli from degradation. Such selective processing--or selective attention--is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, thereby providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding.

KEYWORDS:

information theory; neural coding.; stimulus reconstruction; vision; visual attention

PMID:
25769502
PMCID:
PMC4532299
DOI:
10.1016/j.tics.2015.02.005
[Indexed for MEDLINE]
Free PMC Article

Supplemental Content

Full text links

Icon for Elsevier Science Icon for PubMed Central
Loading ...
Support Center