Format

Send to

Choose Destination
Neural Comput. 2008 Oct;20(10):2526-63. doi: 10.1162/neco.2008.03-07-486.

Sparse coding via thresholding and local competition in neural circuits.

Author information

1
Department of Electrical and Computer Engineering, Rice University, Houston, TX 77251-1892, U.S.A. crozell@gatech.edu

Abstract

While evidence indicates that neural systems may be employing sparse approximations to represent sensed stimuli, the mechanisms underlying this ability are not understood. We describe a locally competitive algorithm (LCA) that solves a collection of sparse coding principles minimizing a weighted combination of mean-squared error and a coefficient cost function. LCAs are designed to be implemented in a dynamical system composed of many neuron-like elements operating in parallel. These algorithms use thresholding functions to induce local (usually one-way) inhibitory competitions between nodes to produce sparse representations. LCAs produce coefficients with sparsity levels comparable to the most popular centralized sparse coding algorithms while being readily suited for neural implementation. Additionally, LCA coefficients for video sequences demonstrate inertial properties that are both qualitatively and quantitatively more regular (i.e., smoother and more predictable) than the coefficients produced by greedy algorithms.

PMID:
18439138
DOI:
10.1162/neco.2008.03-07-486
[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Atypon
Loading ...
Support Center