Format

Send to

Choose Destination
Neural Comput. 2009 Apr;21(4):1038-67. doi: 10.1162/neco.2008.03-08-727.

Recurrent infomax generates cell assemblies, neuronal avalanches, and simple cell-like selectivity.

Author information

1
Department of Morphological Brain Science, Graduate School of Medicine, Kyoto University, Kyoto 606-8501, Japan. ttakuma@mbs.med.kyoto-u.ac.jp

Abstract

Recently multineuronal recording has allowed us to observe patterned firings, synchronization, oscillation, and global state transitions in the recurrent networks of central nervous systems. We propose a learning algorithm based on the process of information maximization in a recurrent network, which we call recurrent infomax (RI). RI maximizes information retention and thereby minimizes information loss through time in a network. We find that feeding in external inputs consisting of information obtained from photographs of natural scenes into an RI-based model of a recurrent network results in the appearance of Gabor-like selectivity quite similar to that existing in simple cells of the primary visual cortex. We find that without external input, this network exhibits cell assembly-like and synfire chain-like spontaneous activity as well as a critical neuronal avalanche. In addition, we find that RI embeds externally input temporal firing patterns to the network so that it spontaneously reproduces these patterns after learning. RI provides a simple framework to explain a wide range of phenomena observed in in vivo and in vitro neuronal networks, and it will provide a novel understanding of experimental results for multineuronal activity and plasticity from an information-theoretic point of view.

PMID:
18928369
DOI:
10.1162/neco.2008.03-08-727
[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Atypon
Loading ...
Support Center