Format

Send to

Choose Destination
Science. 1995 Sep 29;269(5232):1860-3.

Replicator neural networks for universal optimal source coding.

Abstract

Replicator neural networks self-organize by using their inputs as desired outputs; they internally form a compressed representation for the input data. A theorem shows that a class of replicator networks can, through the minimization of mean squared reconstruction error (for instance, by training on raw data examples), carry out optimal data compression for arbitrary data vector sources. Data manifolds, a new general model of data sources, are then introduced and a second theorem shows that, in a practically important limiting case, optimal-compression replicator networks operate by creating an essentially unique natural coordinate system for the manifold.

Supplemental Content

Full text links

Icon for HighWire
Loading ...
Support Center