Format

Send to

Choose Destination
Neural Comput. 2017 Nov;29(11):2979-3013. doi: 10.1162/neco_a_01015. Epub 2017 Sep 28.

Discrete Sparse Coding.

Author information

1
Machine Learning Lab, Cluster of Excellence Hearing4all and Department for Medical Physics and Acoustics, Carl-von-Ossietzky University Oldenburg, 26111 Oldenburg, Germany georgios.exarchakis@uol.de.
2
Machine Learning Lab, Cluster of Excellence Hearing4all and Department for Medical Physics and Acoustics, Carl-von-Ossietzky University Oldenburg, 26111 Oldenburg, Germany joerg.luecke@uol.de.

Abstract

Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.

PMID:
28957027
DOI:
10.1162/neco_a_01015

Supplemental Content

Full text links

Icon for Atypon
Loading ...
Support Center