Format

Send to

Choose Destination
PLoS One. 2014 Mar 26;9(3):e93250. doi: 10.1371/journal.pone.0093250. eCollection 2014.

Deep belief networks learn context dependent behavior.

Author information

1
Center for Computational Neuroscience and Neural Technology, Boston University, Boston, Massachusetts, United States of America; Center of Excellence for Learning in Education, Science, and Technology, Boston University, Boston, Massachusetts, United States of America.
2
Facebook, Menlo Park, California, United States of America.
3
Center for Computational Neuroscience and Neural Technology, Boston University, Boston, Massachusetts, United States of America; Center of Excellence for Learning in Education, Science, and Technology, Boston University, Boston, Massachusetts, United States of America; Department of Psychology and Graduate Program for Neuroscience, Boston University, Boston, Massachusetts, United States of America.

Abstract

With the goal of understanding behavioral mechanisms of generalization, we analyzed the ability of neural networks to generalize across context. We modeled a behavioral task where the correct responses to a set of specific sensory stimuli varied systematically across different contexts. The correct response depended on the stimulus (A,B,C,D) and context quadrant (1,2,3,4). The possible 16 stimulus-context combinations were associated with one of two responses (X,Y), one of which was correct for half of the combinations. The correct responses varied symmetrically across contexts. This allowed responses to previously unseen stimuli (probe stimuli) to be generalized from stimuli that had been presented previously. By testing the simulation on two or more stimuli that the network had never seen in a particular context, we could test whether the correct response on the novel stimuli could be generated based on knowledge of the correct responses in other contexts. We tested this generalization capability with a Deep Belief Network (DBN), Multi-Layer Perceptron (MLP) network, and the combination of a DBN with a linear perceptron (LP). Overall, the combination of the DBN and LP had the highest success rate for generalization.

PMID:
24671178
PMCID:
PMC3966868
DOI:
10.1371/journal.pone.0093250
[Indexed for MEDLINE]
Free PMC Article

Supplemental Content

Full text links

Icon for Public Library of Science Icon for PubMed Central Icon for ModelDB
Loading ...
Support Center