Format

Send to

Choose Destination
See comment in PubMed Commons below
Neural Comput. 2012 Jan;24(1):104-33. doi: 10.1162/NECO_a_00200. Epub 2011 Aug 18.

Recurrent kernel machines: computing with infinite echo state networks.

Author information

  • 1Department of Electronics and Information Systems, Ghent University, 9000 Ghent, Belgium. michiel.hermans@ugent.be

Abstract

Echo state networks (ESNs) are large, random recurrent neural networks with a single trained linear readout layer. Despite the untrained nature of the recurrent weights, they are capable of performing universal computations on temporal input data, which makes them interesting for both theoretical research and practical applications. The key to their success lies in the fact that the network computes a broad set of nonlinear, spatiotemporal mappings of the input data, on which linear regression or classification can easily be performed. One could consider the reservoir as a spatiotemporal kernel, in which the mapping to a high-dimensional space is computed explicitly. In this letter, we build on this idea and extend the concept of ESNs to infinite-sized recurrent neural networks, which can be considered recursive kernels that subsequently can be used to create recursive support vector machines. We present the theoretical framework, provide several practical examples of recursive kernels, and apply them to typical temporal tasks.

PMID:
21851278
DOI:
10.1162/NECO_a_00200
[PubMed - indexed for MEDLINE]
PubMed Commons home

PubMed Commons

0 comments
How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for Atypon
    Loading ...
    Support Center