Format

Send to

Choose Destination
Neural Comput. 2017 Aug;29(8):2123-2163. doi: 10.1162/NECO_a_00984. Epub 2017 May 31.

Deep Restricted Kernel Machines Using Conjugate Feature Duality.

Author information

1
KU Leuven ESAT-STADIUS, B-3001 Leuven, Belgium johan.suykens@esat.kuleuven.be.

Abstract

The aim of this letter is to propose a theory of deep restricted kernel machines offering new foundations for deep learning with kernel machines. From the viewpoint of deep learning, it is partially related to restricted Boltzmann machines, which are characterized by visible and hidden units in a bipartite graph without hidden-to-hidden connections and deep learning extensions as deep belief networks and deep Boltzmann machines. From the viewpoint of kernel machines, it includes least squares support vector machines for classification and regression, kernel principal component analysis (PCA), matrix singular value decomposition, and Parzen-type models. A key element is to first characterize these kernel machines in terms of so-called conjugate feature duality, yielding a representation with visible and hidden units. It is shown how this is related to the energy form in restricted Boltzmann machines, with continuous variables in a nonprobabilistic setting. In this new framework of so-called restricted kernel machine (RKM) representations, the dual variables correspond to hidden features. Deep RKM are obtained by coupling the RKMs. The method is illustrated for deep RKM, consisting of three levels with a least squares support vector machine regression level and two kernel PCA levels. In its primal form also deep feedforward neural networks can be trained within this framework.

PMID:
28562217
DOI:
10.1162/NECO_a_00984

Supplemental Content

Full text links

Icon for Atypon
Loading ...
Support Center