Learning With Jensen-Tsallis Kernels

IEEE Trans Neural Netw Learn Syst. 2016 Oct;27(10):2108-19. doi: 10.1109/TNNLS.2016.2550578. Epub 2016 Apr 14.

Abstract

Jensen-type [Jensen-Shannon (JS) and Jensen-Tsallis] kernels were first proposed by Martins et al. (2009). These kernels are based on JS divergences that originated in the information theory. In this paper, we extend the Jensen-type kernels on probability measures to define positive-definite kernels on Euclidean space. We show that the special cases of these kernels include dot-product kernels. Since Jensen-type divergences are multidistribution divergences, we propose their multipoint variants, and study spectral clustering and kernel methods based on these. We also provide experimental studies on benchmark image database and gene expression database that show the benefits of the proposed kernels compared with the existing kernels. The experiments on clustering also demonstrate the use of constructing multipoint similarities.

MeSH terms

  • Algorithms
  • Cluster Analysis*
  • Machine Learning*
  • Neural Networks, Computer*
  • Probability