Canonical dependency analysis based on squared-loss mutual information

Neural Netw. 2012 Oct:34:46-55. doi: 10.1016/j.neunet.2012.06.009. Epub 2012 Jul 11.

Abstract

Canonical correlation analysis (CCA) is a classical dimensionality reduction technique for two sets of variables that iteratively finds projection directions with maximum correlation. Although CCA is still in vital use in many practical application areas, recent real-world data often contain more complicated nonlinear correlations that cannot be properly captured by classical CCA. In this paper, we thus propose an extension of CCA that can effectively capture such complicated nonlinear correlations through statistical dependency maximization. The proposed method, which we call least-squares canonical dependency analysis (LSCDA), is based on a squared-loss variant of mutual information, and it has various useful properties besides its ability to capture higher-order correlations: for example, it can simultaneously find multiple projection directions (i.e., subspaces), it does not involve density estimation, and it is equipped with a model selection strategy. We demonstrate the usefulness of LSCDA through various experiments on artificial and real-world datasets.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms*
  • Artificial Intelligence* / statistics & numerical data
  • Databases, Factual
  • Least-Squares Analysis