Send to

Choose Destination
See comment in PubMed Commons below
Int J Bioinform Res Appl. 2014;10(1):43-58. doi: 10.1504/IJBRA.2014.058777.

Learning dependence from samples.

Author information

  • 1Helsinki Institute for Information Technology HIIT, Department of Information and Computer Science, Aalto University, Espoo, Finland.
  • 2Electrical and Computer Engineering, University of Florida, Gainesville, FL 32611, USA.


Mutual information, conditional mutual information and interaction information have been widely used in scientific literature as measures of dependence, conditional dependence and mutual dependence. However, these concepts suffer from several computational issues; they are difficult to estimate in continuous domain, the existing regularised estimators are almost always defined only for real or vector-valued random variables, and these measures address what dependence, conditional dependence and mutual dependence imply in terms of the random variables but not finite realisations. In this paper, we address the issue that given a set of realisations in an arbitrary metric space, what characteristic makes them dependent, conditionally dependent or mutually dependent. With this novel understanding, we develop new estimators of association, conditional association and interaction association. Some attractive properties of these estimators are that they do not require choosing free parameter(s), they are computationally simpler, and they can be applied to arbitrary metric spaces.


bioinformatics; causality; conditional association; conditional dependence; conditional mutual information; interaction association; interaction information; learning dependence; metric space; mutual dependence; mutual information; variable selection

[PubMed - indexed for MEDLINE]
PubMed Commons home

PubMed Commons

How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for Atypon
    Loading ...
    Write to the Help Desk