Send to

Choose Destination
Phys Rev E Stat Nonlin Soft Matter Phys. 2004 Jun;69(6 Pt 2):066138. Epub 2004 Jun 23.

Estimating mutual information.

Author information

John-von-Neumann Institute for Computing, Forschungszentrum Jülich, D-52425 Juelich, Germany.

Erratum in

  • Phys Rev E Stat Nonlin Soft Matter Phys. 2011 Jan;83(1 Pt 1):019903.


We present two classes of improved estimators for mutual information M(X,Y), from samples of random points distributed according to some joint probability density mu(x,y). In contrast to conventional estimators based on binnings, they are based on entropy estimates from k -nearest neighbor distances. This means that they are data efficient (with k=1 we resolve structures down to the smallest possible scales), adaptive (the resolution is higher where data are more numerous), and have minimal bias. Indeed, the bias of the underlying entropy estimates is mainly due to nonuniformity of the density at the smallest resolved scale, giving typically systematic errors which scale as functions of k/N for N points. Numerically, we find that both families become exact for independent distributions, i.e. the estimator M(X,Y) vanishes (up to statistical fluctuations) if mu(x,y)=mu(x)mu(y). This holds for all tested marginal distributions and for all dimensions of x and y. In addition, we give estimators for redundancies between more than two random variables. We compare our algorithms in detail with existing algorithms. Finally, we demonstrate the usefulness of our estimators for assessing the actual independence of components obtained from independent component analysis (ICA), for improving ICA, and for estimating the reliability of blind source separation.


Supplemental Content

Loading ...
Support Center