Send to

Choose Destination
See comment in PubMed Commons below
IEEE Trans Neural Netw. 2007 Jan;18(1):97-106.

The AIC criterion and symmetrizing the Kullback-Leibler divergence.

Author information

  • 1Systems Engineering and Complex Systems Research Program, National ICT Australia, Canberra Research Laboratory, Canberra, ACT 2601, Australia.


The Akaike information criterion (AIC) is a widely used tool for model selection. AIC is derived as an asymptotically unbiased estimator of a function used for ranking candidate models which is a variant of the Kullback-Leibler divergence between the true model and the approximating candidate model. Despite the Kullback-Leibler's computational and theoretical advantages, what can become inconvenient in model selection applications is their lack of symmetry. Simple examples can show that reversing the role of the arguments in the Kullback-Leibler divergence can yield substantially different results. In this paper, three new functions for ranking candidate models are proposed. These functions are constructed by symmetrizing the Kullback-Leibler divergence between the true model and the approximating candidate model. The operations used for symmetrizing are the average, geometric, and harmonic means. It is found that the original AIC criterion is an asymptotically unbiased estimator of these three different functions. Using one of these proposed ranking functions, an example of new bias correction to AIC is derived for univariate linear regression models. A simulation study based on polynomial regression is provided to compare the different proposed ranking functions with AIC and the new derived correction with AICc.

[PubMed - indexed for MEDLINE]
PubMed Commons home

PubMed Commons

How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for IEEE Engineering in Medicine and Biology Society
    Loading ...
    Support Center