Skip to main content

AIC stands for the Akaike Information Criterion, which is one technique used to select the best model from a class of models using a penalized likelihood. A smaller AIC implies a better model.

$$\text{AIC} = 2k - 2\ln(L)$$

where $k$ is the number of parameters in the statistical model, and $L$ is the maximized value of the likelihood function for the estimated model.

$$\text{AICc} = \text{AIC} + \frac{2k(k + 1)}{n - k - 1}$$

AICc is AIC with a correction for finite sample sizes, where $n$ denotes the sample size. Thus, AICc is AIC with a greater penalty for extra parameters.

AIC was introduced by Hirotugu Akaike in his seminal 1973 paper "Information Theory and an Extension of the Maximum Likelihood Principle" (in: B. N. Petrov and F. Csaki, eds., 2nd International Symposium on Information Theory, Akademia Kiado, Budapest, pp. 267{281).

References:

  • Wikipedia

  • Akaike, H. (1998). Information theory and an extension of the maximum likelihood principle. [PDF]