Akaike information criterion

Akaike information criterion

Akaike's information criterion, developed by Hirotsugu Akaike under the name of "an information criterion" (AIC) in 1971 and proposed in Akaike (1974), is a measure of the goodness of fit of an estimated statistical model. It is grounded in the concept of entropy, in effect offering a relative measure of the information lost when a given model is used to describe reality and can be said to describe the tradeoff between bias and variance in model construction, or loosely speaking that of precision and complexity of the model.

The AIC is not a test on the model in the sense of hypothesis testing, rather it is a tool for model selection. Given a data set, several competing models may be ranked according to their AIC, with the one having the lowest AIC being the best. From the AIC value one may infer that e.g the top three models are in a tie and the rest are far worse, but one should not assign a value above which a given model is 'rejected'.Burnham, Anderson, 1998, "Model Selection and Inference - A practical information-theoretic approach" ISBN 0-387-98504-2]

Definition

In the general case, the AIC is

:AIC = 2k - 2ln(L),

where "k" is the number of parameters in the statistical model, and "L" is the maximized value of the likelihood function for the estimated model.

Over the remainder of this entry, it will be assumed that the model errors are normally and independently distributed. Let "n" be the number of observations and "RSS" be :RSS = sum_{i=1}^n hat{varepsilon}_i^2,the residual sum of squares. Then AIC becomes

:AIC=2k + n [ln(2pi RSS/n) + 1] ,.

Increasing the number of free parameters to be estimated improves the goodness of fit, regardless of the number of free parameters in the data generating process. Hence AIC not only rewards goodness of fit, but also includes a penalty that is an increasing function of the number of estimated parameters. This penalty discourages overfitting. "The preferred model is the one with the lowest AIC value." The AIC methodology attempts to find the model that best explains the data with a minimum of free parameters. By contrast, more traditional approaches to modeling start from a null hypothesis. The AIC penalizes free parameters less strongly than does the Schwarz criterion.

AIC judges a model by how close its fitted values tend to be to the true values, in terms of a certain expected value.

Relevance to chi^2 fitting (maximum likelihood)

Often, one wishes to select amongst competing models where the likelihood function assumes that the underlying errors are normally distributed. This assumption leads to chi^2 data fitting.

For any set of models where the number of data points, "n", is the same, one can use a slightly altered AIC. For the purposes of this article, this will be called AIC_{chi^2}. It differs from the AIC only through an additive constant, which is a function only of "n". As only differences in the AIC are relevant, this constant can be ignored. AIC_{chi^2} is given by:AIC_{chi^2}=chi^2+2k

This form is often convenient in that data fitting programs produce chi^2 as a statistic for the fit. For models with the same number of data points, the one with the lowest AIC_{chi^2} should be preferred.

AICc and AICu

AICc is AIC with a second order correction for small sample sizes, to start with:

:AICc = AIC + frac{2k(k + 1)}{n - k - 1}.,

Since AICc converges to AIC as "n" gets large, AICc should be employed regardless of sample size (Burnham and Anderson, 2004).

McQuarrie and Tsai (1998: 22) define AICc as:

:AICc = ln{frac{RSS}{n + frac{n+k}{n-k-2} ,

and propose (p. 32) the closely related measure:

:AICu = ln{frac{RSS}{n-k + frac{n+k}{n-k-2} .

McQuarrie and Tsai ground their high opinion of AICc and AICu on extensive simulation work.

QAIC

QAIC (the quasi-AIC) is defined as:

:QAIC = 2k-frac{1}{c}2ln{L},

where "c" is a variance inflation factor. QAIC adjusts for over-dispersion or lack of fit. The small sample version of QAIC is

:QAICc = QAIC + frac{2k(k + 1)}{n - k - 1}.,

References

*cite journal
first = Hirotugu
last = Akaike
authorlink =
year = 1974
month =
title = A new look at the statistical model identification
journal = IEEE Transactions on Automatic Control
volume = 19
issue = 6
pages = 716–723
doi = 10.1109/TAC.1974.1100705

*Burnham, K. P., and D. R. Anderson, 2002. "Model Selection and Multimodel Inference: A Practical-Theoretic Approach", 2nd ed. Springer-Verlag. ISBN 0-387-95364-7.
*--------, 2004. " [http://www2.fmg.uva.nl/modelselection/presentations/AWMS2004-Burnham-paper.pdf Multimodel Inference: understanding AIC and BIC in Model Selection] ", Amsterdam Workshop on Model Selection.
* Hurvich, C. M., and Tsai, C.-L., 1989. "Regression and time series model selection in small samples". Biometrika, Vol 76. pp. 297-307
* McQuarrie, A. D. R., and Tsai, C.-L., 1998. "Regression and Time Series Model Selection". World Scientific.

ee also

*Bayesian information criterion
*deviance
*deviance information criterion
*Hannan-Quinn information criterion
*Jensen-Shannon divergence
*Kullback-Leibler divergence
*Occam's Razor

External links

* [http://www.garfield.library.upenn.edu/classics1981/A1981MS54100001.pdf Hirotogu Akaike comments on how he arrived at the AIC in This Week's Citation Classic]


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Akaike Information Criterion — Ein Informationskriterium ist ein Kriterium zur Auswahl eines Modells in der angewandten Statistik bzw. der Ökonometrie. Dabei gehen die Anpassungsgüte des geschätzten Modells an die vorliegenden empirischen Daten (Stichprobe) und Komplexität des …   Deutsch Wikipedia

  • Information criterion — may refer to: *Akaike information criterion, a measure of the goodness fit of an estimated statistical model *Bayesian information criterion also known as the Schwarz information criterion, a statistical criterion for model selection *Hannan… …   Wikipedia

  • Deviance information criterion — The deviance information criterion (DIC) is a hierarchical modeling generalization of the AIC (Akaike information criterion) and BIC (Bayesian information criterion, also known as the Schwarz criterion). It is particularly useful in Bayesian… …   Wikipedia

  • Bayesian information criterion — In statistics, in order to describe a particular dataset, one can use non parametric methods or parametric methods. In parametric methods, there might be various candidate models with different number of parameters to represent a dataset. The… …   Wikipedia

  • Hannan-Quinn information criterion — Information criteria are often used as a guide in model selection (see forexample, Grasa 1989). The Kullback Leibler quantity of information contained in a model is the distance from the“true” model and is measured by the log likelihood function …   Wikipedia

  • Hirotsugu Akaike — nihongo|Hirotsugu Akaike|赤池 弘次| Akaike Hirotsugu |In academic publications Hirotugu without an S (born November 5, 1927) is a Japanese statistician. In the early 1970s he formulated an information criterion for model identification which has… …   Wikipedia

  • AIC — Akaike information criterion [a goodness of fit measure]; aminoimidazole carboxamide; Association des Infirmieres Canadiennes …   Medical dictionary

  • AIC — • Akaike information criterion [a goodness of fit measure]; • aminoimidazole carboxamide; • Association des Infirmieres Canadiennes …   Dictionary of medical acronyms & abbreviations

  • Rasoir d'Occam — Rasoir d Ockham Pour les articles homonymes, voir Rasoir d Occam (homonymie). Le rasoir d’Occam ou rasoir d’Ockham est un principe de raisonnement que l on attribue au frère franciscain et philosophe Guillaume d Ockham (XIVe siècle), mais… …   Wikipédia en Français

  • Rasoir d'Ockham — Pour les articles homonymes, voir Rasoir d Occam (homonymie). Il est possible de décrire le soleil et les planètes comme étant en orbite autour de la te …   Wikipédia en Français

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”