13. Detecting Major Genes

 You are visitor number   since 17 October 1995 

Updated notes

More comments on Information Criterion

(Posted 12 April 1999)

Background: Page 363 discusses Akaike's information content, a measure to compare the fit of different models by adjusting the likelihood for the number of parameters fit. The likelihood ratio test does not penalize for number of parameters that are included in the model, while most of the information criteria do. Additional IC approachs were briefly discussed by Cuchan Wang (Pfizer) on the Animal Breeders Discussion group

In general, an information criterion (IC) is defined:

IC = L - P

where L is the maximum (restricted) likelihood value and P is a penalty term, which is a function of number of parameters in the model. The more parameters in the model, the bigger the penalty. A full model can have a larger or a smaller IC in comparison to a sub-model or another model, thus a tool for model selection. It should be noted that model selection based on IC does not require model nesting. Various definitions of the penalty term (P) result in different ICs, such as Akaike IC, Hannan and Quinn IC and Bayesian IC. For each IC, there also exists variation in defining P. The bigger the IC defined as above, the better the model fit. IC used as a model selection criterion is large sample based.

A few references:

Awad AM (1996) Properties of the Akaike information criterion. Microelectron. Reliab. 36: 457-464.

Bozdogan H (1987) Model selection and Akaike's information criterion (AIC): the general theory and its analytical extensions. Psychometrika 52: 345-370

Hannan EJ and Quinn BG (1979) The determination of the order of an autoregression. J Royal Statistical Society, Series B, 41: 190-195.

Schwarz G (1978) Estimating the dimension of a model. Annals of Statistics 6: 461-464.


Home Pages: [ Volume One ] - [ Volume Two ] - [ What's new ] - [ Book]

Created 25 February 1995, last updated 30 Jan 1996

Bruce Walsh. jbwalsh@u.arizona.edu . Comments welcome.