Information Theory and Log-Likelihood Models: A Basis for Model Selection and Inference

  • Kenneth P. Burnham
  • David R. Anderson

Abstract

Full reality cannot be included in a model; thus we seek a good model to approximate the effects or factors supported by the empirical data. The selection of an appropriate approximating model is critical to statistical inference from many types of empirical data. This chapter introduces concepts from information theory (see Guiasu 1977), which has been a discipline only since the mid-1940s and covers a variety of theories and methods that are fundamental to many of the sciences (see Cover and Thomas 1991 for an exciting overview; Fig. 2.1 is produced from their book and shows their view of the relationship of information theory to several other fields). In particular, the Kullback—Liebler “distance,” or “information,” between two models (Kullback and Leibler 1951) is introduced, discussed, and linked to Boltzmann’s entropy in this chapter. Akaike (1973) found a simple relationship between the Kullback—Liebler distance and Fisher's maximized log-likelihood function (see deLeeuw 1992 for a brief review). This relationship leads to a simple, effective, and very general methodology for selecting a parsimonious model for the analysis of empirical data.

Keywords

Model Selection True Model Candidate Model Monte Carlo Study Model Selection Criterion 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media New York 1998

Authors and Affiliations

  • Kenneth P. Burnham
    • 1
  • David R. Anderson
    • 1
  1. 1.Colorado Cooperative Fish and Wildlife Research UnitColorado State UniversityFort CollinsUSA

Personalised recommendations