Advertisement

Theoretical Development and Asymptotic Properties of the GIC

Part of the Springer Series in Statistics book series (SSS)

Information criteria have been constructed as estimators of the Kullback–Leibler information discrepancy between two probability distributions or, equivalently, the expected log-likelihood of a statistical model for prediction.

In this chapter, we introduce a general framework for constructing information criteria in the context of functional statistics and give technical arguments and a detailed derivation of the generalized information criterion (GIC) defined in (5.64).We also investigate the asymptotic properties of information criteria in the estimation of the expected log-likelihood of a statistical model.

Keywords

Asymptotic Property Bias Correction Theoretical Development True Distribution Laplace Distribution 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media, LLC 2008

Personalised recommendations