Theoretical Development and Asymptotic Properties of the GIC
Information criteria have been constructed as estimators of the Kullback–Leibler information discrepancy between two probability distributions or, equivalently, the expected log-likelihood of a statistical model for prediction.
In this chapter, we introduce a general framework for constructing information criteria in the context of functional statistics and give technical arguments and a detailed derivation of the generalized information criterion (GIC) defined in (5.64).We also investigate the asymptotic properties of information criteria in the estimation of the expected log-likelihood of a statistical model.
KeywordsAsymptotic Property Bias Correction Theoretical Development True Distribution Laplace Distribution
Unable to display preview. Download preview PDF.