# Background: General inference

Part of the Statistics for Biology and Health book series (SBH)

We review the main theorems providing inference for sums of random variables. The theorem of de Moivre-Laplace is a well-known special case of the central limit theorem and helps provide the setting. Our main interest is on sums which can be considered to be composed of independent increments. The empirical distribution function F n (t) is readily seen to be a consistent estimator for F(t) at all continuity points of F(t). However, we can also view F n (t) as a constant number multiplying a sum of independent Bernoulli variates and this enables us to construct inference for F(t) on the basis of F n (t).Such inference can then be extended to the more general context of estimating equations. Inference for counting processes and stochastic integrals is described since this is commonly used in this area and, additionally, shares a number of features with an approach based on empirical processes. The importance of estimating equations is stressed, in particular equations based on the method of moments and equations derived from the likelihood. Resampling techniques can also be of great value for problems in inference. Our final goal is the use of inferential tools to construct models and so the predictive power of a model is important. An approach to this question can be made via the idea of explained variation or that of explained randomness. Both are dealt with in later chapters. Here, since this does not appears to be well known, we present an outline of explained variation in general terms, i.e., without necessarily leaning on any specific model.

## Keywords

Brownian Motion Central Limit Theorem General Inference Iterate Logarithm Counting Process