Plane Answers to Complex Questions pp 16-44 | Cite as

# Estimation

Chapter

## Abstract

In this chapter, properties of least squares estimates are examined for the model The chapter begins with a discussion of the concept of estimability in linear models. Section 2 characterizes least squares estimates. Sections 3, 4, and 5 establish that least squares estimates are best linear unbiased estimates, maximum likelihood estimates, and minimum variance unbiased estimates. The last two of these properties require the additional assumption where

$$\begin{array}{*{20}{c}}
{Y = X\beta + e,}&{E(e) = 0,}&{Cov(e)}
\end{array} = {\sigma ^2}I$$

*e*~*N*(0,*σ*^{2}*I*)Section 6 also assumes that the errors are normally distributed and presents the distributions of various estimates. From these distributions various tests and confidence intervals are easily obtained. Section 7 examines the model$$\begin{array}{*{20}{c}}
{Y = X\beta + e,}&{E(e) = 0,}&{Cov(e)}
\end{array} = {\sigma ^2}V$$

*V*is a known positive definite matrix. Section 7 introduces weighted least squares estimates and presents properties of those estimates. Section 8 presents the normal equations and establishes their relationship to least squares and weighted least squares estimation. Section 9 discusses Bayesian estimation.## Keywords

Mean Square Error Bayesian Analysis Unbiased Estimate Normal Equation Prediction Interval
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

## Preview

Unable to display preview. Download preview PDF.

## Copyright information

© Springer Science+Business Media New York 1996