# Normal Theory Models and Some Extensions

• James K. Lindsey
Part of the Lecture Notes in Statistics book series (LNS, volume 72)

## Abstract

One of the most widely used tools in all of statistics is linear regression. This is often misnamed least squares regression, but a least squares estimation refers to a deterministic process, whereby the best straight line is fitted through a series of points. In statistical analysis, the interpretation is much different although the technical calculations remain the same. Normal theory linear regression carries the assumption that the response variable has a normal or Gaussian distribution:
$$f(y;\mu ,{{\sigma }^{2}}) = \exp [{{(y - \mu )}^{2}}/(2{{\sigma }^{2}})]/\sqrt {{2\pi {{\sigma }^{2}}}}$$
(1.1)
The mean of this distribution changes in some deterministic way with the values of the explanatory variable(s), e.g.
$${{\mu }_{i}} = {{\beta }_{0}} + \sum\limits_{j} {{{\beta }_{j}}{{X}_{{ij}}}}$$
(1.2)
while the variance remains constant. Then, the regression equation specifies how the mean of the distribution changes for each value of the explanatory variable(s); individual observations will be dispersed about the mean with the given variance. This is illustrated in Figure 1.1.

## Keywords

Scale Parameter Gamma Distribution Parameter List Good Straight Line Observation Parameter
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

## Preview

Unable to display preview. Download preview PDF.