The Multivariate Normal Distribution
In Chapter I we studied how to handle (linear transformations of) random vectors, that is, vectors whose components are random variables. Since the normal distribution is (one of) the most important distribution(s) and since there are special properties, methods, and devices pertaining to this distribution, we devote this chapter to the study of the multivariate normal distribution or, equivalently, to the study of normal random vectors. We show, for example, that the sample mean and the sample variance in a (one-dimensional) sample are independent, a property that, in fact, characterizes this distribution and is essential, for example, in the so called t-test, which is used to test hypotheses about the mean in the (univariate) normal distribution when the variance is unknown. In fact, along the way we will encounter three different ways to show this independence. Another interesting fact that will be established is that if the components of a normal random vector are uncorrelated, then they are in fact independent. One section is devoted to quadratic forms of normal random vectors, which are of great importance in many branches of statistics. The main result states that one can split the sum of the squares of the observations into a number of quadratic forms, each of them pertaining to some cause of variation in an experiment in such a way that these quadratic forms are independent, and (essentially) χ2-distributed random variables. This can be used to test whether or not a certain cause of variation influences the outcome of the experiment. For more on the statistical aspects, we refer to the literature cited in Appendix 1.
KeywordsCovariance Matrix Quadratic Form Random Vector Conditional Distribution Orthogonal Matrix
Unable to display preview. Download preview PDF.