Multivariate Models

  • George A. F. Seber
Part of the Springer Series in Statistics book series (SSS)


Up till now we have been considering various univariate linear models of the form \(y_{i} =\theta _{i} +\varepsilon _{i}\) (i = 1, 2, , n), where \(E[\varepsilon _{i}] = 0\) and the \(\varepsilon _{i}\) are independently and identically distributed. We assumed G that \(\boldsymbol{\theta }\in \varOmega\), where Ω is a p-dimensional vector space in \(\mathbb{R}^{n}\). A natural extension to this is to replace the response variable y i by a 1 × d row vector of response variables y i ′, and replace the vector y = (y i ) by the data matrix
$$\displaystyle{\mathbf{Y} = \left (\begin{array}{c} \mathbf{y}_{1}' \\ \mathbf{y}_{2}'\\ \vdots \\ \mathbf{y}_{n}' \end{array} \right ) = (\mathbf{y}^{(1)},\mathbf{y}^{(2)},\ldots,\mathbf{y}^{(d)}),}$$
say. Here y(j) (j = 1, 2, , d) represents n independent observations on the jth variable of y. Writing \(\mathbf{y}^{(j)} =\boldsymbol{\theta } ^{(j)} + \mathbf{u}^{(j)}\) with E[u(j)] = 0, we now have d univariate models, which will generally not be independent, and we can combine them into one equation giving us
$$\displaystyle{\mathbf{Y} =\boldsymbol{\varTheta } +\mathbf{U},}$$
where \(\boldsymbol{\varTheta }= (\boldsymbol{\theta }^{(1)},\boldsymbol{\theta }^{(2)},\ldots,\boldsymbol{\theta }^{(d)})\), \(\mathbf{U} = (\mathbf{u}^{(1)},\mathbf{u}^{(2)},\ldots,\mathbf{u}^{(d)})\), and E[U] = 0. Of particular interest are vector extensions of experimental designs where each observation is replaced by a vector observation. For example, we can extend the randomized block design
$$\displaystyle{\theta _{ij} =\mu +\alpha _{i} +\tau _{j}\quad (i = 1,2,\ldots,I;j = 1,2,\ldots,J),}$$


  1. Eaton, M. L., & Perlman, M. D. (1973). The non-singularity of generalized sample covariance matrices. Annals of Statistics, 1, 710–717.zbMATHMathSciNetCrossRefGoogle Scholar
  2. Okamoto, M. (1973). Distinctness of the eigenvalues of a quadratic form in a multivariate sample. Annal of Statistics, 1, 763–765.zbMATHCrossRefGoogle Scholar
  3. Pillai, K. C. S. (1955). Some new test criteria in multivariate analysis. Annals of Mathematical Statistics, 26, 117–121.zbMATHMathSciNetCrossRefGoogle Scholar
  4. Roy, S. N. (1953). On a heuristic method of test construction and its use in multivariate analysis. Annals of Mathematical Statistics, 24, 220–238.zbMATHMathSciNetCrossRefGoogle Scholar
  5. Seber, G. A. F. (1984). Multivariate observations. New York: Wiley. Also reproduced in paperback by Wiley in 2004.Google Scholar
  6. Seber, G. A. F. (2008). A matrix handbook for statisticians. New York: Wiley.Google Scholar
  7. Wilks, S. S. (1932). Certain generalizations in the analysis of variance. Biometrika, 24, 471–494.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • George A. F. Seber
    • 1
  1. 1.Department of StatisticsThe University of AucklandAucklandNew Zealand

Personalised recommendations