Reduced-Rank Regression Model

  • Gregory C. Reinsel
  • Raja P. Velu
Part of the Lecture Notes in Statistics book series (LNS, volume 136)


The classical multivariate regression model presented in Chapter 1, as noted before, does not make direct use of the fact that the response variables are likely to be correlated. A more serious practical concern is that even for a moderate number of variables whose interrelationships are to be investigated, the number of parameters in the regression matrix can be large. For example, in a multivariate analysis of economic variables (see Example 2.2), Gudmundsson (1977) uses m = 7 response variables and n = 6 predictor variables, thus totaling 42 regression coefficient parameters (excluding intercepts) to be estimated, in the classical regression setup. But the number of vector data points available for estimation is only T = 36; these are quarterly observations from 1948 to 1956 for the United Kingdom. Thus, in many practical situations, there is a need to reduce the number of parameters in model (1.1) and we approach this problem through the assumption of lower rank of the matrix C in model (1.1). More formally, in the model Y k = CX k + εk we assume that
$$ {\mathop{\rm rank}\nolimits} (C) = r \le \min (m,n) $$


Asymptotic Distribution Canonical Correlation Canonical Correlation Analysis Canonical Variate Error Covariance Matrix 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media New York 1998

Authors and Affiliations

  • Gregory C. Reinsel
    • 1
  • Raja P. Velu
    • 2
  1. 1.Department of StatisticsUniversity of Wisconsin, MadisonMadisonUSA
  2. 2.School of ManagementSyracuse UniversitySyracuseUSA

Personalised recommendations