This chapter deals with the problem of multicollinearity, also commonly termed collinearity and ill-conditioning. The problem arises when there exist near-linear dependencies among the vectors of explanatory variables, so that if the Gaussian elimination of Section 3.2 were used to solve the least squares equations then at least one pivot would be near-zero. The effect of multicollinearity is to inflate the variance of the least squares estimator and possibly any predictions made, and also to restrict the generality and applicability of the estimated model. Therefore, when multicollinearities occur they should be investigated thoroughly and, if they prove harmful, an effort should be made to deal with them appropriately (see below).
Unable to display preview. Download preview PDF.
- Draper, N. R. and Smith, H. (1966) Applied Regression Analysis. Wiley, New York.Google Scholar
- Garnham, N. F. J. (1979) Some aspects of the use of principal components in multiple regression. Dissertation for M.Sc. in Statistics at University of Kent at Canterbury.Google Scholar
- Hill, R. C., Fomby, T. B. and Johnson, S. R. (1977) Component selection norms for principal components regression. Commun. Statist., A, 6, 309–317.Google Scholar
- Hoerl, A. E. and Kennard, R. W. (1970b) Ridge regression: application to non-orthogonal problems. Technometrics, 12, 591–612.Google Scholar
- Lawson, C. L. and Hanson, R. J. (1974) Solving Least Squares Problems. Prentice-Hall, Englewood Cliffs, NJ.Google Scholar
- Marquardt, D. W. and Snee, R. D. (1975) Ridge regression in practice. Amer. Statist., 29, 3–19.Google Scholar
- Silvey, S. D. (1969) Multicollinearity and imprecise estimation. J. Roy. Statist. Soc. B, 31, 539–552.Google Scholar
- Stewart, G. W. (1973) Introduction to Matrix Computations. Academic Press, New York.Google Scholar