Advertisement

Multicollinearity

  • G. Barrie Wetherill
  • P. Duncombe
  • M. Kenward
  • J. Köllerström
  • S. R. Paul
  • B. J. Vowden
Part of the Monographs on Statistics and Applied Probability book series (MSAP)

Abstract

This chapter deals with the problem of multicollinearity, also commonly termed collinearity and ill-conditioning. The problem arises when there exist near-linear dependencies among the vectors of explanatory variables, so that if the Gaussian elimination of Section 3.2 were used to solve the least squares equations then at least one pivot would be near-zero. The effect of multicollinearity is to inflate the variance of the least squares estimator and possibly any predictions made, and also to restrict the generality and applicability of the estimated model. Therefore, when multicollinearities occur they should be investigated thoroughly and, if they prove harmful, an effort should be made to deal with them appropriately (see below).

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Dempster, P., Schatsoff, M. and Wermuth, D. (1977) A simulation study of alternatives to ordinary least squares. J. Amer. Statist. Assoc., 72, 77–91.CrossRefGoogle Scholar
  2. Draper, N. R. and Smith, H. (1966) Applied Regression Analysis. Wiley, New York.Google Scholar
  3. Farrar, D. E. and Glauber, R. R. (1967) Multicollinearity in regression analysis: the problem revisited. Rev. Econ. Statist., 49, 92–107.CrossRefGoogle Scholar
  4. Fomby, T. B., Hill R. C. and Johnson, S. R. (1978) An optimal property of principal components in the context of restricted least squares. J. Amer. Statist. Assoc., 73, 191–193.CrossRefGoogle Scholar
  5. Garnham, N. F. J. (1979) Some aspects of the use of principal components in multiple regression. Dissertation for M.Sc. in Statistics at University of Kent at Canterbury.Google Scholar
  6. Greenberg, E. (1975) Minimum variance properties of principal components regression. J. Amer. Statist. Assoc., 70, 194–197.CrossRefGoogle Scholar
  7. Gunst, R. F. and Mason, R. L. (1977a) Advantages of examining multi-collinearities in regression analysis. Biometrics, 33, 249–260.CrossRefGoogle Scholar
  8. Gunst, R. F. and Mason, R. L. (1977b) Biased estimation in regression: an evaluation using mean square error. J. Amer. Statist. Assoc., 72, 616–628.CrossRefGoogle Scholar
  9. Gunst, R. F. Webster, J. T. and Mason, R. L. (1976) A comparison of least squares and latent root regression estimators. Technometrics, 18, 75–83.CrossRefGoogle Scholar
  10. Hawkins, D. M. (1973) On the investigation of alternative regressions by principal component analysis. Appl. Statist., 22, 275–286.CrossRefGoogle Scholar
  11. Hawkins, D. M. (1975) Relations between ridge regression and eigenanalysis of the augmented correlation matrix. Technometrics, 17, 477–480.CrossRefGoogle Scholar
  12. Hill, R. C., Fomby, T. B. and Johnson, S. R. (1977) Component selection norms for principal components regression. Commun. Statist., A, 6, 309–317.Google Scholar
  13. Hocking, R. R. (1976) The analysis and selection of variables in linear regression. Biometrics, 32, 1–49.CrossRefGoogle Scholar
  14. Hoerl, A. E. and Kennard, R. W. (1970a) Ridge regression: biased estimation for non-orthogonal problems. Technometrics, 12, 55–67.CrossRefGoogle Scholar
  15. Hoerl, A. E. and Kennard, R. W. (1970b) Ridge regression: application to non-orthogonal problems. Technometrics, 12, 591–612.Google Scholar
  16. Johnson, S. R., Reimer, S. C. and Rothrock, T. P. (1973) Principal components and the problem of multicollinearity. Metroeconomica, 25, 306–317.CrossRefGoogle Scholar
  17. Jolliffe, I. T. (1972) Discarding variables in a principal component analysis. I: Artificial data. Appl. Statist., 21, 160–173.CrossRefGoogle Scholar
  18. Jolliffe, I. T. (1973) Discarding variables in a principal component analysis, II: Real data. Appl. Statist., 22, 21–31.CrossRefGoogle Scholar
  19. Kumar, T. K. (1975) Multicollinearity in regression analysis. Rev. Econ. Statist., 57, 365–366.CrossRefGoogle Scholar
  20. Lawson, C. L. and Hanson, R. J. (1974) Solving Least Squares Problems. Prentice-Hall, Englewood Cliffs, NJ.Google Scholar
  21. Lott, W. F. (1973) The optimal set of principal component restrictions on a least squares regression. Commun. Statist. A, 2, 449–464.CrossRefGoogle Scholar
  22. Marquardt, D. W. and Snee, R. D. (1975) Ridge regression in practice. Amer. Statist., 29, 3–19.Google Scholar
  23. Mason, R. L., Gunst, R. F. and Webster, J. T. (1975) Regression analysis and problems of multicollinearity. Commun. Statist. A, 4 (3), 277–292.CrossRefGoogle Scholar
  24. Massy, W. F. (1965) Principal components regression in exploratory statistical research. J. Amer. Statist. Assoc., 60, 234–256.CrossRefGoogle Scholar
  25. Silvey, S. D. (1969) Multicollinearity and imprecise estimation. J. Roy. Statist. Soc. B, 31, 539–552.Google Scholar
  26. Stewart, G. W. (1973) Introduction to Matrix Computations. Academic Press, New York.Google Scholar
  27. Webster, J. T., Gunst, R. F. and Mason, R. L. (1974) Latent root regression analysis. Technometrics, 16, 513–522.CrossRefGoogle Scholar

Copyright information

© G. Barrie Wetherill 1986

Authors and Affiliations

  • G. Barrie Wetherill
    • 1
  • P. Duncombe
    • 2
  • M. Kenward
    • 3
  • J. Köllerström
    • 3
  • S. R. Paul
    • 4
  • B. J. Vowden
    • 3
  1. 1.Department of StatisticsThe University of Newcastle upon TyneUK
  2. 2.Applied Statistics Research UnitUniversity of Kent at CanterburyUK
  3. 3.Mathematical InstituteUniversity of Kent at CanterburyUK
  4. 4.Department of Mathematics and StatisticsUniversity of WindsorCanada

Personalised recommendations