Information-Theoretic Measures of Fit for Univariate and Multivariate Linear Regressions
- 89 Downloads
For the purpose of measuring the relative importance of independent variables in a multiple regression, Kruskal (1987) proposed an averaging procedure over all possible orderings of these variables. The present article uses this suggestion, but it is based on a different measure, from statistical information theory, and it extends the result to systems of equations.
Key WordsCorrelation Information theory Multiple regression Systems of Equations
Unable to display preview. Download preview PDF.
- Anderson, T.W.: 1984, An Introduction to Multivariate Statistical Analysis, 2nd ed., John Wiley, New York.Google Scholar
- Chung, C.-F.: 1987, “On Calculating the Information-Theoretic Measure of Fit for Multivariate Linear Regressions,” McKethan-Matherly Discussion Paper 28, University of Florida, College of Business Administration.Google Scholar
- Hotelling, H.: 1936, “Relations Between Two Sets of Variates,” Biometrika, 28, 321–377.Google Scholar
- Kruskal, W.: 1987, “Relative Importance by Averaging Over Orderings,” The American Statistician, 41, 6–10.Google Scholar
- Lawley, D.N.: 1956, “Tests of Significance of the Latent Roots of Covariance and Correlation Matrices,” Biometrika, 43, 128–136.Google Scholar
- Lawley, D.N.: 1959, “Tests of Significance in Canonical Analysis,” Biometrika, 46, 59–66.Google Scholar
- Theil, H.: 1958, Economic Forecasts and Policy, North-Holland, Amsterdam.Google Scholar
- Theil, H.: 1971, Principles of Econometrics, John Wiley, New York.Google Scholar
- Theil, H., and Fiebig, D.G.: 1984, “Exploiting Continuity: Maximum Entropy Estimation of Continuous Distributions,” Ballinger, Cambridge, Massachusetts.Google Scholar