Skip to main content

Abstract

There is a very large literature on methods of choosing a regression model but, in spite of this, there is little clear guidance on what to do in a specific case. For access to the literature see Hocking (1976,1983), Mosteller and Tukey (1977), Seber (1977), Thompson, (1978), Daniel and Wood (1980), and Miller (1984). This chapter presents a review of the main points and for further details these references should be consulted.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Allen, D. M. (1974) The relationship between variable selection and data augmentation and a method for prediction. Technometrics, 16, 125–127.

    Article  Google Scholar 

  • Anscombe, R. J. (1981) Computing in Statistical Science Through APL. Springer-Verlag, New York.

    Book  Google Scholar 

  • Beale, E. M. L, Kendall, M. G. and Mann, D. W. (1967) The discarding of variables in multivariate analysis. Biometrika, 54, 357–366.

    Google Scholar 

  • Belsley, D., Kuh, E. and Welsch, R. E. (1980) Regression Diagnostics: Identifying Influential Data and Sources of Collinearity. Wiley, New York.

    Book  Google Scholar 

  • Berk, K. N., (1978) Comparing subset regression procedures. Technometrics, 20, 1–6.

    Article  Google Scholar 

  • Clarke, M. R. B. (1981) A given algorithm for moving from one linear model to another without going back to the data. Algorithm AS 153. Appl. Statist., 30, 198–203.

    Article  Google Scholar 

  • Clarke, M. R. B. (1982) The Gauss-Jordan sweep operator with detection of collinearity. Algorithm AS178. Appl. Statist., 31, 166–168.

    Article  Google Scholar 

  • Copas, J. B. (1983) Regression, prediction and shrinkage. J. Roy. Statist. Soc. B, 45, 311–354.

    Google Scholar 

  • Daniel, C. and Wood, F. S. (1980) Fitting Equations to Data. Wiley, New York.

    Google Scholar 

  • Forsythe, A. B., Engelman, L., Jennrich, R. and May, P. R. A. (1973) A stopping rule for variable selection in multiple regression. J. Amer. Statist. Assoc., 68, 75–77.

    Article  Google Scholar 

  • Freedman, D. A. (1983) A note on screening regression equations. Amer. Statist., 37, 147–151.

    Google Scholar 

  • Furnival, G. M. (1971) All possible regressions with less computation. Technometrics, 13, 403–408.

    Article  Google Scholar 

  • Furnival, G. M. and Wilson, R. W., Jr. (1974) Regression by leaps and bounds. Technometrics, 16, 499–512.

    Article  Google Scholar 

  • Gorman, J. W. and Toman, R. J. (1966) Selection of variables for fitting equations to data. Technometrics, 8, 27–51.

    Article  Google Scholar 

  • Henderson, H. V. and Vellman, P. F. (1981) Building multiple regression models interactively. Biometrics, 37, 391–411.

    Article  Google Scholar 

  • Hocking, R. R. (1976) The analysis, and selection of variables in linear regression. Biometrics, 32, 1–49.

    Article  Google Scholar 

  • Hocking R. R. (1983) Developments of linear regression methodology: 1959–1983. Technometrics, 25, 219–249.

    Google Scholar 

  • Hocking, R. R. and Leslie, R. N. (1967) Selection of the best subset in regression analysis. Technometrics, 9, 531–540.

    Article  Google Scholar 

  • Judge, G. G., Griffiths, W. E., Hill, R. C. and Lee, Tsoung-Chao (1980) The Theory and Practice of Econometrics. Wiley, New York.

    Google Scholar 

  • Mallows, C. L. (1973) Some comments on C p . Technometrics, 15, 661–675.

    Google Scholar 

  • Miller, A. J. (1984) Selection of subsets of regression variables. J. Roy. Statist. Soc. A, 147, 389–425.

    Google Scholar 

  • Morgan, J. A. and Tatar, J. F. (1972) Calculation of the residual sum of squares for all possible regressions. Technometrics, 14, 317–325.

    Article  Google Scholar 

  • Mosteller, F. and Tukey, J. W. (1977) Data Analysis and Regression, Addison-Wesley, Reading, MA.

    Google Scholar 

  • Newton, R. G. and Spurrell, D. J. (1967a) A development of multiple regression for the analysis of routine data. Appl. Statist., 16, 51–65.

    Article  Google Scholar 

  • Newton, R. G. and Spurrell, D. J. (1967b) Examples of the use of elements for clarifying regression analysis. Appl. Statist., 16, 165–171.

    Article  Google Scholar 

  • Pope, P. T. and Webster, J. T. (1972) The use of an F-statistic in stepwise regression procedures. Technometrics, 14, 327–340.

    Google Scholar 

  • Preece, D. A. (1981) Distribution of final digits in data. The Statistician, 30, 31–60.

    Article  Google Scholar 

  • Rencher, A. C. and Pun, F. C. (1980) Inflation of R 2 in best subset regression. Technometrics, 22, 49–54.

    Article  Google Scholar 

  • Seber, G. A. F. (1977) Linear Regression Analysis. Wiley, New York.

    Google Scholar 

  • Spjøtvoll, E. (1972a) Multiple comparison of regression functions. Ann. Math. Statist., 43, 1076–1088.

    Article  Google Scholar 

  • Spjøtvoll, E. (1972b) A note on a theorem of Forsythe and Golub. SIAM J. Appl. Math., 23, 307–311.

    Article  Google Scholar 

  • Stone, M. (1974) Cross-validatory choice and assessment of statistical predictions. J. Roy. Statist. Soc. B, 36, 111–147.

    Google Scholar 

  • Thompson, M. L. (1978) Selection of variables in multiple regression. Part I. A review and evaluation. Part II. Chosen procedures, computations and examples. Int. Statist. Rev., 46, 1–19 and 129–146.

    Google Scholar 

  • Wilkinson, L. and Dallal, G. E. (1981) Tests of significance in forward selection regression with an F-to enter stopping rule. Technometrics, 23, 377–380.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 1986 G. Barrie Wetherill

About this chapter

Cite this chapter

Barrie Wetherill, G., Duncombe, P., Kenward, M., Köllerström, J., Paul, S.R., Vowden, B.J. (1986). Choosing a regression model. In: Regression Analysis with Applications. Monographs on Statistics and Applied Probability. Springer, Dordrecht. https://doi.org/10.1007/978-94-009-4105-2_11

Download citation

  • DOI: https://doi.org/10.1007/978-94-009-4105-2_11

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-94-010-8322-5

  • Online ISBN: 978-94-009-4105-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics