Skip to main content

Part of the book series: Mathematics and Its Applications ((MAIA,volume 306))

  • 224 Accesses

Abstract

Most of the difficulties arising in the interpretation of linear regression are due to collinearity, which is inherent to the structure of the design points (the X space) in the classical linear model

$${y_{n \times 1}} = {X_{n \times p}}{\Theta _{p \times 1}} + {e_{n \times 1}},$$

where the subscripts indicate the dimensions of the vectors and matrices. The structure of the X space has to be analysed as a warning to limit a correct use of a regression model for prediction purposes; the portion of space where prediction is good was introduced as the ‘effective prediction domain’ or EPD by Mandel (1985). This notion may be extended when a linear model contains x variables that are nonlinear functions of one or more of the other variables, such as x 2j or xjxk.

In this paper we extend the notion of EPD to nonlinear models, which have the general form

$${y_{n \times 1}} = \eta ({X_{n \times p}},{\Theta _{p \times 1}}) + {e_{n \times 1}}$$

.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Belsley, D.A. (1990).Conditionning Diagnostics: Collinearity and Weak Data in Regression. Wiley, New York.

    Google Scholar 

  • Brook, R.J. and Arnold, G.C. (1985). Applied Regression Analysis and Experimental Design. Dekker, New York.

    MATH  Google Scholar 

  • Butler, R. and Rothman, E.D. (1980). Predictive intervals based on reuse of the sample. Journal of the American Statistical Association 75, 881–889.

    Article  MathSciNet  MATH  Google Scholar 

  • Chambers, J.M. (1977). Computational Methods for Data Analysis. Wiley, New York.

    Google Scholar 

  • Cooks, R.D. and Weisberg, S. (1990). Confidence curves in nonlinear regression. Journal of the American Statistical Association 85, 544–551.

    Article  Google Scholar 

  • Droge, B. (1987). A note on estimating MSEP in nonlinear regression. Statistics 18, 499–520.

    Article  MathSciNet  MATH  Google Scholar 

  • Efron, B. (1985). Bootstrap confidence intervals for a class of parametric problems. Biometrika 72, 45–58.

    Article  MathSciNet  MATH  Google Scholar 

  • Efron, B. (1987). Better bootstrap confidence interval. Journal of the American Statistical Association 82, 171–185.

    Article  MathSciNet  MATH  Google Scholar 

  • Jackson, J.E. (1991). A User’s Guide to Principal Component. Wiley, New York.

    Book  Google Scholar 

  • Mandel, J. (1985). The regression analysis of collinear data. Journal of Research of the National Bureau of Standards 90 465–476.

    Article  MATH  Google Scholar 

  • Sen, A. and Srivastava, M. (1990). Regression Analysis: Theory, Methods and Applications. Springer-Verlag, New York.

    Book  Google Scholar 

  • Snedecor, W.G. and Cochran, G.W. (1971). Méthodes Statistiques. ACTA, Paris.

    Google Scholar 

  • Stine, R.A. (1985). Bootstrap prediction intervals for regression. Journal of the American Statistical Association 80 1029–1031.

    Article  MathSciNet  Google Scholar 

  • Thombs, L.A. and Schucany, W.R. (1990). Bootstrap prediction intervals for autoregression. Journal of the American Statistical Association 85 486–492.

    Article  MathSciNet  MATH  Google Scholar 

  • Tomassone, R., Audrain, S., Lesquoy-de Turckheim, E. and Millier, C. (1992). La Régression: Nouveaux Regards sur une Ancienne Méthode Statistique. Masson, Paris.

    MATH  Google Scholar 

  • Tomassone, R., Dervin, C. and Masson, J-P. (1993). Biométrie, Modélisation de Phénomènes Biologiques. Masson, Paris.

    Google Scholar 

  • Van Huffel, S. and Vandevalle, J. (1991). The Total Least-Squares Problem: Computational Aspects and Analysis. SIAM, Philadelphia.

    Book  MATH  Google Scholar 

  • Wei, C.Z. (1992). On predictive least squares principles. The Annals of Statistics 20 1–42.

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1994 Springer Science+Business Media Dordrecht

About this chapter

Cite this chapter

Audrain, S., Tomassone, R. (1994). Prediction Domain in Nonlinear Models. In: Caliński, T., Kala, R. (eds) Proceedings of the International Conference on Linear Statistical Inference LINSTAT ’93. Mathematics and Its Applications, vol 306. Springer, Dordrecht. https://doi.org/10.1007/978-94-011-1004-4_17

Download citation

  • DOI: https://doi.org/10.1007/978-94-011-1004-4_17

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-94-010-4436-3

  • Online ISBN: 978-94-011-1004-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics