Abstract
Most of the difficulties arising in the interpretation of linear regression are due to collinearity, which is inherent to the structure of the design points (the X space) in the classical linear model
where the subscripts indicate the dimensions of the vectors and matrices. The structure of the X space has to be analysed as a warning to limit a correct use of a regression model for prediction purposes; the portion of space where prediction is good was introduced as the ‘effective prediction domain’ or EPD by Mandel (1985). This notion may be extended when a linear model contains x variables that are nonlinear functions of one or more of the other variables, such as x 2j or xjxk.
In this paper we extend the notion of EPD to nonlinear models, which have the general form
.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Belsley, D.A. (1990).Conditionning Diagnostics: Collinearity and Weak Data in Regression. Wiley, New York.
Brook, R.J. and Arnold, G.C. (1985). Applied Regression Analysis and Experimental Design. Dekker, New York.
Butler, R. and Rothman, E.D. (1980). Predictive intervals based on reuse of the sample. Journal of the American Statistical Association 75, 881–889.
Chambers, J.M. (1977). Computational Methods for Data Analysis. Wiley, New York.
Cooks, R.D. and Weisberg, S. (1990). Confidence curves in nonlinear regression. Journal of the American Statistical Association 85, 544–551.
Droge, B. (1987). A note on estimating MSEP in nonlinear regression. Statistics 18, 499–520.
Efron, B. (1985). Bootstrap confidence intervals for a class of parametric problems. Biometrika 72, 45–58.
Efron, B. (1987). Better bootstrap confidence interval. Journal of the American Statistical Association 82, 171–185.
Jackson, J.E. (1991). A User’s Guide to Principal Component. Wiley, New York.
Mandel, J. (1985). The regression analysis of collinear data. Journal of Research of the National Bureau of Standards 90 465–476.
Sen, A. and Srivastava, M. (1990). Regression Analysis: Theory, Methods and Applications. Springer-Verlag, New York.
Snedecor, W.G. and Cochran, G.W. (1971). Méthodes Statistiques. ACTA, Paris.
Stine, R.A. (1985). Bootstrap prediction intervals for regression. Journal of the American Statistical Association 80 1029–1031.
Thombs, L.A. and Schucany, W.R. (1990). Bootstrap prediction intervals for autoregression. Journal of the American Statistical Association 85 486–492.
Tomassone, R., Audrain, S., Lesquoy-de Turckheim, E. and Millier, C. (1992). La Régression: Nouveaux Regards sur une Ancienne Méthode Statistique. Masson, Paris.
Tomassone, R., Dervin, C. and Masson, J-P. (1993). Biométrie, Modélisation de Phénomènes Biologiques. Masson, Paris.
Van Huffel, S. and Vandevalle, J. (1991). The Total Least-Squares Problem: Computational Aspects and Analysis. SIAM, Philadelphia.
Wei, C.Z. (1992). On predictive least squares principles. The Annals of Statistics 20 1–42.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1994 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Audrain, S., Tomassone, R. (1994). Prediction Domain in Nonlinear Models. In: Caliński, T., Kala, R. (eds) Proceedings of the International Conference on Linear Statistical Inference LINSTAT ’93. Mathematics and Its Applications, vol 306. Springer, Dordrecht. https://doi.org/10.1007/978-94-011-1004-4_17
Download citation
DOI: https://doi.org/10.1007/978-94-011-1004-4_17
Publisher Name: Springer, Dordrecht
Print ISBN: 978-94-010-4436-3
Online ISBN: 978-94-011-1004-4
eBook Packages: Springer Book Archive