Abstract
In this paper we consider the problem of variables selection in a non linear regression model with dependent errors. In this framework, we discuss the use of some measures for the variables relevance to the neural network model and we propose the use of the moving block bootstrap technique to estimate the variability of these measures. The performance of the procedure is evaluated by a small Monte Carlo experiment which shows how the proposed approach determines a correct ranking among relevant and irrelevant variables.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
ANDERS, U. and KORN, O. (1999): Model selection in neural networks. Neural Networks, 12, 309–323.
ATKINSON, A.C. (1985): Plots, Trasformations and Regressions. Oxford University Press, Oxford.
BARRON, A. R. (1991): Complexity regularization with application to artificial neural networks. In: G. Roussas (Ed.): Nonparametric Functional Estimation and Related Topics. Kluwer Academic Publishers, pp. 561–576.
CYBENKO, G. (1989): Approximations by superpositions of a sigmoidal function. Math. Control, Signals, Systems, 2, 303–314.
FARAWAY, J. and CHATFIELD, C. (1998): Time series forecasting with neural networks: a comparative study using the airline data. Applied Statistics, 47, 231–250.
HALL, P., HOROWITZ, J.L. and JING, B. (1995): On blocking rule for the block bootstrap with dependent data. Biometrika, 82, 561–574.
HORNIK, K, STINCHCOMBE, M. and WHITE, H. (1990): Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks. Neural Networks, 3, 551–560.
KUNSCH, H.R. (1989): The jacknife and the bootstrap for general stationary observations. The Annals of Statistics, 11, 1217–1241.
PERNA, C. and GIORDANO, F. (2001): The hidden layer size in feed-forward neural networks: a statistical point of view. Metron, LIX, 217–227.
POLITIS, D.N. and ROMANO, J.P. (1992): A circular block-resampling procedure for stationary data. In: C. Page and R. LePage (Eds.): Exploring the limits of the bootstrap. Springer-Verlag, NY.
REED, R.(1993): Pruning algorithms-a survey. IEEE transactions on Neural Networks, 4, 740–74
REFENES, A.P.N. and ZAPRANIS, A.D. (1999): Neural model identification, variable selection and model adequacy. Journal of Forecasting, 18, 299–332.
RIPLEY, B. D. (1995): Statistical ideas for selecting network architectures. In: B. Kappen and S. Gielen (eds.), Neural Networks: artificial intelligence and industrial applications, Springer, pp. 183–190.
TIBSHIRANI, R. (1995): A comparison of some error estimates for neural network models. Research Report, Department of Preventive and Biostatistics, University of Toronto.
WHITE, H. (1994): Estimation, inference and specification analysis. Cambridge University Press, Cambridge.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Giordano, F., Rocca, M.L., Perna, C. (2004). Bootstrap Variables Selection in Neural Network Regression Models. In: Bock, HH., Chiodi, M., Mineo, A. (eds) Advances in Multivariate Data Analysis. Studies in Classification, Data Analysis, and Knowledge Organization. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-17111-6_9
Download citation
DOI: https://doi.org/10.1007/978-3-642-17111-6_9
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-20889-1
Online ISBN: 978-3-642-17111-6
eBook Packages: Springer Book Archive