Feedforward Neural Networks for Nonparametric Regression
Feed forward neural networks (FFNN) with an unconstrained random number of hidden neurons define flexible non-parametric regression models. In Müller and Rios Insua (1998) we have argued that variable architecture models with random size hidden layer significantly reduce posterior multimodality typical for posterior distributions in neural network models. In this chapter we review the model proposed in Müller and Rios Insua (1998) and extend it to a non-parametric model by allowing unconstrained size of the hidden layer. This is made possible by introducing a Markov chain Monte Carlo posterior simulation scheme using reversible jump (Green 1995) steps to move between different size architectures.
KeywordsDition Plague Veri
Unable to display preview. Download preview PDF.
- De Veaux, R., Ungar, L. (1997). A brief introduction to neural networks, Technical Report, Williams College, Williamstown, MA.Google Scholar
- Lippman, R. P. (1987). An introduction to computing with neural nets, IEEE ASSP Magazine, 4–22.Google Scholar
- Rios Insua, D., Rios Insua, S. and Martin, J. (1997). Simulacion, RA-MA, Madrid.Google Scholar
- Rios Insua, D., Salewicz, K.A., Müller, P. and Bielza, C. (1997). Bayesian methods in reservoir operations, in The Practice of Bayesian Analysis (eds: French, Smith), pp. 107–130, Wiley, New York.Google Scholar
- Ripley, B.D. (1993). Statistical aspects of neural networks. In Networks and Chaos (eds: Barndorf-Nielsen, Jensen, Kendall), London: Chapman and Hall.Google Scholar
- Rumelhart, D.E. and Mclelland, J.L. (eds) (1986). Parallel Distributed Processing, MIT Press, Cambridge.Google Scholar
- West, M., Harrison, J. (1996). Bayesian Forecasting and Dynamic Linear Models, 2nd ed, New York: Springer-Verlag.Google Scholar
- Warner, B., Misra, M. (1996). Understanding neural networks as statistical tools, The American Statistician, 50, 284–293.Google Scholar