Conclusions and Further Work

  • Radford M. Neal
Part of the Lecture Notes in Statistics book series (LNS, volume 118)


The preceding three chapters have examined the meaning of Bayesian neural network models, showed how these models can be implemented by Markov chain Monte Carlo methods, and demonstrated that such an implementation can be applied in practice to problems of moderate size, with good results. In this concluding chapter, I will review what has been accomplished in these areas, and describe on-going and potential future work to extend these results, both for neural networks and for other flexible Bayesian models.


Hide Layer Covariance Function Hierarchical Model Hide Unit Markov Chain Monte Carlo Method 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media New York 1996

Authors and Affiliations

  • Radford M. Neal
    • 1
  1. 1.Department of Statistics and Department of Computer ScienceUniversity of TorontoTorontoCanada

Personalised recommendations