Advertisement

Bayesian Regularization of Neural Networks

  • Frank BurdenEmail author
  • Dave Winkler
Part of the Methods in Molecular Biology™ book series (MIMB, volume 458)

Abstract

Bayesian regularized artificial neural networks (BRANNs) are more robust than standard back-propagation nets and can reduce or eliminate the need for lengthy cross-validation. Bayesian regularization is a mathematical process that converts a nonlinear regression into a “well-posed” statistical problem in the manner of a ridge regression. The advantage of BRANNs is that the models are robust and the validation process, which scales as O(N2) in normal regression methods, such as back propagation, is unnecessary. These networks provide solutions to a number of problems that arise in QSAR modeling, such as choice of model, robustness of model, choice of validation set, size of validation effort, and optimization of network architecture. They are difficult to overtrain, since evidence procedures provide an objective Bayesian criterion for stopping training. They are also difficult to overfit, because the BRANN calculates and trains on a number of effective network parameters or weights, effectively turning off those that are not relevant. This effective number is usually considerably smaller than the number of weights in a standard fully connected back-propagation neural net. Automatic relevance determination (ARD) of the input variables can be used with BRANNs, and this allows the network to “estimate” the importance of each input. The ARD method ensures that irrelevant or highly correlated indices used in the modeling are neglected as well as showing which are the most important variables for modeling the activity data.

This chapter outlines the equations that define the BRANN method plus a flowchart for producing a BRANN-QSAR model. Some results of the use of BRANNs on a number of data sets are illustrated and compared with other linear and nonlinear models.

Keywords

QSAR artificial neural network Bayesian regularizsation early stopping algorithm automatic relevance determination (ARD) overtraining 

References

  1. 1.
    Burden FR, Winkler DA (1999) Robust QSAR models using Bayesian regularized neural networks. J Med Chem 42:3183–3187.CrossRefPubMedGoogle Scholar
  2. 2.
    Winkler DA, Burden FR (2000) Robust QSAR models from novel descriptors and Bayesian regularized neural networks. Mol Simul 24:243–258.CrossRefGoogle Scholar
  3. 3.
    MacKay DJC (1992) A practical Bayesian framework for backpropagation networks. Neural Computation 4:448–472.CrossRefGoogle Scholar
  4. 4.
    Lucic B, Amic D, Trinajstic N. (2000) Nonlinear multivariate regression outperforms several concisely designed neural networks on three QSPR data sets. J Chem Inf Comput Sci 40:403–413.PubMedGoogle Scholar
  5. 5.
    Neal RN (1996) Bayesian learning for neural networks. Springer-Verlag New York, Inc., Secaucus, NJ.Google Scholar
  6. 6.
    Hawkins DM, Basak SC, Mills D (2003) Assessing model fit by cross-validation. J Chem. Inf Comput Sci 43:579–58PubMedGoogle Scholar
  7. 7.
    Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press, Oxford.Google Scholar
  8. 8.
    Nabney IT (2002) Netlab: algorithms for pattern recognition. Springer-Verlag, London.Google Scholar
  9. 9.
    Baskin II, Ait AO, Halberstamc NM, PalyulinVA, Zefirov NS (2002) An approach to the interpretation of backpropagation neural network models in QSAR studies. SAR QSAR Environ Res 13:35–41.CrossRefPubMedGoogle Scholar
  10. 10.
    Burden FR, Ford MG, Whitley DC, Winkler DA (2000) Use of automatic relevance determination in QSAR studies using Bayesian neural networks. J Chem Inf Comput Sci 40:1423–1430.PubMedGoogle Scholar
  11. 11.
    Polley MJ, Burden FR, Winkler, D. A. (2005) Predictive human intestinal absorption QSAR models using Bayesian regularized neural networks. Australian Journal of Chemistry 58:859–863.CrossRefGoogle Scholar
  12. 12.
    Burden F R (1996) Using artificial neural networks to predict biological activity from simple molecular structure considerations. Quant Struct-Act Relat 15:7–11.CrossRefGoogle Scholar
  13. 13.
    Burden FR (1989) Molecular identification number for substructure searches. J Chem Inf Comput Sci 29:225–227.Google Scholar
  14. 14.
    Winkler DA, Burden FR (2004) Bayesian neural nets for modeling in drug discovery. Biosilico 2:104–111.Google Scholar
  15. 15.
    Gasteiger J, Marsili,M (1980) Iterative partial equalization of orbital electronegativity—a rapid access to atomic charges. Tetrahedron. 36:3219–3288.CrossRefGoogle Scholar
  16. 16.
    Burden FR, Winkler DA. (2000) A QSAR model for the acute toxicity of substituted benzenes to tetrahymena pyriformis using Bayesian Regularized neural networks. Chem Re. Toxicol 13:436--440.CrossRefGoogle Scholar
  17. 17.
    Burden FR (1997) A chemically intuitive molecular index based on the eigenvalues of a modified adjacency matrix. Quant Struct-Act Relat 16:309–314.CrossRefGoogle Scholar
  18. 18.
    Winkler DA, Burden FR (2004) Modelling blood brain barrier partitioning using Bayesian neural nets, J. Mol. Graph. Model. 22:499–508.CrossRefPubMedGoogle Scholar
  19. 19.
    van Rossum G. (1995) Python tutorial. Technical Report CS-R9526, Centrum voor Wiskunde en Informatica (CWI), Amsterdam, May1995.Google Scholar
  20. 20.
    van Rossum G, Drake FL Jr (eds) (2003) Python/C API reference manual. PythonLabs, release 2.2.330 May. Google Scholar
  21. 21.
    van Rossum G, Drake FL Jr (eds) (2003) Python library reference. PythonLabs, release 2.2.330 May. Google Scholar
  22. 22.
    Winkler DA, Burden FR (2000) Robust QSAR models from novel descriptors and Bayesian regularized neural networks. Mol Simul 24:243–258.CrossRefGoogle Scholar
  23. 23.
    Winkler DA, Burden FR. (2002) Application of neural networks to large dataset QSAR, virtual screening and library design. in: Bellavance-English,L (ed) Combinatorial chemistry methods and protocols., Humana Press, Totowa, NJ.Google Scholar
  24. 24.
    Bruneau P (2001) Search for predictive generic model of aqueous solubility using Bayesian neural nets. J Chem Inf Comput Sci 41:1605–1616.PubMedGoogle Scholar
  25. 25.
    Klocker J, Wailzer B, Buchbauer G, Wolschann P (2002) Bayesian neural networks for aroma classification. J Chem Inf Comput Sci 42:1443–1449.PubMedGoogle Scholar
  26. 26.
    MacKay DJC (1992) Bayesian interpolation. Neural Computation 4:415–447.CrossRefGoogle Scholar

Copyright information

© Humana Press, a part of Springer Science + Business Media, LLC 2008

Authors and Affiliations

  1. 1.Carlton NorthAustralia
  2. 2.CSIRO Molecular and Health TechnologiesClaytonAustralia

Personalised recommendations