Skip to main content

Sensitivity Analysis of Radial-Basis Function Neural Network due to the Errors of the I.I.D Input

  • Conference paper
  • First Online:
Machine Learning and Cybernetics (ICMLC 2014)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 481))

Included in the following conference series:

  • 1583 Accesses

Abstract

An important issue, in the design and implementation of a Radial-Basis Function Neural Network (RBFNN), is the sensitivity of its output to input perturbations. Based on the central limit theorem, this paper proposes a method to compute the sensitivity of the RBFNN due to the errors of the inputs of the networks. For simplicity and practicality, all inputs are assumed to be independent and identically distributed (i.i.d.) with uniform distribution on interval (a, b). A number of simulations are conducted and the good agreement between the experimental results and the theoretical results verifies the reliability and feasibility of the proposed method. With this method, not only the relationship among the sensitivity of RBFNN, input error ratios and the number of the neurons of the input layer but also the relationship among the sensitivity, input error ratios and the number of the neurons of the hidden layer is founded.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Stevenson, M., Winter, R., Widrow, B.: Sensitivity of feedforward neural networks to weight errors. IEEE Trans. Neural Networks 1, 71–80 (1990)

    Article  Google Scholar 

  2. Cheng, A.Y., Yeung, D.S.: Sensitivity analysis of neocognitron. IEEE Trans. Syst., Man, Cybern. C 29, 238–249 (1999)

    Article  Google Scholar 

  3. Piche, S.W.: The selection of weight accuracies for Madalines. IEEE Trans. Neural Networks 6, 432–445 (1995)

    Article  Google Scholar 

  4. Yeung, D.S., Sun, X.: Using Function Approximation to Analyze the Sensitivity of MLP with Antisymmetric Squashing Activation Function. IEEE Transactions on Neural Networks 13(1), 34–44 (2002)

    Article  Google Scholar 

  5. Hashem, S.: Sensitivity analysis for feedforward artificial neural networks with differentiable activation functions. In: Proc. IJCNN 1992 Baltimore, MD, vol. 1, pp. 419–424 (1992)

    Google Scholar 

  6. Fu, L., Chen, T.: Sensitivity analysis for input vector in multilayer feedforward neural networks. In: Proc. IEEE Int. Conf. Neural Networks San Francisco, CA, vol. 1, pp. 215–218 (1993)

    Google Scholar 

  7. Zurada, J.M., Malinowski, A., Usui, S.: Perturbation method for deleting redundant inputs of perceptron networks. Neurocomput. 14, 177–193 (1997)

    Article  Google Scholar 

  8. Engelbrecht, A.P., Cloete, I.: A sensitivity analysis algorithm for pruning feedforward neural networks. In: Proc. IEEE Int. Conf. Neural Networks, Washington, DC, vol. 2, pp. 1274–1277 (1996)

    Google Scholar 

  9. Engelbrecht, A.P., Fletcher, L., Cloete, I.: Variance analysis of sensitivity information for pruning feedforward neural networks. In: Proc. IEEE Int. Conf. Neural Networks, Washington, DC, pp. 1829–1833 (1999)

    Google Scholar 

  10. Choi, J.Y., Choi, C.-H.: Sensitivity analysis of multilayer perceptron with differentiable activation functions. IEEE Trans. Neural Networks 3, 101–107 (1992)

    Article  Google Scholar 

  11. Broomhead, D.S., Lowe, D.: Multiva-riable functional interpolation and adaptive networks. Complex Systems 2, 321–355 (1988)

    MATH  MathSciNet  Google Scholar 

  12. Yeung, D.S., Ng, W.W.Y., Wang, D., Tsang, E.C.C., Wang, X.-Z.: Localized Generalization Error and Its Application to Architecture Selection for Radial Basis Function Neural Network. IEEE Trans. on Neural Networks 18(5), 1294–1305 (2007)

    Article  Google Scholar 

  13. Wang, X.-Z., Li, C.: A New Definition of Sensitivity for RBFNN and Its Applications to Feature Reduction. In: Wang, J., Liao, X.-F., Yi, Z. (eds.) ISNN 2005. LNCS, vol. 3496, pp. 81–86. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  14. Wang, X.Z., Zhang, H.: An Upper Bound of Input Perturbation for RBFNNs Sensitivity Analysis. In: Proceedings of the Fourth International Conference on Machine Learning and Cybernetics, Guangzhou, pp. 4074–4079 (August 18-21, 2005)

    Google Scholar 

  15. Ng, W.W.Y., Yeung, D.S., Firth, M., Tsang, E.C.C., Wang, X.-Z.: Feature Selection Using Localized Generalization Error for Supervised Classification Problems for RBFNN. Pattern Recognition, 3706–3719 (2008)

    Google Scholar 

  16. Zeng, X., Yeung, D.S.: Sensitivity analysis of multilayer perceptron to input and weight perturbations. IEEE Trans. Neural Netw. 12(6), 1358–1366 (2001)

    Article  Google Scholar 

  17. Yang, S., Ho, C., Siu, S.: Computing and Analyzing the Sensitivity of MLP Due to the Errors of the i.i.d. Inputs and Weights Based on CLT. IEEE Trans. on Neural Networks 21(12) (2010)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jie Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Li, J., Li, J., Liu, Y. (2014). Sensitivity Analysis of Radial-Basis Function Neural Network due to the Errors of the I.I.D Input. In: Wang, X., Pedrycz, W., Chan, P., He, Q. (eds) Machine Learning and Cybernetics. ICMLC 2014. Communications in Computer and Information Science, vol 481. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-45652-1_35

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-45652-1_35

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-662-45651-4

  • Online ISBN: 978-3-662-45652-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics