Abstract
An important issue, in the design and implementation of a Radial-Basis Function Neural Network (RBFNN), is the sensitivity of its output to input perturbations. Based on the central limit theorem, this paper proposes a method to compute the sensitivity of the RBFNN due to the errors of the inputs of the networks. For simplicity and practicality, all inputs are assumed to be independent and identically distributed (i.i.d.) with uniform distribution on interval (a, b). A number of simulations are conducted and the good agreement between the experimental results and the theoretical results verifies the reliability and feasibility of the proposed method. With this method, not only the relationship among the sensitivity of RBFNN, input error ratios and the number of the neurons of the input layer but also the relationship among the sensitivity, input error ratios and the number of the neurons of the hidden layer is founded.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Stevenson, M., Winter, R., Widrow, B.: Sensitivity of feedforward neural networks to weight errors. IEEE Trans. Neural Networks 1, 71–80 (1990)
Cheng, A.Y., Yeung, D.S.: Sensitivity analysis of neocognitron. IEEE Trans. Syst., Man, Cybern. C 29, 238–249 (1999)
Piche, S.W.: The selection of weight accuracies for Madalines. IEEE Trans. Neural Networks 6, 432–445 (1995)
Yeung, D.S., Sun, X.: Using Function Approximation to Analyze the Sensitivity of MLP with Antisymmetric Squashing Activation Function. IEEE Transactions on Neural Networks 13(1), 34–44 (2002)
Hashem, S.: Sensitivity analysis for feedforward artificial neural networks with differentiable activation functions. In: Proc. IJCNN 1992 Baltimore, MD, vol. 1, pp. 419–424 (1992)
Fu, L., Chen, T.: Sensitivity analysis for input vector in multilayer feedforward neural networks. In: Proc. IEEE Int. Conf. Neural Networks San Francisco, CA, vol. 1, pp. 215–218 (1993)
Zurada, J.M., Malinowski, A., Usui, S.: Perturbation method for deleting redundant inputs of perceptron networks. Neurocomput. 14, 177–193 (1997)
Engelbrecht, A.P., Cloete, I.: A sensitivity analysis algorithm for pruning feedforward neural networks. In: Proc. IEEE Int. Conf. Neural Networks, Washington, DC, vol. 2, pp. 1274–1277 (1996)
Engelbrecht, A.P., Fletcher, L., Cloete, I.: Variance analysis of sensitivity information for pruning feedforward neural networks. In: Proc. IEEE Int. Conf. Neural Networks, Washington, DC, pp. 1829–1833 (1999)
Choi, J.Y., Choi, C.-H.: Sensitivity analysis of multilayer perceptron with differentiable activation functions. IEEE Trans. Neural Networks 3, 101–107 (1992)
Broomhead, D.S., Lowe, D.: Multiva-riable functional interpolation and adaptive networks. Complex Systems 2, 321–355 (1988)
Yeung, D.S., Ng, W.W.Y., Wang, D., Tsang, E.C.C., Wang, X.-Z.: Localized Generalization Error and Its Application to Architecture Selection for Radial Basis Function Neural Network. IEEE Trans. on Neural Networks 18(5), 1294–1305 (2007)
Wang, X.-Z., Li, C.: A New Definition of Sensitivity for RBFNN and Its Applications to Feature Reduction. In: Wang, J., Liao, X.-F., Yi, Z. (eds.) ISNN 2005. LNCS, vol. 3496, pp. 81–86. Springer, Heidelberg (2005)
Wang, X.Z., Zhang, H.: An Upper Bound of Input Perturbation for RBFNNs Sensitivity Analysis. In: Proceedings of the Fourth International Conference on Machine Learning and Cybernetics, Guangzhou, pp. 4074–4079 (August 18-21, 2005)
Ng, W.W.Y., Yeung, D.S., Firth, M., Tsang, E.C.C., Wang, X.-Z.: Feature Selection Using Localized Generalization Error for Supervised Classification Problems for RBFNN. Pattern Recognition, 3706–3719 (2008)
Zeng, X., Yeung, D.S.: Sensitivity analysis of multilayer perceptron to input and weight perturbations. IEEE Trans. Neural Netw. 12(6), 1358–1366 (2001)
Yang, S., Ho, C., Siu, S.: Computing and Analyzing the Sensitivity of MLP Due to the Errors of the i.i.d. Inputs and Weights Based on CLT. IEEE Trans. on Neural Networks 21(12) (2010)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Li, J., Li, J., Liu, Y. (2014). Sensitivity Analysis of Radial-Basis Function Neural Network due to the Errors of the I.I.D Input. In: Wang, X., Pedrycz, W., Chan, P., He, Q. (eds) Machine Learning and Cybernetics. ICMLC 2014. Communications in Computer and Information Science, vol 481. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-45652-1_35
Download citation
DOI: https://doi.org/10.1007/978-3-662-45652-1_35
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-45651-4
Online ISBN: 978-3-662-45652-1
eBook Packages: Computer ScienceComputer Science (R0)