Skip to main content

A New Definition of Sensitivity for RBFNN and Its Applications to Feature Reduction

  • Conference paper
Advances in Neural Networks – ISNN 2005 (ISNN 2005)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3496))

Included in the following conference series:

Abstract

Due to the existence of redundant features, the Radial-Basis Function Neural Network (RBFNN) which is trained from a dataset is likely to be huge. Sensitivity analysis technique usually could help to reduce the features by deleting insensitive features. Considering the perturbation of network output as a random variable, this paper defines a new sensitivity formula which is the limit of variance of output perturbation with respect to the input perturbation going to zero. To simplify the sensitivity expression and computation, we prove that the exchange between limit and variance is valid. A formula for computing the new sensitivity of individual features is derived. Numerical simulations show that the new sensitivity definition can be used to remove irrelevant features effectively.

This research work is supported by NSFC (60473045) and Natural Science Foundation of Hebei Province (603137).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Haykin, S.: Neural Networks: A Comprehensive Foundation (Second Edition). Prentice-Hall, Incorporation, Englewood Cliffs (2001)

    Google Scholar 

  2. Winter, R.: Madaline Rule II: A New Method for Training Networks of Adalines. Ph. Thesis, Stanford University, Stanford, CA (1989)

    Google Scholar 

  3. Zurada, J.M.: Perturbation Method for Deleting Redundant Inputs of Perceptron Networks. Neurocomputingc 14, 177–193 (1997)

    Article  Google Scholar 

  4. Choi, J.Y., Choi, C.: Sensitivity Analysis of Multilayer Perceptron with Differentiable Activation Functions. IEEE Transactions on Neural Networks 3, 101–107 (1992)

    Article  Google Scholar 

  5. Zeng, X., Yeung, D.S.: Sensitivity Analysis of Multilayer Perceptron to Input and Weight Perturbations. IEEE Transactions on Neural Networks 12, 1358–1366 (2001)

    Article  Google Scholar 

  6. Stephen, W., Piché: The Selection of Weight Accuracies for Madalines. IEEE Transactions on Neural Networks 6, 432–445 (1995)

    Article  Google Scholar 

  7. Karayiannis, N.B.: Reformulated Radial Basis Neural Networks Trained by Gradient Decent. IEEE Transactions on Neural Networks 10(3), 657–671 (1999)

    Article  Google Scholar 

  8. Karayiannis, N.B.: New Development in the Theory and Training of Reformulated Radial Basis Neural Networks. In: IEEE International Joint Conference on Neural Network, pp. 614–619 (2000)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Wang, X., Li, C. (2005). A New Definition of Sensitivity for RBFNN and Its Applications to Feature Reduction. In: Wang, J., Liao, X., Yi, Z. (eds) Advances in Neural Networks – ISNN 2005. ISNN 2005. Lecture Notes in Computer Science, vol 3496. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11427391_12

Download citation

  • DOI: https://doi.org/10.1007/11427391_12

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-25912-1

  • Online ISBN: 978-3-540-32065-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics