Abstract
Among all the traditional methods introduced in Chap. 2, none has involved activation function in the calculation of sensitivity analysis. This chapter attempts to generalize Piché’s method by parameterizing antisymmetric squashing activation functions, through which a universal expression of MLP’s sensitivity will be derived without any restriction on input or output perturbations.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsAuthor information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Yeung, D.S., Cloete, I., Shi, D., Ng, W.W. (2009). Sensitivity Analysis with Parameterized Activation Function. In: Sensitivity Analysis for Neural Networks. Natural Computing Series. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02532-7_4
Download citation
DOI: https://doi.org/10.1007/978-3-642-02532-7_4
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-02531-0
Online ISBN: 978-3-642-02532-7
eBook Packages: Computer ScienceComputer Science (R0)