Activation Functions

  • N. C. Steele
  • C. R. Reeves
  • E. I. Gaura
Conference paper


This paper considers an alternative activation function for use with MLP networks. The performance on parity problems is considered and it has been found that only n — 1 hidden units were needed to resolve the n-bit problem. Also, insight has been gained into the families of network parameters generated. Use as the kernel of a support vector machine for particular problems is anticipated.


Support Vector Machine Radial Basis Function Activation Function Sigmoid Function Radial Basis Function Network 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    Menoll A. Mehrotra K, Mohan C, Ranka S: Characterization of a Class of Sigmoid Functions with Applications to Neural Networks. Neural Networks 9, No 5, pp 818–835, (1996).Google Scholar
  2. [2]
    Minai A Williams R: On the Derivatives of the Sigmoid. Neural Networks 6, pp 845–853, (1993).CrossRefGoogle Scholar
  3. [3]
    Reeves C. R. Johnston C.: Fitting densities and hazard functions with neural networks. Accepted by ICANNGA 2001.Google Scholar

Copyright information

© Springer-Verlag Wien 2001

Authors and Affiliations

  • N. C. Steele
  • C. R. Reeves
  • E. I. Gaura
    • 1
  1. 1.School of MISCoventry UniversityCoventryUK

Personalised recommendations