Abstract
Our recent works on artificial neural networks point to the possibility of extending the activation function of a standard artificial neuron model using the conditional signal accumulation technique, thus significantly enhancing the capabilities of neural networks. We present a new artificial neuron model, called Sigma-if, with the ability to dynamically tune the size of the decision space under consideration, resulting from a novel activation function. The paper discusses construction of the proposed neuron as well as training Sigma-if feedforward neural networks for well known sample classification problems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Duch, W., Jankowski, N. (1999) Survey of neural transfer functions. Neural Computing Surveys 2: 163–212
Anthony, M., Bartlett, P.L. (1999) Neural networks learning: Theoretical foundations, 1st ed. Cambridge University Press, Cambridge
Hammer, B., Tino, P. (2003) Recurrent neural networks with small weights implement definite memory machines. Neural Computation 15(8): 1897–1929
Murray, A.F., Edwards, P.J. (1992) Synaptic weight noise during MLP learning enhances fault-tolerance, generalisation and learning trajectory. Advances in Neural Information Processing Systems 5: 491–498
Hornik, K., Stinchcombe, M., White, H. (1989) Multilayer feedforward networks are universal approximators. Neural Networks 2: 359–366
Durbin, R., Rumelhart, D. (1989) Product units: A computationally powerful and biologically plausible extension to backpropagation networks. Neural Computation 1: 133–142
Schmitt, M. (2002) On the complexity of computing and learning with multiplicative neural networks. Neural Computation 14: 241–301
Duch, W., Jankowski, N. (2001) Transfer functions: hidden possibilities for better neural networks. In Verleysen, M. (ed.) 9th European Symposium on Artificial Neural Networks. D-Facto, Brugge, pp. 81–94
Cohen, S., Intrator, N. (2002) A hybrid projection based and radial basis function architecture: initial values and global optimization. Pattern Analysis & Applications 5: 113–120
Schmitt, M. (2002) Neural networks with local receptive fields and superlinear VC dimension. Neural Computation 14(4): 919–956
Huk, M. (2004) The Sigma-if neural network as a method of dynamic selection of decision subspaces for medical reasoning systems. Journal of Medical Informatics & Technologies 7: 65–73
Ridella, S., Rovetta, S., Zunino, R. (1999) Representation and generalization properties of Class-Entropy Networks. IEEE Transactions on Neural Networks 10(1): 31–47
Banarer, V., Perwass, C, Sommer, G. (2003) The hypersphere neuron. In: Verleysen, M. (ed.) 11th European Symposium on Artificial Neural Networks. D-Side, Brugge, pp. 469–474
Huk, M. (2003) Determining the relevancy of attributes in classification tasks through nondestructive elimination of interneuronal connections in a neural network (in Polish). Pozyskiwanie Wiedzy z Baz Danych 975: 138–147
Fahlman, S.E., Lebiere, C. (1990) The Cascade-Correlation learning architecture. Advances in Neural Information Processing Systems 2: 524–532
Kaufman, L., (1999) Solving the quadratic pro-gramming problem arising in support vector classification. In: Scholpkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods-Support Vector Learning. MIT Press, Boston, pp. 146–167
Tickle, A., Andrews, R., Golea, M., Diederich, J. (1998) The truth will come to light: directions and challenges in extracting the knowledge embedded within trained artificial neural networks. IEEE Transactions on Neural Networks 9(6): 1057–1068.
Craven, M.W., Shavlik, J.W. (1996) Extracting tree-structured representations of trained networks. Advances in Neural Information Processing Systems 8: 24–30
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag/Wien
About this paper
Cite this paper
Huk, M., Kwasnicka, H. (2005). The Concept and Properties of Sigma-if Neural Network. In: Ribeiro, B., Albrecht, R.F., Dobnikar, A., Pearson, D.W., Steele, N.C. (eds) Adaptive and Natural Computing Algorithms. Springer, Vienna. https://doi.org/10.1007/3-211-27389-1_4
Download citation
DOI: https://doi.org/10.1007/3-211-27389-1_4
Publisher Name: Springer, Vienna
Print ISBN: 978-3-211-24934-5
Online ISBN: 978-3-211-27389-0
eBook Packages: Computer ScienceComputer Science (R0)