Skip to main content

Evolution of Cubic Spline Activation Functions for Artificial Neural Networks

  • Conference paper
  • First Online:
Progress in Artificial Intelligence (EPIA 2001)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2258))

Included in the following conference series:

Abstract

The most common (or even only) choice of activation functions (AFs) for multi-layer perceptrons (MLPs) widely used in research, engineering and business is the logistic function. Among the reasons for this popularity are its boundedness in the unit interval, the function’s and its derivative’s fast computability, and a number of amenable mathematical properties in the realm of approximation theory. However, considering the huge variety of problem domains MLPs are applied in, it is intriguing to suspect that specific problems call for specific activation functions. Biological neural networks with their enormous variety of neurons mastering a set of complex tasks may be considered to motivate this hypothesis. We present a number of experiments evolving structure and activation functions of generalized multi-layer perceptrons (GMLPs) using the parallel netGEN system to train the evolved architectures. For the evolution of activation functions we employ cubic splines and compare the evolved cubic spline ANNs with evolved sigmoid ANNs on synthetic classification problems which allow conclusions w.r.t. the shape of decision boundaries. Also, an interesting observation concerning Minsky’s Paradox is reported.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Lippmann, R.P.: An introduction to computing with neural nets. IEEE Acoustics, Speech and Signal Processing 4 (1987) 4–22

    Google Scholar 

  2. Cybenko, G.: Approximation by superpositions of a sigmoidal function. Mathematics of Control, Signals, and Systems 2 (1987) 303–314

    Article  MathSciNet  Google Scholar 

  3. Barron, A.R.: Universal approximation bounds for superpositions of a sigmoidal function. IEEE Transactions on Information Theory 39 (1993) 930–944

    Article  MATH  MathSciNet  Google Scholar 

  4. Liu, Y., Yao, X.: Evolutionary Design of Artificial Neural Networks with Different Nodes. In: Proceedings of the Third IEEE International Conference on Evolutionary Computation. (1996) 570–675

    Google Scholar 

  5. Vecci, L., Piazza, F., Uncini, A.: Learning and approximation capabilities of adaptive spline activation function neural networks. Neural Networks (1998) 259–270

    Google Scholar 

  6. Sopena, J.M., Romero, E., Alquézar, R.: Neural networks with periodic and monotonic activation functions: a comparative study in classification problems. In: Proceedings of the 9th International Conference on Artificial Neural Networks. (1999)

    Google Scholar 

  7. Zell, A., Mamier, G., Vogt, M., Mach, N., Huebner, R., Herrmann, K.U., Soyez, T., Schmalzl, M., Sommer, T., Hatzigeogiou, A., Doering, S., Posselt, D.: SNNS Stuttgart Neural Network Simulator, User Manual. University of Stuttgart (1994)

    Google Scholar 

  8. Miller, G.F., Todd, P.M., Hegde, S.U.: Designing neural networks using genetic algorithms. In Schaffer, J.D., ed.: Proceedings of the Third International Conference on Genetic Algorithms, San Mateo, California, Philips Laboratories, Morgan Kaufman Publishers, Inc. (1989) 379–384

    Google Scholar 

  9. Riedmiller, M., Braun, H.: A direct adaptive method for faster backpropagation learning: The RPROP algorithm. In: Proceedings of the IEEE International Conference on Neural Networks, San Francisco, CA (1993)

    Google Scholar 

  10. Minsky, M.L., Papert, S.A.: Perceptrons: Introduction to Computational Geometry. Expanded edn. MIT Press (1988)

    Google Scholar 

  11. Rueda, L., Oommen, B.J.: The Foundational Theory of Optimal Bayesian Pairwise Linear Classifiers. In: Proceedings of Joint IAPR International Workshops SSPR 2000 and SPR 2000 (LNCS 1876), Springer (2000) 581–590

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Mayer, H.A., Schwaiger, R. (2001). Evolution of Cubic Spline Activation Functions for Artificial Neural Networks. In: Brazdil, P., Jorge, A. (eds) Progress in Artificial Intelligence. EPIA 2001. Lecture Notes in Computer Science(), vol 2258. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45329-6_10

Download citation

  • DOI: https://doi.org/10.1007/3-540-45329-6_10

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-43030-8

  • Online ISBN: 978-3-540-45329-1

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics