Abstract
In Chapter 7 we showed that the saturated-linear activation function is not unique in its Turing universality, but rather that various sigmoidal-like activation functions can form finite-size architectures which are Turing universal as well. The class of activation functions considered in this chapter is much wider than that of the previous chapter, and as a result the lower bound on its computational power is weaker. We prove that any function for which the left and right limits exist and are different can serve as an activation function for the neurons to yield a network that is at least as strong computationally as a finite automaton.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer Science+Business Media New York
About this chapter
Cite this chapter
Siegelmann, H.T. (1999). Different-limits Networks. In: Neural Networks and Analog Computation. Progress in Theoretical Computer Science. Birkhäuser, Boston, MA. https://doi.org/10.1007/978-1-4612-0707-8_8
Download citation
DOI: https://doi.org/10.1007/978-1-4612-0707-8_8
Publisher Name: Birkhäuser, Boston, MA
Print ISBN: 978-1-4612-6875-8
Online ISBN: 978-1-4612-0707-8
eBook Packages: Springer Book Archive