Abstract
It is known that in the dynamical analysis of RNNs, the activation functions are important factors which affect the dynamics of neural networks. Various activation functions have been used for neural networks. Recently, the studies reported in [19, 51, 75, 76, 203, 204, 208, 54] are focused on a class of RNNs with unsaturating linear threshold activation functions (LT networks). This class of neural networks has potential in many important applications. In [76], an efficient silicon design for LT networks has been demonstrated, with a discussion on the co-existence of analog amplification and digital selection in networks circuits. This transfer function is also more appropriate for RNNs [51]. Since the unsaturating piecewise linear activation function are unbounded, more complex dynamic properties may exist in the networks.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Yi, Z., Tan, K.K. (2004). Recurrent Neural Networks with Unsaturating Piecewise Linear Activation Functions. In: Convergence Analysis of Recurrent Neural Networks. Network Theory and Applications, vol 13. Springer, Boston, MA. https://doi.org/10.1007/978-1-4757-3819-3_4
Download citation
DOI: https://doi.org/10.1007/978-1-4757-3819-3_4
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4757-3821-6
Online ISBN: 978-1-4757-3819-3
eBook Packages: Springer Book Archive