Recurrent Neural Networks with Unsaturating Piecewise Linear Activation Functions

  • Zhang Yi
  • K. K. Tan
Part of the Network Theory and Applications book series (NETA, volume 13)

Abstract

It is known that in the dynamical analysis of RNNs, the activation functions are important factors which affect the dynamics of neural networks. Various activation functions have been used for neural networks. Recently, the studies reported in [19, 51, 75, 76, 203, 204, 208, 54] are focused on a class of RNNs with unsaturating linear threshold activation functions (LT networks). This class of neural networks has potential in many important applications. In [76], an efficient silicon design for LT networks has been demonstrated, with a discussion on the co-existence of analog amplification and digital selection in networks circuits. This transfer function is also more appropriate for RNNs [51]. Since the unsaturating piecewise linear activation function are unbounded, more complex dynamic properties may exist in the networks.

Keywords

Equilibrium Point Activation Function Convergence Analysis Recurrent Neural Network Complete Convergence 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media Dordrecht 2004

Authors and Affiliations

  • Zhang Yi
    • 1
  • K. K. Tan
    • 2
  1. 1.School of Computer Science and EngineeringUniversity of Electronic Science and Technology of ChinaChengduPeople’s Republic of China
  2. 2.Department of Electrical and Computer EngineeringThe National University of SingaporeSingapore

Personalised recommendations