Recurrent Neural Networks with Unsaturating Piecewise Linear Activation Functions
It is known that in the dynamical analysis of RNNs, the activation functions are important factors which affect the dynamics of neural networks. Various activation functions have been used for neural networks. Recently, the studies reported in [19, 51, 75, 76, 203, 204, 208, 54] are focused on a class of RNNs with unsaturating linear threshold activation functions (LT networks). This class of neural networks has potential in many important applications. In , an efficient silicon design for LT networks has been demonstrated, with a discussion on the co-existence of analog amplification and digital selection in networks circuits. This transfer function is also more appropriate for RNNs . Since the unsaturating piecewise linear activation function are unbounded, more complex dynamic properties may exist in the networks.
KeywordsEquilibrium Point Activation Function Convergence Analysis Recurrent Neural Network Complete Convergence
Unable to display preview. Download preview PDF.