Neural Processing Letters

, Volume 21, Issue 1, pp 53–60 | Cite as

Complete Convergence of Competitive Neural Networks with Different Time Scales

  • Mao Ye
  • Yi Zhang


This paper studies the complete convergence of a class of neural networks with different time scales under the assumption that the activation functions are unsaturated piecewise linear functions. Under this assumption, there are multiple equilibrium points in the neural network. Traditional methods cannot be used in this neural network. Complete convergence is proved by constructing an energy-like function. Simulations are employed to illustrate the theory.


complete convergence different time scales recurrent neural network unsaturated piecewise linear function 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Amari, S. 1983Field theory of self-organizing neural netsIEEE Transactions System Man Cybernetics SMC13741748Google Scholar
  2. Douglas, R., Koch, C., Mahowald, M., Martin, K., Suarez, H. 1995Recurrent excitation in neocortical circuitsScience269981985Google Scholar
  3. Grossberg, S. 1978Competition, decision and consensusJournal of Mathematical and Analytical and Applications66470493Google Scholar
  4. Hahnloser, R. 1998On the piecewise analysis of linear thresholdneural networks Neural Networks11691697Google Scholar
  5. Hahnloser, R., Mahowald, M. A., Douglas, R. J., Seung,  H.S. 2000Digitial selection and analogue amplification coexist in a cortex inspired silicon circuitNature405947951Google Scholar
  6. Hahnloser, R., Seung, H. S., Slotine, J. J. 2003Permitted and forbidden sets in symmetric threshold-linear networksNeural Computation15621638Google Scholar
  7. Lemmon, M., Kumar, V. 1989Emulating the dynamics for a class of laterally inhibited neural networksNeural Networks2193214Google Scholar
  8. Meyer-Baese, A., Pilyugin, S. S., Chen, Y. 2003Global exponential stability of competitive neural networks with differernt time scalesIEEE Transactions Neural Network14716719Google Scholar
  9. Meyer-Base, A., Ohl, F., Scheich, H. 1996Singular perturbation annalysis of competitve neural networks with different time scalesNeural Computation8545563Google Scholar
  10. Wersing, H., Beyn, W. J., Ritter, H. 2001Dynamical stability conditions for recurrent neural networks with unsaturating piecewise linear transfer functionsNeural Computation1318111825Google Scholar
  11. Xie, X., Hahnloser, R., Seung, H. S. 2002Selectively grouping neurons in recurrent networks of lateral inhibitionNeural Computation1426272646Google Scholar
  12. Yi, Z., Tan, K. K., Lee, T. H. 2003Multistablity analysis for recurrent neural networks with unsaturating piecewise linear transfer functionsNeural Computation15639662Google Scholar
  13. Yi, Z, Tan, K. 2003Convergence Analysis of Recurrent Neural NetworksKluwer Academic PublishersDordrechtGoogle Scholar

Copyright information

© Springer 2005

Authors and Affiliations

  1. 1.Department of Computer Science and EngineeringNanjing University of Aeronautics and AstronauticsNanjingPR China
  2. 2.Computational Intelligence Laboratory, School of Computer Science and EngineeringUniversity of Electronic Science and Technology of ChinaChengduP.R. China

Personalised recommendations