Memory Capacity of Input-Driven Echo State Networks at the Edge of Chaos

  • Peter Barančok
  • Igor Farkaš
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8681)


Reservoir computing provides a promising approach to efficient training of recurrent neural networks, by exploiting the computational properties of the reservoir structure. Various approaches, ranging from suitable initialization to reservoir optimization by training have been proposed. In this paper we take a closer look at short-term memory capacity, introduced by Jaeger in case of echo state networks. Memory capacity has recently been investigated with respect to criticality, the so called edge of chaos, when the network switches from a stable regime to an unstable dynamic regime. We calculate memory capacity of the networks for various input data sets, both random and structured, and show how the data distribution affects the network performance. We also investigate the effect of reservoir sparsity in this context.


echo state network memory capacity edge of chaos 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bertschinger, N., Natschläger, T.: Real-time computation at the edge a of chaos in recurrent neural networks. Neural Computation 16(7), 1413–1436 (2004)CrossRefzbMATHGoogle Scholar
  2. 2.
    Boedecker, J., Obst, O., Lizier, J., Mayer, N., Asada, M.: Information processing in echo state networks at the edge of chaos. Theory in Biosciences 131, 205–213 (2012)CrossRefGoogle Scholar
  3. 3.
    Büsing, L., Schrauwen, B., Legenstein, R.: Connectivity, dynamics, and memory in reservoir computing with binary and analog neurons. Neural Computation 22(5), 1272–1311 (2010)CrossRefzbMATHMathSciNetGoogle Scholar
  4. 4.
    Hermans, M., Schrauwen, B.: Memory in linear recurrent neural networks in continuous time. Neural Networks 23, 341–355 (2010)CrossRefGoogle Scholar
  5. 5.
    Huebner, U., Abraham, N., Weiss, C.: Dimensions and entropies of chaotic intensity pulsations in a single-mode far-infrared NH3 laser. Physics Reviews A 40(11), 6354–6365 (1989)CrossRefGoogle Scholar
  6. 6.
    Jaeger, H.: Short term memory in echo state networks. Tech. Rep. GMD Report 152, German National Research Center for Information Technology (2002)Google Scholar
  7. 7.
    Jaeger, H.: Echo state network. Scholarpedia 2(9) (2007)Google Scholar
  8. 8.
    Legenstein, R., Maass, W.: What makes a dynamical system computationally powerful? In: New Directions in Statistical Signal Processing: From Systems to Brain, pp. 127–154. MIT Press (2007)Google Scholar
  9. 9.
    Lukosevicius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Computer Science Review 3(3), 127–149 (2009)CrossRefGoogle Scholar
  10. 10.
    Ozturk, M., Xu, C., Principe, J.: Analysis and design of echo state networks. Neural Computation 19, 111–138 (2006)CrossRefGoogle Scholar
  11. 11.
    Rodan, A., Tiňo, P.: Minimum complexity echo state network. IEEE Transaction on Neural Networks 21(1), 131–144 (2011)Google Scholar
  12. 12.
    Schrauwen, B., Buesing, L., Legenstein, R.: On computational power and the order-chaos phase transition in reservoir computing. In: Advances in Neural Information Processing Systems, pp. 1425–1432 (2009)Google Scholar
  13. 13.
    Sprott, J.: Chaos and Time-Series Analysis. Oxford University Press (2003)Google Scholar
  14. 14.
    Verstraeten, D., Dambre, J., Dutoit, X., Schrauwen, B.: Memory versus non-linearity in reservoirs. In: International Joint Conference on Neural Networks, pp. 1–8 (2010)Google Scholar
  15. 15.
    White, O., Lee, D., Sompolinsky, H.: Short-term memory in orthogonal neural networks. Physical Review Letters 92(14), 148102 (2004)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Peter Barančok
    • 1
  • Igor Farkaš
    • 1
  1. 1.Faculty of Mathematics, Physics and InformaticsComenius University in BratislavaSlovakia

Personalised recommendations