Advertisement

Determining Reservoir Topologies from Short-Term Memory in Echo State Networks

  • Qianli Ma
  • Weibiao Chen
Part of the Communications in Computer and Information Science book series (CCIS, volume 321)

Abstract

Reservoir computing (RC), represented by echo state networks (ESN), is a kind of novel recurrent neural networks (RNN), which is increasingly being used in classification, chaotic time series prediction, speech recognition etc. ESN has a large number of randomly connected neurons (called “reservoir”) and an adaptable output. The short-term memory of reservoir has much effect on the performance of ESN. However, due to the way that the neurons in reservoir are randomly connected, the relationship between the topological structure of reservoir and short-term memory in ESN is not yet fully understood. In this paper, we establish a direct relationship between memory of the neural network and its connectivity. We transform the iterative mathematical model of ESN to direct one. In this model, we can determine the reservoir topology from short-term memory in ESN inversely. Furthermore, we find that some reservoir topologies proposed by pervious papers are the special solutions of our method.

Keywords

Reservoir computing Echo state networks Memory capability Reservoir topology 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Computer Science Review 3(3), 127–149 (2009)CrossRefGoogle Scholar
  2. 2.
    Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks. German National Research Center for Information Technology, Tech. Rep. GMD Report. 148 (2001)Google Scholar
  3. 3.
    Jaeger, H., Haas, H.: Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. Science 304, 78–80 (2004)CrossRefGoogle Scholar
  4. 4.
    Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Computation 14(11), 2531–2560 (2002)zbMATHCrossRefGoogle Scholar
  5. 5.
    Jaeger, H.: Short term memory in echo state networks. German National Research Center for Information Technology, Technical Report GMD report. 152 (2002)Google Scholar
  6. 6.
    Bertschinger, N., Natschläger, T.: Real-time computation at the edge of chaos in recurrent neural networks. Neural Computation 16(7), 1413–1436 (2004)zbMATHCrossRefGoogle Scholar
  7. 7.
    Legenstein, R., Maass, W.: Edge of chaos and prediction of computational performance for neural circuit models. Neural Networks 20(3), 323–334 (2007)zbMATHCrossRefGoogle Scholar
  8. 8.
    Legenstein, R., Maass, W.: What makes a dynamical system computationally powerful? In: New Directions in Statistical Signal Processing: From Systems to Brain, pp. 127–154. MIT Press (2007)Google Scholar
  9. 9.
    Boedecker, J., Obst, O., Lizier, J.T., Mayer, N.M., Asada, M.: Information Processing in Echo State Networks at the Edge of Chaos. Theory in Biosciences (2011), doi:10.1007/s12064-011-0146-8Google Scholar
  10. 10.
    Büsing, L., Schrauwen, B., Legenstein, R.: Connectivity, dynamics, and memory in reservoir computing with binary and analog neurons. Neural Computation 22(5), 1272–1311 (2010)MathSciNetzbMATHCrossRefGoogle Scholar
  11. 11.
    White, O., Lee, D., Sompolinsky, H.: Short-term memory in orthogonal neural networks. Physical Review Letters 92(14), 148102 (2004)CrossRefGoogle Scholar
  12. 12.
    Ganguli, S., Huh, D., Sompolinsky, H.: Memory traces in dynamical systems. Proceedings of the National Academy of Sciences 105, 18970–18975 (2008)CrossRefGoogle Scholar
  13. 13.
    Hermans, M., Schrauwen, B.: Memory in linear recurrent neural networks in continuous time. Neural Networks 23(3), 341–355 (2010)CrossRefGoogle Scholar
  14. 14.
    Hermans, M., Schrauwen, B.: Memory in reservoirs for high dimensional input. In: International Joint Conference on Neural Networks, pp. 1–7 (2010)Google Scholar
  15. 15.
    Ganguli, S., Sompolinsky, H.: Short-term memory in neuronal networks through dynamical compressed sensing. In: Lafferty, J., Williams, C., Shawe-Taylor, J., Zemel, R., Culotta, A. (eds.) Advances in Neural Information Processing Systems, vol. 23, pp. 667–675 (2010)Google Scholar
  16. 16.
    Candes, E., Wakin, M.: An introduction to compressive sampling. IEEE Signal Processing Magazine 25(2), 21–30 (2008)CrossRefGoogle Scholar
  17. 17.
    Rodan, A., Tino, P.: Minimum Complexity Echo State Network. IEEE Transactions on Neural Networks 22(1), 131–144 (2011)CrossRefGoogle Scholar
  18. 18.
    Shi, Z., Han, M.: Support vector echo-state machine for chaotic time-series prediction. IEEE Transactions on Neural Networks 18(2), 359–372 (2007)CrossRefGoogle Scholar
  19. 19.
    Buehner, M.R., Young, P.M.: A Tighter Bound for the Echo State Property. IEEE Transaction on Neural Networks 17(3), 820–824 (2006)CrossRefGoogle Scholar
  20. 20.
    Zhang, B., Miller, D.J., Wang, Y.: Nonlinear System Modeling With Random Matrices: Echo State Networks Revisited. IEEE Transaction on Neural Networks 23(1), 175–182 (2012)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Qianli Ma
    • 1
  • Weibiao Chen
    • 1
  1. 1.School of Computer Science and EngineeringSouth China University of TechnologyGuangzhouChina

Personalised recommendations