Skip to main content

Determining Reservoir Topologies from Short-Term Memory in Echo State Networks

  • Conference paper
  • 3377 Accesses

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 321))

Abstract

Reservoir computing (RC), represented by echo state networks (ESN), is a kind of novel recurrent neural networks (RNN), which is increasingly being used in classification, chaotic time series prediction, speech recognition etc. ESN has a large number of randomly connected neurons (called “reservoir”) and an adaptable output. The short-term memory of reservoir has much effect on the performance of ESN. However, due to the way that the neurons in reservoir are randomly connected, the relationship between the topological structure of reservoir and short-term memory in ESN is not yet fully understood. In this paper, we establish a direct relationship between memory of the neural network and its connectivity. We transform the iterative mathematical model of ESN to direct one. In this model, we can determine the reservoir topology from short-term memory in ESN inversely. Furthermore, we find that some reservoir topologies proposed by pervious papers are the special solutions of our method.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Computer Science Review 3(3), 127–149 (2009)

    Article  Google Scholar 

  2. Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks. German National Research Center for Information Technology, Tech. Rep. GMD Report. 148 (2001)

    Google Scholar 

  3. Jaeger, H., Haas, H.: Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. Science 304, 78–80 (2004)

    Article  Google Scholar 

  4. Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Computation 14(11), 2531–2560 (2002)

    Article  MATH  Google Scholar 

  5. Jaeger, H.: Short term memory in echo state networks. German National Research Center for Information Technology, Technical Report GMD report. 152 (2002)

    Google Scholar 

  6. Bertschinger, N., Natschläger, T.: Real-time computation at the edge of chaos in recurrent neural networks. Neural Computation 16(7), 1413–1436 (2004)

    Article  MATH  Google Scholar 

  7. Legenstein, R., Maass, W.: Edge of chaos and prediction of computational performance for neural circuit models. Neural Networks 20(3), 323–334 (2007)

    Article  MATH  Google Scholar 

  8. Legenstein, R., Maass, W.: What makes a dynamical system computationally powerful? In: New Directions in Statistical Signal Processing: From Systems to Brain, pp. 127–154. MIT Press (2007)

    Google Scholar 

  9. Boedecker, J., Obst, O., Lizier, J.T., Mayer, N.M., Asada, M.: Information Processing in Echo State Networks at the Edge of Chaos. Theory in Biosciences (2011), doi:10.1007/s12064-011-0146-8

    Google Scholar 

  10. Büsing, L., Schrauwen, B., Legenstein, R.: Connectivity, dynamics, and memory in reservoir computing with binary and analog neurons. Neural Computation 22(5), 1272–1311 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  11. White, O., Lee, D., Sompolinsky, H.: Short-term memory in orthogonal neural networks. Physical Review Letters 92(14), 148102 (2004)

    Article  Google Scholar 

  12. Ganguli, S., Huh, D., Sompolinsky, H.: Memory traces in dynamical systems. Proceedings of the National Academy of Sciences 105, 18970–18975 (2008)

    Article  Google Scholar 

  13. Hermans, M., Schrauwen, B.: Memory in linear recurrent neural networks in continuous time. Neural Networks 23(3), 341–355 (2010)

    Article  Google Scholar 

  14. Hermans, M., Schrauwen, B.: Memory in reservoirs for high dimensional input. In: International Joint Conference on Neural Networks, pp. 1–7 (2010)

    Google Scholar 

  15. Ganguli, S., Sompolinsky, H.: Short-term memory in neuronal networks through dynamical compressed sensing. In: Lafferty, J., Williams, C., Shawe-Taylor, J., Zemel, R., Culotta, A. (eds.) Advances in Neural Information Processing Systems, vol. 23, pp. 667–675 (2010)

    Google Scholar 

  16. Candes, E., Wakin, M.: An introduction to compressive sampling. IEEE Signal Processing Magazine 25(2), 21–30 (2008)

    Article  Google Scholar 

  17. Rodan, A., Tino, P.: Minimum Complexity Echo State Network. IEEE Transactions on Neural Networks 22(1), 131–144 (2011)

    Article  Google Scholar 

  18. Shi, Z., Han, M.: Support vector echo-state machine for chaotic time-series prediction. IEEE Transactions on Neural Networks 18(2), 359–372 (2007)

    Article  Google Scholar 

  19. Buehner, M.R., Young, P.M.: A Tighter Bound for the Echo State Property. IEEE Transaction on Neural Networks 17(3), 820–824 (2006)

    Article  Google Scholar 

  20. Zhang, B., Miller, D.J., Wang, Y.: Nonlinear System Modeling With Random Matrices: Echo State Networks Revisited. IEEE Transaction on Neural Networks 23(1), 175–182 (2012)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ma, Q., Chen, W. (2012). Determining Reservoir Topologies from Short-Term Memory in Echo State Networks. In: Liu, CL., Zhang, C., Wang, L. (eds) Pattern Recognition. CCPR 2012. Communications in Computer and Information Science, vol 321. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33506-8_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-33506-8_9

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-33505-1

  • Online ISBN: 978-3-642-33506-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics