Determining Reservoir Topologies from Short-Term Memory in Echo State Networks
Reservoir computing (RC), represented by echo state networks (ESN), is a kind of novel recurrent neural networks (RNN), which is increasingly being used in classification, chaotic time series prediction, speech recognition etc. ESN has a large number of randomly connected neurons (called “reservoir”) and an adaptable output. The short-term memory of reservoir has much effect on the performance of ESN. However, due to the way that the neurons in reservoir are randomly connected, the relationship between the topological structure of reservoir and short-term memory in ESN is not yet fully understood. In this paper, we establish a direct relationship between memory of the neural network and its connectivity. We transform the iterative mathematical model of ESN to direct one. In this model, we can determine the reservoir topology from short-term memory in ESN inversely. Furthermore, we find that some reservoir topologies proposed by pervious papers are the special solutions of our method.
KeywordsReservoir computing Echo state networks Memory capability Reservoir topology
Unable to display preview. Download preview PDF.
- 2.Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks. German National Research Center for Information Technology, Tech. Rep. GMD Report. 148 (2001)Google Scholar
- 5.Jaeger, H.: Short term memory in echo state networks. German National Research Center for Information Technology, Technical Report GMD report. 152 (2002)Google Scholar
- 8.Legenstein, R., Maass, W.: What makes a dynamical system computationally powerful? In: New Directions in Statistical Signal Processing: From Systems to Brain, pp. 127–154. MIT Press (2007)Google Scholar
- 9.Boedecker, J., Obst, O., Lizier, J.T., Mayer, N.M., Asada, M.: Information Processing in Echo State Networks at the Edge of Chaos. Theory in Biosciences (2011), doi:10.1007/s12064-011-0146-8Google Scholar
- 14.Hermans, M., Schrauwen, B.: Memory in reservoirs for high dimensional input. In: International Joint Conference on Neural Networks, pp. 1–7 (2010)Google Scholar
- 15.Ganguli, S., Sompolinsky, H.: Short-term memory in neuronal networks through dynamical compressed sensing. In: Lafferty, J., Williams, C., Shawe-Taylor, J., Zemel, R., Culotta, A. (eds.) Advances in Neural Information Processing Systems, vol. 23, pp. 667–675 (2010)Google Scholar