Encyclopedia of Computational Neuroscience

2015 Edition
| Editors: Dieter Jaeger, Ranu Jung

Recurrent Network Models, Reservoir Computing

Reference work entry
DOI: https://doi.org/10.1007/978-1-4614-6675-8_796

Synonyms

Definition

Reservoir computing (RC) is a general concept for computation and learning on temporal input streams with dynamical systems. Time can either be continuous or discrete. We will here use continuous time for the general definition. An input-driven dynamical system (the reservoir) provides a nonlinearly transformed and temporally integrated representation of the input stream in terms of its internal state. This representation is utilized by a readoutwhich maps internal states to outputs of the system. The output of the readout can be fed back into the reservoir. The reservoir is typically a recurrent neural network, but other dynamical systems have been employed in the RC spirit recently. The readout function is adapted through some learning procedure. Originally, this training was supervised and the reservoir was not adapted. Today, many...

This is a preview of subscription content, log in to check access

References

  1. Boyd S, Chua LO (1985) Fading memory and the problem of approximating nonlinear operators with Volterra series. IEEE Trans Circuits Syst 32:1150–1161Google Scholar
  2. Buesing L, Schrauwen B, Legenstein R (2010) Connectivity, dynamics, and memory in reservoir computing with binary and analog neurons. Neural Comput 22(5):1272–1311Google Scholar
  3. Buonomano DV, Maass W (2009) State-dependent computations: spatiotemporal processing in cortical networks. Nat Rev Neurosci 10(2):113–125PubMedGoogle Scholar
  4. Buonomano DV, Merzenich MM (1995) Temporal information transformed into a spatial code by a neural network with realistic properties. Science 267:1028–1030PubMedGoogle Scholar
  5. Destexhe A, Contreras D (2006) Neuronal computations with stochastic network states. Science 314:85–90PubMedGoogle Scholar
  6. Dominey PF (1995) Complex sensory-motor sequence learning based on recurrent state representation and reinforcement learning. Biol Cybern 73(3):265–274PubMedGoogle Scholar
  7. Hoerzer GM, Legenstein R, Maass W (2012) Emergence of complex computational structures from chaotic neural networks through reward-modulated Hebbian learning. Cereb Cortex 24(3):677–690PubMedGoogle Scholar
  8. Jaeger H (2001) The “echo state” approach to analysing and training recurrent neural networks – with an erratum note. GMD Report 148: German National Research Center for Information TechnologyGoogle Scholar
  9. Lukoševičius M, Jaeger H (2009) Survey: reservoir computing approaches to recurrent neural network training. Comput Sci Rev 3(3):127–149Google Scholar
  10. Lukoševičius M, Jaeger H, Schrauwen B (2012) Reservoir computing trends. KI – Künstliche Intelligenz, pp 1–7Google Scholar
  11. Maass W, Natschlaeger T, Markram H (2002) Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Comput 14(11):2531–2560PubMedGoogle Scholar
  12. Maass W, Prashant J, Sontag ED (2007) Computational aspects of feedback in neural circuits. PLoS Comput Biol 3(1):e165PubMedCentralPubMedGoogle Scholar
  13. Nikolic D, Haeusler S, Singer W, Maass W (2009) Distributed fading memory for stimulus properties in the primary visual cortex. PLoS Biol 7(12):1–19Google Scholar
  14. Rigotti M, Barak O, Warden MR, Wang X-J, Daw ND, Miller EK, Fusi S (2013) The importance of mixed selectivity in complex cognitive tasks. Nature 497:585–590PubMedGoogle Scholar

Further Reading

  1. Webpages Google Scholar
  2. Reservoir Computing. http://reservoir-computing.org
  3. Scholarpedia Google Scholar
  4. Echo State NetworksGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  1. 1.Institute for Theoretical Computer ScienceGraz University of TechnologyGrazAustria