A New Approach towards Vision Suggested by Biologically Realistic Neural Microcircuit Models

  • Wolfgang Maass
  • Robert Legenstein
  • Henry Markram
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2525)


We propose an alternative paradigm for processing time-varying visual inputs, in particular for tasks involving temporal and spatial integration, which is inspired by hypotheses about the computational role of cortical microcircuits. Since detailed knowledge about the precise structure of the microcircuit is not needed for that, it can in principle also be implemented with partially unknown or faulty analog hardware. In addition, this approach supports parallel real-time processing of time-varying visual inputs for diverse tasks, since different readouts can be trained to extract concurrently from the same microcircuit completely different information components.


Cortical microcircuits recurrent connections spiking neurons dynamic synapses dynamical systems movement prediction direction of motion novelty detection parallel computing 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [Abbott and Blum, 1996]
    Abbott, L. F., and Blum, K. I. (1996) Functional significance of longterm potentiation for sequence learning and prediction, Cerebral Cortex, vol. 6, 406–416.CrossRefGoogle Scholar
  2. [Auer et al., 2002]
    Auer, P., Burgsteiner, H., and Maass, W. (2002) Reducing communication for distributed learning in neural networks. In Proc. ICANN’2002, 2002. Springer-Verlag.Google Scholar
  3. [Buonomano and Merzenich, 1995]
    Buonomano, D. V., and Merzenich, M. M. (1995) Temporal information transformed into a spatial code by a neural network with realistic properties, Science, vol. 267, Feb. 1995, 1028–1030.Google Scholar
  4. [Gupta et al., 2000]
    Gupta, A., Wang, Y., and Markram, H. (2000) Organizing principles for a diversity of GABAergic interneurons and synapses in the neocortex, Science 287, 2000, 273–278.CrossRefGoogle Scholar
  5. [Haeusler et al., 2002]
    Haeusler, S., Markram, H., and Maass, W. (2002) Observations on low dimensional readouts from the complex high dimensional dynamics of neural microcircuits, submitted for publication. Online available as # 137 on
  6. [Jaeger, 2001]
    Jaeger, H. (2001) The “echo state” approach to analyzing and training recurrent neural networks, submitted for publication.Google Scholar
  7. [Levy, 1996]
    Levy, W. B. (1996) A sequence predicting CA3 is a flexible associator that learns and uses context to solve hippocampal-like tasks, Hippocampus, vol. 6, 579–590.CrossRefGoogle Scholar
  8. [Maass et al., 2001]
    Maass, W., Natschlaeger, T., and Markram, H. (2001) Real-time computing without stable states: A new framework for neural computation based on perturbations, Neural Computation (in press). Online available as # 130 on
  9. [Mallot, 2000]
    Mallot, H. A. (2000) Computational Vision, MIT-Press (Cambridge, MA).Google Scholar
  10. [Markram et al., 1998]
    Markram, H., Wang, Y., and Tsodyks, M. (1998) Differential signaling via the same axon of neocortical pyramidal neurons, Proc. Natl. Acad. Sci., 95, 5323–5328.Google Scholar
  11. [Rao and Sejnowski, 2000]
    Rao, R. P. N., and Sejnowski, T. J. (2000) Predictive sequence learning in recurrent neocortical circuits, Advances in Neural Information Processing Systems 12, (NIPS*99), 164–170, S. A. Solla, T. K. Leen, and K. R. Muller (Eds.), MIT Press.Google Scholar
  12. [Schölkopf and Smola, 2002]
    Schölkopf, B., and Smola, A. J. (2002) Learning with Kernels, MIT-Press (Cambridge, MA).Google Scholar
  13. [Stocker and Douglas, 1999]
    Stocker, A., and Douglas, R. (1999) Computation of smooth optical flow in a feedback connected analog network. Advances in Neural Information Processing Systems 11, (NIPS*98), 706–712.Google Scholar
  14. [Tsodyks et al., 2000]
    Tsodyks, M., Uziel, A., and Markram, H. (2000) Synchrony generation in recurrent networks with frequency-dependent synapses, J. Neuroscience, Vol. 20, RC50.Google Scholar
  15. [Vapnik, 1998]
    Vapnik, V. N. (1998) Statistical Learning Theory, John Wiley (New York).zbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Wolfgang Maass
    • 1
  • Robert Legenstein
    • 1
  • Henry Markram
    • 2
  1. 1.Institute for Theoretical Computer ScienceTechnische Universitaet GrazGrazAustria
  2. 2.Brain Mind InstituteEcole Polytechnique Federale de LausanneLausanne

Personalised recommendations