Implementing hebbian learning in a rank-based neural network

  • Manuel Samuelides
  • Simon Thorpe
  • Emmanuel Veneau
Part I: Coding and Learning in Biology
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1327)


Recent works have shown that biologically motivated net works of spiking neurons can potentially process information very quickly by encoding information in the latency at which different neurons fire, rather than by using frequency of firing as the code. In this paper, the relevant information is the rank vector of latency order of competing neurons. We propose here a Hebbian reinforcement, learning scheme to adjust the weights of a terminal layer of decision neurons in order to process this information. Then this learning rule is shown to be efficient in a simple pattern recognition task. We discuss in conclusion further extensions of that learning strategy for artificial vision.


Reinforcement Learning Synaptic Weight Hebbian Learning Target Neuron Hebbian Learning Rule 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    J.J. Hopfield. Pattern recognition computation using action potential tuning for stimulus representation. Nature (376):33–36, 1995Google Scholar
  2. 2.
    S.J. Thorpe. Spike arrival times: a highly efficient coding scheme for neural networks in “Parallel processing in neural system” R.Eckmiller, G.Hartrrian and G.Hauske (Eds.) North Holland: Elsevier. 91–94, 1990Google Scholar
  3. 3.
    S.J. Thorpe, D. Fize, C. Marlot. Speed of processing in the human visual system. Nature (381):520–522, 1996Google Scholar
  4. 4.
    S.J. Thorpe, J. Gautrais. How can the visual system process a natural scene in under150 ms? On the role of asynchronous spike propagation. In Proceeding of the 5th European Symposium on Artificial Neural Networks M. Verleysen (Eds.), Bruxelles: De Facto.(in press)79–84, 1997Google Scholar
  5. 5.
    M.V. Tsodyks, M.V. Markram. The neural code between neocortical pyramidal neurons depends on neurotransmitter release probability. Proc Natl Acad Sci USA(94):719–723, 1997Google Scholar
  6. 6.
    F. Worgotter, R. Opara, K.Funke, U. Eysel. Using latency for object recognition in real and artificial neural networks. Neuroreport (7):741–744, 1996Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1997

Authors and Affiliations

  • Manuel Samuelides
    • 1
    • 2
  • Simon Thorpe
    • 3
  • Emmanuel Veneau
    • 1
    • 3
  1. 1.Ecole Nationale Supérieure de l'Aéronautique et de l'EspaceToulouse CedexFrance
  2. 2.Computer Science DepartmentONERA-CERTToulouse CedexFrance
  3. 3.Centre de Recherche Cerveau et CognitionCNRS Faculté de MédecineToulouseFrance

Personalised recommendations