Skip to main content

Neural Networks with Fluctuating Synapses

  • Conference paper
Fourth Granada Lectures in Computational Physics

Part of the book series: Lecture Notes in Physics ((LNP,volume 493))

  • 387 Accesses

Abstract

The Hopfield-like models for associative memory [1, 21 consist of a set of N binary neurons, s X = ±1, whose activity state evolves with time by some either deterministic or stochastic process. The neurons interact with each other according to the Hebb’s rule J xy = N −1 Σ P μ=1 ξ μX ξ μy , for instance, where EquationSource<math display='block'> <mrow> <mrow><mo>{</mo> <mrow> <msubsup> <mi>&#x03BE;</mi> <mi>x</mi> <mi>&#x03BC;</mi> </msubsup> <mo>=</mo><mo>&#x00B1;</mo><mn>1</mn><mo>;</mo><mi>x</mi><mo>&#x2208;</mo><msub> <mo>&#x2227;</mo> <mi>d</mi> </msub> </mrow> <mo>}</mo></mrow><mo>&#x2261;</mo><mtext>&#x00A0;</mtext><msup> <mi>&#x03BE;</mi> <mi>&#x03BC;</mi> </msup> </mrow> </math> ]]</EquationSource><EquationSource Format="TEX"><![CDATA[$$\left\{ {\xi _x^\mu = \pm 1;x \in { \wedge _d}} \right\} \equiv {\text{ }}{\xi ^\mu }$$ represent μ = 1,..., P memorized patterns. This is assumed to represent the case in which the intensities J xy have been fixed in a previous learning process, independent from the process in which the neurons evolve with time. Motivated by the situation in biology systems, we argue that neglecting time variations of the synapses further than those during the learning process is not realistic. We report on a neural-network model that, in addition to learning plasticity, involves relatively fast local fluctuations of the synapse intensities, which vary randomly with time in such a way that its average over the characteristic time for the evolution of the neurons has the value corresponding to the involved learning rule [3]. The influence on emergent properties of such fluctuations happens to be interesting. For specific distributions for the fluctuations we obtain some exact results, namely, effective Hamiltonians for both symmetric or asymmetric couplings. We use the replica trick formalism to obtain explicit results from these effective Hamiltonians. The most general description is provided by a kinetic mean-field approch, which reveals a varied behavior. We show explicitly that allowing for fluctuations amounts to introduce an extra noise that significantly affects the property of associative memory. In particular, the ocurrence of the spin-glass phase at finite temperature is substantially restricted in the model, and it does not appear at zero temperature above a critical value for the number of stored patterns. On the other hand, this version of the model is not critically affected by the Almeida-Thouless line or limit of stability for the replica symmetry solution. We also show that an appropiate choice for the synaptic fluctuation distribution may significantly improve the retrieval process for a finite number of patterns.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Hopfield J. J. (1982): Neural networks and physical systems with emergent collective computational abilities, Proc. Natl. Acad. Sci. 79, 2554–2558.

    Article  ADS  MathSciNet  Google Scholar 

  • Peretto P. (1992): An introduction to the Modeling of Neural Networks ( Cambridge University Press, Cambridge).

    Book  MATH  Google Scholar 

  • Torres J. J., Garrido P. L. and Marro J. (1997): preprint.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Torres, J.J., Garrido, P.L., Marro, J. (1997). Neural Networks with Fluctuating Synapses. In: Garrido, P.L., Marro, J. (eds) Fourth Granada Lectures in Computational Physics. Lecture Notes in Physics, vol 493. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-14148-9_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-14148-9_20

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-662-14150-2

  • Online ISBN: 978-3-662-14148-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics