Abstract
The Hopfield-like models for associative memory [1, 21 consist of a set of N binary neurons, s X = ±1, whose activity state evolves with time by some either deterministic or stochastic process. The neurons interact with each other according to the Hebb’s rule J xy = N −1 Σ P μ=1 ξ μX ξ μy , for instance, where EquationSource<math display='block'> <mrow> <mrow><mo>{</mo> <mrow> <msubsup> <mi>ξ</mi> <mi>x</mi> <mi>μ</mi> </msubsup> <mo>=</mo><mo>±</mo><mn>1</mn><mo>;</mo><mi>x</mi><mo>∈</mo><msub> <mo>∧</mo> <mi>d</mi> </msub> </mrow> <mo>}</mo></mrow><mo>≡</mo><mtext> </mtext><msup> <mi>ξ</mi> <mi>μ</mi> </msup> </mrow> </math> ]]</EquationSource><EquationSource Format="TEX"><![CDATA[$$\left\{ {\xi _x^\mu = \pm 1;x \in { \wedge _d}} \right\} \equiv {\text{ }}{\xi ^\mu }$$ represent μ = 1,..., P memorized patterns. This is assumed to represent the case in which the intensities J xy have been fixed in a previous learning process, independent from the process in which the neurons evolve with time. Motivated by the situation in biology systems, we argue that neglecting time variations of the synapses further than those during the learning process is not realistic. We report on a neural-network model that, in addition to learning plasticity, involves relatively fast local fluctuations of the synapse intensities, which vary randomly with time in such a way that its average over the characteristic time for the evolution of the neurons has the value corresponding to the involved learning rule [3]. The influence on emergent properties of such fluctuations happens to be interesting. For specific distributions for the fluctuations we obtain some exact results, namely, effective Hamiltonians for both symmetric or asymmetric couplings. We use the replica trick formalism to obtain explicit results from these effective Hamiltonians. The most general description is provided by a kinetic mean-field approch, which reveals a varied behavior. We show explicitly that allowing for fluctuations amounts to introduce an extra noise that significantly affects the property of associative memory. In particular, the ocurrence of the spin-glass phase at finite temperature is substantially restricted in the model, and it does not appear at zero temperature above a critical value for the number of stored patterns. On the other hand, this version of the model is not critically affected by the Almeida-Thouless line or limit of stability for the replica symmetry solution. We also show that an appropiate choice for the synaptic fluctuation distribution may significantly improve the retrieval process for a finite number of patterns.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Hopfield J. J. (1982): Neural networks and physical systems with emergent collective computational abilities, Proc. Natl. Acad. Sci. 79, 2554–2558.
Peretto P. (1992): An introduction to the Modeling of Neural Networks ( Cambridge University Press, Cambridge).
Torres J. J., Garrido P. L. and Marro J. (1997): preprint.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1997 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Torres, J.J., Garrido, P.L., Marro, J. (1997). Neural Networks with Fluctuating Synapses. In: Garrido, P.L., Marro, J. (eds) Fourth Granada Lectures in Computational Physics. Lecture Notes in Physics, vol 493. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-14148-9_20
Download citation
DOI: https://doi.org/10.1007/978-3-662-14148-9_20
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-14150-2
Online ISBN: 978-3-662-14148-9
eBook Packages: Springer Book Archive