Advertisement

ASSCOUNT: Associative Memory for Time Sequences

  • Berndt Müller
  • Joachim Reinhardt
Part of the Physics of Neural Networks book series (NEURAL NETWORKS)

Abstract

An obvious way to teach a Hopfield-type network to reproduce time sequences instead of stationary patterns is to introduce synaptic coefficients with an intrinsic time dependence [Am88, Gu88] as discussed in Sect. 3.4. Thus instead of the synaptic matrix w ij , which acts instantaneously to generate the local field defined in (20.1), one may introduce a collection of synapses w ij τ acting with a distribution of characteristic time delays of magnitude τ. The local field, which according to (20.2) determines the probability for the updated neuron state s i (t + 1), can be replaced by a generalized convolution in time
$$ {h_i}\left( t \right)\,\, = \,\,\sum\limits_{\tau \, = \,0}^{{\tau _{\max }}} {} \sum\limits_{j\, = \,1}^N {{\lambda ^\tau }w_{ij}^\tau {s_j}\left( {t\,\, - \,\,\tau } \right).} $$
(21.1)

Keywords

Local Field Learning Rule Generalize Convolution Program Description Synaptic Coupling 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 1990

Authors and Affiliations

  • Berndt Müller
    • 1
  • Joachim Reinhardt
    • 2
  1. 1.Department of PhysicsDuke UniversityDurhamUSA
  2. 2.Institut für Theoretische PhysikJ.-W.-Goethe-UniversitätFrankfurt 1Fed. Rep. of Germany

Personalised recommendations