Abstract
A novel type of gradient attractor neural network is described that is characterized by bistable dynamics of nodes and their linear coupling. In contrast to the traditional perceptron-based neural networks which are plagued by spurious states it is found that this Bistable Gradient Network (BGN) is virtually free from spurious states. The consequences of this — greatly enhanced memory capacity, high speed of training and perfect recall — are illustrated by and compared with a small Hopfield network.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Amit D. J.: Modeling brain function. The world of attractor neural networks. (Cambridge Univ. Press, 1989)
Chinarov V., Menzinger M.: Computational dynamics of gradient bistable networks, BioSytems 55 (2000), 137–142.
Hebb D. O.: The organization of behaviour (Wiley, 1949)
Haykin S.: Neural Networks, (Prentice Hall, N.Y. 1999)
Hopfield J.: Neurons with graded response have collective computational properties like those of two-state neurons. Proc.Natl.Acad.Sci. 81 (1984), 3088–3092.
Hoppenstaedt FC and Izhikevich (199?)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Chinarov, V., Menzinger, M. (2001). Bistable Gradient Neural Networks: Their Computational Properties. In: Mira, J., Prieto, A. (eds) Connectionist Models of Neurons, Learning Processes, and Artificial Intelligence. IWANN 2001. Lecture Notes in Computer Science, vol 2084. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45720-8_38
Download citation
DOI: https://doi.org/10.1007/3-540-45720-8_38
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42235-8
Online ISBN: 978-3-540-45720-6
eBook Packages: Springer Book Archive