Since 1982, starting with the work of Hopfield, theoretical physics is contributing to the theory of neural networks. In his pioneering work, Hopfield pointed out a relation between models of disordered magnets (spin glasses) and models of neurons interacting by competing synaptic couplings. This work started an extensive research effort: using models, methods and principles of statistical physics one has described the cooperative behavior of a large system of interacting neurons. Now, almost two decades later, much has been achieved in this field: associative memory, learning from examples, generalization from examples to an unknown rule, time series prediction, optimizing architectures and learning rules, all this has been expressed in a mathematical language which allows to calculate the cooperative properties of infinitely large systems being trained on infinitely many patterns [1, 2].
Unable to display preview. Download preview PDF.
- Hertz, J. and Krogh, A., and Palmer, R.G.: Introduction to the Theory of Neural Computation, ( Addison Wesley, Redwood City, 1991 )Google Scholar
- Engel, A. and Van den Broeck, C.: Statistical Mechanics of Learning, (Cambridge University Press, 2001 )Google Scholar
- M. Opper and W.Kinzel: Statistical Mechanics of Generalization, Models of Neural Networks III, ed. by E. Domany and J.L. van Hemmen and K. Schulten, 151-209 ( Springer Verlag, Heidelberg 1995 )Google Scholar
- A. Weigand and N. S. Gershenfeld: Time Series Prediction, Santa Fe, ( Addison Wesley, 1994 )Google Scholar
- W. Kinzel, G. Reents: Physics by Computer, (Springer Verlag, 1998 )Google Scholar
- A. Priel and I. Kanter: Robust chaos generation by a perceptron, Europhys. Lett. 51, 244 - 250 (2000)Google Scholar
- A. Freking and W. Kinzel and I. Kanter: Phys. Rev. E (2002)Google Scholar
- I. Kanter, W. Kinzel and E. Kanter, Europhys. Lett. 57, 141-147 (2002)Google Scholar