Abstract
Problems of combinatorial optimization, beyond their interest in applied research, play a crucial role in fundamental issues of theoretical computer science, for their inherent computational complexity. Here we use them as test bed on which to gauge the many perspectives and problems offered by neural networks.
The realization that optimization problems for quadratic functions of many Boolean variables which are, in a technical sense to be made precise, as difficult as they can be, are conveniently dealt with by neural networks contributes to the interest of such dynamical systems: the parameters controlling their evolution can indeed be assigned in such a way that they have precisely the function to be minimized as a Lyapunov function. The recognition that such an evolution will, in general, stop in a local minimum of this Lyapunov function, as opposed to the global minima one is searching for, motivates the idea of endowing the dynamics of a neural network with a stochastic transition rule leading to a stationary distribution strongly peacked around global minima.
Here we discuss several problems related to the dynamics of both deterministic and stochastic networks with an emphasis on the problem of quantitatively assessing their computational capabilities.
Preview
Unable to display preview. Download preview PDF.
References
Ackley D.H., Hinton G.E., Sejnowski T.J.:“A learning algorithm for Boltzmann machines”. Cognitive Science 9, 147–169 (1985)
Apolloni B., Carvalho C., de Falco D.:“Quantum stochastic optimization” to appear in Stochastic Processes and their Applications.
Apolloni B., Cesa-Bianchi N., de Falco D.: “Quantum tunnelling in stochastic mechanics and combinatorial optimization” in [Ca89]
Barahona F.: “On computational complexity of Ising spin glass models”. Journal of Physics A 15 3241–3253 (1982)
Bertoni A., Campadelli P., Morpurgo A.:“Total stabilization in symmetric networks”. Proceedings of the international workshop Neural Networks and their Applications, Nimes (1988)
Bertoni A., Campadelli P.: “Neural networks and non uniform circuits”, in [Ca89]
Bounds D.G.: “A statistical mechanical study of Boltzmann machines”. Journal of Physics A 20, 2133–2145 (1987)
Bruschi D., Campadelli P. “Reachability and stabilization in antisymmetric networks” in[Ca89]
Brook J., Goodman J.W.: “A generalized convergence theorem for neural networks” Stanford preprint (1988)
Caianiello E.R.:“Outline of a theory of thought processes and thinking machines”. Journal of Theoretical Biology 1, 204–235 (1961)
Caianiello E.R., ed:“Parallel architectures and neural networks”. World Scientific (1989)
[C188]Clark J.W.: “Statistical Mechanics of neural networks”. Physics Reports 158, 91–157 (1988)
Cook S.A.:“The complexity of theorem proving procedure”. Proceedings of the third ACM symposium on the theory of computing.
Fogelman F., Goles E., Pellegrin D.:“Decreasing energy functions as a tool for studying threshold networks”. Discrete and Applied Mathematics 12, 261–277 (1985)
Garey M.R., Johnson D.S.:“Computers and intractability” Freeman (1979)
Gill J.:“Computational complexity of probabilistic Turing machines”. SIAM Journal of Computing 6, 675–695 (1977)
Hinton G.E., Sejnowski J.J., Ackley D.H.:“Boltzmann Machines: constraint satisfaction networks that learn”. Technical Report CMU-CS-119 Carnegie-Mellon University (1984)
Hopfield J.J.: “Neural networks and physical systems with emergent collective computational abilities”. Proceedings of the National Academy of Science, 79, 2554–2558 (1982)
Hopfield J.J., Tank D.: “Neural computation of decisions in optimization problems”. Biological Cybernetics 52, 141–152 (1985)
Hong J.: “On connectionist model”. Beijing Computer Institute preprint (1988)
Hu S.T. “Threshold Logic” University of California Press (1965)
Kullback S.: “Information theory and statistics”. Wiley (1959)
Levin L.A.:“Universal sorting problem” Problemy Peredachi Informatsii, 115–116; 9, English translation in: Problems of Information Transmission 9, 255–256 (1973)
McCulloch W.S., Pitts W.A.: “A logical calculus of ideas immanent in nervous activity”. Bulletin of Mathematical Biophysics 5, 115–133 (1943)
Muroga S. “Threshold logic and its application”. Wiley (1971)
Robert F.:“An introduction to discrete iterations”. in: “Automata networks in computer science”, Fogelman, Robert, Tchuente eds. Manchester University Press (1987)
Rosenberg I.G.: “Reduction of bivalent maximization to the quadratic case” Cahiers Centre Etudes Rech.Oper. 17, 71–74 (1975)
Schnakenberg J.:“Network theory of microscopic and macroscopic behavior of master equation systems”. Reviews of Modern Physics 48, 571–585 (1976)
Stockmeyer L.: “Classifying the computational complexity of problems”. The Journal of Symbolic Logic, 52, 1–43, (1987)
Valiant L.G.:“A theory of the learnable”. Communications of the ACM 27, 1134–1142 (1984)
Valiant L.G.:“Functionality in neural networks”. Harvard preprint (1988)
van Laarhoven P.J.M., Aarts E.H.L.:“Simulated annealing”. Reidel (1987).
Zachos S.:“Robustness of probabilistic computational complexity classes under defnitional perturbations”; Information and Control, 54, 143–154 (1982)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1990 Springer-Verlag
About this paper
Cite this paper
Apolloni, B., Bertoni, A., Campadelli, P., de Falco, D. (1990). Neural networks: Deterministic and stochastic dynamics. In: Lima, R., Streit, L., Vilela Mendes, R. (eds) Dynamics and Stochastic Processes Theory and Applications. Lecture Notes in Physics, vol 355. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-52347-2_21
Download citation
DOI: https://doi.org/10.1007/3-540-52347-2_21
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-52347-5
Online ISBN: 978-3-540-46969-8
eBook Packages: Springer Book Archive