Synchronous Boltzmann Machines and Gibbs Fields: Learning Algorithms

  • Robert Azencott
Conference paper
Part of the NATO ASI Series book series (volume 68)

Abstract

The Boltzmann machines are stochastic networks of formal neurons linked by a quadratic energy function. Hinton, Sejnowski and Ackley who introduced them as pattern classifiers that learn, have proposed a learning algorithm for the asynchronous machine. Here we study the synchronous machine where all neurons are simultaneously updated, we compute its equilibrium energy, and propose a synchronous learning algorithm based on delayed average coactivity of pairs of connected neurons. We generalize the Boltzmann machine paradigm to much wider types of interactions and energies allowing multiple interactions of arbitrary order. We propose a learning algorithm for these generalized machines using the theory of Gibbs fields and parameter estimation for such fields. We give quasi-convergence results for all these algorithms, within the framework of stochastic algorithms theory. The links between generalized Boltzmann machines and Markov field models sketched here provide the groundwork for designing generalized Boltzmann machines capable of performing efficient low level vision tasks. These Boltzmann vision modules are described in a forthcoming paper.

Keywords

Mellon 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Bibliography

  1. [A]
    R. AZENCOTT-Markov fields and low-level vision tasks. Proc. Int. Cong. App. Math., ICIAM, Paris (1987).Google Scholar
  2. R. AZENCOTT -Gibbs fields, simulated annealing, and low-level vision tasks, Proc. Congress on Pattern Recognition AFCET-INRIA, Antibes (1987).Google Scholar
  3. R. AZENCOTT -General Boltzmann machines with multiple interactions (to appear in IEEE PAMI, 1990).Google Scholar
  4. R. AZENCOTT Parameter estimationfor synchronous Markov fields (to appear 1990).Google Scholar
  5. [B.M.P.]
    A. BENVENISTE, M. MÉTIVIER, R. PRIOURET-Algorithms stochastiques, Masson, Paris (1988).Google Scholar
  6. [C]
    B. CHALMOND-Image restoration using an estimated Markov model, IEEE PAMI (1988). Google Scholar
  7. [G]
    D. and S. GEMAN-Gibbs fields, simulated annealing, and Bayerian reconstruction of images, IEEE PAMI (1984). Google Scholar
  8. [G.G.]
    S. GEMAN and C. GRAFFIGNE-Gibbs fields and image segmentation, Proc. Int Cong. Math. (1987).Google Scholar
  9. [H.S.A.]
    G. HINTON, T. SEJNOWSKY, D.H. ACKLEY-Boltzmann machines: constraint satisfaction networks that learn, Technical report Carnegie Mellon Univ. (1984).Google Scholar
  10. [L]
    W.A. LITTLE-The existence of persistent states, Math. Biosci 12 (1974).Google Scholar
  11. [P]
    P. PERETTO-Collective properties of neural networks, Preprint (1984).Google Scholar
  12. [T]
    A. TROUVÉ-Parallelization of simulated annealing, C.R. Ac. Sci (1988) and preprint (1988).Google Scholar
  13. [Y]
    L. YOUNES-Parameter estimation for Gibbs fields, Ann. Inst H. Poincaré (1988) and doctorate thesis Univ. Paris-Sud (1988).Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1990

Authors and Affiliations

  • Robert Azencott
    • 1
    • 2
  1. 1.École Normale SupérieureParis Cedex 05France
  2. 2.Université Paris-SudFrance

Personalised recommendations