Bernoulli Mixture Model of Experts for Supervised Pattern Classification

  • N. Elhor
  • R. Bertrand
  • D. Hamad
Conference paper


Artificial neural networks have been applied to solve hard problems in different engineering domains, thanks to their capability of universal function approximators [4]. However, when these networks are used in their standard forms, ‘black-box models’, their performances are inferior to dedicated statistical solutions. Performances can be largely improved if we introduce prior knowledge in network architectures. If the real problem has an obvious decomposition, then it may be possible to design a network architecture by hand. Unfortunately, this is not always possible.


Mixture Model Gaussian Mixture Model Output Neuron Synaptic Weight Decision Boundary 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    C. M. Bishop. Neural Networks for Pattern Recognition. Clarendon Press, Oxford, 1995.Google Scholar
  2. [2]
    J. S. Bridle. Probabilistic interpretation of feed-forward classification network outputs, with relationships to statistical pattern recognition. In F. Fogelman-Soulié and J. Hérault, editors, Neuro-Computing: Algorithms, Architectures and Applications, NATO Series, Vol. F68, pages 227–236, 1990.Google Scholar
  3. [3]
    M. Y. Chow, A. Menozzi, J. Teeter, and J. P. Thrower. Bernoulli error measure approach to train feedforward artificial neural networks for classification problems. In IEEE Inter. Conf. Neural Networks, volume 1, pages 44–49. Orlando, Florida, USA, 1994.Google Scholar
  4. [4]
    S. Haykin. Neural Networks a Comprehensive Foundation. IEEE Computer Society Press, 1994.Google Scholar
  5. [5]
    R. A. Jacobs and M. I. Jordan. A competitive modular connectionist architecture. In R. P. Lippmann, J. E. Moody, and D. J. Touretzky, editors, Advances in Neural Information Processing Systems 3, pages 767–773, San Mateo, CA, 1991. Morgan Kauffmann.Google Scholar
  6. [6]
    R. A. Jacobs, M. I. A. Jordan, S. J. Nowlan, and G. E. Hinton. Adaptive mixture of local experts. Neural Computation, 3:79–87, 1991.CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Wien 1998

Authors and Affiliations

  • N. Elhor
    • 1
  • R. Bertrand
    • 1
  • D. Hamad
    • 1
  1. 1.Centre d’Automatique de Lille, Bâtiment P2Université des Sciences et Technologies de LilleVilleneuve d’Ascq CedexFrance

Personalised recommendations