Abstract
In this paper, an alternative to Mixture of Experts (ME) called localised mixture of experts is studied. It corresponds to ME where the experts are linear regressions and the gating network is a Gaussian classifier. The underlying regressors distribution can be considered to be Gaussian, so that the joint distribution is a Gaussian mixture. This provides a powerful speed-up of the EM algorithm for localised ME. Conversely, when studying Gaussian mixtures with specific constraints, one can use the standard EM algorithm for mixture of experts to carry out maximum likelihood estimation. Some constrained models are useful, and the corresponding modifications to apply to the EM algorithm are described.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
DEMPSTER, A. P., LAIRD, N. M. and RUBIN, D. B. (1977): Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society, Series B, 39, 1–38.
J. FRITSCH. (1996): Modular neural networks for speech recognition. Master’s thesis, Carnegie Mellon University & University of Karlsruhe. ftp://reports.adm.cs.cmu.edu/usr/anon/1996/CMU-CS-96-203.ps.gz.
FRITSCH, J., FINKE, M. and WAIBEL, A. (1997): Adaptively growing hierarchical mixtures of experts. In M. C. Mozer, M. I. Jordan and T. Petsche (Eds.), Advances in Neural Informations Processing Systems, 9. MIT Press.
HENNIG, C. (1999): Models and Methods for Clusterwise Linear Regression. Gaul, W. and Locarek-Junge, H. (Eds): Classification in the Information Age. Springer, Berlin, 179–187.
HENNIG, C. (2000): Identifiability of Models for Clusterwise Linear Regression. Journal of Classification, 17, 273–296.
HURN, M. A., JUSTEL, A. and C. P., ROBERT. (2000): Estimating mixtures of regressions. Technical report, CREST, France.
JACOBS, R. A., JORDAN, M. I., NOWLAN, S. J. and HINTON, G. E. (1991): Adaptive mixture of local experts. Neural Computation, 3(1), 79–87.
JIANG, W. and TANNER, M.A. (1999): Hierarchical mixtures-of-experts for exponential family regression models: approximation and maximum likelihood estimation. Ann. Statistics, 27, 987–1011.
JORDAN, M. I. and JACOBS, R. A. (1994): Hierarchical mixtures of experts and the EM algorithm. Neural Computation, 6, 181–214.
QUANDT, R. E.(1972): A new Approach to Estimating Switching Regressions Journal of the American Statistical Association, 67, 306–310.
KIEFER, N. M.(1978): Discrete Parameter Variation: Efficient Estimation of a Switching Regression Model, Econometrica, 46, 427–434.
QUANDT, R. E. and RAMSEY, J. B. (1978): Estimating mixtures of normal distributions and switching regressions, JASA, 73, 730–752.
McLACHLAN, G. J. and PEEL, D. (2000): Finite Mixture Models, Wiley.
MOODY, J. and DARKEN, C.J. (1989): Fast learning in networks of locally-tuned processing units Neural Computation, 1, 281–294.
MOERLAND, P. (1999) Classification using localized mixtures of experts. In proc. of the International Conference on Artificial Neural Networks.
SATO, M. and ISHII, S. (2000): On-line EM algorithm for the normalized gaussian network. Neural Computation, 12(2), 407–432.
XU, L. and JORDAN, M.I. (1995): On convergence properties of the EM algorithm for Gaussian mixtures. Neural Computation, 8(1), Jan.
XU, L., HINTON, G. and JORDAN, M. I. (1995): An alternative model for mixtures of experts. In G. Tesauro et al., Advances in Neural Information Processing Systems, 7, 633–640, Cambridge MA, MIT Press.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Bouchard, G. (2003). Localised Mixtures of Experts for Mixture of Regressions. In: Schader, M., Gaul, W., Vichi, M. (eds) Between Data Science and Applied Data Analysis. Studies in Classification, Data Analysis, and Knowledge Organization. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-18991-3_18
Download citation
DOI: https://doi.org/10.1007/978-3-642-18991-3_18
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-40354-8
Online ISBN: 978-3-642-18991-3
eBook Packages: Springer Book Archive