Advertisement

An MCMC Based EM Algorithm for Mixtures of Gaussian Processes

  • Di Wu
  • Ziyi Chen
  • Jinwen Ma
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9377)

Abstract

The mixture of Gaussian processes (MGP) is a powerful statistical learning model for regression and prediction and the EM algorithm is an effective method for its parameter learning or estimation. However, the feasible EM algorithms for MGPs are certain approximations of the real EM algorithm since Q-function cannot be computed efficiently in this situation. To overcome this problem, we propose an MCMC based EM algorithm for MGPs where Q-function is alternatively estimated on a set of simulated samples via the Markov Chain Monte Carlo (MCMC) method. It is demonstrated by the experiments on both the synthetic and real-world datasets that our proposed MCMC based EM algorithm is more effective than the other three EM algorithms for MGPs.

Keywords

Mixture of Gaussian processes EM algorithm Classification Multimodality Prediction 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Rasmussen, C.E., Williams, C.K.I.: Gaussian process for machine learning. MIT Press, Cambridge (2006)zbMATHGoogle Scholar
  2. 2.
    Tresp, V.: Mixtures of Gaussian processes. In: Proc. of the Conf. on Neural Information Processing Systems (NIPS), pp. 654–660 (2000)Google Scholar
  3. 3.
    Chen, Z., Ma, J., Zhou, Y.: A precise hard-cut EM algorithm for mixtures of Gaussian processes. In: Huang, D.-S., Jo, K.-H., Wang, L. (eds.) ICIC 2014. LNCS, vol. 8589, pp. 68–75. Springer, Heidelberg (2014)Google Scholar
  4. 4.
    Yang, Y., Ma, J.: An efficient EM approach to parameter learning of the mixture of Gaussian processes. In: Liu, D., Zhang, H., Polycarpou, M., Alippi, C., He, H. (eds.) ISNN 2011, Part II. LNCS, vol. 6676, pp. 165–174. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  5. 5.
    Nguyen, T., Bonilla, E.: Fast allocation of Gaussian process experts. In: Proceedings of the 31st International Conference on Machine Learning (ICML), pp. 145–153 (2014)Google Scholar
  6. 6.
    Sun, S., Xu, X.: Variational inference for infinite mixtures of Gaussian processes with applications to traffic flow prediction. IEEE Transactions on Intelligent Transportation Systems 12(2), 466–475 (2011)CrossRefGoogle Scholar
  7. 7.
    Yuan, C., Neubauer, C.: Variational mixture of Gaussian process experts. In: Advances in Neural Information Processing Systems, vol. 21, pp. 1897–1904 (2008)Google Scholar
  8. 8.
    Lazaro-Gredilla, M., Vaerenbergh, S.V., Lawrence, N.D.: Overlapping mixtures of Gaussian processes for the data association problem. Pattern Recognition 45, 1386–1395 (2012)CrossRefGoogle Scholar
  9. 9.
    Rasmussen, C.E., Ghahramani, Z.: Infinite mixture of Gaussian process experts. In: Advances in Neural Information Processing Systems, vol. 2, pp. 881–888 (2002)Google Scholar
  10. 10.
    Tayal, A., Poupart, P., Li, Y.: Hierarchical double Dirichlet process mixture of Gaussian processes. Association for the Advancement of Artificial Intelligence (2012)Google Scholar
  11. 11.
    Meeds, E., Osindero, S.: An alternative infinite mixture of Gaussian process experts. In: Advances in Neural Information Processing Systems, vol. 18, pp. 883–890 (2006)Google Scholar
  12. 12.
    Sun, S.: Infinite mixtures of multivariate Gaussian processes. In: Proceedings of the International Conference on Machine Learning and Cybernetics, pp. 1–6 (2013)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

<SimplePara><Emphasis Type="Bold">Open Access</Emphasis> This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 2.5 International License (http://creativecommons.org/licenses/by-nc/2.5/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made. </SimplePara> <SimplePara>The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.</SimplePara>

Authors and Affiliations

  • Di Wu
    • 1
  • Ziyi Chen
    • 1
  • Jinwen Ma
    • 1
  1. 1.Department of Information Science, School of Mathematical Sciences and LMAMPeking UniversityBeijingChina

Personalised recommendations