An Effective Model Selection Criterion for Mixtures of Gaussian Processes
The Mixture of Gaussian Processes (MGP) is a powerful statistical learning framework in machine learning. For the learning of MGP on a given dataset, it is necessary to solve the model selection problem, i.e., to determine the number C of actual GP components in the mixture. However, the current learning algorithms for MGPs cannot solve this problem effectively. In this paper, we propose an effective model selection criterion, called the Synchronously Balancing or SB criterion for MGPs. It is demonstrated by the experimental results that this SB criterion is feasible and even outperforms two classical criterions: AIC and BIC, for model selection on MGPs. Moreover, it is found that there exists a feasible interval of the penalty coefficient for correct model selection.
KeywordsMixture of Gaussian processes Model selection EM algorithm Parameter learning Likelihood
Unable to display preview. Download preview PDF.
- 1.Yuan, C., Neubauer, C.: Variational mixture of Gaussian process experts. In: Advances in Neural Information Processing Systems, pp. 1897–1904 (2008)Google Scholar
- 2.Tresp, V.: Mixtures of Gaussian processes. In: Advances in Neural Information Processing Systems, vol. 13, pp. 654–660 (2000)Google Scholar
- 3.Nguyen, T., Bonilla, E.: Fast Allocation of Gaussian Process Experts. In: Proceedings of The 31st International Conference on Machine Learning, pp. 145–153 (2014)Google Scholar
- 4.Rasmussen, C.E., Ghahramani, Z.: Infinite mixtures of Gaussian process experts. In: Advances in Neural Information Processing Systems, vol. 14, pp. 881–888 (2001)Google Scholar
- 5.Fergie, M.P.: Discriminative Pose Estimation Using Mixtures of Gaussian Processes. The University of Manchester (2013)Google Scholar
- 7.Meeds, E., Osindero, S.: An alternative infinite mixture of Gaussian process experts. In: Advances in Neural Information Processing Systems, vol. 18, pp. 883–890 (2005)Google Scholar
- 9.Liddle, A.R.: Information criterion for astrophysical model selection. Monthly Notices of the Royal Astronomical Society: Letters 377(1), L74–L78 (2007)Google Scholar
- 10.Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm. Journal of Royal Statistical Society, Series B (Methodological), 1–38 (1977)Google Scholar
- 11.Chen, Z., Ma, J., Zhou, Y.: A Precise Hard-Cut EM Algorithm for Mixtures of Gaussian Processes. In: Huang, D.-S., Jo, K.-H., Wang, L. (eds.) ICIC 2014. LNCS, vol. 8589, pp. 68–75. Springer, Heidelberg (2014)Google Scholar
- 14.Fergie, M.P.: Discriminative Pose Estimation Using Mixture of Gaussian Processes. The University of Manchester (2013)Google Scholar
<SimplePara><Emphasis Type="Bold">Open Access</Emphasis> This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 2.5 International License (http://creativecommons.org/licenses/by-nc/2.5/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made. </SimplePara> <SimplePara>The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.</SimplePara>