Abstract
A further investigation is made on an adaptive local factor analysis algorithm from Bayesian Ying-Yang (BYY) harmony learning, which makes parameter learning with automatic determination of both the component number and the factor number in each component. A comparative study has been conducted on simulated data sets and several real problem data sets. The algorithm has been compared with not only a recent approach called Incremental Mixture of Factor Analysers (IMoFA) but also the conventional two-stage implementation of maximum likelihood (ML) plus model selection, namely, using the EM algorithm for parameter learning on a series candidate models, and selecting one best candidate by AIC, CAIC, and BIC. Experiments have shown that IMoFA and ML-BIC outperform ML-AIC or ML-CAIC while the BYY harmony learning considerably outperforms IMoFA and ML-BIC. Furthermore, this BYY learning algorithm has been applied to the popular MNIST database for digits recognition with a promising performance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Redner, R.A., Walker, H.F.: Mixture densities, maximum likelihood, and the EM algorithm. SIAM Review 26, 195–239 (1984)
Kambhatla, N., Leen, T.K.: Fast non-linear dimension reduction. In: Advances in NIPS 6. In: Advances in NIPS, vol. 6. Morgan Kaufmann, San Francisco (1994)
Hinton, G.E., Revow, M., Dayan, P.: Recognizing handwritten digits using mixtures of Linear models. In: Advances in NIPS, vol. 7. MIT Press, Cambridge (1995)
Akaike, H.: A new look at statistical model identification. IEEE Trans. Automatic Control 19, 716–723 (1974)
Barron, A., Rissanen, J.: The minimum description length priciple in coding and modeling. IEEE Trans. Information Theory 44, 2743–2760 (1998)
Bozdogan, H.: Model selection and Akaike’s information criterion (AIC): the general theory and its analytical extensions. Psychometrika 52(3), 345–370 (1987)
Figueiredo, M.A.T., Jain, A.K.: Unsupervised learning of finite mixture models. IEEE Trans. Pattern Analysis and Machine Intelligence 24(3), 381–396 (2002)
Rubin, D., Thayer, D.: EM algorithms for ML factor analysis. Psychometrika 47(1), 69–76 (1982)
Schwarz, G.: Estimating the dimension of a model. The Annals of Statistics 6(2), 461–464 (1978)
Ghahramani, Z., Beal, M.: Variational inference for Bayesian mixture of factor analysers. In: Advances in NIPS, vol. 12, pp. 449–455 (2000)
Salah, A.A., Alpaydin, E.: Incremental Mixtures of Factor Analysers. In: Proc. 17th Intl Conf. on Pattern Recognition, vol. 1, pp. 276–279 (2004)
LeCun, Y., et al.: Gradient-Based Learning Applied to Document Recognition. Proceedings of the IEEE 86(11), 2278–2324 (1998)
Xu, L.: Multisets Modeling Learning: An Unified Theory for Supervised and Unsupervised Learning. In: Proc. IEEE ICNN 1994, Orlando, Florida, June 26-July 2, vol. I, pp. 315–320 (1994) (invited talk)
Xu, L.: Temporal BYY Encoding, Markovian State Spaces, and Space Dimension Determination. IEEE Trans. Neural Networks 15(5), 1276–1295 (2004)
Xu, L.: Advances on BYY harmony learning: information theoretic perspective, generalized projection geometry, and independent factor auto-determination. IEEE Trans on Neural Networks 15(4), 885–902 (2004)
Xu, L.: A Unified Perspective and New Results on RHT Computing, Mixture Based Learning, and Multi-learner Based Problem Solving. To appear in a special issue of Pattern Recognition (2006)
Xu, L.: Trends on Regularization and Model Selection in Statistical Learning: A Perspective from Bayesian Ying Yang Learning. In: Duch, W., Mandziuk, J., Zurada, J.M. (eds.) Challenges to Computational Intelligence. The Springers series - Studies in Computational Intelligence. Springer, Heidelberg (2006) (in press)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Shi, L., Xu, L. (2006). Local Factor Analysis with Automatic Model Selection: A Comparative Study and Digits Recognition Application. In: Kollias, S., Stafylopatis, A., Duch, W., Oja, E. (eds) Artificial Neural Networks – ICANN 2006. ICANN 2006. Lecture Notes in Computer Science, vol 4132. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11840930_27
Download citation
DOI: https://doi.org/10.1007/11840930_27
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-38871-5
Online ISBN: 978-3-540-38873-9
eBook Packages: Computer ScienceComputer Science (R0)