Abstract
This paper proposes a general approach named Expectation-MiniMax (EMM) for clustering analysis without knowing the cluster number. It describes the contrast function of Expectation-Maximization (EM) algorithm by an approximate one with a designable error term. Through adaptively minimizing a specific error term meanwhile maximizing the approximate contrast function, the EMM automatically penalizes all rivals during the competitive learning. Subsequently, the EMM not only includes the Rival Penalized Competitive Learning algorithm (Xu et al. 1993) and its Type A form (Xu 1997) with the new variants developed, but also provides a better alternative way to optimize the EM contrast function with at least two advantages: (1) faster model parameter learning speed, and (2) automatic model-complexity selection capability. We present the general learning procedures of the EMM, and demonstrate its outstanding performance in comparison with the EM.
This work was supported by the Faculty Research Grant of Hong Kong Baptist University (Project No: FRG/02/03/I-06).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
H. Akaike, “A New Look at the Statistical Model Identfication”, IEEE Transactions on Automatic Control AC-19, pp. 716–723, 1974.
Y.M. Cheung, “k*-means—A Generalized k-means Clustering Algorithm with Unknown Cluster Number”, Proceedings of Third International Conference on Intelligent Data Engineering and Automated Learning (IDEAL’02), pp. 307–317, 2002.
A.P. Dempster, N.M. Laird and D.B. Rubin, “Maximum Likelihood from Incomplete Data via The EM Algorithm”, Journal of Royal Statistical Society, Vol. 39, pp. 1–38, 1977.
J.B. MacQueen, “Some Methods for Classification and Analysis of Multivariate Observations”, Proceedings of 5nd Berkeley Symposium on Mathematical Statistics and Probability, 1, Berkeley, University of California Press, pp. 281–297, 1967.
G.J. McLachlan and K.E. Basford, “Mixture Models: Inference and Application to Clustering”, Dekker, 1988.
G. Schwarz, “Estimating the Dimension of a Model”, The Annals of Statistics, Vol. 6, No. 2, pp. 461–464, 1978.
L. Xu, “A Unified Learning Scheme: Bayesian-Kullback Ying-Yang Machine”, Advances in Neural Information Processing Systems, Vol. 8, pp. 444–450, 1996.
L. Xu, “Bayesian Ying-Yang Machine, Clustering and Number of Clusters”, Pattern Recognition Letters, Vol. 18, No. 11–13, pp. 1167–1178, 1997.
L. Xu, A. Krzyżak and E. Oja, “Rival Penalized Competitive Learning for Clustering Analysis, RBF Net, and Curve Detection”, IEEE Transaction on Neural Networks, Vol. 4, pp. 636–648, 1993.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Cheung, Ym. (2003). Expectation-MiniMax Approach to Clustering Analysis. In: Kaynak, O., Alpaydin, E., Oja, E., Xu, L. (eds) Artificial Neural Networks and Neural Information Processing — ICANN/ICONIP 2003. ICANN ICONIP 2003 2003. Lecture Notes in Computer Science, vol 2714. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44989-2_21
Download citation
DOI: https://doi.org/10.1007/3-540-44989-2_21
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-40408-8
Online ISBN: 978-3-540-44989-8
eBook Packages: Springer Book Archive