A New Classifier Combination Scheme Using Clustering Ensemble
Abstract
Combination of multiple classifiers has been shown to increase classification accuracy in many application domains. Besides, the use of cluster analysis techniques in supervised classification tasks has shown that they can enhance the quality of the classification results. This is based on the fact that clusters can provide supplementary constraints that may improve the generalization capability of the classifiers. In this paper we introduce a new classifier combination scheme which is based on the Decision Templates Combiner. The proposed scheme uses the same concept of representing the classifiers decision as a vector in an intermediate feature space and builds more representatives decision templates by using clustering ensembles. An experimental evaluation was carried out on several synthetic and real datasets. The results show that the proposed scheme increases the classification accuracy over the Decision Templates Combiner, and other classical classifier combinations methods.
Keywords
Classifier Combination Decision Templates Clustering EnsembleReferences
- 1.Kuncheva, L.I.: Combining Pattern Classifiers. Methods and Algorithms. John Wiley & Sons, New York (2004)MATHCrossRefGoogle Scholar
- 2.Vega-Pons, S., Ruiz-Shulcloper, J.: A survey of clustering ensemble algorithms. International Journal of Pattern Recognition and Artificial Intelligence 25(3), 337–372 (2011)CrossRefGoogle Scholar
- 3.Jurek, A., Bi, Y., Wu, S., Nugent, C.: Classification by Cluster Analysis: A New Meta-Learning Based Approach. In: Sansone, C., Kittler, J., Roli, F. (eds.) MCS 2011. LNCS, vol. 6713, pp. 259–268. Springer, Heidelberg (2011)CrossRefGoogle Scholar
- 4.Gao, J., Liangy, F., Fanz, W., Sun, Y., Han, J.: Graph-based consensus maximization among multiple supervised and unsupervised models. In: 23rd Annual Conference on Neural Information Processing Systems, pp. 1–9 (2009)Google Scholar
- 5.Ma, X., Luo, P., Zhuang, F., He, Q., Shi, Z., Shen, Z.: Combining supervised and unsupervised models via unconstrained probabilistic embedding. In: Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence (2011)Google Scholar
- 6.Acharya, A., Hruschka, E.R., Ghosh, J., Acharyya, S.: C 3E: A Framework for Combining Ensembles of Classifiers and Clusterers. In: Sansone, C., Kittler, J., Roli, F. (eds.) MCS 2011. LNCS, vol. 6713, pp. 269–278. Springer, Heidelberg (2011)CrossRefGoogle Scholar
- 7.Xu, L., Krzyzak, A., Suen, C.: Methods of combining multiple classifiers and their applications to handwriting recognition. IEEE Transactions on Systems, Man and Cybernetics 22(3), 418–435 (1992)CrossRefGoogle Scholar
- 8.Kuncheva, L.: Combining classifiers: Soft computing solutions, pp. 427–452. World Scientific (2001)Google Scholar
- 9.Rogova, G.: Combining the results of several neural network classifiers. Neural Networks 7(5), 777–781 (1994)CrossRefGoogle Scholar
- 10.Kuncheva, L., Bezdek, J., Duin, R.: Decision templates for multiple classifier fusion: an experimental comparison. Pattern Recognition 34(2), 299–314 (2001)MATHCrossRefGoogle Scholar
- 11.Frank, A., Asuncion, A.: UCI machine learning repository (2010), http://archive.ics.uci.edu/ml
- 12.Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., Witten, I.H.: The weka data mining software: An update. SIGKDD Explorations 11(1) (2009)Google Scholar
- 13.Fred, A., Jain, A.: Combining multiple clustering using evidence accumulation. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(6), 835–850 (2005)CrossRefGoogle Scholar