A Combine-Correct-Combine Scheme for Optimizing Dissimilarity-Based Classifiers

  • Sang-Woon Kim
  • Robert P. W. Duin
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5856)


Recently, to increase the classification accuracy of dissimilarity-based classifications (DBCs), Kim and Duin [5] proposed a method of simultaneously employing fusion strategies in representing features (representation step) as well as in designing classifiers (generalization step). In this multiple fusion strategies, however, the resulting dissimilarity matrix is sometimes an indefinite one, causing problems in using the traditional pattern recognition tools after embedding the matrix in a vector space. To overcome this problem, we study a new way, named combine-correct-combine (CCC) scheme, of additionally employing an Euclidean correction procedure between the two steps. In CCC scheme, we first combine dissimilarity matrices obtained with different measures to a new dissimilarity representation using a representation combining strategy. Next, we correct the dissimilarity matrix using a pseudo-Euclidean embedding algorithm to improve the internal consistency of the matrix. After that, we again utilize the classifier combining strategies in the refined dissimilarity matrix to achieve an improved classification for a given data set. Our experimental results for well-known benchmark databases demonstrate that the CCC mechanism works well and achieves further improved results in terms of the classification accuracy compared with the previous multiple fusion approaches. The results especially demonstrate that the highest accuracies are obtained when the refined representation is classified with the trained combiners.


Negative Eigenvalue Dissimilarity Measure Face Database Fusion Strategy Dissimilarity Matrix 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Borg, I., Groenen, P.: Morden Mutlidimensional Scaling: Theory and Applications. Springer, New York (1997)Google Scholar
  2. 2.
    Congalton, R.G.: A review of assessing the accuracy of classifications of remotely sensed data. Remote Sensing of Enviroment 37, 35–46 (1991)CrossRefGoogle Scholar
  3. 3.
    Duin, R.P.W., Pekalska, E., Harol, A., Lee, W.: On Euclidean corrections for non-Euclidean dissimilarities. In: da Vitoria Lobo, N., Kasparis, T., Roli, F., Kwok, J.T., Georgiopoulos, M., Anagnostopoulos, G.C., Loog, M. (eds.) S+SSPR 2008. LNCS, vol. 5342, pp. 551–561. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  4. 4.
    Haasdonk, H., Burkhardt, B.: Invariant kernels for pattern analysis and machine learning. Machine Learning 68, 35–61 (2007)CrossRefGoogle Scholar
  5. 5.
    Kim, S.-W., Duin, R.P.W.: On optimizing dissimilarity-based classifier using multi-level fusion strategies. Journal of The Institute of Electronics Engineers of Korea 45-CI(5), 15–24 (2008) (in korean); A preliminary version of this paper was presented at the 20th Canadian Conference on Artificial Intelligence, Montreal, Canada. LNCS (LNAI), vol. 4509, pp. 110–121 (2007)Google Scholar
  6. 6.
    Kittler, J., Hatef, M., Duin, R.P.W., Matas, J.: On combining classifiers. IEEE Trans. Pattern Anal. and Machine Intell. 20(3), 226–239 (1998)CrossRefGoogle Scholar
  7. 7.
    Kuncheva, L.I.: Combining Pattern Classifiers - Methods and Algorithms. John Wiley & Sons, New Jersey (2004)zbMATHCrossRefGoogle Scholar
  8. 8.
    Munoz, A., de Diego, I.M.: From indefinite to positive semi-definite matrices. In: Yeung, D.-Y., Kwok, J.T., Fred, A., Roli, F., de Ridder, D. (eds.) SSPR 2006 and SPR 2006. LNCS, vol. 4109, pp. 764–772. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  9. 9.
    Pekalska, E., Duin, R.P.W.: The Dissimilarity Representation for Pattern Recognition: Foundations and Applications. World Scientific Publishing, Singapore (2005)zbMATHGoogle Scholar
  10. 10.
    Pekalska, E., Harol, A., Duin, R.P.W., Spillmann, B., Bunke, H.: Non-Euclidean or non-metric measures can be informative. In: Yeung, D.-Y., Kwok, J.T., Fred, A., Roli, F., de Ridder, D. (eds.) SSPR 2006 and SPR 2006. LNCS, vol. 4109, pp. 871–880. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  11. 11.
    Todorovski, L., Dzeroski, S.: Combining classifiers with meta decision trees. Machine Learning 50(3), 223–249 (2003)zbMATHCrossRefGoogle Scholar
  12. 12.
    Wilson, C.L., Garris, M.D.: Handprinted Character Database 3, Technical report, National Institute of Standards and Technology, Gaithersburg, Maryland (1992)Google Scholar
  13. 13.
    Zhou, Z.-H., Tang, W.: Clusterer ensemble. Knowledge-Based System 19, 77–83 (2006)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Sang-Woon Kim
    • 1
  • Robert P. W. Duin
    • 2
  1. 1.Dept. of Computer Science and EngineeringMyongji UniversityYonginSouth Korea
  2. 2.Faculty of Electrical Engineering, Mathematics and Computer ScienceDelft University of TechnologyThe Netherlands

Personalised recommendations