An Empirical Comparison of Kernel-Based and Dissimilarity-Based Feature Spaces

  • Sang-Woon Kim
  • Robert P. W. Duin
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6218)

Abstract

The aim of this paper is to find an answer to the question: What is the difference between dissimilarity-based classifications(DBCs) and other kernel-based classifications(KBCs)? In DBCs [11], classifiers are defined among classes; they are not based on the feature measurements of individual objects, but rather on a suitable dissimilarity measure among them. In KBCs [15], on the other hand, classifiers are designed in a high-dimensional feature space transformed from the original input feature space through kernels, such as a Mercer kernel. Thus, the difference that exists between the two approaches can be summarized as follows: The distance kernel of DBCs represents the discriminative information in a relative manner, i.e. through pairwise dissimilarity relations between two objects, while the mapping kernel of KBCs represents the discriminative information uniformly in a fixed way for all objects. In this paper, we report on an empirical evaluation of some classifiers built in the two different representation spaces: the dissimilarity space and the kernel space. Our experimental results, obtained with well-known benchmark databases, demonstrate that when the kernel parameters have not been appropriately chosen, DBCs always achieve better results than KBCs in terms of classification accuracies.

Keywords

kernel-based classifications (KBCs) dissimilarity-based classifications (DBCs) representation spaces classification accuracies 

References

  1. 1.
    Balachander, T., Kothari, R.: Kernel based subspace pattern recognition. In: Proc. of Int’l Joint Conference on Neural Networks, Washington DC, USA, vol. 5, pp. 3119–3122 (1999)Google Scholar
  2. 2.
    Balcan, M.-F., Blum, A., Vempala, S.: Kernels as features: On kernels, margins, and low-dimensional mappings. Machine Learning 65, 79–94 (2006)CrossRefGoogle Scholar
  3. 3.
    Baudat, G., Anouar, F.: Generalized discriminant analysis using a kernel approach. Neural Comput. 12(10), 2385–2404 (2000)CrossRefGoogle Scholar
  4. 4.
    Chen, B., Yuan, L., Liu, H., Bao, Z.: Kernel subclass discriminant analysis. Neurocomputing 71, 455–458 (2007)CrossRefGoogle Scholar
  5. 5.
    Goldfarb, L.: A unified approach to pattern recognition. Pattern Recogni. 17, 575–582 (1984)MATHCrossRefMathSciNetGoogle Scholar
  6. 6.
    Haasdonk, B.: Feature space interpretation of SVMs with indefinite kernels. IEEE Trans. Pattern Anal. and Machine Intell. 25(5), 482–492 (2005)CrossRefGoogle Scholar
  7. 7.
    Hardoon, D.R., Szedmak, S., Shawe-Taylor, J.: Canonical correlation analysis: An overview with application to learning methods. Neural Comput. 16, 2639–2664 (2004)MATHCrossRefGoogle Scholar
  8. 8.
    Kim, S.-W., Oommen, B.J.: On using prototype reduction schemes to optimize dissimilarity-based classification. Pattern Recognition 40, 2946–2957 (2007)MATHCrossRefGoogle Scholar
  9. 9.
    Neuhaus, M., Bunke, H.: Edit distance-based kernel functions for structural pattern classification. Pattern Recognition 39, 1852–1863 (2006)MATHCrossRefGoogle Scholar
  10. 10.
    Paclik, P., Novovicova, J., Somol, P., Pudil, P.: Road sign classification using Laplace kernel classifier. Pattern Recognition Lett. 21(13-14), 1165–1173 (2000)MATHCrossRefGoogle Scholar
  11. 11.
    Pekalska, E., Duin, R.P.W.: The Dissimilarity Representation for Pattern Recognition: Foundations and Applications. World Scientific Publishing, Singapore (2005)MATHCrossRefGoogle Scholar
  12. 12.
    Pekalska, E., Duin, R.P.W.: Beyond traditional kernels: Classification in two dissimilarity-based representation spaces. IEEE Trans. Sys. Man, and Cybern(C) 38(6), 727–744 (2008)Google Scholar
  13. 13.
    Schölkopf, B., Smola, A.J., Müller, K.-R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput. 10, 1299–1319 (1998)CrossRefGoogle Scholar
  14. 14.
    Sebastian, T.B., Klein, P.N., Kimia, B.B.: Recognition of shapes by editing shock graphs. In: Proc. of 8th IEEE Int’l Conf. on Computer Vision, Vancouver, Canada, pp. 755–762 (2001)Google Scholar
  15. 15.
    Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, Cambridge (2004)Google Scholar
  16. 16.
    Tsagaroulis, T., Hamza, A.B.: Kernel locally linear embedding algorithm for quality control. In: Sobh, T., Elleithy, K., Mahmood, A., Karim, M.A. (eds.) Novel Algorithms and Techniques in Telecommunications, Automation and Industrial Electronics, pp. 1–6. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  17. 17.
    Wang, J., Lee, J., Zhang, C.: Kernel Trick Embedded Gaussian Mixture Model. In: Gavaldá, R., Jantke, K.P., Takimoto, E. (eds.) ALT 2003. LNCS (LNAI), vol. 2842, pp. 159–174. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  18. 18.
    Wilson, C.L., Garris, M.D.: Handprinted Character Database 3, Technical report, National Institute of Standards and Technology, Gaithersburg, Maryland (1992)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Sang-Woon Kim
    • 1
  • Robert P. W. Duin
    • 2
  1. 1.Dept. of Computer Science and EngineeringMyongji UniversityYonginSouth Korea
  2. 2.Faculty of Electrical Engineering, Mathematics and Computer ScienceDelft University of TechnologyThe Netherlands

Personalised recommendations