Advertisement

Sparsification of Indefinite Learning Models

  • Frank-Michael Schleif
  • Christoph Raab
  • Peter Tino
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11004)

Abstract

The recently proposed Krĕin space Support Vector Machine (KSVM) is an efficient classifier for indefinite learning problems, but with a non-sparse decision function. This very dense decision function prevents practical applications due to a costly out of sample extension. In this paper we provide a post processing technique to sparsify the obtained decision function of a Krĕin space SVM and variants thereof. We evaluate the influence of different levels of sparsity and employ a Nyström approach to address large scale problems. Experiments show that our algorithm is similar efficient as the non-sparse Krĕin space Support Vector Machine but with substantially lower costs, such that also large scale problems can be processed.

Keywords

Non-positive kernel Krein space Sparse model 

Notes

Acknowledgment

We would like to thank Gaelle Bonnet-Loosli for providing support with the Krĕin Space SVM.

References

  1. 1.
    Alabdulmohsin, I.M., Cissé, M., Gao, X., Zhang, X.: Large margin classification with indefinite similarities. Mach. Learn. 103(2), 215–237 (2016)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Duin, R.P.W., Pekalska, E.: Non-euclidean dissimilarities: causes and informativeness. In: Hancock, E.R., Wilson, R.C., Windeatt, T., Ulusoy, I., Escolano, F. (eds.) SSPR /SPR. LNCS, vol. 6218, pp. 324–333. Springer, Heidelberg (2010).  https://doi.org/10.1007/978-3-642-14980-1_31CrossRefGoogle Scholar
  3. 3.
    Geoffrey, Z.Z., Davis, M., Mallat, S.G.: Adaptive time-frequency decompositions. SPIE J. Opt. Eng. 33(1), 2183–2191 (1994)Google Scholar
  4. 4.
    Gisbrecht, A., Schleif, F.-M.: Metric and non-metric proximity transformations at linear costs. Neurocomputing 167, 643–657 (2015)CrossRefGoogle Scholar
  5. 5.
    Gusfield, D.: Algorithms on Strings, Trees, and Sequences: Computer Science and Computational Biology. Cambridge University Press, Cambridge (1997)CrossRefGoogle Scholar
  6. 6.
    Hassibi, B.: Indefinite metric spaces in estimation, control and adaptive filtering. Ph.D. thesis, Stanford University, Department of Electrical Engineering, Stanford (1996)Google Scholar
  7. 7.
    Hodgetts, C.J., Hahn, U.: Similarity-based asymmetries in perceptual matching. Acta Psychol. 139(2), 291–299 (2012)CrossRefGoogle Scholar
  8. 8.
    Ling, H., Jacobs, D.W.: Shape classification using the inner-distance. IEEE Trans. Pattern Anal. Mach. Intell. 29(2), 286–299 (2007)CrossRefGoogle Scholar
  9. 9.
    Loosli, G., Canu, S., Ong, C.S.: Learning SVM in Krein spaces. IEEE Trans. Pattern Anal. Mach. Intell. 38(6), 1204–1216 (2016)CrossRefGoogle Scholar
  10. 10.
    Luss, R., d’Aspremont, A.: Support vector machine classification with indefinite kernels. Math. Program. Comput. 1(2–3), 97–118 (2009)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Mwebaze, E., Schneider, P., Schleif, F.-M., et al.: Divergence based classification in learning vector quantization. Neurocomputing 74, 1429–1435 (2010)CrossRefGoogle Scholar
  12. 12.
    Olshausen, B.A., Field, D.J.: Sparse coding with an overcomplete basis set: a strategy employed by V1? Vis. Res. 37(23), 3311–3325 (1997)CrossRefGoogle Scholar
  13. 13.
    Ong, C.S., Mary, X., Canu, S., Smola, A.J.: Learning with non-positive kernels. In: (ICML 2004) (2004)Google Scholar
  14. 14.
    Pati, Y.C., Rezaiifar, R., Krishnaprasad, P.S.: Orthogonal matching pursuit: recursive function approximation with applications to wavelet decomposition. In Proceedings of the 27th Asilomar Conference on Signals, Systems and Computers, vol. 1, pp. 40–44, November 1993Google Scholar
  15. 15.
    Pekalska, E., Duin, R.: The Dissimilarity Representation for Pattern Recognition. World Scientific, Singapore (2005)CrossRefGoogle Scholar
  16. 16.
    Pekalska, E., Haasdonk, B.: Kernel discriminant analysis for positive definite and indefinite kernels. IEEE Trans. Pattern Anal. Mach. Intell. 31(6), 1017–1031 (2009)CrossRefGoogle Scholar
  17. 17.
    Scheirer, W.J., Wilber, M.J., Eckmann, M., Boult, T.E.: Good recognition is non-metric. Pattern Recogn. 47(8), 2721–2731 (2014)CrossRefGoogle Scholar
  18. 18.
    Schleif, F.-M., Tiño, P.: Indefinite proximity learning: a review. Neural Comput. 27(10), 2039–2096 (2015)CrossRefGoogle Scholar
  19. 19.
    Schleif, F.-M., Tiño, P.: Indefinite core vector machine. Pattern Recogn. 71, 187–195 (2017)CrossRefGoogle Scholar
  20. 20.
    Schnitzer, D., Flexer, A., Widmer, G.: A fast audio similarity retrieval method for millions of music tracks. Multimed. Tools Appl. 58(1), 23–40 (2012)CrossRefGoogle Scholar
  21. 21.
    Srisuphab, A., Mitrpanont, J.L.: Gaussian kernel approx algorithm for feedforward neural network design. Appl. Math. Comp. 215(7), 2686–2693 (2009)CrossRefGoogle Scholar
  22. 22.
    Tsang, I.H., Kwok, J.Y., Zurada, J.M.: Generalized core vector machines. IEEE TNN 17(5), 1126–1140 (2006)Google Scholar
  23. 23.
    UCI: Skin segmentation database, March 2016Google Scholar
  24. 24.
    Vapnik, V.N.: The Nature of Statistical Learning Theory. Statistics for Engineering and Information Science. Springer, New York (2000)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Frank-Michael Schleif
    • 1
    • 2
  • Christoph Raab
    • 1
  • Peter Tino
    • 2
  1. 1.Department of Computer ScienceUniversity of Applied Science Würzburg-SchweinfurtWürzburgGermany
  2. 2.School of Computer ScienceUniversity of BirminghamBirminghamUK

Personalised recommendations