Advertisement

Fast Approximation of Support Vector Kernel Expansions, and an Interpretation of Clustering as Approximation in Feature Spaces

  • Bernhard Schölkopf
  • Phil Knirsch
  • Alex Smola
  • Chris Burges
Part of the Informatik aktuell book series (INFORMAT)

Abstract

Kernel-based learning methods provide their solutions as expansions in terms of a kernel. We consider the problem of reducing the computational complexity of evaluating these expansions by approximating them using fewer terms. As a by-product, we point out a connection between clustering and approximation in reproducing kernel Hilbert spaces generated by a particular class of kernels.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    K. Blackmore, R. Williamson, and I. Mareels. Decision region approximation by polynomials or neural networks. IEEE Trans. Inf. Theory, 43: 903–907, 1997.MathSciNetzbMATHCrossRefGoogle Scholar
  2. 2.
    B. Boser, I. Guyon, and V. Vapnik. A training algorithm for optimal margin classifiers. In D. Haussier, editor, Proceedings of the 5th Annual ACM Workshop on COLT, pages 144–152, Pittsburgh, PA, July 1992. ACM Press.CrossRefGoogle Scholar
  3. 3.
    J. M. Buhmann. Data clustering and learning. In M. A. Arbib, editor, The Handbook of Brain Theory and Neural Networks, pages 278–281. MIT Press, 1995.Google Scholar
  4. 4.
    C. J. C. Burges. Simplified support vector decision rules. In L. Saitta, editor, Proceedings, 13th ICML, pages 71–77, San Mateo, CA, 1996. Morgan Kaufmann.Google Scholar
  5. 5.
    C. J. C. Burges and B. Schölkopf. Improving the accuracy and speed of support vector learning machines. In M. Mozer, M. Jordan, and T. Pet sehe, editors, Advances in NIPS P, pages 375–381, Cambridge, MA, 1997. MIT Press.Google Scholar
  6. 6.
    T. Frieß. Personal communication. 1998.Google Scholar
  7. 7.
    Y. LeCun, B. Boser, J. Denker, D. Henderson, R. Howard, W. Hubbard, and L. Jackel. Backpropagation applied to handwritten zip code recognition. Neural Computation, 1: 541–551, 1989.CrossRefGoogle Scholar
  8. 8.
    C. A. Micchelli. Interpolation of scattered data: distance matrices and conditionally positive definite functions. Constructive Approximation, 2: 11–22, 1986.MathSciNetzbMATHCrossRefGoogle Scholar
  9. 9.
    E. Osuna and F. Girosi. Reducing run-time complexity in support vector machines. In B. Schölkopf, C. Burges, and A. Smola, editors, Advances in Kernel Methods — Support Vector Learning. MIT Press, Cambridge, MA, 1998. to appear.Google Scholar
  10. 10.
    S. Saitoh. Theory of Reproducing Kernels and its Applications. Longman Scientific & Technical, Harlow, England, 1988.Google Scholar
  11. 11.
    B. Schölkopf. Support Vector Learning. R. Oldenbourg Verlag, Munich, 1997.Google Scholar
  12. 12.
    B. Schölkopf, A. Smola, and K.-R. Müller. Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 10:1299 - 1319, 1998.Google Scholar
  13. 13.
    B. Schölkopf, K. Sung, C. Burges, F. Girosi, P. Niyogi, T. Poggio, and V. Vapnik. Comparing support vector machines with gaussian kernels to radial basis function classifiers. IEEE Trans. Sign. Processing, 45: 2758–2765, 1997.CrossRefGoogle Scholar
  14. 14.
    V. Vapnik. The Nature of Statistical Learning Theory. Springer, New York, 1995.zbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1998

Authors and Affiliations

  • Bernhard Schölkopf
    • 1
  • Phil Knirsch
    • 2
  • Alex Smola
    • 1
  • Chris Burges
    • 2
  1. 1.GMD FIRSTBerlinGermany
  2. 2.Bell LabsLucent TechnologiesHolmdelUSA

Personalised recommendations