Abstract
We present a novel algorithm for sparse online greedy kernelbased nonlinear regression. This algorithm improves current approaches to kernel-based regression in two aspects. First, it operates online - at each time step it observes a single new input sample, performs an update and discards it. Second, the solution maintained is extremely sparse. This is achieved by an explicit greedy sparsi.cation process that admits into the kernel representation a new input sample only if its feature space image is linearly independent of the images of previously admitted samples. We show that the algorithm implements a form of gradient ascent and demonstrate its scaling and noise tolerance properties on three benchmark regression problems.
The research of R. M. was supported by the fund for promotion of research at the Technion and by the Ollendorff center.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
C. J. C. Burges and B. Schölkopf. Improving the accuracy and speed of support vector machines. In Advances in Neural Information Processing Systems, volume 9. The MIT Press, 1997.
G. Cauwenberghs and T. Poggio. Incremental and decremental support vector machine learning. In Advnaces in Neural Information Systems, pages 409–415, 2000.
O. Chapelle, V. Vapnik, O. Bousquet, and S. Mukherjee. Choosing multiple parameters for support vector machines. Machine Learning, 46:131–160, 2002.
R. Collobert and S. Bengio. SVMTorch: Support vector machines for large-scale regression problems. Journal of Machine Learning Research, 1:143–160, 2001.
N. Cristianini and J. Shawe-Taylor. An Introduction to Support Vector Machines. Cambridge University Press, Cambridge, England, 2000.
T. Downs, K. Gates, and A. Masters. Exact simplification of support vector solutions. Journal of Machine Learning Research, 2:293–297, December 2001.
G. Fung and O. L. Mangasarian. Incremental support vector machine classification. In Second SIAM Intrnational Conference on Data Mining, 2002.
R. Herbrich. Learning Kernel Classifiers. The MIT Press, Cambridge, MA, 2002.
J. Platt. Fast training of support vector machines using sequential minimal optimization. In Advances in Kernel Methods—Support Vector Learning, pages 42–65. MIT Press, 1999.
L. Ralaivola and F. d’Alché Buc. Incremental support vector machine learning: a local approach. In Proceedings of ICANN. Springer, 2001.
B. Schölkopf and A. Smola. Learning with Kernels. MIT Press, Cambridge, MA, 2002.
N. Syed, H. Liu, and K. Sung. Incremental learning with support vector machines. In Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI-99), 1999.
V. N. Vapnik. Statistical Learning Theory. Wiley Interscience, New York, 1998.
S. Vijayakumar and S. Wu. Sequential support vector classifiers and regression. In Proceedings of the International Conference on Soft Computing (SOCO’99), 1999.
G. Wabha. Spline models for observational data. CBMS-NSF Regional Conference Series in Applied Mathematics, Vol. 59, Philadelphia: SIAM, 1990.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Engel, Y., Mannor, S., Meir, R. (2002). Sparse Online Greedy Support Vector Regression. In: Elomaa, T., Mannila, H., Toivonen, H. (eds) Machine Learning: ECML 2002. ECML 2002. Lecture Notes in Computer Science(), vol 2430. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36755-1_8
Download citation
DOI: https://doi.org/10.1007/3-540-36755-1_8
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-44036-9
Online ISBN: 978-3-540-36755-0
eBook Packages: Springer Book Archive