Abstract
This paper describes an on-line method for building ∈-insensitive support vector machines for regression as described in [12]. The method is an extension of the method developed by [1] for building incremental support vector machines for classification. Machines obtained by using this approach are equivalent to the ones obtained by applying exact methods like quadratic programming, but they are obtained more quickly and allow the incremental addition of new points, removal of existing points and update of target values for existing data. This development opens the application of SVM regression to areas such as on-line prediction of temporal series or generalization of value functions in reinforcement learning.
Chapter PDF
Similar content being viewed by others
Keywords
- Support Vector Machine
- Support Vector
- Support Vector Regression
- Online Learning
- Neural Information Processing System
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
G. Cauwenberghs and T. Poggio. Incremental and decremental support vector machine learning. In T. G. Dietterich T. K. Leen and V. Tresp, editors, Advances in Neural Infomation Processing Systems 13, pages 409–415. MIT Press, 2001.
N. Cristianini and J. Shawe-Taylor. An Introduction to Support Vector Machines. Cambridge University Press, 2000.
C. Domeniconi and D. Gunopulos. Incremental support vector machine construction. In N. Cercone, T. Lin, and X. Wu, editors, Proceedings of the 2001 IEEE Intl. Conference on Data Mining, pages 589–592. IEEE Computer Society, 2001.
S. Dumais, J. Platt, D. Heckerman, and M. Sahami. Inductive learning algorithms and representations for text categorization. In 7th International Conference on Information and Knowledge Management, ACM-CIKM98, pages 148–155, 1998.
C. Gentile. A new approximate maximal margin classification algorithm. In T. G. Dietterich T. K. Leen and V. Tresp, editors, Advances in Neural Information Processing Systems 13, pages 500–506. MIT Press, 2001.
T. Graepel, R. Herbrich, and R. Williamson. From margin to sparsity. In T. G. Dietterich T. K. Leen and V. Tresp, editors, Advances in Neural Information Processing Systems 13, pages 210–216. MIT Press, 2001.
J. Kivinen, A. Smola, and R. Williamson. Online learning with kernels. In S. Becker T. G. Dietterich and Z. Ghahramani, editors, Advances in Neural Information Processing Systems 14. MIT Press, 2002.
M. Martin. On-line support vector machine for function approximation. Technical report, Universitat Politècnica de Catalunya, Forthcomming.
E. Osuna, R. Freund, and F. Girosi. Training support vector machines: an application to face detection. In International Conference on Computer Vision and Pattern Recognition, CVPR97, pages 30–136, 1997.
A. Smola and B. Schölkopf. A tutorial on support vector regression. Technical Report NC2-TR-1998-030, NeuroCOLT2, 1998.
R. Sutton and A. Barto. Reinforcement Learning. MIT Press, 1998.
V. Vapnik. The nature of statistical learning theory. Springer Verlag, 1995.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Martin, M. (2002). On-Line Support Vector Machine Regression. In: Elomaa, T., Mannila, H., Toivonen, H. (eds) Machine Learning: ECML 2002. ECML 2002. Lecture Notes in Computer Science(), vol 2430. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36755-1_24
Download citation
DOI: https://doi.org/10.1007/3-540-36755-1_24
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-44036-9
Online ISBN: 978-3-540-36755-0
eBook Packages: Springer Book Archive