Abstract
Random sampling techniques have been developed for combinatorial optimization problems. In this note, we report an application of one of these techniques for training support vector machines (more precisely, primal-form maximal-margin classifiers) that solve two-group classification problems by using hyperplane classifiers. Through this research, we are aiming (I) to design efficient and theoretically guaranteed support vector machine training algorithms, and (II) to develop systematic and efficient methods for finding “outliers”, i.e., examples having an inherent error.
This work was started when the first and third authors visited Centre de Recerca Mathemática, Spain.
Supported in part by EU ESPRIT IST-1999-14186 (ALCOM-FT), EU EP27150 (Neurocolt II), Spanish Government PB98-0937-C04 (FRESCO), and CIRIT 1997SGR-00366.
Supported in part by a Grant-in-Aid (C-13650444) from the Ministry of Education, Science, Sports and Culture of Japan.
Supported in part by a Grant-in-Aid for Scientific Research on Priority Areas “Discovery Science” from the Ministry of Education, Science, Sports and Culture of Japan.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
I. Adler and R. Shamir, A randomized scheme for speeding up algorithms for linear and convex programming with high constraints-to-variable ratio, Math. Programming 61, 39–52, 1993.
J. Balcázar, Y. Dai, and O. Watanabe, in preparation.
K.P. Bennett and E.J. Bredensteiner, Duality and geometry in SVM classifiers, in Proc. the 17th Int’l Conf. on Machine Learning (ICML’2000), 57–64, 2000.
D.P. Bertsekas, Nonlinear Programming, Athena Scientific, 1995.
P.S. Bradley, O.L. Mangasarian, and D.R. Musicant, Optimization methods in massive datasets, in Handbook of Massive Datasets (J. Abello, P.M. Pardalos, and M.G.C. Resende, eds.) Kluwer Academic Pub., 2000, to appear.
C.J. Lin, On the convergence of the decomposition method for support vector machines, IEEE Trans. on Neural Networks, 2001, to appear.
K.L. Clarkson, Las Vegas algorithms for linear and integer programming, J.ACM 42, 488–499, 1995.
C. Cortes and V. Vapnik, Support-vector networks, Machine Learning 20, 273–297, 1995.
N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines, Cambridge University Press 2000.
B. Gärtner and E. Welzl, A simple sampling lemma: Analysis and applications in geometric optimization, Discr. Comput. Geometry, 2000, to appear.
S.S. Keerthi and E.G. Gilbert, Convergence of a generalized SMO algorithm for SVM classifier design, Technical Report CD-00-01, Dept. of Mechanical and Production Eng., National University of Singapore, 2000.
E. Osuna, R. Freund, and F. Girosi, An improved training algorithm for support vector machines, in Proc. IEEE Workshop on Neural Networks for Signal Processing, 276–285, 1997.
J. Platt, Fast training of support vector machines using sequential minimal optimization, in Advances in Kernel Methods-Support Vector Learning (B. Scholkopf, C.J.C. Burges, and A.J. Smola, eds.), MIT Press, 185–208, 1999.
A.J. Smola and B. Scholkopf, A tutorial on support vector regression, NeuroCOLT Technical Report NC-TR-98-030, Royal Holloway College, University of London, 1998.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Balcázar, J., Dai, Y., Watanabe, O. (2001). A Random Sampling Technique for Training Support Vector Machines. In: Abe, N., Khardon, R., Zeugmann, T. (eds) Algorithmic Learning Theory. ALT 2001. Lecture Notes in Computer Science(), vol 2225. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45583-3_11
Download citation
DOI: https://doi.org/10.1007/3-540-45583-3_11
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42875-6
Online ISBN: 978-3-540-45583-7
eBook Packages: Springer Book Archive