Abstract
Support vector machines have recently attracted much attention in the machine learning and optimization communities for their remarkable generalization ability. An open problem, however, is the selection of the optimal kernel matrix for regression problems. Recently, a means to compute the optimal kernel matrix for pattern classification using semidefinite programming has been introduced [7]. In this paper we extend these thoughts to the regression analysis scenario. Preliminary experimental results are presented for which the optimal kernel matrix for support vector machine regression is retrieved.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Bazaraa M. Z., Sherali H. D. and Shetty C. M. (1993), “Nonlinear Programming. Theory and Algorithms,” John Wiley & Sons, New York.
Bertsekas D. P. (1999), “Nonlinear Programming,” Athena Scientific, Belmont, Massachusetts.
Burges C. J. C. (1998), “A Tutorial on Support Vector Machines for Pattern Classification,” Data Mining and Knowledge Discovery Vol. 2(2), 121–167.
Chapelle O., Vapnik V., Bousquet O. and Mukherjee S. (2002), “Choosing Multiple Parameters for Support Vector Machines,” Machine Learning Vol. 46(1/3), 131–159.
Cristianini N. and Shawe-Taylor J. (2000), “An Introduction to Support Vector Machines,” Cambridge University Press, Cambridge, UK.
Cristianini N., Shawe-Taylor J., Kandola J. and Elisseef A. (2001), “On Kernel Target Alignment,” In Advances in Neural Information Processing Systems, Cambridge, MA: MIT Press.
Lanckriet G., Cristianini N., Bartlett P., El-Ghaoui L. and Jordan M. I. (2002), “Learning the Kernel Matrix with Semi-Definite Programming,” Technical Report, Department of Electrical Engineering and Computer Sciences, University of California, Berkeley.
Pardalos P. M. and Wolkowicz H. (Eds) (1998), Topics in Semidefinite and Interior-Point Methods, Fields Institute Communications Series, Vol. 18, American Mathematical Society.
Ramana M. and Pardalos P. M. (1996), “Semidefinite Programming”, In Interior Point methods of Mathematical Programming,T. Terlaky ed., Kluwer Academic Publishers, pp. 369–398.
Schölkopf B. and Smola A. J. (2002), “Learning with Kernels,” MIT Press, Cambridge, Massachusetts.
] Smola A. and Schölkopf B. (1998), “A tutorial on support vector regression,” Statistics and Computing, Invited paper, in press.
Sturm J. F. (1999), “Using SeDuMi 1.02, a MATLAB toolbox for optimization over symmetric cones,” Optimization Methods and Software Vol. 11–12, 625–653.
Vandenberghe L. and Boyd S. (1996), “Semidefinite Programming,” SIAM Review Vol. 38(1).
Vapnik V. (1982), “Estimation of Dependencies Based on, Empirical Data,” Springer Verlag.
Vapnik V. (1995), “The Nature of Statistical Learning Theory,” Springer Verlag.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Kluwer Academic Publishers
About this paper
Cite this paper
Trafalis, T.B., Malyscheff, A.M. (2004). Optimal Selection of the Regression Kernel Matrix with Semidefinite Programming. In: Floudas, C.A., Pardalos, P. (eds) Frontiers in Global Optimization. Nonconvex Optimization and Its Applications, vol 74. Springer, Boston, MA. https://doi.org/10.1007/978-1-4613-0251-3_31
Download citation
DOI: https://doi.org/10.1007/978-1-4613-0251-3_31
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4613-7961-4
Online ISBN: 978-1-4613-0251-3
eBook Packages: Springer Book Archive