Skip to main content

Efficient Optimization of the Parameters of LS-SVM for Regression versus Cross-Validation Error

  • Conference paper
Artificial Neural Networks – ICANN 2009 (ICANN 2009)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5769))

Included in the following conference series:

Abstract

Least Squares Support Vector Machines (LS-SVM) are the state of the art in kernel methods for regression and function approximation. In the last few years, these models have been successfully applied to time series modelling and prediction. A key issue for the good performance of a LS-SVM model are the values chosen for both the kernel parameters and its hyperparameters in order to avoid overfitting the underlying system to be modelled. In this paper an efficient method for the evaluation of the cross validation error for LS-SVM is revised. The expressions for its partial derivatives are presented in order to improve the procedure for parameter optimization. Some initial guesses to set the values of both kernel parameters and the regularization factor are also presented. We finally conduct some experiments on a time series data example using a number of methods for parameter optimization for LS-SVM models. The results show that the proposed partial derivatives and heuristics can improve the performance with respect to both execution time and the optimized model obtained.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Suykens, J.A.K., Van Gestel, T., De Brabanter, J., De Moor, B., Vandewalle, J.: Least Squares Support Vector Machines. World Scientific, Singapore (2002)

    Book  MATH  Google Scholar 

  2. Rojas, I., et al.: Analysis of the functional block involved in the design of radial basis function networks. Neural Processing Letters 12, 1–17 (2000)

    Article  Google Scholar 

  3. Müller, K.-R., Smola, A.J., Rätsch, G., Schölkopf, B., Kohlmorgen, J., Vapnik, V.: Using support vector machines for time series prediction (2000)

    Google Scholar 

  4. Rubio, G., Guillen, A., Herrera, L.J., Pomares, H., Rojas, I.: Use of specific-to-problem kernel functions for time series modeling. In: ESTSP 2008: Proceedings of the European Symposium on Time Series Prediction, pp. 177–186 (2008)

    Google Scholar 

  5. Scholkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge (2001)

    Google Scholar 

  6. Van Gestel, T., Suykens, J.A.K., Baestaens, D.-E., Lambrechts, A., Lanckriet, G., Vandaele, B., De Moor, B., Vandewalle, J.: Financial time series prediction using least squares support vector machines within the evidence framework. IEEE Transactions on Neural Networks 12(4), 809–821 (2001)

    Article  Google Scholar 

  7. Lendasse, A., Ji, Y., Reyhani, N., Verleysen, M.: Ls-svm hyperparameter selection with a nonparametric noise estimator. In: ICANN (2), pp. 625–630 (2005)

    Google Scholar 

  8. Ying, Z., Keong, K.C.: Fast leave-one-out evaluation and improvement on inference for ls-svms. In: Proceedings of the 17th International Conference on Pattern Recognition, ICPR 2004, vol. 3, pp. 494–497 (2004)

    Google Scholar 

  9. An, S., Liu, W., Venkatesh, S.: Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression. Pattern Recogn. 40(8), 2154–2162 (2007)

    Article  MATH  Google Scholar 

  10. Liitiäinen, E., Lendasse, A., Corona, F.: Non-parametric residual variance estimation in supervised learning. In: Sandoval, F., Prieto, A.G., Cabestany, J., Graña, M. (eds.) IWANN 2007. LNCS, vol. 4507, pp. 63–71. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  11. Jones, A.J., Evans, D., Kemp, S.E.: A note on the Gamma test analysis of noisy input/output data and noisy time series. Physica D Nonlinear Phenomena 229, 1–8 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  12. Mackey, M.C., Glass, L.: Oscillation and Chaos in Physiological Control Systems. Science 197(4300), 287–289 (1977)

    Article  Google Scholar 

  13. Herrera, L.J., et al.: TaSe, a Taylor Series-based fuzzy system model that combines interpretability and accuracy. Fuzzy Sets and Systems 153, 403–427 (2005)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Rubio, G., Pomares, H., Rojas, I., Herrera, L.J., Guillén, A. (2009). Efficient Optimization of the Parameters of LS-SVM for Regression versus Cross-Validation Error. In: Alippi, C., Polycarpou, M., Panayiotou, C., Ellinas, G. (eds) Artificial Neural Networks – ICANN 2009. ICANN 2009. Lecture Notes in Computer Science, vol 5769. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04277-5_41

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-04277-5_41

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-04276-8

  • Online ISBN: 978-3-642-04277-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics