Skip to main content

Quadratic Optimization Fine Tuning for the Learning Phase of SVM

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNPSE,volume 3563))

Abstract

This paper presents a study of the Quadratic optimization Problem (QP) lying on the learning process of Support Vector Machines (SVM). Taking the Karush-Kuhn-Tucker (KKT) optimality conditions, we present the strategy of implementation of the SVM-QP following two classical approaches: i) active set, also divided in primal and dual spaces, methods and ii) interior point methods. We also present the general extension to treat large scale applications consisting in a general decomposition of the QP problem into smaller ones. In the same manner, we discuss some considerations to take into account to start the general learning process. We compare the performances of the optimization strategies using some well-known benchmark databases.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Altman, A., Gondzio, J.: Regularized Symmetric Indefinite Systems in Interior Point Methods for Linear and Quadratic Optimization. Optimization Methods and Software 11- 12, 275-302 (1998)

    Google Scholar 

  2. Andersen, E.D., et al.: Implementation of interior-point methods for large-scale linear programming. Technical Report 1996.3, Logilab, HEC Geneva, Section of Management Studies, University of Geneva, Switzerland (1996)

    Google Scholar 

  3. Fletcher, R.: Practical Methods of Optimization, 2nd edn. John Wiley & Sons, Chichester (2000)

    Google Scholar 

  4. Freund, R.M., Mizuno, S.: Interior Point Methods: Current Status and Future Directions. Optima 51, 1–9 (1996)

    Google Scholar 

  5. Gertz, M., Wright, S.J.: Object-Oriented Software for Quadratic Programming. Report ANL/MCS-P891-1000, Argonne National Laboratory, Mathematics and Computer Science Division (2001)

    Google Scholar 

  6. González-Mendoza, M.: Etude du problème d’optimisation dans les Machines à Vecteurs de Support. Mémoire de DEA INSAT, LAAS-CNRS, Toulouse, France (2000)

    Google Scholar 

  7. González-Mendoza, M., Titli, A., Hernández-Gress, N.: Boosting Support Vector Machines in Density Estimation Problems. In: IEEE Information Processing and Management of Uncertainty, IPMU 2002 Symposium, Annecy, France (July 1-5 2002)

    Google Scholar 

  8. Goldfarb, D., Idnani, A.: A numerically stable dual method for solving strictly convex quadratic programs. Mathematical Programming 27, 1–33 (1983)

    Article  MATH  MathSciNet  Google Scholar 

  9. Joachims, T.: Making large-scale support vector machine training practical. In: Schölkopf, B., Burges, C., Smola, A. (eds.) Advances in Kernel Methods: Support Vector Machines, pp. 169–184. MIT press, Cambridge (1998)

    Google Scholar 

  10. Kaufman, L.: Solving the quadratic programming problem arising in support vector classification. In: Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods, ch. 10, pp. 147–167. MIT press, Cambridge (1998)

    Google Scholar 

  11. Keogh, E., Blake, C., Merz, C.J.: UCI repository of machine learning databases (1998), http://kdd.ics.uci.edu

  12. Moré, J.J., Wright, S.J.: Optimization Software Guide. SIAM Publications, Philadelphia (1993)

    MATH  Google Scholar 

  13. Osuna, E., et al.: Support Vector Machines: Training and Applications. Report of the Center of Biological and Computational Learning MIT. Paper No. 144 (1997)

    Google Scholar 

  14. Platt, J.C.: Fast Training of support vector machines using sequential minimal optimization. In: Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods, ch. 12, pp. 185–208. MIT press, Cambridge (1998)

    Google Scholar 

  15. Powell, M. J. D.: ZQPCVX A fortran subroutine for convex quadratic programming. Technical Report DAMTP/83/NA17, University of Cambridge, UK (1985)

    Google Scholar 

  16. Smola, A. J.: Learning with Kernels. PhD thesis, Technische Universität Berlin (1998)

    Google Scholar 

  17. Vanderbei, R.J.: Linear programming: Foundations and Extensions. Kluwer academic Publishers, Hingham (1997)

    Google Scholar 

  18. Vapnik, V.N.: Computational Learning Theory. John Wiley & Sons, Chichester (1998)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

González-Mendoza, M., Hernández-Gress, N., Titli, A. (2005). Quadratic Optimization Fine Tuning for the Learning Phase of SVM. In: Ramos, F.F., Larios Rosillo, V., Unger, H. (eds) Advanced Distributed Systems. ISSADS 2005. Lecture Notes in Computer Science, vol 3563. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11533962_31

Download citation

  • DOI: https://doi.org/10.1007/11533962_31

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-28063-7

  • Online ISBN: 978-3-540-31674-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics