Nested Sequential Minimal Optimization for Support Vector Machines

  • Alessandro Ghio
  • Davide Anguita
  • Luca Oneto
  • Sandro Ridella
  • Carlotta Schatten
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7553)


We propose in this work a nested version of the well–known Sequential Minimal Optimization (SMO) algorithm, able to contemplate working sets of larger cardinality for solving Support Vector Machine (SVM) learning problems. Contrary to several other proposals in literature, neither new procedures nor numerical QP optimizations must be implemented, since our proposal exploits the conventional SMO method in its core. Preliminary tests on benchmarking datasets allow to demonstrate the effectiveness of the presented method.


Support Vector Machine Convex Constrained Quadratic Programming Sequential Minimal Optimization 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
  2. 2.
    Chang, C.C., Lin, C.J.: LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology 2, 27:1–27:27 (2011)Google Scholar
  3. 3.
    Fan, R., Chang, K., Hsieh, C., Wang, X., Lin, C.: Liblinear: A library for large linear classification. The Journal of Machine Learning Research 9, 1871–1874 (2008)zbMATHGoogle Scholar
  4. 4.
    Fan, R., Chen, P., Lin, C.: Working set selection using second order information for training support vector machines. The Journal of Machine Learning Research 6, 1889–1918 (2005)zbMATHMathSciNetGoogle Scholar
  5. 5.
    Hsu, C., Chang, C., Lin, C.: A practical guide to support vector classification (2003)Google Scholar
  6. 6.
    Joachims, T.: Making large-scale svm learning practical. In: Advances in Kernel Methods (1999)Google Scholar
  7. 7.
    Keerthi, S., Shevade, S., Bhattacharyya, C., Murthy, K.: Improvements to platt’s smo algorithm for svm classifier design. Neural Computation 13(3), 637–649 (2001)CrossRefzbMATHGoogle Scholar
  8. 8.
    Larochelle, H., Erhan, D., Courville, A., Bergstra, J., Bengio, Y.: An empirical evaluation of deep architectures on problems with many factors of variation. In: Proceedings of the International Conference on Machine Learning, pp. 473–480 (2007)Google Scholar
  9. 9.
    Lin, Y., Hsieh, J., Wu, H., Jeng, J.: Three-parameter sequential minimal optimization for support vector machines. Neurocomputing 74(17), 3467–3475 (2011)CrossRefGoogle Scholar
  10. 10.
    Munder, S., Gavrila, D.: An experimental study on pedestrian classification. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(11), 1863–1868 (2006)CrossRefGoogle Scholar
  11. 11.
    Osuna, E., Freund, R., Girosi, F.: An improved training algorithm for support vector machines. In: Proceedings of the Workshop Neural Networks for Signal Processing (1997)Google Scholar
  12. 12.
    Platt, J.: Sequential minimal optimization: A fast algorithm for training support vector machines. In: Advances in Kernel Methods Support Vector Learning, vol. 208, pp. 1–21 (1998)Google Scholar
  13. 13.
    Platt, J.: Using analytic qp and sparseness to speed training of support vector machines. In: Advances in Neural Information Processing Systems, pp. 557–563 (1999)Google Scholar
  14. 14.
    Shawe-Taylor, J., Sun, S.: A review of optimization methodologies in support vector machines. Neurocomputing 74(17), 3609–3618 (2011)CrossRefGoogle Scholar
  15. 15.
    Vapnik, V.: Statistical learning theory. Wiley, New York (1998)zbMATHGoogle Scholar
  16. 16.
    Webb, S., Caverlee, J., Pu, C.: Introducing the webb spam corpus: Using email spam to identify web spam automatically. In: Proceedings of the Conference on Email and Anti-Spam (2006)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Alessandro Ghio
    • 1
  • Davide Anguita
    • 1
  • Luca Oneto
    • 1
  • Sandro Ridella
    • 1
  • Carlotta Schatten
    • 1
  1. 1.DITENUniversity of GenovaGenovaItaly

Personalised recommendations