Advertisement

Non-interior-point smoothing Newton method for CP revisited and its application to support vector machines

Original Research
  • 19 Downloads

Abstract

Non-interior-point smoothing Newton method (SNM) for optimization have been widely studied for over three decades. SNM is a popular approach for solving small- and medium-scale complementarity problem (CP) and many optimization problems. The main purpose of this paper is to revisit the SNM and show that the Hessian matrix in SNM becomes increasingly ill-conditioned while smoothing parameter approaches to zero, which leads to their practical use remains limited due to computational difficulties in solving large-scale CP. To tackle this, we redesign a new smoothing method, called accelerated preconditioned smoothing method (APSM) for the efficient solution of regularized support vector machines in machine learning. With the help of suitable preconditioner, we can correct the ill-conditioning of associated smoothing Hessian matrix and thereby the associated smoothing Hessian equation can be solved in a few of iterations by using iterative methods in linear algebra. Two accelerated techniques are designed in the paper to reduce our computation time. Finally we present numerical experiments to support our theoretical guarantees and test accelerated convergence obtained by APSM. The result showed that APSM computes faster than state of art algorithm without reducing classification accuracy.

Keywords

Non-interior-point smoothing method Complementarity problem Preconditioner Line-search Support vector machines 

Mathematics Subject Classification

90C33 65K10 

Notes

References

  1. 1.
    Bergamaschi, L., Gondzio, J., Zilli, G.: Preconditioning indefinite systems in interior point methods for optimization. Comput. Optim. Appl. 28, 149–171 (2004)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Bellavia, S., De Simone, V., di Serafino, D., Morini, B.: Updating constraint preconditioners for KKT systems in quadratic programming via low-rank corrections. SIAM J. Optim. 25(3), 1787–1808 (2015)MathSciNetCrossRefGoogle Scholar
  3. 3.
    Bennett, K.P., Mangasarian, O.L.: Robust linear programming discrimination of two linearly inseparable sets. Optim. Methods Softw. 1, 23–34 (1992)CrossRefGoogle Scholar
  4. 4.
    Bradley, P.S., Mangasarian, O.L.: Feature selection via concave minimization and support vector machines. In: Shavlik, J. (ed.) Machine Learning Proceedings of the Fifteenth International Conference (ICML98), pp. 82–90 (1998). ftp://ftp.cs.wisc.edu/math-prog/tech-reports/98-03.ps. Accessed Mar 2018
  5. 5.
    Chen, B., Harker, P.T.: A non-interior-point continuation method for linear complementarity problem. SIAM J. Matrix Anal. Appl. 14, 1168–1190 (1993)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Cherkassky, V., Mulier, F.: Learning from Data—Concepts, Theory and Methods. Wiley, New York (1998)zbMATHGoogle Scholar
  7. 7.
    Corts, C., Vapnik, V.N.: Support vector networks. Mach. Learn. 20, 273–297 (1995)zbMATHGoogle Scholar
  8. 8.
    Cristianini, N., Shawe-Taylor, J.: Support Vector Machines and Other Kernel-based Learning Methods. Cambridge University Press, Cambridge (2000)CrossRefGoogle Scholar
  9. 9.
    Dassios, I., Fountoulakis, K., Gondzio, J.: A preconditioner for a primal-dual Newton conjugate gradients method for compressed sensing problems. SIAM J. Sci. Comput. 37(6), A2783–A2812 (2014)MathSciNetCrossRefGoogle Scholar
  10. 10.
    De Simone, V., di Serafino, D., Morini, B.: On preconditioner updates for sequences of saddle-point linear systems. Commun. Appl. Ind. Math. 9, 35–41 (2018)MathSciNetzbMATHGoogle Scholar
  11. 11.
    Durazzi, C., Ruggiero, V.: Indefinitely preconditioned conjugate gradient method for large sparse equality and inequality constrained quadratic problems. Numer. Linear Algebra Appl. 10, 673–688 (2003)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Facchinei, F., Pang, J.S.: Finite-Dimensional Variational Inequalities and Complementarity Problems. Springer, New York (2003)zbMATHGoogle Scholar
  13. 13.
    Friedlander, M.P., Orban, D.: A primal-dual regularized interior-point method for convex quadratic programs. Math. Program. Comput. 4, 71–107 (2012)MathSciNetCrossRefGoogle Scholar
  14. 14.
    Gondzio, J.: Matrix-free interior point method. Comput. Optim. Appl. 51, 457–480 (2012)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Horn, R.A., Johnson, C.R.: Matrix Analysis. Cambridge University Press, Cambridge (1985)CrossRefGoogle Scholar
  16. 16.
    Huang, Z.H., Han, J., Chen, Z.: A predictor-corrector smoothing Newton algorithm, based on a new smoothing function, for solving the nonlinear complementarity problem with a \(P_0\) function. J. Optim. Theory Appl. 117, 39–68 (2003)MathSciNetCrossRefGoogle Scholar
  17. 17.
    Huang, Z.H., Ni, T.: Smoothing algorithms for complementarity problems over symmetric cones. Comput. Optim. Appl. 45, 557–579 (2010)MathSciNetCrossRefGoogle Scholar
  18. 18.
    Huang, Z.H., Qi, L.: Formulating an n-person noncooperative game as a tensor complementarity problem. Comput. Optim. Appl. 66, 557–576 (2017)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Huang, Z.H., Qi, L., Sun, D.: Sub-quadratic convergence of a smoothing Newton algorithm for the \(P_0\)- and monotone LCP. Math. Program. 99, 423–441 (2004)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Huang, Z.H., Sun, D., Zhao, G.Y.: A smoothing Newton-type algorithm of stronger convergence for the quadratically constrained convex quadratic programming. Comput. Optim. Appl. 35, 199–237 (2006)MathSciNetCrossRefGoogle Scholar
  21. 21.
    Joachims, T.: Making large-scale support vector machine learning practical. In: Scholkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods—Support Vector Learning, pp. 169–184. MIT Press, Cambridge (1999)Google Scholar
  22. 22.
    Kanzow, C.: Some noninterior continuation methods for linear complementarity problems. SIAM J. Matrix Anal. Appl. 17, 851–868 (1996)MathSciNetCrossRefGoogle Scholar
  23. 23.
    Lee, Y.-J., Mangasarian, O.L.: SSVM: a smooth support vector machine for classification. Comput. Optim. Appl. 20, 5–22 (2001)MathSciNetCrossRefGoogle Scholar
  24. 24.
    Mangasarian, O.L., Musicant, D.R.: Successive overrelaxation for support vector machines. IEEE Trans. Neural Netw. 10, 1032–1037 (1999)CrossRefGoogle Scholar
  25. 25.
    Morini, B., Simoncini, V., Tani, M.: Spectral estimates for unreduced symmetric KKT systems arising from Interior Point methods. Numer. Linear. Algebra 23, 776–800 (2016)MathSciNetCrossRefGoogle Scholar
  26. 26.
    Morini, B., Simoncini, V., Tani, M.: A comparison of reduced and unreduced symmetric KKT systems arising from interior point methods. Comput. Optim. Appl. 68(1), 1–29 (2017)MathSciNetCrossRefGoogle Scholar
  27. 27.
    Murphy, P.M., Aha, D.W.: UCI repository of machine learning databases (1992). http://www.ics.uci.edu/~mlearn/MLRepository.html. Accessed Mar 2018
  28. 28.
    Ni, T., Wang, P.: A smoothing-type algorithm for solving nonlinear complementarity problems with a non-monotone line search. Appl. Math. Comput. 216, 2207–2214 (2010)MathSciNetzbMATHGoogle Scholar
  29. 29.
    Ni, T., Zhai, J.: A matrix-free smoothing algorithm for large-scale support vector machines. Inform. Sci. 358–359, 29–43 (2016)CrossRefGoogle Scholar
  30. 30.
    Porcelli, M., Simoncini, V., Stoll, M.: Preconditioning PDE-constrained optimization with \(L^{1}\)-sparsity and control constraints. Comput. Math. Appl. 74, 1059–1075 (2017)MathSciNetCrossRefGoogle Scholar
  31. 31.
    Qi, H.D.: A regularized smoothing Newton method for box constrained variational inequality problems with \(P_0\)-functions. SIAM J. Optim. 10, 315–330 (2000)MathSciNetCrossRefGoogle Scholar
  32. 32.
    Qi, L., Zhou, G.: A smoothing Newton method for minimizing a sum of Euclidean norms. SIAM J. Optim. 11, 389–410 (2000)MathSciNetCrossRefGoogle Scholar
  33. 33.
    Smale, S.: Algorithms for solving equations. In: Gleason, A.M. (ed.) Proceeding of International Congress of Mathematicians, pp. 172–195. American Mathematics Society, Providence (1987)Google Scholar
  34. 34.
    Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)zbMATHGoogle Scholar
  35. 35.
    Zhang, H.-C., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM. J. Optim. 14, 1043–1056 (2004)MathSciNetCrossRefGoogle Scholar

Copyright information

© Korean Society for Informatics and Computational Applied Mathematics 2019

Authors and Affiliations

  1. 1.Department of Applied Mathematics, School of ScienceChongqing University of TechnologyChongqingPeople’s Republic of China

Personalised recommendations