Advertisement

A Class of Derivative-Free CG Projection Methods for Nonsmooth Equations with an Application to the LASSO Problem

  • Min SunEmail author
  • Maoying Tian
Original Paper

Abstract

In this paper, based on a modified Gram–Schmidt (MGS) process, we propose a class of derivative-free conjugate gradient (CG) projection methods for nonsmooth equations with convex constraints. Two attractive features of the new class of methods are (1) its generated direction contains a free vector, which can be set as any vector such that the denominator of the direction does not equal to zero; (2) it adopts a new line search which can reduce its computing cost. The new class of methods includes many efficient iterative methods for the studied problem as its special cases. When the underlying mapping is monotone, we establish its global convergence and convergence rate. Finally, preliminary numerical results about the LASSO problem show that the new class of methods is promising compared to some existing ones.

Keywords

Monotone constrained equations Derivative-free method Global convergence The LASSO problem 

Mathematics Subject Classification

90C25 90C30 

Notes

Funding

This research was partially supported by the National Natural Science Foundation of China and Shandong Province (Nos. 11671228, 11601475, ZR2016AL05).

References

  1. 1.
    Zeidler, E.: Nonlinear Functional Analysis and its Applications. Springer, Berlin (1990)CrossRefzbMATHGoogle Scholar
  2. 2.
    Wood, A.J., Wollenberg, B.F.: Power Generation, Operation, and Control. Wiley, New York (1996)Google Scholar
  3. 3.
    Meintjes, K., Morgan, A.P.: A methodology for solving chemical equilibrium systems. Appl. Math. Comput. 22, 333–361 (1987)MathSciNetzbMATHGoogle Scholar
  4. 4.
    Xiao, Y.H., Zhu, H.: A conjugate gradient method to solve convex constrained monotone equations with applications in compressive sensing. J. Math. Anal. Appl. 405, 310–319 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Sun, M., Liu, J.: The convergence rate of the proximal alternating direction method of multipliers with indefinite proximal regularization. J. Inequ. Appl. 2017, 19 (2017)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Zhou, B., Duam, G.R., Lin, Z.: A parametric periodic Lyapunov equation with application in semi-global stabilization of discrete-time periodic systems subject to actuator saturation. Automatica 47, 316–325 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    La Cruz, W., Raydan, M.: Nonmonotone spectral methods for large-scale nonlinear systems. Optim. Methods Softw. 18, 583–599 (2003)MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    La Cruz W., Mart\(\grave{\rm i}\)nez J.M., Raydan M.: Spectral residual method without gradient information for solving large- scale nonlinear systems of equations. Math. Comput. 75, 429–1448 (2006)Google Scholar
  9. 9.
    Zhang, L., Zhou, W.J.: Spectral gradient projection method for solving nonlinear monotone equations. J. Comput. Appl. Math. 196, 478–484 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 1988(8), 141–148 (1988)MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    Solodov, M.V., Svaiter, B.F.: A globally convergent inexact Newton method for system of monotone equations. In: Fukushima, M., Qi, L. (eds.) Reformulation: Nonsmooth, Piecewise Smooth, Semi- smooth and Smoothing Methods, pp. 355–369. Kluwer Academic Publishers, Dordrecht (1999)Google Scholar
  12. 12.
    Yu, Z.S., Lin, J., Sun, J., Xiao, Y.H., Liu, L.Y., Li, Z.H.: Spectral gradient projectionmethod formonotone nonlinear equations with convex constraints. Appl. Numer. Math. 59, 2416–2423 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    Yu, G.H., Niu, S.Z., Ma, J.H.: Multivariate spectral gradient projectionmethod for nonlinear monotone equations with convex constraints. J. Ind. Manag. Optim. 9, 117–129 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Liu, J., Duan, Y.R.: Two spectral gradient projection methods for constrained equations and their linear convergence rate. J. Inequ. Appl. 2015, 8 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  15. 15.
    Cheng, W.Y.: A PRP type method for systems of monotone equations. Math. Comput. Model. 50, 15–20 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  16. 16.
    Li, Q.N., Li, D.H.: A class of derivative-free methods for large-scale nonlinear monotone equations. IMA J. Numer. Anal. 31, 1625–1635 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  17. 17.
    Li, D.H., Wang, X.L.: A modified Fletcher–Reeves-type derivative-free method for symmetric nonlinear equations. Numer. Algebra Control Optim. 1(1), 71–82 (2012)MathSciNetzbMATHGoogle Scholar
  18. 18.
    Sun, X.L.: A two-term Fletcher–Reeves conjugate gradient method for monotone constrained equations and its application in compressive sensing. ICIC Express Lett. 9(11), 2987–2992 (2015)Google Scholar
  19. 19.
    Sun, M., Liu, J.: A modified Hestenes-Stiefel projection method for constrained nonlinear equations and its linear convergence rate. J. Appl. Math. Comput. 49(1–2), 145–156 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  20. 20.
    Sun, M., Liu, J.: New hybrid conjugate gradient projection method for the convex constrained equations. Calcolo 53, 399–411 (2016)MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    Yuan, G.L., Zhang, M.J.: A three-terms Polak-Ribi\(\grave{\rm e}\)re–Polyak conjugate gradient algorithm for large-scale nonlinear equations. J. Comput. Appl. Math. 286, 186–195 (2015)MathSciNetCrossRefGoogle Scholar
  22. 22.
    Dai, Z.F., Chen, X.H., Wen, F.H.: A modified Perry’s conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations. Appl. Math. Comput. 270(1), 378–386 (2015)MathSciNetzbMATHGoogle Scholar
  23. 23.
    Liu, J.K., Li, S.J.: Multivariate spectral DY-type projection method for convex constrained nonlinear monotone equations. J. Ind. Manag. Optim. 13(1), 283–295 (2017)MathSciNetzbMATHGoogle Scholar
  24. 24.
    Ou, Y.G., Li, J.Y.: A new derivative-free SCG-type projection method for nonlinear monotone equations with convex constraints. J. Appl. Math. Comput. 56(1–2), 195–216 (2018)MathSciNetCrossRefzbMATHGoogle Scholar
  25. 25.
    Cheng, W.Y.: A two-term PRP-based descent method. Numer. Funct. Anal. Optim. 28, 1217–1230 (2007)MathSciNetCrossRefzbMATHGoogle Scholar
  26. 26.
    Sun, M., Bai, Q.G.: A new descent memory gradient method and its global convergence. J. Syst. Sci. Complex. 24(4), 784–794 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  27. 27.
    Sun, M., Liu, J.: Three derivative-free projection methods for nonlinear equations with convex constraints. J. Appl. Math. Comput. 47, 265–276 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  28. 28.
    Sun, M., Tian, M.Y., Wang, Y.J.: Multi-step discrete-time Zhang neural networks with application to time-varying nonlinear optimization. Discrete Dyn. Nat. Soc. Article ID 4745759, 1–14 (2019)Google Scholar
  29. 29.
    Feng, D.X., Sun, M., Wang, X.Y.: A family of conjugate gradient methods for large-scale nonlinear equations. J. Inequ. Appl. 2017, 236 (2017)MathSciNetCrossRefzbMATHGoogle Scholar
  30. 30.
    Zarantonello, E.H.: Projections on Convex Sets in Hilbert Space and Spectral Theory. Academic Press, New York (1971)zbMATHGoogle Scholar
  31. 31.
    Sun, W.Y., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer Optimization and Its Applications, vol. 1. Springer, New York (2006)Google Scholar
  32. 32.
    Zhang, L., Zhou, W.J., Li, D.H.: A descent modified Polak–Ribi\(\grave{\rm e}\)re–Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26, 629–640 (2006)MathSciNetCrossRefGoogle Scholar
  33. 33.
    Zhang, L., Zhou, W.J., Li, D.H.: Global convergence of a modified Fletcher–Reeves conjugate method with Armijo-type line search. Numerische Mathematik 104, 561–572 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  34. 34.
    Solodov, M.V., Svaiter, B.F.: A globally convergent inexact Newton method for systems of monotone equations. In: Fukushima, M., Qi, L. (eds.) Reformulation: Nonsmooth, Semismooth and Smoothing Methods, Piecewise smooth, pp. 335–369. Kluwer Academic Publishers, Dordrecht (1998)Google Scholar
  35. 35.
    Polyak B.T.: Introduction to Optimization. Optimization Software Inc., Publications Division, New York (1987) (Translated from Russian, with a foreword by Dimitri P. Bertsekas) Google Scholar
  36. 36.
    Tibshirani, R.: Regression shrinkage and selection via the LASSO. J. R. Stat. Soc. Ser. B Stat. Methodol. 58, 267–288 (1996)MathSciNetzbMATHGoogle Scholar
  37. 37.
    Wu, L., Sun, Z.: New nonsmooth equations-based algorithms for \(\ell _1\)-norm minimization and applications. J. Appl. Math. 139609, 1–14 (2012)Google Scholar
  38. 38.
    Figueiredo, M.A.T., Nowak, R.D., Wright, S.J.: Gradient projection for sparse reconstruction, application to compressed sensing and other inverse problems. IEEE J. Select. Topics Signal Process. 1(4), 586–597 (2007)CrossRefGoogle Scholar

Copyright information

© Iranian Mathematical Society 2019

Authors and Affiliations

  1. 1.School of Mathematics and StatisticsZaozhuang UniversityShandongPeople’s Republic of China
  2. 2.School of ManagementQufu Normal UniversityShandongPeople’s Republic of China
  3. 3.Department of PhysiologyShandong Coal Mining Health SchoolShandongPeople’s Republic of China

Personalised recommendations