Advertisement

An Efficient Gradient Method with Approximately Optimal Stepsize Based on Tensor Model for Unconstrained Optimization

  • Zexian Liu
  • Hongwei LiuEmail author
Article
  • 242 Downloads

Abstract

A new type of stepsize, which was recently introduced by Liu et al. (Optimization 67(3):427–440, 2018), is called approximately optimal stepsize and is very efficient for gradient method. Interestingly, all gradient methods can be regarded as gradient methods with approximately optimal stepsizes. In this paper, we present an efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization. In the proposed method, if the objective function is not close to a minimizer and a quadratic function on a line segment between the current and latest iterates, then a tensor model is exploited to generate approximately optimal stepsize for gradient method. Otherwise, quadratic approximation models are constructed to generate approximately optimal stepsizes for gradient method. The global convergence of the proposed method is established under weak conditions. Numerical results indicate that the proposed method is very promising.

Keywords

Gradient method Approximately optimal stepsize Quadratic model Tensor model Global convergence 

Mathematics Subject Classification

90C06 65K 

Notes

Acknowledgements

We would like to thank Professors Hager and Zhang, H. C. for their C code of CG_DESCENT, and thank Professor Dai, Y. H. for his help in the numerical experiments. This research is supported by National Science Foundation of China (No.11461021), Shangxi Science Foundation (No. 2017JM1014), Guangxi Science Foundation (Nos. 2018GXNSFBA281180, 2017GXNSFBA198031), Project of Guangxi Education Department Grant (2017KY0648), Scientific Research Project of Hezhou University (Nos. 2014YBZK06, 2016HZXYSX03).

References

  1. 1.
    Cauchy, A.: Méthode générale pour la résolution des systéms déquations simultanées. Comp. Rend. Sci. Paris 25, 46–89 (1847)Google Scholar
  2. 2.
    Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Asmundis, R.D., Serafino, D.D., Riccio, F., et al.: On spectral properties of steepest descent methods. IMA J. Numer. Anal. 33(4), 1416–1435 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Raydan, M.: On the Barzilai and Borwein choice of steplength for the gradient method. IMA J. Numer. Anal. 13, 321–326 (1993)MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Dai, Y.H., Liao, L.Z.: \( R \)-linear convergence of the Barzilai and Borwein gradient method. IMA J. Numer. Anal. 22(1), 1–10 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7, 26–33 (1997)MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23, 707–716 (1986)MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Biglari, F., Solimanpur, M.: Scaling on the spectral gradient method. J. Optim. Theory Appl. 158(2), 626–635 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Dai, Y.H., Yuan, J.Y., Yuan, Y.X.: Modified two-point stepsize gradient methods for unconstrained optimization problems. Comput. Optim. Appl. 22, 103–109 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Dai, Y.H., Hager, W.W., Schittkowski, K., et al.: The cyclic Barzilai–Borwein method for unconstrained optimization. IMA J. Numer. Anal. 26(3), 604–627 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    Xiao, Y.H., Wang, Q.Y., Wang, D., et al.: Notes on the Dai–Yuan–Yuan modified spectral gradient method. J. Comput. Appl. Math. 234(10), 2986–2992 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    Nosratipour, H., Fard, O.S., Borzabadi, A.H.: An adaptive nonmonotone global Barzilai–Borwein gradient method for unconstrained optimization. Optimization 66(4), 641–655 (2017)MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    Miladinović, M., Stanimirović, P., Miljković, S.: Scalar correction method for solving large scale unconstrained minimization problems. J. Optim. Theory Appl. 151(2), 304–320 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Liu, Z.X., Liu, H.W., Dong, X.L.: An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem. Optimization 67(3), 427–440 (2018)MathSciNetCrossRefzbMATHGoogle Scholar
  15. 15.
    Liu, Z.X., Liu, H.W.: An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numer. Algorithms 78(1), 21–39 (2018)MathSciNetCrossRefzbMATHGoogle Scholar
  16. 16.
    Liu, Z.X., Liu, H.W.: Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization. J. Comput. Appl. Math. 328, 400–413 (2018)MathSciNetCrossRefzbMATHGoogle Scholar
  17. 17.
    Liu, H.W., Liu, Z.X., Dong, X.L.: A new adaptive Barzilai and Borwein method for unconstrained optimization. Optim. Lett. 12(4), 845–873 (2018)MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    Dai, Y.H., Kou, C.X.: A Barzilai–Borwein conjugate gradient method. Sci. China Math. 59(8), 1511–1524 (2016)MathSciNetCrossRefzbMATHGoogle Scholar
  19. 19.
    Yuan, G.L., Meng, Z.H., Li, Y.: A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations. J. Optim. Theory Appl. 168, 129–152 (2016)MathSciNetCrossRefzbMATHGoogle Scholar
  20. 20.
    Liu, Z.X., Liu, H.W.: An efficient Barzilai–Borwein conjugate gradient method for unconstrained optimization. J. Optim. Theory Appl.  https://doi.org/10.1007/s10957-018-1393-3 (2018)
  21. 21.
    Yuan, G.L., Wei, Z.X., Zhao, Q.M.: A modified Polak–Ribière–Polyak conjugate gradient algorithm for large-scale optimization problems. IIE Trans. 46, 397–413 (2014)CrossRefGoogle Scholar
  22. 22.
    Yuan, G.L., Wei, Z.X., Lu, X.W.: Global convergence of BFGS and PRP methods under a modified weak Wolfe–Powell line search. Appl. Math. Model. 47, 811–825 (2017)MathSciNetCrossRefGoogle Scholar
  23. 23.
    Figueiredo, M.A.T., Nowak, R.D., Wright, S.J.: Gradient projection for sparse reconstruction: application to compressed sensing and other inverse problems. IEEE J. Sel. Top. Signal Process. 1(4), 586–597 (2007)CrossRefGoogle Scholar
  24. 24.
    Wright, S.J., Nowak, R.D., Figueiredo, M.A.T.: Sparse reconstruction by separable approximation. IEEE Trans. Signal Process. 57(7), 3373–3376 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  25. 25.
    Huang, Y.K., Liu, H.W.: Smoothing projected Barzilai–Borwein method for constrained non-Lipschitz optimization. Comput. Optim. Appl. 63(3), 671–698 (2016)MathSciNetCrossRefzbMATHGoogle Scholar
  26. 26.
    Liu, H.W., Li, X.L.: Modified subspace Barzilai–Borwein gradient method for non-negative matrix factorization. Comput. Optim. Appl. 55(1), 173–196 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  27. 27.
    Huang, Y.K., Liu, H.W., Zhou, S.: An efficient monotone projected Barzilai–Borwein method for nonnegative matrix factorization. Appl. Math. Lett. 45, 12–17 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  28. 28.
    Yuan, Y.X.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11(3), 325–332 (1991)MathSciNetCrossRefzbMATHGoogle Scholar
  29. 29.
    Schnabel, R.B., Chow, T.: Tensor methods for unconstrained optimization using second derivatives. SIAM J. Optim. 1(3), 293–315 (1991)MathSciNetCrossRefzbMATHGoogle Scholar
  30. 30.
    Chow, T., Eskow, E., Schnabel, R.: Algorithm 738: a software package for unconstrained optimization using tensor methods. ACM Trans. Math. Softw. 20(4), 518–530 (1994)CrossRefzbMATHGoogle Scholar
  31. 31.
    Bouaricha, A.: Tensor methods for large, sparse unconstrained optimization. SIAM J. Optim. 7(3), 732–756 (1997)MathSciNetCrossRefzbMATHGoogle Scholar
  32. 32.
    Yuan, Y.X., Sun, W.Y.: Theory and Methods of Optimization. Science Press of China, Beijing (1999)Google Scholar
  33. 33.
    Li, D.H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129(1), 15–35 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  34. 34.
    Toint, P.L.: An assessment of nonmonotone linesearch techniques for unconstrained optimization. SIAM J. Sci. Comput. 17(3), 725–739 (1996)MathSciNetCrossRefzbMATHGoogle Scholar
  35. 35.
    Zhang, H.C., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14, 1043–1056 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  36. 36.
    Birgin, E.G., Martínez, J.M., Raydan, M.: Nonmonotone spectral projected gradient methods for convex sets. SIAM J. Optim. 10(4), 1196–1211 (2000)MathSciNetCrossRefzbMATHGoogle Scholar
  37. 37.
    Hager, W.W., Zhang, H.C.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)MathSciNetCrossRefzbMATHGoogle Scholar
  38. 38.
    Andrei, N.: Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization. Bull. Malays. Math. Sci. Soc. 34(2), 319–330 (2011)MathSciNetzbMATHGoogle Scholar
  39. 39.
    Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  40. 40.
    Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10, 147–161 (2008)MathSciNetzbMATHGoogle Scholar
  41. 41.
    Gould, N.I.M., Orban, D., Toint, P.L.: CUTEr and SifDec: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)CrossRefzbMATHGoogle Scholar
  42. 42.
    Hager, W.W., Zhang, H.C.: Algorithm 851:CG\_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)CrossRefzbMATHGoogle Scholar
  43. 43.
    Yuan, G.L., Zhou, S., Wang, B.P., et al.: The global convergence of a modified BFGS method for nonconvex functions. J. Comput. Appl. Math. 327, 274–294 (2018)MathSciNetCrossRefzbMATHGoogle Scholar
  44. 44.
    Zhang, J.Z., Deng, N.Y., Chen, L.H.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102(1), 147–167 (1999)MathSciNetCrossRefzbMATHGoogle Scholar
  45. 45.
    Yuan, G.L., Wei, Z.X.: Convergence analysis of a modified BFGS method on convex minimizations. Comput. Optim. Appl. 47, 237–255 (2010)MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.School of Mathematics and StatisticsXidian UniversityXi’anPeople’s Republic of China
  2. 2.School of Mathematics and Computer ScienceHezhou UniversityHezhouPeople’s Republic of China

Personalised recommendations