Skip to main content
Log in

An Efficient Gradient Method with Approximately Optimal Stepsize Based on Tensor Model for Unconstrained Optimization

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

A new type of stepsize, which was recently introduced by Liu et al. (Optimization 67(3):427–440, 2018), is called approximately optimal stepsize and is very efficient for gradient method. Interestingly, all gradient methods can be regarded as gradient methods with approximately optimal stepsizes. In this paper, we present an efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization. In the proposed method, if the objective function is not close to a minimizer and a quadratic function on a line segment between the current and latest iterates, then a tensor model is exploited to generate approximately optimal stepsize for gradient method. Otherwise, quadratic approximation models are constructed to generate approximately optimal stepsizes for gradient method. The global convergence of the proposed method is established under weak conditions. Numerical results indicate that the proposed method is very promising.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Cauchy, A.: Méthode générale pour la résolution des systéms déquations simultanées. Comp. Rend. Sci. Paris 25, 46–89 (1847)

    Google Scholar 

  2. Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  3. Asmundis, R.D., Serafino, D.D., Riccio, F., et al.: On spectral properties of steepest descent methods. IMA J. Numer. Anal. 33(4), 1416–1435 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  4. Raydan, M.: On the Barzilai and Borwein choice of steplength for the gradient method. IMA J. Numer. Anal. 13, 321–326 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  5. Dai, Y.H., Liao, L.Z.: \( R \)-linear convergence of the Barzilai and Borwein gradient method. IMA J. Numer. Anal. 22(1), 1–10 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  6. Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7, 26–33 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  7. Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23, 707–716 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  8. Biglari, F., Solimanpur, M.: Scaling on the spectral gradient method. J. Optim. Theory Appl. 158(2), 626–635 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  9. Dai, Y.H., Yuan, J.Y., Yuan, Y.X.: Modified two-point stepsize gradient methods for unconstrained optimization problems. Comput. Optim. Appl. 22, 103–109 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  10. Dai, Y.H., Hager, W.W., Schittkowski, K., et al.: The cyclic Barzilai–Borwein method for unconstrained optimization. IMA J. Numer. Anal. 26(3), 604–627 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  11. Xiao, Y.H., Wang, Q.Y., Wang, D., et al.: Notes on the Dai–Yuan–Yuan modified spectral gradient method. J. Comput. Appl. Math. 234(10), 2986–2992 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  12. Nosratipour, H., Fard, O.S., Borzabadi, A.H.: An adaptive nonmonotone global Barzilai–Borwein gradient method for unconstrained optimization. Optimization 66(4), 641–655 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  13. Miladinović, M., Stanimirović, P., Miljković, S.: Scalar correction method for solving large scale unconstrained minimization problems. J. Optim. Theory Appl. 151(2), 304–320 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  14. Liu, Z.X., Liu, H.W., Dong, X.L.: An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem. Optimization 67(3), 427–440 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  15. Liu, Z.X., Liu, H.W.: An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numer. Algorithms 78(1), 21–39 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  16. Liu, Z.X., Liu, H.W.: Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization. J. Comput. Appl. Math. 328, 400–413 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  17. Liu, H.W., Liu, Z.X., Dong, X.L.: A new adaptive Barzilai and Borwein method for unconstrained optimization. Optim. Lett. 12(4), 845–873 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  18. Dai, Y.H., Kou, C.X.: A Barzilai–Borwein conjugate gradient method. Sci. China Math. 59(8), 1511–1524 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  19. Yuan, G.L., Meng, Z.H., Li, Y.: A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations. J. Optim. Theory Appl. 168, 129–152 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  20. Liu, Z.X., Liu, H.W.: An efficient Barzilai–Borwein conjugate gradient method for unconstrained optimization. J. Optim. Theory Appl. https://doi.org/10.1007/s10957-018-1393-3 (2018)

  21. Yuan, G.L., Wei, Z.X., Zhao, Q.M.: A modified Polak–Ribière–Polyak conjugate gradient algorithm for large-scale optimization problems. IIE Trans. 46, 397–413 (2014)

    Article  Google Scholar 

  22. Yuan, G.L., Wei, Z.X., Lu, X.W.: Global convergence of BFGS and PRP methods under a modified weak Wolfe–Powell line search. Appl. Math. Model. 47, 811–825 (2017)

    Article  MathSciNet  Google Scholar 

  23. Figueiredo, M.A.T., Nowak, R.D., Wright, S.J.: Gradient projection for sparse reconstruction: application to compressed sensing and other inverse problems. IEEE J. Sel. Top. Signal Process. 1(4), 586–597 (2007)

    Article  Google Scholar 

  24. Wright, S.J., Nowak, R.D., Figueiredo, M.A.T.: Sparse reconstruction by separable approximation. IEEE Trans. Signal Process. 57(7), 3373–3376 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  25. Huang, Y.K., Liu, H.W.: Smoothing projected Barzilai–Borwein method for constrained non-Lipschitz optimization. Comput. Optim. Appl. 63(3), 671–698 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  26. Liu, H.W., Li, X.L.: Modified subspace Barzilai–Borwein gradient method for non-negative matrix factorization. Comput. Optim. Appl. 55(1), 173–196 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  27. Huang, Y.K., Liu, H.W., Zhou, S.: An efficient monotone projected Barzilai–Borwein method for nonnegative matrix factorization. Appl. Math. Lett. 45, 12–17 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  28. Yuan, Y.X.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11(3), 325–332 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  29. Schnabel, R.B., Chow, T.: Tensor methods for unconstrained optimization using second derivatives. SIAM J. Optim. 1(3), 293–315 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  30. Chow, T., Eskow, E., Schnabel, R.: Algorithm 738: a software package for unconstrained optimization using tensor methods. ACM Trans. Math. Softw. 20(4), 518–530 (1994)

    Article  MATH  Google Scholar 

  31. Bouaricha, A.: Tensor methods for large, sparse unconstrained optimization. SIAM J. Optim. 7(3), 732–756 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  32. Yuan, Y.X., Sun, W.Y.: Theory and Methods of Optimization. Science Press of China, Beijing (1999)

    Google Scholar 

  33. Li, D.H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129(1), 15–35 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  34. Toint, P.L.: An assessment of nonmonotone linesearch techniques for unconstrained optimization. SIAM J. Sci. Comput. 17(3), 725–739 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  35. Zhang, H.C., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14, 1043–1056 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  36. Birgin, E.G., Martínez, J.M., Raydan, M.: Nonmonotone spectral projected gradient methods for convex sets. SIAM J. Optim. 10(4), 1196–1211 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  37. Hager, W.W., Zhang, H.C.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  38. Andrei, N.: Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization. Bull. Malays. Math. Sci. Soc. 34(2), 319–330 (2011)

    MathSciNet  MATH  Google Scholar 

  39. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  40. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10, 147–161 (2008)

    MathSciNet  MATH  Google Scholar 

  41. Gould, N.I.M., Orban, D., Toint, P.L.: CUTEr and SifDec: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)

    Article  MATH  Google Scholar 

  42. Hager, W.W., Zhang, H.C.: Algorithm 851:CG\_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)

    Article  MATH  Google Scholar 

  43. Yuan, G.L., Zhou, S., Wang, B.P., et al.: The global convergence of a modified BFGS method for nonconvex functions. J. Comput. Appl. Math. 327, 274–294 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  44. Zhang, J.Z., Deng, N.Y., Chen, L.H.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102(1), 147–167 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  45. Yuan, G.L., Wei, Z.X.: Convergence analysis of a modified BFGS method on convex minimizations. Comput. Optim. Appl. 47, 237–255 (2010)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

We would like to thank Professors Hager and Zhang, H. C. for their C code of CG_DESCENT, and thank Professor Dai, Y. H. for his help in the numerical experiments. This research is supported by National Science Foundation of China (No.11461021), Shangxi Science Foundation (No. 2017JM1014), Guangxi Science Foundation (Nos. 2018GXNSFBA281180, 2017GXNSFBA198031), Project of Guangxi Education Department Grant (2017KY0648), Scientific Research Project of Hezhou University (Nos. 2014YBZK06, 2016HZXYSX03).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hongwei Liu.

Additional information

Guoyin Li.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, Z., Liu, H. An Efficient Gradient Method with Approximately Optimal Stepsize Based on Tensor Model for Unconstrained Optimization. J Optim Theory Appl 181, 608–633 (2019). https://doi.org/10.1007/s10957-019-01475-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-019-01475-1

Keywords

Mathematics Subject Classification

Navigation