Advertisement

Structured Two-Point Stepsize Gradient Methods for Nonlinear Least Squares

  • Hassan MohammadEmail author
  • Mohammed Yusuf Waziri
Article

Abstract

In this paper, we present two choices of structured spectral gradient methods for solving nonlinear least squares problems. In the proposed methods, the scalar multiple of identity approximation of the Hessian inverse is obtained by imposing the structured quasi-Newton condition. Moreover, we propose a simple strategy for choosing the structured scalar in the case of negative curvature direction. Using the nonmonotone line search with the quadratic interpolation backtracking technique, we prove that these proposed methods are globally convergent under suitable conditions. Numerical experiment shows that the methods are competitive with some recently developed methods.

Keywords

Nonlinear least squares problems Spectral gradient method Nonmonotone line search Global convergence 

Mathematics Subject Classification

49M37 65K05 90C56 

Notes

Acknowledgements

We thank Professor Sandra Augusta Santos of the University of Campinas (UNICAMP) SP, Brazil, for providing us with useful comments and suggestions that we used for improving the earlier version of this manuscript. We are also grateful to the suggestions of the anonymous reviewer and the Editor-in-Chief, which lead us to improve the presentation of the paper. We are indebted to Hasan Shahid, graduate student at Brown University, USA, for his careful reading and helpful comments. This research was conducted during a one year stay of the first author as a sandwich Ph.D. student at the University of Campinas (UNICAMP), SP, Brazil, supported by the Tertiary Education Trust Fund (TETFund) Academic Staff Training and Development (AST&D) intervention.

References

  1. 1.
    Golub, G., Pereyra, V.: Separable nonlinear least squares: the variable projection method and its applications. Inverse Probl. 19(2), R1–R26 (2003)MathSciNetzbMATHGoogle Scholar
  2. 2.
    Kim, S., Koh, K., Lustig, M., Boyd, S., Gorinevsky, D.: An interior-point method for large-scale \(\ell _1 \)-regularized least squares. IEEE J. Sel. Top. Signal Process. 1(4), 606–617 (2007)Google Scholar
  3. 3.
    Li, J., Ding, F., Yang, G.: Maximum likelihood least squares identification method for input nonlinear finite impulse response moving average systems. Math. Comput. Model. 55(3–4), 442–450 (2012)MathSciNetzbMATHGoogle Scholar
  4. 4.
    Hartley, H.O.: The modified Gauss–Newton method for the fitting of non-linear regression functions by least squares. Technometrics 3(2), 269–280 (1961)MathSciNetzbMATHGoogle Scholar
  5. 5.
    Levenberg, K.: A method for the solution of certain non-linear problems in least squares. Q. Appl. Math. 2(2), 164–168 (1944)MathSciNetzbMATHGoogle Scholar
  6. 6.
    Marquardt, D.W.: An algorithm for least-squares estimation of nonlinear parameters. J. Soc. Ind. Appl. Math. 11(2), 431–441 (1963)MathSciNetzbMATHGoogle Scholar
  7. 7.
    Dennis Jr., J.E., Gay, D.M., Walsh, R.E.: An adaptive nonlinear least-squares algorithm. ACM Trans. Math. Softw. (TOMS) 7(3), 348–368 (1981)zbMATHGoogle Scholar
  8. 8.
    Dennis, J.: Some computational techniques for the nonlinear least squares problem. In: Byrne, G.D., Hall, C.A. (eds.) Numerical Solution of Systems of Nonlinear Algebraic Equations, pp. 157–183. Academic Press, New York (1973)Google Scholar
  9. 9.
    Spedicato, E., Vespucci, M.T.: Numerical experiments with variations of the Gauss–Newton algorithm for nonlinear least squares. J. Optim. Theory Appl. 57(2), 323–339 (1988)MathSciNetzbMATHGoogle Scholar
  10. 10.
    Wang, F., Li, D.H., Qi, L.: Global convergence of Gauss–Newton-MBFGS method for solving the nonlinear least squares problem. Adv. Model. Optim. 12(1), 1–18 (2010)MathSciNetzbMATHGoogle Scholar
  11. 11.
    Dennis, J.E., Schnabel, R.B.: Numerical Methods for Unconstrained Optimization and Nonlinear Equations, vol. 16. SIAM, University City (1996)zbMATHGoogle Scholar
  12. 12.
    Knoll, D.A., Keyes, D.E.: Jacobian-free Newton–Krylov methods: a survey of approaches and applications. J. Comput. Phys. 193(2), 357–397 (2004)MathSciNetzbMATHGoogle Scholar
  13. 13.
    Xu, W., Coleman, T.F., Liu, G.: A secant method for nonlinear least-squares minimization. Comput. Optim. Appl. 51(1), 159–173 (2012)MathSciNetzbMATHGoogle Scholar
  14. 14.
    Xu, W., Zheng, N., Hayami, K.: Jacobian-free implicit inner-iteration preconditioner for nonlinear least squares problems. J. Sci. Comput. 68(3), 1055–1081 (2016)MathSciNetzbMATHGoogle Scholar
  15. 15.
    Al-Baali, M.: Quasi-Newton algorithms for large-scale nonlinear least-squares. In: Pillo, G., Murli, A. (eds.) High Performance Algorithms and Software for Nonlinear Optimization, pp. 1–21. Springer, Berlin (2003)zbMATHGoogle Scholar
  16. 16.
    Al-Baali, M., Fletcher, R.: Variational methods for non-linear least-squares. J. Oper. Res. Soc. 36(5), 405–421 (1985)zbMATHGoogle Scholar
  17. 17.
    Bartholomew-Biggs, M.C.: The estimation of the Hessian matrix in nonlinear least squares problems with non-zero residuals. Math. Program. 12(1), 67–80 (1977)zbMATHGoogle Scholar
  18. 18.
    Betts, J.T.: Solving the nonlinear least square problem: application of a general method. J. Optim. Theory Appl. 18(4), 469–483 (1976)MathSciNetzbMATHGoogle Scholar
  19. 19.
    Dennis, J.E., Martinez, H.J., Tapia, R.A.: Convergence theory for the structured BFGS secant method with an application to nonlinear least squares. J. Optim. Theory Appl. 61(2), 161–178 (1989)MathSciNetzbMATHGoogle Scholar
  20. 20.
    Fletcher, R., Xu, C.: Hybrid methods for nonlinear least squares. IMA J. Numer. Anal. 7(3), 371–389 (1987)MathSciNetzbMATHGoogle Scholar
  21. 21.
    Lukšan, L.: Hybrid methods for large sparse nonlinear least squares. J. Optim. Theory Appl. 89(3), 575–595 (1996)MathSciNetzbMATHGoogle Scholar
  22. 22.
    McKeown, J.J.: Specialised versus general-purpose algorithms for minimising functions that are sums of squared terms. Math. Program. 9(1), 57–68 (1975)MathSciNetzbMATHGoogle Scholar
  23. 23.
    Nazareth, L.: An adaptive method for minimizing a sum of squares of nonlinear functions. Working Paper WP-83-099, IIASA, Laxenburg, Austria (1983)Google Scholar
  24. 24.
    Yuan, Y.X.: Recent advances in numerical methods for nonlinear equations and nonlinear least squares. Numer. Algebra Control Optim. 1(1), 15–34 (2011)MathSciNetzbMATHGoogle Scholar
  25. 25.
    Mohammad, H., Waziri, M.Y., Santos, S.A.: A brief survey of methods for solving nonlinear least-squares problems. Numer. Algebra Control Optim. 9(1), 1–13 (2019).  https://doi.org/10.3934/naco.2019001 MathSciNetCrossRefGoogle Scholar
  26. 26.
    Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)MathSciNetzbMATHGoogle Scholar
  27. 27.
    Raydan, M.: On the Barzilai and Borwein choice of steplength for the gradient method. IMA J. Numer. Anal. 13(3), 321–326 (1993)MathSciNetzbMATHGoogle Scholar
  28. 28.
    Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7(1), 26–33 (1997)MathSciNetzbMATHGoogle Scholar
  29. 29.
    Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23(4), 707–716 (1986)MathSciNetzbMATHGoogle Scholar
  30. 30.
    Dai, Y.H., Zhang, H.: Adaptive two-point stepsize gradient algorithm. Numer. Algorithms 27(4), 377–385 (2001)MathSciNetzbMATHGoogle Scholar
  31. 31.
    Zhang, J.Z., Deng, N.Y., Chen, L.H.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102(1), 147–167 (1999).  https://doi.org/10.1023/A:1021898630001 MathSciNetzbMATHCrossRefGoogle Scholar
  32. 32.
    Dai, Y., Yuan, J., Yuan, Y.X.: Modified two-point stepsize gradient methods for unconstrained optimization. Comput. Optim. Appl. 22(1), 103–109 (2002)MathSciNetzbMATHGoogle Scholar
  33. 33.
    Mohammad, H., Santos, S.A.: A structured diagonal Hessian approximation method with evaluation complexity analysis for nonlinear least squares. Comput. Appl. Math. (2018).  https://doi.org/10.1007/s40314-018-0696-1
  34. 34.
    Xiao, Y., Wang, Q., Wang, D.: Notes on the Dai–Yuan–Yuan modified spectral gradient method. J. Comput. Appl. Math. 234(10), 2986–2992 (2010)MathSciNetzbMATHGoogle Scholar
  35. 35.
    Biglari, F., Solimanpur, M.: Scaling on the spectral gradient method. J. Optim. Theory Appl. 158(2), 626–635 (2013)MathSciNetzbMATHGoogle Scholar
  36. 36.
    Liu, H., Liu, Z., Dong, X.: A new adaptive Barzilai and Borwein method for unconstrained optimization. Optim. Lett. 12(4), 845–873 (2018)MathSciNetzbMATHGoogle Scholar
  37. 37.
    Liu, Z., Liu, H.: Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization. J. Comput. Appl. Math. 328(15), 400–413 (2018)MathSciNetzbMATHGoogle Scholar
  38. 38.
    Curtis, F.E., Guo, W.: Handling nonpositive curvature in a limited memory steepest descent method. IMA J. Numer. Anal. 36(2), 717–742 (2016)MathSciNetzbMATHGoogle Scholar
  39. 39.
    Kafaki, S.B., Fatemi, M.: A modified two-point stepsize gradient algorithm for unconstrained minimization. Optim. Methods Softw. 28(5), 1040–1050 (2013)MathSciNetzbMATHGoogle Scholar
  40. 40.
    Luengo, F., Raydan, M.: Gradient method with dynamical retards for large-scale optimization problems. Electron. Trans. Numer. Anal. 16, 186–193 (2003)MathSciNetzbMATHGoogle Scholar
  41. 41.
    Zhang, H., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14(4), 1043–1056 (2004)MathSciNetzbMATHGoogle Scholar
  42. 42.
    Moré, J.J., Garbow, B.S., Hillstrom, K.E.: Testing unconstrained optimization software. ACM Trans. Math. Softw. TOMS 7(1), 17–41 (1981)MathSciNetzbMATHGoogle Scholar
  43. 43.
    La Cruz, W., Martínez, J.M., Raydan, M.: Spectral residual method without gradient information for solving large-scale nonlinear systems: theory and experiments. Technical Report RT-04-08, Universidad Central de Venezuela, Venezuela (2004). http://kuainasi.ciens.ucv.ve/mraydan/download_papers/TechRep.pdf
  44. 44.
    Gonçalves, D.S., Santos, S.A.: Local analysis of a spectral correction for the Gauss–Newton model applied to quadratic residual problems. Numer. Algorithms 73(2), 407–431 (2016)MathSciNetzbMATHGoogle Scholar
  45. 45.
    Lukšan, L., Vlcek, J.: Test problems for unconstrained optimization. Technical Report 897, Academy of Sciences of the Czech Republic, Institute of Computer Science (2003)Google Scholar
  46. 46.
    Jamil, M., Yang, X.S.: A literature survey of benchmark functions for global optimisation problems. Int. J. Math. Model. Numer. Optim. 4(2), 150–194 (2013)zbMATHGoogle Scholar
  47. 47.
    Incerti, S., Zirilli, F., Parisi, V.: Algorithm 111: a fortran subroutine for solving systems of nonlinear simultaneous equations. Comput. J. 24(1), 87–90 (1981).  https://doi.org/10.1093/comjnl/24.1.87 zbMATHCrossRefGoogle Scholar
  48. 48.
    Santos, S.A., Silva, R.C.M.: An inexact and nonmonotone proximal method for smooth unconstrained minimization. J. Comput. Appl. Math. 269, 86–100 (2014)MathSciNetzbMATHGoogle Scholar
  49. 49.
    Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. program. 91(2), 201–213 (2002)MathSciNetzbMATHGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Department of Mathematical Sciences, Faculty of Physical SciencesBayero UniversityKanoNigeria
  2. 2.Department of Applied Mathematics, Institute of Mathematics, Statistics and Scientific ComputingUniversity of CampinasCampinasBrazil

Personalised recommendations