Skip to main content

Part of the book series: Applied Optimization ((APOP,volume 82))

Abstract

Low storage quasi-Newton algorithms for large-scale nonlinear least-squares problems are considered with “better” modified Hessian approximations defined implicitly in terms of a set of vector pairs. The modification technique replaces one vector of each pair, namely the difference in the gradients of the objective function, by a superior choice in various ways. These vectors introduce information about the true Hessian of this function by exploiting information about the Jacobian matrix of the residual vector of the problem. The proposed technique is also based on a new safeguarded scheme for enforcing the positive definiteness of Hessian approximations. It is shown, in particular, that this technique enhances the quality of the limited memory (L-)BFGS Hessian, maintains the simplicity formulation of the L-BFGS algorithm and improves its performance substantially.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. M. Al-Baali. (1984), “Methods for nonlinear least squares,” PhD thesis, Dept. of Mathematics and Computer Science, Univ. of Dundee, Scotland.

    Google Scholar 

  2. M. Al-Baali (1999), “Improved Hessian approximations for the limited memory BFGS methods,” Numerical Algorithms, 22, 99–112.

    Article  MathSciNet  MATH  Google Scholar 

  3. M. Al-Baali (2000), “Extra updates for the BFGS method,” Optimization Methods and Software, 13, 159–179.

    Article  MathSciNet  MATH  Google Scholar 

  4. M. Al-Baali and R. Fletcher (1985), “Variational methods for non-linear least-squares,” Operational Research Society Ltd., 36, 405–421.

    MATH  Google Scholar 

  5. M. Al-Baali and R. Fletcher (1986), “An efficient line search for nonlinear least squares,” JOTA, 48, 359–377.

    Article  MathSciNet  MATH  Google Scholar 

  6. M.C. Bartholomew-Biggs (1977), “The estimation of the Hessian matrix in nonlinear least squares problems with non-zero residuals,” Math. Prog., 12, 67–80.

    Article  MathSciNet  MATH  Google Scholar 

  7. C. Bischof, A. Carle, G. Corliss, A. Griewank and P. Hovland (1992), “ADIFOR: Generating derivative codes from Fortran programs,” Scientific Programming, 1, 11–29.

    Google Scholar 

  8. A. Bouaricha and J.J. Moré (1997), “Impact of partial separability on large-scale optimization,” Comp. Optim. Appl., 7, 27–40.

    Article  MATH  Google Scholar 

  9. R.H. Byrd, J. Nocedal and R.B. Schnabel (1994), “Representations of quasi-Newton matrices and their use in limited memory methods,” Math. Prog., 63, 129–156.

    Article  MathSciNet  MATH  Google Scholar 

  10. A.R. Conn, N.I.M. Gould and Ph.L. Toint. (1992), LANCELOT: a FORTRAN package for large-scale nonlinear optimization (Release A), Number 17, Springer Series in Computational Mathematics, Springer-Verlag, New York, USA.

    Google Scholar 

  11. A. Curtis, M.J.D. Powell and J.K. Reid (1974), “On estimation of sparse Jacobian matrices,” J. IMA, 13, 117–120.

    MATH  Google Scholar 

  12. J.E. Dennis (1977), “Non-linear least squares and equations,” In D.A.H. Jacobs, editor, The State of the Art in Numerical Analysis,Academic Press, 269–312.

    Google Scholar 

  13. J.E. Dennis, D.M. Gay and R.E. Welsch (1981), “An adaptive nonlinear least-squares algorithm,” ACM Trans. Math. Software, 7, 348–368.

    Article  MATH  Google Scholar 

  14. J.E. Dennis, H.J. Martinez and R.A. Tapia (1989), “Convergence theory for the structured BFGS secant method with an application to nonlinear least squares,” JOTA, 61, 161–178.

    Article  MathSciNet  MATH  Google Scholar 

  15. R. Fletcher. (1987), Practical methods of optimization, second edition, John Wiley, Chichester, England.

    MATH  Google Scholar 

  16. R. Fletcher (1990), “Low storage methods for unconstrained optimization,” In E.L. Allgower and K. George, editors, Computational Solution of Nonlinear Systems of Equations, Lectures in Applied Mathematics, AMS, Providence, RI, 165–179.

    Google Scholar 

  17. R. Fletcher (1995), “An optimal positive definite update for sparse Hessian matrices,” SIAM J. Optimization, 5, 192–218.

    Article  MathSciNet  MATH  Google Scholar 

  18. R. Fletcher and M.J.D. Powell (1963), “A rapidly convergent descent method for minimization,” Computer Journal, 6, 163–168.

    MathSciNet  MATH  Google Scholar 

  19. R. Fletcher and C. Xu (1987), “Hybrid methods for nonlinear least squares,” IMA J. Nurn. Anal., 7, 371–389.

    Article  MathSciNet  MATH  Google Scholar 

  20. L. Grandinetti (1984), “Some investigations in a new algorithm for nonlinear optimization based on conic models of the objective function,” JOTA, 43, 1–21.

    Article  MathSciNet  MATH  Google Scholar 

  21. J. Huschens (1994), “On the use of product structure in secant methods for nonlinear least squares problems,” SIAM J. Optimization, 4, 108–129.

    Article  MathSciNet  MATH  Google Scholar 

  22. P. Lindström and P. Wedin (1984), “A new line search algorithm for nonlinear least-squares problems,” Math. Prog., 29, 268–296.

    Article  MATH  Google Scholar 

  23. D.C. Liu and J. Nocedal (1989), “On the limited memory BFGS method for large scale optimization,” Math. Prog. (Series B), 45, 503–528.

    Article  MathSciNet  MATH  Google Scholar 

  24. L. Lukšsan and E. Spedicato (2000), “Variable metric methods for unconstrained optimization and nonlinear least squares,” J. Comp. Appl. Math., 124, 61–95.

    Article  MathSciNet  Google Scholar 

  25. J.J. Moré, B.S. Garbow and K.E. Hillstrom (1981), “Testing unconstrained optimization software,” ACM Trans. Math. Software, 7, 17–41.

    Article  MathSciNet  MATH  Google Scholar 

  26. J. Nocedal (1980), “Updating quasi-Newton matrices with limited storage,” Math. Comput.,35, 773–782.

    Article  MathSciNet  MATH  Google Scholar 

  27. J. Nocedal (1996), “Large scale unconstrained optimization,” Tech. Report, Dept. of Electrical Engineering and Computer Science, Northwestern Univ., Evanston, USA.

    Google Scholar 

  28. M.J.D. Powell (1976), “Some global convergence properties of a variable metric algorithm for minimization without exact line searches,” In R.W. Cottle and C.E. Lemke, editors, Nonlinear Programming, SIAM-AMS Proceedings, SIAM Publications, vol. IX, 53–72.

    Google Scholar 

  29. E. Spedicato and M.T. Vespucci (1988), “Numerical experiments with variations of the Gauss-Newton algorithm for nonlinear least squares,” JOTA, 57, 323–339.

    Article  MathSciNet  MATH  Google Scholar 

  30. Ph.L. Toint (1978), “Some numerical results using a sparse matrix updating formula in unconstrained optimization,” Math. Comput., 32, 839–851.

    Article  MathSciNet  MATH  Google Scholar 

  31. Ph.L. Toint (1987), “On large scale nonlinear least squares calculations,” SIAM J. Sci. Stat. Comput., 8, 416–435.

    Article  MathSciNet  MATH  Google Scholar 

  32. H. Yabe and T. Takahashi (1993), “Numerical comparison among structured quasi-Newton methods for nonlinear least squares problems,” J. Operations Research Society of Japan, 34, 287–305.

    MathSciNet  Google Scholar 

  33. J.Z. Zhang, L.H. Chen and N.Y. Deng (2000), “A family of scaled factorized Broyden-like methods for nonlinear least squares problems,” SIAM J. Optimization, 10, 1163–1179.

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Kluwer Academic Publishers B.V.

About this chapter

Cite this chapter

Al-Baali, M. (2003). Quasi-Newton Algorithms for Large-Scale Nonlinear Least-Squares. In: Di Pillo, G., Murli, A. (eds) High Performance Algorithms and Software for Nonlinear Optimization. Applied Optimization, vol 82. Springer, Boston, MA. https://doi.org/10.1007/978-1-4613-0241-4_1

Download citation

  • DOI: https://doi.org/10.1007/978-1-4613-0241-4_1

  • Publisher Name: Springer, Boston, MA

  • Print ISBN: 978-1-4613-7956-0

  • Online ISBN: 978-1-4613-0241-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics