Damped Techniques for the Limited Memory BFGS Method for Large-Scale Optimization
- 412 Downloads
This paper is aimed to extend a certain damped technique, suitable for the Broyden–Fletcher–Goldfarb–Shanno (BFGS) method, to the limited memory BFGS method in the case of the large-scale unconstrained optimization. It is shown that the proposed technique maintains the global convergence property on uniformly convex functions for the limited memory BFGS method. Some numerical results are described to illustrate the important role of the damped technique. Since this technique enforces safely the positive definiteness property of the BFGS update for any value of the steplength, we also consider only the first Wolfe–Powell condition on the steplength. Then, as for the backtracking framework, only one gradient evaluation is performed on each iteration. It is reported that the proposed damped methods work much better than the limited memory BFGS method in several cases.
KeywordsLarge-scale optimization The limited memory BFGS method Damped technique Line search framework
- 2.Al-Baali, M.: Convergence analysis of a class of damped quasi-Newton methods for nonlinear optimization. Research Report, DOMAS 11/2, Sultan Qaboos University, Oman (2011) Google Scholar
- 6.Al-Baali, M.: Quasi-Wolfe conditions for quasi-Newton methods for large-scale optimization. In: 40th Workshop on Large Scale Nonlinear Optimization, Erice, Italy, June 22–July 1 (2004) Google Scholar
- 15.Toint, P.L.: Test problems for partially separable optimization and results for the routine PSPMIN. Technical Report No. 83/4, Department of Mathematics, Faculties University de Namur, Namur, Belgium (1983) Google Scholar
- 17.Siegel, D.: Implementing and modifying Broyden class updates for large scale optimization. Technical Report NA12, Department of Applied Mathematics and Theoretical Physics, Cambridge University, England (1992) Google Scholar