Generating Conjugate Directions Using Limited Second Derivatives
Gradients of objective functions in very many variables can be evaluated cheaply by the reverse, or adjoint, mode of automatic differentiation. One can simultaneously evaluate arbitrary Hessian-vector products with only a constant increase in complexity. These limited second derivatives can be used within a truncated Newton code or to improve nonlinear conjugate gradient codes with respect to the aspects: stepsize prediction, search direction conjugacy, restart criteria. We report preliminary results with an experimental conjugate gradient implementation.
Unable to display preview. Download preview PDF.
- C. Bischof, A. Carle, G. Corliss, A. Griewank, and P. Hovland; ADIFOR: Generating derivative codes from Fortran programs, Scient. Prog. 1, 1992, 11–29.Google Scholar
- J. Gilbert and J. Nocedal; Global convergence properties of conjugate gradient methods for optimization, INRIA Rapport de Recherche n∘1268, Le Chesnay, 1991.Google Scholar
- A. Griewank, D. Juedes, and J. Utke, ADOL-C, a package for the automatic differentiation of algorithms written in C/C++, ACM Trans. Math. Soft., 1994, to appear.Google Scholar
- E. Polak and G. Ribière; Note sur la convergence de méthods de directions conjuguées, Rev. Franç, d’Inform. Rech. Opér. 16, 1969, 35–43.Google Scholar
- A. Griewank and P. Toint; On the unconstrained optimization of partially separable objective functions, in Nonlinear Optimization 1981 Ed. by M. Powell), Academic, London, 1981, 301–312.Google Scholar