In Chap. 6, multidimensional optimization methods were considered in which the search for the minimizer is carried out by using a set of conjugate directions. An important feature of some of these methods (e.g., the Fletcher-Reeves and Powell’s methods) is that explicit expressions for the second derivatives of f(x) are not required. Another class of methods that do not require explicit expressions for the second derivatives is the class of quasi-Newton methods. These are sometimes referred to as variable metric methods.
KeywordsLine Search Descent Direction Gradient Evaluation BFGS Method Real Symmetric Matrix
Unable to display preview. Download preview PDF.
- 3.W. C. Davidon, “Variable metric method for minimization,” AEC Res. and Dev. Report ANL-5990, 1959.Google Scholar
- 5.B. A. Murtagh and R. W. H. Sargent, “A constrained minimization method with quadratic convergence,” Optimization, ed. R. Fletcher, pp. 215–246, Academic Press, London, 1969.Google Scholar
- 6.P. Wolfe, “Methods of nonlinear programming,” Nonlinear Programming, ed. J. Abadie, pp. 97–131, Interscience, Wiley, New York, 1967.Google Scholar
- 17.G. P. McCormick and J. D. Pearson, “Variable metric methods and unconstrained optimization,” in Optimization, ed. R. Fletcher, Academic Press, London, 1969.Google Scholar
- 19.A. Antoniou, Digital Signal Processing: Signals, Systems, and Filters, McGraw-Hill, New York, 2005.Google Scholar