Conjugate Gradient and Quasi-Newton
Our discussion of Newton’s method has highlighted both its strengths and its weaknesses. Related algorithms such as scoring and Gauss-Newton exploit special features of the objective function f (x) in overcoming the defects of Newton’s method. We now consider algorithms that apply to generic functions f (x). These algorithms also operate by locally approximating f (x) by a strictly convex quadratic function. Indeed, the guiding philosophy behind many modern optimization algorithms is to see what techniques work well with quadratic functions and then to modify the best techniques to generic functions.
KeywordsConjugate Gradient Line Search Conjugate Gradient Method Trust Region Positive Definite Matrix
Unable to display preview. Download preview PDF.