Abstract
In this chapter we shall consider a number of minimization methods which require the evaluation of the derivatives of the function as well as the function values themselves. Much of the theory surrounding such methods is strictly applicable only to quadratic functions (see Section 3.3), but fortunately many objective functions can be well approximated by quadratics in the neighbourhood of the minimum. We begin with an account of the simplest and possibly oldest of this type of minimization procedure, steepest descent.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1987 B. S. Everitt
About this chapter
Cite this chapter
Everitt, B.S. (1987). Gradient methods. In: Introduction to Optimization Methods and their Application in Statistics. Springer, Dordrecht. https://doi.org/10.1007/978-94-009-3153-4_3
Download citation
DOI: https://doi.org/10.1007/978-94-009-3153-4_3
Publisher Name: Springer, Dordrecht
Print ISBN: 978-94-010-7917-4
Online ISBN: 978-94-009-3153-4
eBook Packages: Springer Book Archive