Abstract
Over the last 40 years many powerful direct search algorithms have been developed for the unconstrained minimization of general functions. These algorithms require an initial estimate to the optimum point, denoted by \(\mathbf{x}^0\). With this estimate as starting point, the algorithm generates a sequence of estimates \(\mathbf{x}^0\), \(\mathbf{x}^1\), \(\mathbf{x}^2\), \(\dots \), by successively searching directly from each point in a direction of descent to determine the next point. The process is terminated if either no further progress is made, or if a point \(\mathbf{x}^k\) is reached (for smooth functions) at which the first necessary condition in, i.e. \({\varvec{\nabla }} f(\mathbf{x})=\mathbf{0}\) is sufficiently accurately satisfied, in which case \(\mathbf{x}^*\cong \mathbf{x}^k\). It is usually, although not always, required that the function value at the new iterate \(\mathbf{x}^{i+1}\) be lower than that at \(\mathbf{x}^i\).
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsAuthor information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this chapter
Cite this chapter
Snyman, J.A., Wilke, D.N. (2018). LINE SEARCH DESCENT METHODS FOR UNCONSTRAINED MINIMIZATION. In: Practical Mathematical Optimization. Springer Optimization and Its Applications, vol 133. Springer, Cham. https://doi.org/10.1007/978-3-319-77586-9_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-77586-9_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-77585-2
Online ISBN: 978-3-319-77586-9
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)