Abstract
The steepest descent method is one of the oldest and well-known search techniques for minimizing multivariable unconstrained optimization problems. This method has played an important role in the development of advanced optimization algorithms. It is a first-order derivative iterative optimization algorithm whose convergence is linear for the case of quadratic functions. If we take steps in the direction of a negative gradient of the function at the given current point to find a local minimum point, then this procedure is called Gradient Descent.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2019 Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Mishra, S.K., Ram, B. (2019). Steepest Descent Method. In: Introduction to Unconstrained Optimization with R. Springer, Singapore. https://doi.org/10.1007/978-981-15-0894-3_6
Download citation
DOI: https://doi.org/10.1007/978-981-15-0894-3_6
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-0893-6
Online ISBN: 978-981-15-0894-3
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)