Steepest Descent Method
- 117 Downloads
The steepest descent method is one of the oldest and well-known search techniques for minimizing multivariable unconstrained optimization problems. This method has played an important role in the development of advanced optimization algorithms. It is a first-order derivative iterative optimization algorithm whose convergence is linear for the case of quadratic functions. If we take steps in the direction of a negative gradient of the function at the given current point to find a local minimum point, then this procedure is called Gradient Descent.