Advertisement

Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

A stochastic steepest-descent algorithm

Abstract

A stochastic steepest-descent algorithm for function minimization under noisy observations is presented. Function evaluation is done by performing a number of random experiments on a suitable probability space. The number of experiments performed at a point generated by the algorithm reflects a balance between the conflicting requirements of accuracy and computational complexity. The algorithm uses an adaptive precision scheme to determine the number of random experiments at a point; this number tends to increase whenever a stationary point is approached and to decrease otherwise. Two rules are used to determine the number of random experiments at a point; one, in the inner loop of the algorithm, uses the magnitude of the observed gradient of the function to be minimized; and the other, in the outer-loop, uses a measure of accumulated errors in function evaluations at past points generated by the algorithm. Once a stochastic approximation of the function to be minimized is obtained at a point, the algorithm proceeds to generate the next point by using the steepest-descent deterministic methods of Armijo and Polak (Refs. 3, 4). Convergence of the algorithm to stationary points is demonstrated under suitable assumptions.

This is a preview of subscription content, log in to check access.

References

  1. 1.

    Klessig, R., andPolak, E.,An Adaptive Precision Gradient Method for Optimal Control, SIAM Journal on Control and Optimization, Vol. 11, pp. 80–93, 1973.

  2. 2.

    Wardi, Y.,Adaptive-Precision Steepest-Descent Optimization Algorithms (to appear).

  3. 3.

    Armijo, L.,Minimization of Functions Having Lipschitz Continuous First Partial Derivatives, Pacific Journal of Mathematics, Vol. 16, pp. 1–3, 1966.

  4. 4.

    Polak, E.,Computational Methods in Optimization—A Unified Approach, Academic Press, New York, New York, 1971.

  5. 5.

    Kushner, H. J.,Stochastic Approximation Algorithms for the Local Optimization of Functions with Nonunique Stationary Points, IEEE Transactions on Automatic Control, Vol. AC-17, pp. 646–654, 1972.

  6. 6.

    Kushner, H. J., andGavin, T.,Stochastic Approximation Type Methods for Constrained Systems: Algorithms and Numerical Results, IEEE Transactions on Automatic Control, Vol. AC-19, pp. 349–357, 1974.

  7. 7.

    Kushner, H. J.,Stochastic Approximation Algorithms for Constrained Optimization Problems, Annals of Statistics, Vol. 2, pp. 713–723, 1974.

  8. 8.

    Kiefer, J., andWolfowitz, J.,Stochastic Estimation of the Maximum of a Regression Function, Annals of Mathematical Statistics, Vol. 23, pp. 462–466, 1952.

Download references

Author information

Additional information

Communicated by D. Q. Mayne

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Wardi, Y. A stochastic steepest-descent algorithm. J Optim Theory Appl 59, 307–323 (1988). https://doi.org/10.1007/BF00938315

Download citation

Key Words

  • Steepest-descent algorithm
  • Armijo stepsize
  • adaptive precision schemes
  • stochastic algorithms