Subgradient-Type Methods with Space Dilation

  • Naum Z. Shor
Part of the Nonconvex Optimization and Its Applications book series (NOIA, volume 24)


The analysis of the subgradient methods has shown that improvements only in the stepsize rules cannot, in general, significantly accelerate convergence if at each iteration the algorithm proceeds in the direction opposite to that of the subgradient. Indeed, slow convergence is due to the fact that the subgradient is often almost perpendicular to the direction towards the minimum. In such circumstances the reduction of the distance to the minimum is much smaller than the stepsize, and therefore the stepsizes cannot diminish too rapidly, if we want to guarantee the convergence to a minimum.


Minimum Point Optimal Point Geometrical Progression Subgradient Method Convex Programming Problem 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media Dordrecht 1998

Authors and Affiliations

  • Naum Z. Shor
    • 1
  1. 1.V.M. Glushkov Institute of CyberneticsUkrainian National Academy of SciencesKievUkraine

Personalised recommendations