Abstract
The analysis of the subgradient methods has shown that improvements only in the stepsize rules cannot, in general, significantly accelerate convergence if at each iteration the algorithm proceeds in the direction opposite to that of the subgradient. Indeed, slow convergence is due to the fact that the subgradient is often almost perpendicular to the direction towards the minimum. In such circumstances the reduction of the distance to the minimum is much smaller than the stepsize, and therefore the stepsizes cannot diminish too rapidly, if we want to guarantee the convergence to a minimum.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1998 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Shor, N.Z. (1998). Subgradient-Type Methods with Space Dilation. In: Nondifferentiable Optimization and Polynomial Problems. Nonconvex Optimization and Its Applications, vol 24. Springer, Boston, MA. https://doi.org/10.1007/978-1-4757-6015-6_3
Download citation
DOI: https://doi.org/10.1007/978-1-4757-6015-6_3
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4419-4792-5
Online ISBN: 978-1-4757-6015-6
eBook Packages: Springer Book Archive