Abstract
In this chapter we analyze the convergence of the mirror descent algorithm under the presence of computational errors. We show that the algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Moreover, for a known computational error, we find out what an approximate solution can be obtained and how many iterates one needs for this.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Beck A, Teboulle M (2003) Mirror descent and nonlinear projected subgradient methods for convex optimization. Oper Res Lett 31:167–175
Nemirovski A, Yudin D (1983) Problem complexity and method efficiency in optimization. Wiley, New York
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Zaslavski, A.J. (2016). The Mirror Descent Algorithm. In: Numerical Optimization with Computational Errors. Springer Optimization and Its Applications, vol 108. Springer, Cham. https://doi.org/10.1007/978-3-319-30921-7_3
Download citation
DOI: https://doi.org/10.1007/978-3-319-30921-7_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-30920-0
Online ISBN: 978-3-319-30921-7
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)