Abstract
In this chapter we analyze the convergence of a projected gradient algorithm with a smooth objective function under the presence of computational errors. We show that the algorithm generates a good approximate solution, if computational errors are bounded from above by a small positive constant. Moreover, for a known computational error, we find out what an approximate solution can be obtained and how many iterates one needs for this.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Nesterov Yu (1983) A method for solving the convex programming problem with convergence rate O(1∕k 2). Dokl Akad Nauk 269:543–547
Nesterov Yu (2004) Introductory lectures on convex optimization. Kluwer, Boston
Polyak RA (2015) Projected gradient method for non-negative least squares. Contemp Math 636:167–179
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Zaslavski, A.J. (2016). Gradient Algorithm with a Smooth Objective Function. In: Numerical Optimization with Computational Errors. Springer Optimization and Its Applications, vol 108. Springer, Cham. https://doi.org/10.1007/978-3-319-30921-7_4
Download citation
DOI: https://doi.org/10.1007/978-3-319-30921-7_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-30920-0
Online ISBN: 978-3-319-30921-7
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)