Abstract
In this chapter we analyze the convergence of a projected gradient algorithm with a smooth objective function under the presence of computational errors. The problem is described by an objective function and a set of feasible points. For this algorithm each iteration consists of two steps. The first step is a calculation of a gradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error. In general, these two computational errors are different.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Nesterov Yu (1983) A method for solving the convex programming problem with convergence rate O(1∕k2). Dokl Akad Nauk 269:543–547
Nesterov Yu (2004) Introductory lectures on convex optimization. Kluwer, Boston
Polyak RA (2015) Projected gradient method for non-negative least squares. Contemp Math 636:167–179
Zaslavski AJ (2016) Numerical optimization with computational errors. Springer, Cham
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
J. Zaslavski, A. (2020). Gradient Algorithm with a Smooth Objective Function. In: Convex Optimization with Computational Errors. Springer Optimization and Its Applications, vol 155. Springer, Cham. https://doi.org/10.1007/978-3-030-37822-6_4
Download citation
DOI: https://doi.org/10.1007/978-3-030-37822-6_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-37821-9
Online ISBN: 978-3-030-37822-6
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)