Skip to main content

Gradient Algorithm with a Smooth Objective Function

  • Chapter
  • First Online:
Convex Optimization with Computational Errors

Part of the book series: Springer Optimization and Its Applications ((SOIA,volume 155))

  • 867 Accesses

Abstract

In this chapter we analyze the convergence of a projected gradient algorithm with a smooth objective function under the presence of computational errors. The problem is described by an objective function and a set of feasible points. For this algorithm each iteration consists of two steps. The first step is a calculation of a gradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error. In general, these two computational errors are different.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 49.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 49.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Nesterov Yu (1983) A method for solving the convex programming problem with convergence rate O(1∕k2). Dokl Akad Nauk 269:543–547

    MathSciNet  Google Scholar 

  2. Nesterov Yu (2004) Introductory lectures on convex optimization. Kluwer, Boston

    Google Scholar 

  3. Polyak RA (2015) Projected gradient method for non-negative least squares. Contemp Math 636:167–179

    Article  MathSciNet  Google Scholar 

  4. Zaslavski AJ (2016) Numerical optimization with computational errors. Springer, Cham

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

J. Zaslavski, A. (2020). Gradient Algorithm with a Smooth Objective Function. In: Convex Optimization with Computational Errors. Springer Optimization and Its Applications, vol 155. Springer, Cham. https://doi.org/10.1007/978-3-030-37822-6_4

Download citation

Publish with us

Policies and ethics