Skip to main content
Log in

Modified Fejér sequences and applications

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

In this note, we propose and study the notion of modified Fejér sequences. Within a Hilbert space setting, this property has been used to prove ergodic convergence of proximal incremental subgradient methods. Here we show that indeed it provides a unifying framework to prove convergence rates for objective function values of several optimization algorithms. In particular, our results apply to forward–backward splitting algorithm, incremental subgradient proximal algorithm, and the Douglas–Rachford splitting method including and generalizing known results.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Attouch, H., Bolte, J., Svaiter, B.: Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods. Math. Program. Ser. A 137, 91–129 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  2. Alber, Y.I., Iusem, A.N., Solodov, M.V.: On the projected subgradient method for nonsmooth convex optimization in a Hilbert space. Math. Program. 81, 23–35 (1998)

    MathSciNet  MATH  Google Scholar 

  3. Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert spaces. Springer, New York (2011)

    Book  MATH  Google Scholar 

  4. Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2, 183–202 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  5. Bello-Cruz, J.-Y.: On proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functions. Set Valued Var. Anal. 25, 245–263 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  6. Bonettini, S., Benfenati, A., Ruggiero, V.: Scaling techniques for \(\epsilon \)-subgradient methods. SIAM J. Optim. 26, 891–921 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  7. Bertsekas, D.P.: Incremental proximal methods for large scale convex optimization. Math. Program. Ser. B 129, 163–195 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  8. Boyd, S., Xiao, L., Mutapcic, A.: Subgradient methods. https://web.stanford.edu/class/ee392o/subgrad_method.pdf. Accessed 14 Oct 2015

  9. Bredies, K., Lorenz, D.A.: Linear convergence of iterative soft-thresholding. J. Fourier Anal. Appl. 14, 813–837 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  10. Chen, G.H., Rockafellar, R.T.: Convergence rates in forward–backward splitting. SIAM J. Optim. 7, 421–444 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  11. Combettes, P.L.: Quasi-Fejérian analysis of some optimization algorithms. Stud. Comput. Math. 8, 115–152 (2001)

    Article  MATH  Google Scholar 

  12. Combettes, P.L., Wajs, V.R.: Signal recovery by proximal forward–backward splitting. Multiscale Model. Simul. 4, 1168–1200 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  13. Combettes, P.L.: Fejér monotonicity in convex optimization. In: Floudas, C.A., Pardalos, P.M. (eds.) Encyclopedia of Optimization, pp. 1016–1024. Springer, New York (2009)

    Chapter  Google Scholar 

  14. Combettes, P.L., Pesquet, J.C.: Proximal splitting methods in signal processing. In: Bauschke, H.H., Burachik, R.S., Combettes, P.L., Elser, V., Luke, D.R., Wolkowicz, H. (eds.) Fixed-Point Algorithms for Inverse Problems in Science and Engineering, pp. 185–212. Springer, New York (2011)

    Chapter  Google Scholar 

  15. Combettes, P.L., Pesquet, J.C.: Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping. SIAM J. Optim. 25, 1221–1248 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  16. Combettes, P.L., Pesquet, J.C.: Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping II: mean-square and linear convergence. https://arxiv.org/abs/1704.08083. (2017)

  17. Combettes, P.L., Salzo, S., Villa, S.: Consistent learning by composite proximal thresholding. Math. Program. Ser. B. https://doi.org/10.1007/s10107-017-1133-8 (2017)

  18. Combettes, P.L., Vũ, B.C.: Variable metric quasi Fejér monotonicity. Nonlinear Anal. 78, 17–31 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  19. Darzentas, J.: Problem complexity and method efficiency in optimization. J. Oper. Res. Soc. 35, 455–455 (1984)

    Article  Google Scholar 

  20. Davis, D.: Convergence rate analysis of the forward-Douglas–Rachford splitting scheme. SIAM J. Optim. 25, 1760–1786 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  21. Douglas, J., Rachford, H.H.: On the numerical solution of heat conduction problems in two and three space variables. Trans. Am. Math. Soc. 82, 421–439 (1956)

    Article  MathSciNet  MATH  Google Scholar 

  22. Ermol’ev, Y.M., Tuniev, A.D.: Random Fejér and quasi-Fejér sequences, Theory of Optimal Solutions—Akademiya Nauk Ukrainsko\(\breve{\rm {\i }}\) SSR Kiev 2, 76–83 (1968); translated in: American Mathematical Society Selected Translations in Mathematical Statistics and Probability 13, 143–148 (1973)

  23. Goffin, J.L.: On convergence rates of subgradient optimization methods. Math. Program. 13, 329–347 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  24. Kiwiel, K.C.: Convergence of approximate and incremental subgradient methods for convex optimization. SIAM J. Optim. 14, 807–840 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  25. Knopp, K.: Infinite Sequences and Series. Dover Publications Inc., New York (1956)

    MATH  Google Scholar 

  26. Larsson, T., Patriksson, M., Stromberg, A.-B.: On the convergence of conditional \(\epsilon \)-subgradient methods for convex programs and convex-concave saddle-point problems. EJOR 151, 461–473 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  27. Lieutaud, J.: Approximation d’Opérateurs par des Méthodes de Décomposition. Thèse, Université de Paris (1969)

  28. Lin, J., Rosasco, L., Zhou, D.X.: Iterative regularization for learning with convex loss functions. J. Mach. Learn. Res. 17, 1–38 (2016)

    MathSciNet  MATH  Google Scholar 

  29. Lions, P.L., Mercier, B.: Splitting algorithms for the sum of two nonlinear operators. SIAM J. Numer. Anal. 16, 964–979 (1979)

    Article  MathSciNet  MATH  Google Scholar 

  30. Moreau, J.J.: Fonctions convexes duales et points proximaux dans un espace hilbertien. C.R. Acad. Sci. Paris 255, 2897–2899 (1962)

    MathSciNet  MATH  Google Scholar 

  31. Nedic, A., Bertsekas, D.P.: Incremental subgradient methods for nondifferentiable optimization. SIAM J. Optim. 12, 109–138 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  32. Nedic, A., Bertsekas, D.: Convergence rate of incremental subgradient algorithms. In: Uryasev, S., Pardalos, P.M. (eds.) Stochastic Optimization: Algorithms and Applications, pp. 223–264. Springer, New York (2001)

    Chapter  Google Scholar 

  33. Nesterov, Y.: Introductory Lectures on Convex Optimization. Springer, New York (2004)

    Book  MATH  Google Scholar 

  34. Passty, G.B.: Ergodic convergence to a zero of the sum of monotone operators in Hilbert space. J. Math. Anal. Appl. 72, 383–390 (1979)

    Article  MathSciNet  MATH  Google Scholar 

  35. Phelps, R.R.: Convex Functions, Monotone Operators and Differentiability, 2nd edn. Springer, New York (1993)

    MATH  Google Scholar 

  36. Polyak, B.T.: Introduction to Optimization. Optimization Software, New York (1987)

    MATH  Google Scholar 

  37. Rosasco, L., Villa, S., Vũ, B.C.: Stochastic forward-backward splitting for monotone inclusions. J. Optim. Theory Appl. 169, 388–406 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  38. Rosasco, L., Villa, S., Vũ, B.C.: A stochastic inertial forward-backward splitting algorithm for multivariate monotone inclusions. Optimization 65, 1293–1314 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  39. Rosasco, L., Villa, S., Vũ, B.C.: A first-order stochastic primal-dual algorithm with correction step. Numer. Funct. Anal. Optim. 38, 602–626 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  40. Salzo, S.: The variable metric forward-backward splitting algorithm under mild differentiability assumptions. SIAM J. Optim. 27, 2153–2181 (2017)

  41. Shor, N.Z.: Minimization Methods for Non-Differentiable Functions. Springer, New York (1979)

    MATH  Google Scholar 

  42. Singer, Y., Duchi, J.C.: Efficient learning using forward-backward splitting. In: Advances in Neural Information Processing Systems, pp. 495–503 (2009)

  43. Tseng, P.: Applications of a splitting algorithm to decomposition in convex programming and variational inequalities. SIAM J. Control Optim. 29, 119–138 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  44. Zălinescu, C.: Convex Analysis in General Vector Spaces. World Scientific, River Edge (2002)

    Book  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Silvia Villa.

Additional information

This material is based upon work supported by the Center for Brains, Minds and Machines (CBMM), funded by NSF STC Award CCF-1231216. The work by D. X. Zhou described in this paper is supported by a Grant from the NSFC/RGC Joint Research Scheme [RGC Project No. N_CityU120/14 and NSFC Project No. 11461161006]. L. R. acknowledges the financial support of the Italian Ministry of Education, University and Research FIRB Project RBFR12M3AC, and S. V. the support of INDAM-GNAMPA, Project 2017: “Algoritmi di ottimizzazione ad equazioni di evoluzione ereditarie”.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lin, J., Rosasco, L., Villa, S. et al. Modified Fejér sequences and applications. Comput Optim Appl 71, 95–113 (2018). https://doi.org/10.1007/s10589-017-9962-1

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-017-9962-1

Keywords

Navigation