Abstract
In this note, we propose and study the notion of modified Fejér sequences. Within a Hilbert space setting, this property has been used to prove ergodic convergence of proximal incremental subgradient methods. Here we show that indeed it provides a unifying framework to prove convergence rates for objective function values of several optimization algorithms. In particular, our results apply to forward–backward splitting algorithm, incremental subgradient proximal algorithm, and the Douglas–Rachford splitting method including and generalizing known results.
Similar content being viewed by others
References
Attouch, H., Bolte, J., Svaiter, B.: Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods. Math. Program. Ser. A 137, 91–129 (2011)
Alber, Y.I., Iusem, A.N., Solodov, M.V.: On the projected subgradient method for nonsmooth convex optimization in a Hilbert space. Math. Program. 81, 23–35 (1998)
Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert spaces. Springer, New York (2011)
Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2, 183–202 (2009)
Bello-Cruz, J.-Y.: On proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functions. Set Valued Var. Anal. 25, 245–263 (2017)
Bonettini, S., Benfenati, A., Ruggiero, V.: Scaling techniques for \(\epsilon \)-subgradient methods. SIAM J. Optim. 26, 891–921 (2016)
Bertsekas, D.P.: Incremental proximal methods for large scale convex optimization. Math. Program. Ser. B 129, 163–195 (2009)
Boyd, S., Xiao, L., Mutapcic, A.: Subgradient methods. https://web.stanford.edu/class/ee392o/subgrad_method.pdf. Accessed 14 Oct 2015
Bredies, K., Lorenz, D.A.: Linear convergence of iterative soft-thresholding. J. Fourier Anal. Appl. 14, 813–837 (2008)
Chen, G.H., Rockafellar, R.T.: Convergence rates in forward–backward splitting. SIAM J. Optim. 7, 421–444 (1997)
Combettes, P.L.: Quasi-Fejérian analysis of some optimization algorithms. Stud. Comput. Math. 8, 115–152 (2001)
Combettes, P.L., Wajs, V.R.: Signal recovery by proximal forward–backward splitting. Multiscale Model. Simul. 4, 1168–1200 (2005)
Combettes, P.L.: Fejér monotonicity in convex optimization. In: Floudas, C.A., Pardalos, P.M. (eds.) Encyclopedia of Optimization, pp. 1016–1024. Springer, New York (2009)
Combettes, P.L., Pesquet, J.C.: Proximal splitting methods in signal processing. In: Bauschke, H.H., Burachik, R.S., Combettes, P.L., Elser, V., Luke, D.R., Wolkowicz, H. (eds.) Fixed-Point Algorithms for Inverse Problems in Science and Engineering, pp. 185–212. Springer, New York (2011)
Combettes, P.L., Pesquet, J.C.: Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping. SIAM J. Optim. 25, 1221–1248 (2015)
Combettes, P.L., Pesquet, J.C.: Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping II: mean-square and linear convergence. https://arxiv.org/abs/1704.08083. (2017)
Combettes, P.L., Salzo, S., Villa, S.: Consistent learning by composite proximal thresholding. Math. Program. Ser. B. https://doi.org/10.1007/s10107-017-1133-8 (2017)
Combettes, P.L., Vũ, B.C.: Variable metric quasi Fejér monotonicity. Nonlinear Anal. 78, 17–31 (2013)
Darzentas, J.: Problem complexity and method efficiency in optimization. J. Oper. Res. Soc. 35, 455–455 (1984)
Davis, D.: Convergence rate analysis of the forward-Douglas–Rachford splitting scheme. SIAM J. Optim. 25, 1760–1786 (2015)
Douglas, J., Rachford, H.H.: On the numerical solution of heat conduction problems in two and three space variables. Trans. Am. Math. Soc. 82, 421–439 (1956)
Ermol’ev, Y.M., Tuniev, A.D.: Random Fejér and quasi-Fejér sequences, Theory of Optimal Solutions—Akademiya Nauk Ukrainsko\(\breve{\rm {\i }}\) SSR Kiev 2, 76–83 (1968); translated in: American Mathematical Society Selected Translations in Mathematical Statistics and Probability 13, 143–148 (1973)
Goffin, J.L.: On convergence rates of subgradient optimization methods. Math. Program. 13, 329–347 (1977)
Kiwiel, K.C.: Convergence of approximate and incremental subgradient methods for convex optimization. SIAM J. Optim. 14, 807–840 (2004)
Knopp, K.: Infinite Sequences and Series. Dover Publications Inc., New York (1956)
Larsson, T., Patriksson, M., Stromberg, A.-B.: On the convergence of conditional \(\epsilon \)-subgradient methods for convex programs and convex-concave saddle-point problems. EJOR 151, 461–473 (2003)
Lieutaud, J.: Approximation d’Opérateurs par des Méthodes de Décomposition. Thèse, Université de Paris (1969)
Lin, J., Rosasco, L., Zhou, D.X.: Iterative regularization for learning with convex loss functions. J. Mach. Learn. Res. 17, 1–38 (2016)
Lions, P.L., Mercier, B.: Splitting algorithms for the sum of two nonlinear operators. SIAM J. Numer. Anal. 16, 964–979 (1979)
Moreau, J.J.: Fonctions convexes duales et points proximaux dans un espace hilbertien. C.R. Acad. Sci. Paris 255, 2897–2899 (1962)
Nedic, A., Bertsekas, D.P.: Incremental subgradient methods for nondifferentiable optimization. SIAM J. Optim. 12, 109–138 (2001)
Nedic, A., Bertsekas, D.: Convergence rate of incremental subgradient algorithms. In: Uryasev, S., Pardalos, P.M. (eds.) Stochastic Optimization: Algorithms and Applications, pp. 223–264. Springer, New York (2001)
Nesterov, Y.: Introductory Lectures on Convex Optimization. Springer, New York (2004)
Passty, G.B.: Ergodic convergence to a zero of the sum of monotone operators in Hilbert space. J. Math. Anal. Appl. 72, 383–390 (1979)
Phelps, R.R.: Convex Functions, Monotone Operators and Differentiability, 2nd edn. Springer, New York (1993)
Polyak, B.T.: Introduction to Optimization. Optimization Software, New York (1987)
Rosasco, L., Villa, S., Vũ, B.C.: Stochastic forward-backward splitting for monotone inclusions. J. Optim. Theory Appl. 169, 388–406 (2016)
Rosasco, L., Villa, S., Vũ, B.C.: A stochastic inertial forward-backward splitting algorithm for multivariate monotone inclusions. Optimization 65, 1293–1314 (2016)
Rosasco, L., Villa, S., Vũ, B.C.: A first-order stochastic primal-dual algorithm with correction step. Numer. Funct. Anal. Optim. 38, 602–626 (2017)
Salzo, S.: The variable metric forward-backward splitting algorithm under mild differentiability assumptions. SIAM J. Optim. 27, 2153–2181 (2017)
Shor, N.Z.: Minimization Methods for Non-Differentiable Functions. Springer, New York (1979)
Singer, Y., Duchi, J.C.: Efficient learning using forward-backward splitting. In: Advances in Neural Information Processing Systems, pp. 495–503 (2009)
Tseng, P.: Applications of a splitting algorithm to decomposition in convex programming and variational inequalities. SIAM J. Control Optim. 29, 119–138 (1991)
Zălinescu, C.: Convex Analysis in General Vector Spaces. World Scientific, River Edge (2002)
Author information
Authors and Affiliations
Corresponding author
Additional information
This material is based upon work supported by the Center for Brains, Minds and Machines (CBMM), funded by NSF STC Award CCF-1231216. The work by D. X. Zhou described in this paper is supported by a Grant from the NSFC/RGC Joint Research Scheme [RGC Project No. N_CityU120/14 and NSFC Project No. 11461161006]. L. R. acknowledges the financial support of the Italian Ministry of Education, University and Research FIRB Project RBFR12M3AC, and S. V. the support of INDAM-GNAMPA, Project 2017: “Algoritmi di ottimizzazione ad equazioni di evoluzione ereditarie”.
Rights and permissions
About this article
Cite this article
Lin, J., Rosasco, L., Villa, S. et al. Modified Fejér sequences and applications. Comput Optim Appl 71, 95–113 (2018). https://doi.org/10.1007/s10589-017-9962-1
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-017-9962-1