Advertisement

Random Minibatch Subgradient Algorithms for Convex Problems with Functional Constraints

  • Angelia Nedić
  • Ion NecoaraEmail author
Article
  • 20 Downloads

Abstract

In this paper we consider non-smooth convex optimization problems with (possibly) infinite intersection of constraints. In contrast to the classical approach, where the constraints are usually represented as intersection of simple sets, which are easy to project onto, in this paper we consider that each constraint set is given as the level set of a convex but not necessarily differentiable function. For these settings we propose subgradient iterative algorithms with random minibatch feasibility updates. At each iteration, our algorithms take a subgradient step aimed at only minimizing the objective function and then a subsequent subgradient step minimizing the feasibility violation of the observed minibatch of constraints. The feasibility updates are performed based on either parallel or sequential random observations of several constraint components. We analyze the convergence behavior of the proposed algorithms for the case when the objective function is strongly convex and with bounded subgradients, while the functional constraints are endowed with a bounded first-order black-box oracle. For a diminishing stepsize, we prove sublinear convergence rates for the expected distances of the weighted averages of the iterates from the constraint set, as well as for the expected suboptimality of the function values along the weighted averages. Our convergence rates are known to be optimal for subgradient methods on this class of problems. Moreover, the rates depend explicitly on the minibatch size and show when minibatching helps a subgradient scheme with random feasibility updates.

Keywords

Convex minimization Functional constraints Subgradient algorithms Random minibatch projection algorithms Convergence rates 

Notes

Acknowledgements

This research was supported by the National Science Foundation under CAREER Grant CMMI 07-42538 and by the Executive Agency for Higher Education, Research and Innovation Funding (UEFISCDI), Romania, PNIII-P4-PCE-2016-0731, project ScaleFreeNet, No. 39/2017.

References

  1. 1.
    Blatt, D., Hero, A.O.: Energy based sensor network source localization via projection onto convex sets. IEEE Trans. Signal Process. 54(9), 3614–3619 (2006)CrossRefzbMATHGoogle Scholar
  2. 2.
    Bauschke, H., Borwein, J.: On projection algorithms for solving convex feasibility problems. SIAM Rev. 38(3), 367–376 (1996)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Bot, R.I., Hendrich, C.: A double smoothing technique for solving unconstrained nondifferentiable convex optimization problems. Comput. Optim. Appl. 54(2), 239–262 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Bot, R.I., Hein, T.: Iterative regularization with general penalty term—theory and application to \(L_1\) and TV regularization. Inverse Probl. 28(10), 1–19 (2012)CrossRefzbMATHGoogle Scholar
  5. 5.
    Burke, J., Ferris, M.: Weak sharp minima in mathematical programming. SIAM J. Control Optim. 31(6), 1340–1359 (1993)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Bianchi, P., Hachem, W., Salim, A.: A constant step forward-backward algorithm involving random maximal monotone operators, (2017) arxiv preprint (arXiv:1702.04144)
  7. 7.
    Bertsekas, D.P.: Incremental proximal methods for large scale convex optimization. Math. Program. 129(2), 163–195 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Fercoq, O., Alacaoglu, A., Necoara, I., Cevher, V.: Almost surely constrained convex optimization. In: International Conference on Machine Learning (ICML) (2019)Google Scholar
  9. 9.
    Keys, K., Zhou, H., Lange, K.: Proximal distance algorithms: theory and practice. J. Mach. Learn. Res. 20(66), 1–38 (2019)MathSciNetzbMATHGoogle Scholar
  10. 10.
    Kundu, A., Bach, F., Bhattacharya, C.: Convex optimization over inter-section of simple sets: improved convergence rate guarantees via an exact penalty approach. In: International Conference on Artificial Intelligence and Statistics (2018)Google Scholar
  11. 11.
    Lewis, A., Pang, J.: Error bounds for convex inequality systems. In: Crouzeix, J., Martinez-Legaz, J., Volle, M. (eds.) Generalized Convexity, Generalized Monotonicity, pp. 75–110. Cambridge University Press, Cambridge (1998)CrossRefGoogle Scholar
  12. 12.
    Moulines, E., Bach, F.R.: Non-asymptotic analysis of stochastic approximation algorithms for machine learning. Advances in Neural Information Processing Systems (2011)Google Scholar
  13. 13.
    Necoara, I., Nesterov, Yu., Glineur, F.: Linear convergence of first order methods for non-strongly convex optimization. Math. Program. 175(1–2), 69–107 (2018)MathSciNetzbMATHGoogle Scholar
  14. 14.
    Necoara, I., Richtarik, P., Patrascu, A.: Randomized projection methods for convex feasibility problems: conditioning and convergence rates. SIAM J. Optim. 28(4), 2783–2808 (2019)Google Scholar
  15. 15.
    Necoara, I., Nedić, A.: Minibatch stochastic subgradient-based projection algorithms for solving convex inequalities, arxiv preprint (2019)Google Scholar
  16. 16.
    Nedić, A.: Random projection algorithms for convex set intersection Problems. In: IEEE Conference on Decision and Control, pp. 7655–7660 (2010)Google Scholar
  17. 17.
    Nedić, A.: Random algorithms for convex minimization problems. Math. Program. 129(2), 225–273 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    Nemirovski, A., Juditsky, A., Lan, G., Shapiro, A.: Robust stochastic approximation approach to stochastic programming. SIAM J. Optim. 19(4), 1574–1609 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  19. 19.
    Nemirovski, A., Yudin, D.: Problem Complexity Method Efficiency in Optimization. Wiley, New York (1983)Google Scholar
  20. 20.
    Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course. Kluwer, Boston (2004)CrossRefzbMATHGoogle Scholar
  21. 21.
    Ouyang, H., Gray, A.: Stochastic smoothing for nonsmooth minimizations: accelerating SGD by exploiting structure, (2012) arxiv preprint (arXiv:1205.4481)
  22. 22.
    Patrascu, A., Necoara, I.: Nonasymptotic convergence of stochastic proximal point algorithms for constrained convex optimization. J. Mach. Learn. Res. 18(198), 1–42 (2018)zbMATHGoogle Scholar
  23. 23.
    Polyak, B.T.: A general method of solving extremum problems. Doklady Akademii Nauk USSR 174(1), 33–36 (1967)zbMATHGoogle Scholar
  24. 24.
    Polyak, B.T.: Minimization of unsmooth functionals. USSR Comput. Math. Math. Phys. 9, 14–29 (1969)CrossRefzbMATHGoogle Scholar
  25. 25.
    Polyak, B.: Random algorithms for solving convex inequalities. Stud. Comput. Math. 8, 409–422 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  26. 26.
    Polyak, B.T., Juditsky, A.B.: Acceleration of stochastic approximation by averaging. SIAM J. Control Optim. 30(4), 838–855 (1992)MathSciNetCrossRefzbMATHGoogle Scholar
  27. 27.
    Rockafellar, R.T., Uryasev, S.P.: Optimization of conditional value-at-risk. J. Risk 2, 21–41 (2000)CrossRefGoogle Scholar
  28. 28.
    Shefi, R., Teboulle, M.: Rate of convergence analysis of decomposition methods based on the proximal method of multipliers for convex minimization. SIAM J. Optim. 24(1), 269–297 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  29. 29.
    Tran-Dinh, Q., Fercoq, O., Cevher, V.: A smooth primal-dual optimization framework for nonsmooth composite convex minimization. SIAM J. Optim. 28(1), 96–134 (2018)MathSciNetCrossRefzbMATHGoogle Scholar
  30. 30.
    Vapnik, V.: Statistical Learning Theory. Wiley, New York (1998)zbMATHGoogle Scholar
  31. 31.
    Wang, M., Chen, Y., Liu, J., Gu, Y.: Random multiconstraint projection: stochastic gradient methods for convex optimization with many constraints, (2015) arxiv preprint (arxiv:1511.03760)

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.School of Electrical, Computer and Energy EngineeringArizona State UniversityTempeUSA
  2. 2.Department of Automatic Control and Systems EngineeringUniversity Politehnica BucharestBucharestRomania

Personalised recommendations