Advertisement

Computational Optimization and Applications

, Volume 57, Issue 2, pp 339–363 | Cite as

On the O(1/t) convergence rate of the projection and contraction methods for variational inequalities with Lipschitz continuous monotone operators

  • Xingju Cai
  • Guoyong Gu
  • Bingsheng He
Article

Abstract

Nemirovski’s analysis (SIAM J. Optim. 15:229–251, 2005) indicates that the extragradient method has the O(1/t) convergence rate for variational inequalities with Lipschitz continuous monotone operators. For the same problems, in the last decades, a class of Fejér monotone projection and contraction methods is developed. Until now, only convergence results are available to these projection and contraction methods, though the numerical experiments indicate that they always outperform the extragradient method. The reason is that the former benefits from the ‘optimal’ step size in the contraction sense. In this paper, we prove the convergence rate under a unified conceptual framework, which includes the projection and contraction methods as special cases and thus perfects the theory of the existing projection and contraction methods. Preliminary numerical results demonstrate that the projection and contraction methods converge twice faster than the extragradient method.

Keywords

Variational inequality Projection and contraction method Convergence rate 

Notes

Acknowledgements

The authors thank X.-L. Fu, M. Li, M. Tao and X.-M. Yuan for the discussion and valuable suggestions.

References

  1. 1.
    Bertsekas, D.P., Tsitsiklis, J.N.: Parallel and Distributed Computation, Numerical Methods. Prentice-Hall, Englewood Cliffs (1989) MATHGoogle Scholar
  2. 2.
    Blum, E., Oettli, W.: Mathematische Optimierung: Grundlagen und Verfahren. Ökonometrie und Unternehmensforschung. Springer, Berlin (1975) CrossRefGoogle Scholar
  3. 3.
    Facchinei, F., Pang, J.S.: Finite-Dimensional Variational Inequalities and Complementarity Problems, Vols. I and II. Springer Series in Operations Research. Springer, New York (2003) Google Scholar
  4. 4.
    Harker, P.T., Pang, J.S.: A damped-Newton method for the linear complementarity problem. Lect. Appl. Math. 26, 265–284 (1990) MathSciNetGoogle Scholar
  5. 5.
    He, B.S.: A class of projection and contraction methods for monotone variational inequalities. Appl. Math. Optim. 35, 69–76 (1997) CrossRefMATHMathSciNetGoogle Scholar
  6. 6.
    He, B.S., Liao, L.-Z.: Improvements of some projection methods for monotone nonlinear variational inequalities. J. Optim. Theory Appl. 112, 111–128 (2002) CrossRefMATHMathSciNetGoogle Scholar
  7. 7.
    He, B.S., Xu, M.-H.: A general framework of contraction methods for monotone variational inequalities. Pac. J. Optim. 4, 195–212 (2008) MATHMathSciNetGoogle Scholar
  8. 8.
    He, B.S., Yuan, X.M., Zhang, J.J.Z.: Comparison of two kinds of prediction-correction methods for monotone variational inequalities. Comput. Optim. Appl. 27, 247–267 (2004) CrossRefMATHMathSciNetGoogle Scholar
  9. 9.
    He, B.S., Liao, L.-Z., Wang, X.: Proximal-like contraction methods for monotone variational inequalities in a unified framework I: effective quadruplet and primary methods. Comput. Optim. Appl. 51, 649–679 (2012) CrossRefMATHMathSciNetGoogle Scholar
  10. 10.
    He, B.S., Liao, L.-Z., Wang, X.: Proximal-like contraction methods for monotone variational inequalities in a unified framework II: general methods and numerical experiments. Comput. Optim. Appl. 51, 681–708 (2012) CrossRefMATHMathSciNetGoogle Scholar
  11. 11.
    Howard, A.G.: Large margin, transformation learning. PhD Thesis, Graduate School of Arts and Science, Columbia University (2009) Google Scholar
  12. 12.
    Korpelevich, G.M.: The extragradient method for finding saddle points and other problems. Ekon. Mat. Metod. 12, 747–756 (1976) MATHGoogle Scholar
  13. 13.
    Khobotov, E.N.: Modification of the extragradient method for solving variational inequalities and certain optimization problems. USSR Comput. Math. Math. Phys. 27, 120–127 (1987) CrossRefMATHMathSciNetGoogle Scholar
  14. 14.
    Lacoste-Julien, S.: Discriminative machine learning with structure. PhD Thesis, Computer Science, University of California, Berkeley (2009) Google Scholar
  15. 15.
    Nemirovski, A.: Prox-method with rate of convergence O(1/t) for variational inequality with Lipschitz continuous monotone operators and smooth convex-concave saddle point problems. SIAM J. Optim. 15, 229–251 (2005) CrossRefMathSciNetGoogle Scholar
  16. 16.
    Pan, Y.: A game theoretical approach to constrained OSNR optimization problems in optical network. PhD Thesis, Electrical and Computer Engineering, University of Toronto (2009) Google Scholar
  17. 17.
    Pan, Y., Pavel, L.: Games with coupled propagated constraints in optical networks with multi-link topologies. Automatica 45, 871–880 (2009) CrossRefMATHMathSciNetGoogle Scholar
  18. 18.
    Sha, F.: Large margin training of acoustic models for speech recognition. PhD Thesis, Computer and Information Science, University of Pennsylvania (2007) Google Scholar
  19. 19.
    Solodov, M.V., Tseng, P.: Modified projection-type methods for monotone variational inequalities. SIAM J. Control Optim. 34, 1814–1830 (1996) CrossRefMATHMathSciNetGoogle Scholar
  20. 20.
    Sun, D.: A class of iterative methods for solving nonlinear projection equations. J. Optim. Theory Appl. 91, 123–140 (1996) CrossRefMATHMathSciNetGoogle Scholar
  21. 21.
    Taji, K., Fukushima, M., Ibaraki, I.: A globally convergent Newton method for solving strongly monotone variational inequalities. Math. Program. 58, 369–383 (1993) CrossRefMATHMathSciNetGoogle Scholar
  22. 22.
    Taskar, B., Lacoste-Julien, S., Jordan, M.I.: Structured prediction, dual extragradient and Bregman projections. J. Mach. Learn. Res. 7, 1627–1653 (2006) MATHMathSciNetGoogle Scholar
  23. 23.
    Taskar, B., Lacoste-Julien, S., Jordan, M.I.: Structured prediction via extragradient method. In: Weiss, Y., Schoelkopf, B., Platt, J. (eds.) Advances in Neural Information Processing Systems (NIPS), vol. 18 (2006) Google Scholar
  24. 24.
    Tseng, P.: On accelerated proximal gradient methods for convex-concave optimization. Department of Mathematics, University of Washington, Seattle, WA 98195, USA (2008) Google Scholar
  25. 25.
    Xue, G.L., Ye, Y.Y.: An efficient algorithm for minimizing a sum of Euclidean norms with applications. SIAM J. Optim. 7, 1017–1036 (1997) CrossRefMATHMathSciNetGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  1. 1.Department of MathematicsNanjing UniversityNanjingP.R. China

Personalised recommendations