A Class of Globally Convergent Algorithms for Pseudomonotone Variational Inequalities
We describe a fairly broad class of algorithms for solving variational inequalities, global convergence of which is based on the strategy of generating a hyperplane separating the current iterate from the solution set. The methods are shown to converge under very mild assumptions. Specifically, the problem mapping is only assumed to be continuous and pseudomonotone with respect to at least one solution. The strategy to obtain (super)linear rate of convergence is also discussed. The algorithms in this class differ in the tools which are used to construct the separating hyperplane. Our general scheme subsumes an extragradient-type projection method, a globally and locally super linearly convergent Josephy-Newton-type method, a certain minimization-based method, and a splitting technique.
KeywordsVariational Inequality Complementarity Problem Global Convergence Linear Complementarity Problem SIAM Journal
Unable to display preview. Download preview PDF.
- N.H. Josephy. Newton’s method for generalized equations. Technical Summary Report 1965, Mathematics Research Center, University of Wisconsin, Madison, Wisconsin, 1979.Google Scholar
- M.V. Solodov and B.F. Svaiter. A globally convergent inexact Newton method for systems of monotone equations. In M. Fukushima and L. Qi, editors, Reformulation — Nonsmooth, Piecewise Smooth, Semismooth and Smoothing Methods, pages 355–369. Kluwer Academic Publishers, 1999.Google Scholar