Conditional Extragradient Algorithms for Solving Constrained Variational Inequalities (original) (raw)

Conditional extragradient algorithms for variational inequalities

In this paper, we generalize the classical extragradient algorithm for solving variational inequality problems by utilizing nonzero normal vectors of the feasible set. In particular, conceptual algorithms are proposed with two different linesearchs. We then establish convergence results for these algorithms under mild assumptions. Our study suggests that nonzero normal vectors may significantly improve convergence if chosen appropriately.

An extragradient algorithm for solving general nonconvex variational inequalities

Applied Mathematics Letters, 2010

In this work, we suggest and analyze an extragradient method for solving general nonconvex variational inequalities using the technique of the projection operator. We prove that the convergence of the extragradient method requires only pseudomonotonicity, which is a weaker condition than requiring monotonicity. In this sense, our result can be viewed as an improvement and refinement of the previously known results. Our method of proof is very simple as compared with other techniques.

Extragradient methods for solving nonconvex variational inequalities

Journal of Computational and Applied Mathematics, 2011

In this paper, we introduce and consider a new class of variational inequalities, which are called the nonconvex variational inequalities. Using the projection technique, we suggest and analyze an extragradient method for solving the nonconvex variational inequalities. We show that the extragradient method is equivalent to an implicit iterative method, the convergence of which requires only pseudo-monotonicity, a weaker condition than monotonicity. This clearly improves on the previously known result. Our method of proof is very simple as compared with other techniques.

A new modified subgradient extragradient method for solving variational inequalities

Applicable Analysis, 2019

The goal of the note is to introduce a new modified subgradient extragradient algorithm for solving variational inequalities in Hilbert spaces. A result on the strong convergence of the algorithm is proved without the knowledge of Lipschitz constant of the operator. Several numerical experiments for the proposed algorithm are presented.

Projection algorithms for solving a system of general variational inequalities

Nonlinear Analysis: Theory, Methods & Applications, 2009

In this paper, we introduce and consider a new system of general variational inequalities involving four different operators. Using the projection operator technique, we suggest and analyze some new explicit iterative methods for this system of variational inequalities. We also study the convergence analysis of the new iterative method under certain mild conditions. Since this new system includes the system of variational inequalities involving three operators, variational inequalities and related optimization problems as special cases, results obtained in this paper continue to hold for these problems. Our results can be viewed as a refinement and improvement of the previously known results for variational inequalities.

Accelerated Modified Tseng’s Extragradient Method for Solving Variational Inequality Problems in Hilbert Spaces

Axioms

The aim of this paper is to propose a new iterative algorithm to approximate the solution for a variational inequality problem in real Hilbert spaces. A strong convergence result for the above problem is established under certain mild conditions. Our proposed method requires the computation of only one projection onto the feasible set in each iteration. Some numerical examples are presented to support that our proposed method performs better than some known comparable methods for solving variational inequality problems.

Outer-Inner Approximation Projection Methods for Multivalued Variational Inequalities

Acta Mathematica Vietnamica, 2016

In this paper, we present new projection methods for solving multivalued variational inequalities on a given nonlinear convex feasible domain. The first one is an extension of the extragradient method to multivalued variational inequalities under the asymptotic optimality condition, but it must satisfy certain Lipschitz continuity conditions. To avoid this requirement, we propose linesearch procedures commonly used in variational inequalities to obtain an approximation linesearch method for solving multivalued variational inequalities. Next, basing on a family of nonempty closed convex subsets of R n and linesearch techniques, we give inner approximation projection algorithms for solving multivalued variational inequalities and the convergence of the algorithms is established under few assumptions.

New extragradient-type methods for general variational inequalities

Journal of Mathematical Analysis and Applications, 2003

In this paper, we consider and analyze a new class of extragradient-type methods for solving general variational inequalities. The modified methods converge for pseudomonotone operators which is weaker condition than monotonicity. Our proof of convergence is very simple as compared with other methods. The proposed methods include several new and known methods as special cases. Our results present a significant improvement of previously known methods for solving variational inequalities and related optimization problems.  2002 Elsevier Science (USA). All rights reserved.

A descent method for nonsmooth variational inequalities via regularization

2008

in this paper we propose a descent method for solving variational inequality problems where the underlying operator is nonsmooth, locally Lipschitz, and monotone over a closed, convex feasible set. The idea is to combine a descent method for variational inequality problems whose operators are nonsmooth, locally Lipschitz, and strongly monotone, with the Tikonov-Browder regularization technique. Finally, numerical results are presented and discussed.

Extragradient algorithms extended to equilibrium problems

Optimization, 2008

We make use of the auxiliary problem principle to develop iterative algorithms for solving equilibrium problems. The first one is an extension of the extragradient algorithm to equilibrium problems. In this algorithm the equilibrium bifunction is not required to satisfy any monotonicity property, but it must satisfy a certain Lipschitztype condition. To avoid this requirement we propose linesearch procedures commonly used in variational inequalities to obtain projection-type algorithms for solving equilibrium problems. Applications to mixed variational inequalities are discussed. A special class of equilibrium problems is investigated and some preliminary computational results are reported.