Font Size: a A A

Convergence Analysis On Stochastic Extragradient Methods For Stochastic Variational Inequalities

Posted on:2021-04-22Degree:MasterType:Thesis
Country:ChinaCandidate:S X ZhangFull Text:PDF
GTID:2370330626960628Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Variational inequalities are widely used in nonlinear optimization,mathematical economy and control theory,and have attracted attentions from a lot of scholars.With the further research,the related algorithms and theoretical results for solving deterministic variational inequalities have matured.But in the real world,there are some uncontrollable factors,such as weather and demand,we call them random variables.These random variables will have a nonnegligible impact on our decisions.But the algorithms for solving the deterministic variational inequalities are not applicable.This requires us to develop new methods to solve these problems.In this paper,we mainly study the mini-batch stochastic Bregman extragradient method for solving stochastic variational inequalities and stochastic extragradient method with Lipschitz line search for solving stochastic saddle point problem.We introduce the present research status at home and abroad of variational inequalities firstly,and show the shortcomings and advantages of the existing algorithms.In the second chapter,we give some basic theoretical knowledge,including monotonicity of function,continuity of function and Bergman distance.In addition,we also introduce the Bergman projection operator and its properties.It plays an important role in our paper.Then in the third chapter,We analyze the convergence of mini-batch stochastic Bergman extragradient method.We give some important lemmas firstly,and obtained the following conclusions.One is that,for pseudomonotone stochastic variational inequalities,we give the convergence results under the Lipschitz continuous assumption of operators.The other one is that,for monotone stochastic variational inequalities that satisfy the H¨older continuous assumption,we prove that a sharper convergence rate of stepsize diminishing mini-batch stochastic Bregman extragradient method can be achieved.Then,as an application of the mini-batch stochastic Bregman extragradient method,we apply it to solve the stochastic saddle point problem and give the convergence analysis accordingly.Finally,we use a modified stochastic extragradient method to solve the stochastic saddle point problem,the step size of the algorithm is obtained by Lipschitz line search,we prove that the method can achieve linear convergence.
Keywords/Search Tags:stochastic variational inequality, mini-batch stochastic Bregman extragradient method, Lipschitz line-search, convergence analysis
PDF Full Text Request
Related items