Font Size: a A A

Strong Convergence For Some Iterative Methods Of Nonlinear Optimization Problems

Posted on:2015-01-05Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y M WangFull Text:PDF
GTID:1260330428475580Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
In this dissertation, we mainly study some nonlinear optimization problems, such as the minimization problem of the sum of two functions, zero problem of maximal monotone opera-tors and split feasibility problem. Moreover, we also investigate the KM iteration for demicon-tractive mappings. In order to solve these problems, we modify others’algorithms, and prove the strong convergence of these modified algorithms. This dissertation consists of five chapters.In Chapter1, we introduce the research background, some basic concepts and properties. We first review the development of the zero problem of maximal monotone operators. Addi-tionally, some fixed point theory and some basic iterations are recalled. The basic iterations are Mann iteration, Halpern iteration and Ishikawa iteration. Finally, we collect some useful properties of monotone operator, subdifferentiability, nonexpansive mapping and resolvent of monotone operator.In Chapter2, we mainly investigate the minimization problem of the sum of two functions. In2005, Combettes and Wajs proved that the weak convergence of proximal-gradient algorithm can be guaranteed when some conditions are satisfied. The proximal-gradient algorithm is a method to solve the minimization problem of the sum of two functions. However, the strong convergence can’t establish if the underlying space H is infinite-dimensional. We will mod-ify proximal-gradient algortihtms to guarante the strong convergences. We first introduce the regularized and Halpern iteration of proximal-gradient algorithm, and the proximal-gradient algorithm’s regularized iteration is stated as follows Let the sequence of parameters satisfy appropriate conditions. Assume that the gradient▽f satisfies Lipschitz condition. Then the two iterations converge strongly to a minimizer of the problem. Moremore, we deal with two modified proximal-gradient algorithms(one with errors, the other without errors) with multi-parameters. The algorithm without errors is written as where{αn}(?)(0,1),{λn}(?)(0,2),{βn}(?)(-1,1), We study the case that Bn can be negative. Using two different methods we prove the strong convergences of the two algorithms. For the constrained convex minimization problem which is a special case of the sum of two functions, we propose the regularized algorithm: When the parameters of this algorithm satisfy some appropriate conditions, we prove the norm convergence of the algorithm.Chapter3is devoted to solving the zero problem of maximal monotone operators. In order to solve this problem, we investigate proximal point algorithm and the multi-parameters’proxi-mal point algorithm under some different accuracy criterions on the error sequence. For the zero problem with only one monotone operator, we first deal with the weak convergence of proximal point algorithm for one monotone operator. Since Guler proved the proximal point algorithm didn’t establish when H is infinite dimensional, two algorithms with multi-parameters are dis-cussed when we study the strong convergence of the proximal point algorithm for one monotone operator. The two algorithms generate sequences as Under two different accuracy criterions on the error sequence, the two multi-parameters algorithms converge respectively strongly to a solution of the problem. In order to solve the zero problem for two monotone operators, we propose a proximal point algorithm for two monotone operators. Under two different accuracy criterions on the error sequence, we prove the strong convergence for the proximal point algorithm.Chapter4is devoted to solving split feasibility problem and studys the KM iteration for demicontractive mappings. We deal with the strong convergence of a modified CQ algorithm to solve the split feasibility problem, The strong convergence can be guaranteed when some appro-priate conditions are satisfied. Additionally, for the KM iteration of demicontractive mappings, we obtain the strong convergence for this algorithm.In Chapter5, we sum up the results of this dissertation, and discuss the research direction in the future.
Keywords/Search Tags:maximal monotone operator, proximal point algorithm, proximal-gradient algo-rithm, strong convergence, CQ algorithm
PDF Full Text Request
Related items