Font Size: a A A

Research On Numerical Methods Of Splittable Programming And Matrix Equations

Posted on:2020-09-08Degree:DoctorType:Dissertation
Country:ChinaCandidate:M SunFull Text:PDF
GTID:1360330572472403Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Many practical applications in image processing,compressed sensing and machine learning can be reformulated as the separable convex programming with linear constraints.The periodic matrix equations appear in many engineer problems,such as communication systems theory,control the-ory,power system,signal processing.In this thesis,we are going to design some iteration methods for solving the separable convex programming and the generalized periodic Sylvester matrix,and discuss their convergence results and execute some numerical experiments to show the efficiency of the proposed methods.The thesis includes seven chapters.Chapter 1 is mainly an introduction on the mathematical models of the separable convex pro-gramming with linear constraints and the generalized periodic Sylvester matrix,and current devel-opment on the iteration method for these two problems.Furthermore,some necessary concepts,properties,theorems and conclusions are provided.In Chapter 2,we consider the convex programming with linear constraints.By solving its dual problem by the gradient ascent method,we get the classic augmented Lagrangian method.In this chapter,to reduce its computation load and enhance its convergence rate,we modify the method from the following two aspects,and propose an accelerated proximal augmented La-grangian method:(1)Some proximal term with indefinite proximal matrix are added to its pri-mal variable’s subproblem;(2)The new iterate is the convex combination of the output of primal variable’s subproblem and the current iterate.Under mild assumptions,we establish its worst-case O(1/t~2)convergence rate in an iterate sense,where t is a positive integer.Finally,numerical results show that the new method is feasible and efficient for solving compressive sensing.In Chapter 3,we consider the separable two-block convex programming with linear con-straints,and alternating direction method(ADM)is a benchmarker to solve the problem.Relaxing the iterative schemes of the ADM’s second primal variable and dual variable with a relaxation factorα∈(0,2),we get the generalized ADM(GADM),which often(whenα∈(1,2))performs better than the classic ADM.In this chapter,we propose a symmetric version of the GADM,and the prominent characteristic of the new method is that all subproblems are accelerated by the relax-ation factor,which can take any values in the interval[1,+∞).Under some standard assumptions,we establish the convergence results of the proposed method,including the global convergence,the worst-case O(1/t)convergence rate in both the ergodic and iterate senses.Numerical experiments to decode a sparse signal arising in compressed sensing are included to illustrate the efficiency of the new method,which indicate that when the relax factor takes some proper values,the new method is superior to the proximal alternating direction method.In Chapter 4,we further consider the separable two-block convex programming with linear constraints.Due to update the Lagrangian multiplier twice at each iteration,the symmetric alter-nating direction method(ADM)often performs better than other ADM-type methods.In practical applications,some proximal terms with positive definite proximal matrices are often added to its subproblems,and it is commonly known that large proximal parameter of the proximal term often results in“too-small-step-size”phenomenon.In this chapter,we generalize the proximal matrix from positive definite to indefinite,and propose a new S-ADMM with indefinite proximal ma-trix.Without any additional assumptions,we prove the global convergence of the new method and analyze its worst-case O(1/t)convergence rate in an ergodic sense by the iteration complexity.Finally,some numerical results are included to illustrate that the indefinite proximal matrix can enhance the efficiency of the new method.In Chapter 5,we consider the separable multi-block convex programming with linear con-straints.Based on the classic alternating direction method,in this chapter we purpose two proximal splitting methods for such problem.The first proximal splitting method fully utilizes the desired property of such problems and adopts fully parallel updating rule.The global convergence and the worst-case O(1/t)convergence rate in an ergodic sense of the first proximal splitting method are proved under the condition that the involved functions are assumed to be strongly convex.The second proximal splitting method updates the first primal variable and the last primal m-1 vari-ables in the alternate way,and updates the last primal m-1 variables in a parallel way.The global convergence of the second method can be guaranteed only under the condition that the involved functions are convex.Furthermore,the worst-case O(1/t)convergence rate in both the ergodic and iterate senses of the second method is also established.Finally,numerical results on stable princi-pal component pursuit are reported to testify to the accuracy and speed of the second method,and some numerical comparisons are also reported.In Chapter 6,we further consider the separable multi-block convex programming with linear constraints,and propose a proximal alternating direction method of multiplier with partially paral-lel splitting,which has the following nice properties:(1)The restrictions imposed on the proximal parameters are relaxed substantively;(2)The relaxation parameterγis only attached to the update formula of the dual variableλ.For the resulted method,we establish its global convergence and worst-case O(1/t)convergence rate in an ergodic sense,where k is the iteration counter.Finally,three numerical examples are given to illustrate the theoretical results obtained.Chapter 7 is devoted to designing two conjugate gradient methods for the least Frobenius norm solution of the generalized periodic Sylvester matrix equations.When the studied problem is consistent,the first conjugate gradient method can find its solution within finite iterative steps in the absence of round-off errors.Furthermore,its least Frobenius norm solution can be obtained with some special kind of initial matrices.When the studied problem is inconsistent,the second conjugate gradient method with some special kind of initial matrices can find its least squares solution with the least Frobenius norm within finite iterative steps in the absence of round-off errors.Finally,several numerical examples are tested to validate the performance of the proposed methods.Chapter 8 briefly summarizes the research content of this paper,and also points out the prospect of the future study work.
Keywords/Search Tags:The separable programming, alternating direction method, global convergence, convergence rate, principal component analysis, the conjugate gradient method, the generalized periodic Sylvester matrix equations, finite convergence
PDF Full Text Request
Related items