Font Size: a A A

Research On ADMM For Nonconvex And Nonsmooth Optimization Problems

Posted on:2023-09-14Degree:MasterType:Thesis
Country:ChinaCandidate:L N XiaFull Text:PDF
GTID:2530306836465734Subject:Mathematics
Abstract/Summary:PDF Full Text Request
Nonconvex and nonsmooth optimization problems exist widely in practical engineering applications,such as aerospace,earth science,engineering technology,machine learning and so on.The alternate direction multiplier of method(ADMM)is commonly used to solve nonconvex and nonsmooth optimization problems with block structure.In that iteration of the algorithm,the Bregman distance can effectively simplify the computation of the subproblem,and the inertial technique can reduce the iteration number of the subproblem.In this paper,we study the ADMM for two classes of nonconvex and nonsmooth optimization problems.First,consider a class of nonconvex and nonsmooth optimization problems,where the objective function is the sum of two separable functions and one nonseparable function.In this paper,a nonconvex three-block inertial Bregman alternating direction multiplier of method is proposed by combining the inertial technique with ADMM.The inertial term and Bregman distance are added to the subproblem.Under suitable conditions,the global convergence and strong convergence of the algorithm are analyzed.The experimental results show that the algorithm is effective..Second,we consider a class of nonconvex multi-block optimization problem in which the objective functions is the sum of several functions.The research of ADMM for nonconvex two-block optimization problems has been gradually improved and matured,but the research for nonconvex multi-block optimization problems is relatively less.In this paper,a proximal inertial Bregman alternating direction multiplier of method is proposed,which can effectively solve this kind of multi-block optimization problems.The global convergence of the algorithm is proved under some suitable conditions,including the estimation region of two relaxation factors.When the augmented Lagrangian satisfies the Kurdyka-Lojasiewicz property,the algorithm has strong convergence.In addition,our analysis is based on the monotone decreasing property of the merit function instead of that of the augmented Lagrangian function.Finally,numerical experiments such as compressive sensing,signal recovery,robust principal component analysis verify the effectiveness of the algorithm.
Keywords/Search Tags:Nonconvex and nonsmooth problems, Kurdyka-Lojasiewicz property, Bregman distance, Convergence
PDF Full Text Request
Related items