Font Size: a A A

Convergence Analysis Of General Gradient Systems And Proximal Algorithms With Extrapolation

Posted on:2018-08-27Degree:DoctorType:Dissertation
Country:ChinaCandidate:B WenFull Text:PDF
GTID:1310330536481302Subject:Basic mathematics
Abstract/Summary:PDF Full Text Request
In this thesis,we first study the convergence behavior of a class of second order gradient systems.Then,based on this system,we consider the convergence properties and convergence rates of some proximal algorithms with extrapolation.The details are as follows:1.We study the convergence behavior of a class of second order gradient systems and the relationship between this second order gradient system and proximal gradient algorithm with extrapolation.Using the ?ojasiewicz inequality,we establish the convergence of the solution of this system for a class of nonconvex potential functions with an assumption that the dissipative terms vanish slowly enough.Moreover,we show that the length of this solution is finite.Then,we discuss the relationship between the second order gradient system and some proximal gradient algorithms with extrapolation.2.We consider the proximal gradient algorithm with extrapolation for solving a class of nonconvex nonsmooth minimization problems.Under the error bound condition,we establish the R-linear convergence of the iterate sequence and objective sequence generated by the proximal gradient algorithm with extrapolation with the assumption that the supremum of the extrapolation coefficients are below a certain threshold.In addition,we show that the threshold reduces to 1 when solving convex problems,and fast iterative shrinkage-thresholding algorithm(FISTA)with fixed restart is a special case of our algorithm.As a consequence,when the error bound condition holds for the objective,we show that both the iterate sequence and function value sequence generated by the FISTA with fixed restart are R-linearly convergent.3.We consider the proximal gradient algorithm with extrapolation for solving a class of convex problems.We first show that for a large class of extrapolation parameters including the extrapolation parameters chosen in FISTA,the successive changes of iterates go to 0.Moreover,based on the ?ojasiewicz inequality,we establish the convergence of iterates generated by the proximal gradient algorithm with extrapolation with additional assumptions on the extrapolation coefficients.In particular,we prove the length of the iterates is finite.4.We study the proximal difference-of-convex algorithm(DCA)with extrapolation for solving a class of difference-of-convex(DC)optimization problems.We prove that for a large class of extrapolation parameters including the extrapolation parameters chosen in FISTA with fixed restart,any cluster point of the sequence generated by the proximal DCA with extrapolation is a stationary point of the DC optimization problem.Moreover,using the Kurdyka-?ojasiewicz inequality,we establish the global convergence of proximal DCA with extrapolation and analyze its convergence rate with an additional assumption on the objective.Finally,our numerical experiments on two difference-of-convex regularized least squares models show the efficiency of our algorithm.
Keywords/Search Tags:gradient system, proximal algorithm with extrapolation, convergence, nonconvex problem, ?ojasiewicz inequality, error bound condition
PDF Full Text Request
Related items