Font Size: a A A

Modified Algorithm Based On Dai-Kou Three-term Conjugate Gradient Method

Posted on:2020-10-14Degree:MasterType:Thesis
Country:ChinaCandidate:X ZhangFull Text:PDF
GTID:2370330572491887Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Nonlinear conjugate gradient method is widely used because of its small storage and fast calculation speed.In order to obtain better theoretical results and numerical experimental results,three new conjugate gradient methods,namely modified memoryless BFGS method,Dai-Kou method and spectral conjugate gradient method,are proposed in this paper.In this paper,three modified conjugate gradient methods are given.Inspired by a modified three-term conjugate gradient method proposed by Dai-Kou and combined with a modified secant condition,a new three-term conjugate gradient method(MTDK method for short)is given.In terms of theoretical derivation,under the improved Wolfe line search condition,the proof that the method satisfies the global convergence of uniformly convex functions is given.In terms of numerical results,two new conjugate gradient methods are superior to DK methods.Inspired by the approximation idea of DK method,the direction of conjugate gradient method is combined with the results of Chapter 2.Approximating the modified conjugate gradient method(MDK method for short),a new two-term conjugate gradient method(MDK method for short)is presented.In terms of theoretical derivation,the proof that the method satisfies the global convergence of uniformly convex functions under the condition of improved Wolfe line search is given,and the truncated form of the method is given.The truncated algorithm(MDK+method for short)satisfies the global convergence of general functions.Finally,the numerical results show that the MDK method is better than the DK method;the MDK+method is slightly better than the DK method.Inspired by the idea of spectral conjugate gradient method,a new two-term spectral conjugate gradient method(referred to as MDKS method)is proposed in conjunction with the research results in Chapter 3.In terms of theoretical derivation,under the improved Wolfe line search condition,the method satisfies the uniform convex function.The global convergence is proved,and the truncated form of the method is given.The truncated algorithm(MDKS+method for short)proves that it satisfies the global convergence for general functions.Through numerical results,it is concluded that both the MDKS method and the MDKS+method are slightly better than the DK method.
Keywords/Search Tags:Nonlinear conjugate gradient methods, Sufficient descent property, Global convergence, Improved Wolfe line search
PDF Full Text Request
Related items