Font Size: a A A

Two Classes Of Modified Dai-kou Conjugate Gradient Methods

Posted on:2022-09-13Degree:MasterType:Thesis
Country:ChinaCandidate:Y LiFull Text:PDF
GTID:2480306530959589Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
As we all know,conjugate gradient methods are widely used in many practical fields,for example engineering problems,financial models and so on,because of its simple structure and low storage.However,there are still some challenging problems in the study of conjugate gradient methods,such as the descent property of all conjugate gradient methods,the global convergence of non-convex functions,the optimal selection of Dai-Liao type conjugate parameters,etc.Therefore,studying the theoretical properties and numerical results of conjugate gradient methods have important theoretical significance and practical application value.Conjugate conditions,sufficient descent and minimized condition numbers of conjugate gradient methods are three important factors that accelerate the actual calculation iteration.Consequently,in order to obtain high efficiency and robustness for conjugate gradient methods,this paper considers the sufficient descent property,global convergence and computational efficiency of conjugate gradient methods.On the basis of the existing nonlinear conjugate gradient methods,two classes of conjugate gradient methods with sufficient descent property and good computational effect are proposed.In Chapter 1,we mainly introduce the research background of nonlinear conjugate gradient methods and some preliminary knowledge,including two important assumptions,one important lemma of conjugate gradient methods.Finally,we introduce the main work of this article.In Chapter 2,we describe some relevant researches on nonlinear conjugate gradient methods,mainly including classical conjugate gradient methods and spectral conjugate gradient methods.In Chapter 3,based on Andrei’s research on the scaling parameter in the selfscaling memoryless BFGS method.This chapter proposes a way to determine the scaling parameter value in DK family conjugate gradient methods.First,the search direction matrix of the DK method is symmetrical.Second,the distance between the eigenvalues of the search direction matrix is minimized.Third,according to the determinant and eigenvalues relationship obtains an adaptive selection of the scaling parameter,and derives the DDK method and the truncated form of the DDK+ method.The theory proves that these two methods are sufficient descent,and under the strong Wolfe line search,the DDK method is strongly converges for uniformly convex functions.The DDK+ method is global convergence for general functions.Numerical experiments are carried out by comparing the ratio methods and the performance profiles,from which we can see that the DDK+ method is effective.In Chapter 4,based on the idea of using spectral parameters to make conjugate gradient methods satisfy sufficient descent,the DDK method in chapter 3 is modified with spectral parameters to obtain the MDDK method.It is proved that MDDK method does not depend on line search and has sufficient descent,and under the strong Wolfe line search,the DDK+ method is strongly converges for uniformly convex functions.Numerical results show that MDDK method is superior to other methods.
Keywords/Search Tags:Conjugate gradient methods, Strong Wolfe line search, Spectral parameters, Sufficient descent, Global convergence
PDF Full Text Request
Related items