Font Size: a A A

Global Convergence Of A Type Of Modified Conjugate Gradient Method

Posted on:2009-05-10Degree:MasterType:Thesis
Country:ChinaCandidate:Y R TaoFull Text:PDF
GTID:2120360245467692Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Conjugate gradient method is one of the most useful methods for lager-scale unconstrained optimization. Recently, Hager-Zhang and Dai- Liao proposed some effective conjugate gradient methods. To get more efficient methods in the numerical performance, a type of modified conjugate gradient method is proposed. Under mild conditions, our methods all satisfy sufficient descent property. The thesis is organized as follows:Chapter 1 reviews the development of the conjugate gradient methods and introduces some related conjugate gradient methods.In chapter 2, Based on the update parameterβkN of Hager-Zhang, a new conjugate gradient method is proposed. Under strong Wolfe-Powell conditions, its global convergence for strongly convex functions is proved. Then, two related hybrid conjugate gradient methods are also proposed. Their global convergence for general nonlinear functions is proved with strong Wolfe-Powell conditions.In chapter 3, based on the update parameterβkN of Hager-Zhang and the update parameterβk<sup>DL2 of Dai-Liao, another new conjugate gradient method is proposed. Under weak Wolfe-Powell conditions, its global convergence for strongly convex functions is proved. Then, one related hybrid conjugate gradient method is also proposed. Its global convergence for general nonlinear functions is proved with weak Wolfe-Powell conditions.In chapter 4, numerical results are reported. From numerical point, the new methods are efficient.
Keywords/Search Tags:unconstrained optimization, conjugate gradient method, sufficient descent condition, line search, global convergence
PDF Full Text Request
Related items