Conjugate gradient method is one of the most useful methods for lager-scale unconstrained optimization. Recently, Hager-Zhang and Dai- Liao proposed some effective conjugate gradient methods. To get more efficient methods in the numerical performance, a type of modified conjugate gradient method is proposed. Under mild conditions, our methods all satisfy sufficient descent property. The thesis is organized as follows:Chapter 1 reviews the development of the conjugate gradient methods and introduces some related conjugate gradient methods.In chapter 2, Based on the update parameterβkN of Hager-Zhang, a new conjugate gradient method is proposed. Under strong Wolfe-Powell conditions, its global convergence for strongly convex functions is proved. Then, two related hybrid conjugate gradient methods are also proposed. Their global convergence for general nonlinear functions is proved with strong Wolfe-Powell conditions.In chapter 3, based on the update parameterβkN of Hager-Zhang and the update parameterβk<sup>DL2 of Dai-Liao, another new conjugate gradient method is proposed. Under weak Wolfe-Powell conditions, its global convergence for strongly convex functions is proved. Then, one related hybrid conjugate gradient method is also proposed. Its global convergence for general nonlinear functions is proved with weak Wolfe-Powell conditions.In chapter 4, numerical results are reported. From numerical point, the new methods are efficient.
|