We mainly research a new type of hybrid conjugate gradient methods to unconstrainedoptimization, which not only can guarantee dk being the sufficient descent direction, butalso perform well from the point of numeical view. The thesis is organized as follows:In Chapter 1, we review the development of the conjugate gradient methods andbriefly introduce two typical types of them.In Chapter 2, we propose a new parameter formulaβkD and study its property. Thenwe propose a new type of hybrid conjugate gradient methods based onβkD, and prove theirglobal convergence property under some mild conditions.In Chapter 3, we aim at modifying the hybrid conjugate gradient methods proposedin Chapter 2. Wei et.al(2004) proposed a class of new quasi-Newton equation Bksk-1=yk*=yk-1+Aksk-1, where Ak is some matrix. Based on the expression of yk*, wegive two modified hybrid conjugate gradient methods. The modified methods are globallyconvergent under some mild conditions.In Chapter 4, We report the numerical results, and explain the advantages of newmethod from numerical view.
|