Font Size: a A A

The Improvement Of Nonlinear Conjugate Gradient Methods

Posted on:2010-05-14Degree:MasterType:Thesis
Country:ChinaCandidate:M LiFull Text:PDF
GTID:2190330338982214Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Nonlinear conjugate gradient methods are efficient for solving unconstrained optimization. They are paricularly welcome for the solution of large-scale problems due to their lower storage and good convergence theory. However, most traditional conjugate gradient methods may not be decent methods if inexact line search is used. Some methods though provide descent directions for the objective function, the descent property strongly relies on the line search used. In this thesis, we make modificaitons to some femous conjugate gradient methods to improve the convergence of the related methods.In the second chapter, we improve the recently develop HZ conjugate gradient method. We introduce a parameter in the HZ method and propose a modified HZ method called MHZ method. We show that if the parameter is appropriately chosen, the MHZ method provides sufficient descent directions for the objective function. This property is independent of the line search used. The MHZ method implies the HZ method as a spectial case which corresponding to the case where the parameter is set to 2. In addition, if exact line search is used, the MHZ method reduces to the well-known HS method. We show that the MHZ method with Goldstein or Wolfe line search converges to the unique of the problem if it is applied to minimize a uniformly convex function. We then introduce a coutious rule in the MHZ method and propose a cautious MHZ method. We show that the cautious MHZ method with Armijo line search is globally convergent even for nonconvex minimization. In Chapter 3, we improve the well-known DY method and propose a modified DY (MDY) method. The MDY method enjorys a nice property that it provides descent directions for the objective function.This property is also independ of the line search used. In addition, if exact line search is used the MDY method reduces to the standard DY method. Based on the MDY method, we design a cautious rule in the method to propose a cautious MDY method. Under milder conditions, we get the global convergence of the cautious MDY method. In Chapter 4, we propose a cautious modified CD method and establish it global convergence.We also do extensive numerical experiments to test the proposed methods and compare its perpormance with some existing conjugate gradient methods. The results show that the proposed methods perform better than the existing methods.
Keywords/Search Tags:Unconstrained optimization problem, conjugate gradient method, descent direction, global convergence
PDF Full Text Request
Related items