| Optimization theory and method has been applied in many practical problems, unconstrained optimization problem as the most basic part of optimization theory and method, some effective methods which solving this problem have been appeared in the present. Because of conjugate gradient method is the low storage requirement and easy programming, moreover, it can effectively solve large scale unconstrained optimization problem, so it always is a hot research topic. In order to make up for the deficiency of some classical conjugate gradient methods in the property part or numerical performance part, several modified conjugate gradient methods are proposed in this paper.The first chapter, some basic knowledges of unconstrained optimization problem are introduced and some two-term conjugate gradient methods and three-term conjugate methods are introduced.The second chapter, some concepts of convergence of the method and some common methods of calculating the step length are introduced.The third chapter, a new modified PRP conjugate gradient method is proposed, this conjugate gradient method possesses two aspects of information of gradient value and function value. Without any line search method, the search direction of this conjugate method possesses sufficient descent property. When the weak Wolfe-Powell line search method or the Armijo line search method is used, this conjugate gradient method is global convergence for general nonconvex objection function. When the objection function is uniformly convex, this conjugate gradient method which using the weak Wolfe-Powell line search method or the Armijo line search method is linear convergence. The numerical results indicate that the new method is effective and competitive for solving large scale unconstrained optimization problem.The fourth chapter, a new modified HS conjugate gradient method is proposed, this conjugate gradient method possesses two aspects of information of gradient value and function value. The search direction of this conjugate method is sufficient descent under the condition of weak Wolfe-Powell line search. The global convergence of this conjugate gradient method has been established for nonconvex objection function. The numerical results show that this modified conjugate gradient method is effective for solving unconstrained optimization problem and it’s numerical performance is competitive.The fifth chapter, a new modified three-term LS conjugate gradient method is proposed, this three-term conjugate gradient method not only contains gradient value information but also contains function value information. When this conjugate gradient method doesn’t relies on line search method which is used to compute step length, it’s search direction satisfies the sufficient descent condition. The global convergence of this conjugate gradient method which using weak Wolfe-Powell line search has been proved for general nonconvex objection function. The numerical experiment results indicate that this modified conjugate gradient method can effectively solve unconstrained optimization problem and it’s numerical performance is competitive. |