| The conjugate gradient method is one of the most effective methods for large-scale unconstrained optimization problems because of its fast convergence speed and it does not need to calculate and store Hesse matrix and its inverse matrix.In this paper,three adaptive selections of parameter are proposed by minimizing the condition number of the direction matrix of the DL conjugate gradient method,and a class of three term conjugate gradient methods with sufficient descent property are proposed by the idea of approximation.In Chapter 1,the research background,iterative scheme,common line search and research status of the nonlinear conjugate gradient method are introduced firstly,then two methods of processing numerical experimental data are introduced,and finally the main work of this paper is introduced.In Chapter 2,based on the research of Saman babaie kafaki and Reza ghanbari on the singular value of the search direction matrix of the DL conjugate gradient method,three adaptive parameters are obtained by minimizing the condition number of the search direction matrix of DL conjugate gradient method.In order to make the search direction satisfy the sufficient descent property,we truncated the three parameters and proposed three parameters adaptive selection of DL method,Similar to the DK+ method,three DL+ conjugate gradient methods are proposed.It is proved that the three DL methods and the three DL+ methods satisfy the sufficient descent property.Under the standard Wolfe line search,the three DL methods are strongly convergent to the uniform convex function,and the three DL+ methods are globally convergent to the general function.Finally,numerical experiments show that two of the three DL+ methods are slightly superior to the DK+ method in computational efficiency.In Chapter 3,based on the NHS three term conjugate gradient method proposed by Li,a class of three term conjugate gradient methods(GNHS methods)with scaled parameters are proposed by using the idea of approximation.Moreover,by introducing the parameter z_k,the GNHS methods are further extended to the GGNHS conjugate gradient methods,similar to the NHS+ method taking truncation,we proposed the GGNHS+ conjugate gradient methods.Both the GGNHS and GGNHS+ methods satisfy the sufficient descent property,Under the standard Wolfe line search,the GGNHS methods are strongly convergent to the uniform convex function,and the GGNHS+method are globally convergent to the general function.Numerical experiments show that the GGNHS+ methods are slightly better than the DK+ method in computational efficiency. |