| As the basis of optimization theory research,unconstrained optimiza-tion theory and methods are widely used in many fields in real life.With the advent of the era of big data,the dimensionality of optimization problems has increased sharply.In this instance,the conjugate gradient method,with simple iterations and required storage space,exhibits obvious advantages in solving large-scale problems.This thesis summarizes the classical conjugate gradient method in optimization algorithms and the existing nonlinear conjugate gradient algorithm.Two new hybrid conjugate gradient algorithms are proposed after summarizing the improvement ideas of the algorithm.Firstly,an improved hybrid conjugate parameter β_k~1 is proposed,and the al-gorithm framework is given.The hybrid conjugate gradient algorithm satisfies au-tomatic descent and has global convergence under the standard Wolfe inexact linear search criterion.Numerical experiments on the descending hybrid conjugate gradient algorithm are carried out,and the results show that the algorithm is effective.Secondly,based on the improved mixing formula,the conjugate parameters are modified,and a hybrid conjugate gradient method with conjugate parameters β_k~2 is proposed.The algorithm satisfies the sufficient descent.Under the standard Wolfe inexact linear search criterion,the global convergence of the fully descending hybrid conjugate gradient algorithm is proved.Compared with the JXZ method and the DY method,the results show that the proposed algorithm is feasible. |