Font Size: a A A

Several Hybrid Conjugate Gradient Methods And Their Global Convergence

Posted on:2019-04-25Degree:MasterType:Thesis
Country:ChinaCandidate:H T WangFull Text:PDF
GTID:2370330599456317Subject:Mathematics
Abstract/Summary:PDF Full Text Request
For the large-scale unconstrained optimization problem,the conjugate gradient method has become one of the main solutions because of its low storage requirement and simple algorithms,etc.Conjugate gradient method avoids the slow convergence rate of the steepest descent method and the large calculation of the Newton method,and attracts more and more scholars'attention.A lot of researches have been done on the conjugate gradient method,and there are many achievements as well as some short-comings.In this paper,based on the former research,we improve the hybrid conjugate gradient method and propose three hybrid nonlinear conjugate gradient methods.At present,the conjugate gradient method is not only one of the mostly useful methods which solve the large-scale linear equations,but also one of the most effec-tive algorithms for the large-scale nonlinear optimization problems.It mainly includes the classical conjugate gradient method,the modified conjugate gradient method,the mixed conjugate gradient method,the spectral conjugate gradient method and the three conjugate gradient method.Firstly,we propose a new hybrid?kformula,and then give a new class of the mixed conjugate gradient method,which uses the exact line search step rule.And we prove the descent property and global convergence of the new algorithm under suitable assumptions.Secondly,based on the improved mixed formula of the parameter?kMmixand?kRMILMIL proposed by Jia and Rivaie respectively,a hybrid conjugate gradient method with suffi-cient descent property is proposed.And we prove the convergence of the new algorithm with the Goldstein line search.Finally,inspired by the methods of the NVPRP*and the DHS,we extend this idea to?kRMILand get the formula?kNEW.Then a new hybrid conjugate gradient method combined with the JMJ method is proposed.And,we prove the descent property and global convergence of the method by the Wolfe linear search.
Keywords/Search Tags:unconstrained optimization, mixed conjugate gradient method, sufficient descent property, global convergence
PDF Full Text Request
Related items