Font Size: a A A

Two Classes Of Conjugate Gradient Methods For Large-scale Unconstrained Optimization

Posted on:2021-03-13Degree:MasterType:Thesis
Country:ChinaCandidate:L WangFull Text:PDF
GTID:2370330623484261Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
As the basis of the research on optimization problems,the theory and algorithm of unconstrained optimization is wildly used in many fields of real life.With the ad-vent of the big date era,the dimensions of optimization problems increase dramatically-It brings the development space for conjugate gradient metho{ds which are very suit-able for highly dimensional problems.In this thesis,a new spectral conjugate gradient method and a hybrid conjugate gradient method based on the modified secant equation are suggested for large-scale unconstrained optimization.Firstly,based on the idea of the approximate optimal stepsize and DY method,a new spectral conjugate gradient method is proposed.The memoryless BFGS formula is embedded in the algorithm to reduce the computational and storage cost.The search direction generated by the developed method is the sufficient descent direction.Un-der some suitable assumptions,it be convergent to the general functions.Besides,nu-merical experiments imply that our method is competitive for large-scale optimization problems.Secondly,by using the convex combination of HS method and DY method,a new hybrid conjugate gradient method is presented.The hybridization parameter is com-puted from the Newton direction and a modified secant equation.Under some mild conditions,it be convergent to the general functions.Numerical results show the pro-posed method is effective,especially for large-scale optimization problems.
Keywords/Search Tags:Unconstrained optimization, Spectral conjugate gradient methods, Hybrid conjugate gradient methods, Global convergence
PDF Full Text Request
Related items