Font Size: a A A

The Improvement Of Two Classes Of Unconstrained Optimization Algorithms

Posted on:2015-02-08Degree:MasterType:Thesis
Country:ChinaCandidate:J ChenFull Text:PDF
GTID:2180330431978878Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Unconstrained optimization is the basis of nonlinear programming. This thesis gives two classes of improved algorithms for solving unconstrained optimization problems:one is Hooke-Jeeves algorithm without using derivatives, another is conjugate gradient algo-rithms using derivatives.At present, there exist many algorithms without using derivatives which can be used to solve unconstrained optimization problems, for example, the cyclic coordinate algo-rithm, the algorithm of Hooke-Jeeves, and the algorithm of Rosenbeock, etc.. However, the objective function value may be increasing at the acceleration step in the standard Hooke-Jeeves algorithm with discrete steps (HJADS) for solving unconstrained optimiza-tion problems. In Chapter2, A modified HJADS (MHJADS) is proposed, which can guarantee the non-increase of the objective function value in the acceleration step. Nu-merical results show that the MHJADS is more efficient than the HJADS because of the requirement of a very small number of function evaluations.There exist many algorithms using derivatives which can be used to solve uncon-strained optimization problems, among which conjugate gradient algorithms are a class of important algorithms. Conjugate gradient algorithms are very efficient for solving large-scale unconstrained optimization problems because of the simplicity of its iterative form and the requirement of lower storage. The LS conjugate gradient algorithm (LSC-GA) is not globally convergent under the standard Armijo line search condition. Tang C.M. et al. proved the global convergence of LSCGA with a Armijo-type line search. In Chapter3, motivated by the idea of Tang C.M. et al., we present a new Armijo-type line search (ATLS) technique for a modified LS conjugate gradient algorithm (MLSCGA), and prove the sufficient descent property and the global convergence of the MLSCGA with the ATLS. Numerical results show that the algorithm given in this chapter is efficient.The theoretical property of the HS conjugate gradient algorithm (HSCGA) is poor. Wei Z.X. et al. proposed a modified HS conjugate gradient algorithm (MHSCGA) in which the conjugate gradient parameter possesses the nonnegativity under the Wolfe line search conditions, and proved the sufficient descent property and the global convergence of MHSCGA under the strong Wolfe line search conditions. In Chapter4, motivated by the idea of Wei Z.X. et al., we present a new modified HS conjugate gradient algorith-m (MHS*CGA) in which the conjugate gradient parameter possesses the nonnegativity without depending on any line search conditions. The MHS*CGA possesses the sufficient descent property without depending on any line search conditions and the global conver-gence under the Wolfe line search conditions. Numerical results show that the algorithm given in this chapter is efficient.
Keywords/Search Tags:unconstrained optimization, the method of Hooke-Jeeves, discrete step, acceleration step, conjugate gradient method, Liu-Storey conjugate gradient method, Hestenes-Stiefel conjugate gradient method, sufficient descent property, global conver-gence
PDF Full Text Request
Related items