Font Size: a A A

Research On The Super-Memory Methods And GLP Gradient Projection Method For Non-Linear Programming

Posted on:2005-02-09Degree:DoctorType:Dissertation
Country:ChinaCandidate:Q Y SunFull Text:PDF
GTID:1100360122496913Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Optimization method is an important part of operations research. It has wide application to many fields, such as, natural science, social science, practical production, engineering design, and modern management, etc. Many practical problems can be reduced to optimization problems. The key to investigating optimization problems is to design efficient algorithm for solving it.The PHD thesis is to study the super-memory gradient methods for optimization problems. Firstly, a new class of three-term memory gradient methods for unconstrained optimization is presented. Global convergence properties of the new methods are discussed. Numerical results show that the new methods are efficient. Secondly, the new super-memory gradient method is generalized to solve constrained optimization by using projection matrixes, such as Rosen projection matrix, generalized projection matrix, GLP projection, etc. Global convergence properties of the new methods are discussed. Numerical results show that the new methods are efficient.The innovation of the thesis has six points.1. A new class of three-term memory gradient methods with Armijo-like step size rule for unconstrained optimization is presented. Global convergence properties of the new methods are discussed without assuming that the sequence of iterates is bounded. Moreover, it is shown that, when the objective function is a pseudo-convex (quasi-convex) function, this new method has strong convergence results. Combining quasi-Newton method with our new method, quasi-Newton method is modified to have global convergence property. Numerical results show that the new algorithms are efficient.2. By using projection matrix, a new three-term memory gradient projection method for nonlinear programming with linear or nonlinear in-equality constraints is presented. The global convergence properties of the new method are discussed. Combining with conjugate gradient scalar with our new method, three new classes of three-term memory gradient projection methods with conjugate gradient scalar are presented. The numerical results illustrate that the new methods are effective.3. By using generalized projection matrix, a new three-term memory gradient projection method for nonlinear programming with nonlinear equality and in-equality constraints is presented. The global convergence properties of the new method are discussed. The numerical results illustrate that the new methods are effective.4. By using generalized projection matrix, a three-term memory gradient projection method with arbitrary initial point for nonlinear programming with nonlinear in-equality constraints is presented. The global convergence properties of the new method are discussed. Numerical results illustrate that the new methods are effective.5. A generalized memory gradient projection method for convex constrained optimization is presented by using Goldstein-Lavintin-Polyak projection. The global convergence properties of the new method are discussed with an accurate step size rule and withoutassuming that the sequence of iteration is bounded. Numerical results show that the algorithm is efficient.6. A modified gradient GLP projection descent method for convex constrained optimization is presented. The global convergence properties of the new method are discussed without assuming that the sequence of iteration is bounded. Moreover, it is shown that this new method has strong convergence results. The numerical results illustrate that the new methods are effective.
Keywords/Search Tags:Unconstrained optimization, constrained optimization, super-memory gradient method, projection, convergence, numerical experiment.
PDF Full Text Request
Related items