Font Size: a A A

Linear Constrained Optimization Problem Of The Gradient Projection Method

Posted on:2008-12-11Degree:MasterType:Thesis
Country:ChinaCandidate:H M GaoFull Text:PDF
GTID:2190360212487981Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Optimization is one of subject with strong application. As the developing of the computer technology and the requirement of the actual problems, the cosmically optimization is been enlighten more and more, and then the celerity and efficiency way of arithmetic is becoming to a very hot topic in recent research.The optimization contains two different branches, one is the optimization without constraint, which is the classical question to get the extremum of a function; and the other is the optimization with the constraint. As the optimization is a very interesting problem in operational research, many of researchers have proposed lots of improvements and innovations of arithmetic under the construction of classical optimization theory. The study of the optimization is an important direction of the optimization researchers too. and they have got many of achievements, such as quadratic programming . sequential quadratic programming , penalty function method, trust domain method, constraint DFP . feasible direction method . etc.How to search the effective, descending direction is very important steps in the optimization arithmetic. The projection gradient method will be a possible way to solve the problem that we just get. It has been shown that the projections of the every directions, of which is the boundary point in linear restraint problems, are the possible decent directions, and the projection of negative grads direction is a decent direction. In 1960', Rosen proposed the basic idea of projection gradient methods, and then lots of researchers have been tried to find the convergence of this method. But most of them get the convergence with the condition to amend the convergence itself. Until recently, the convergence of the Rosen's arithmetic has been proved. The projection gradient method is the generalization of Steepest decent method to constraint problem. So it does have a faster convergence speed that the fastest decent method . To solve the problem, many researchers have generalizat the well-developed optimization with unconstraint to the Rosen's gradient method. The conjugate gradient method is one of success to solve the problem. The calculation is simple, the structure of the arithmetic is well-defined, and it gets the advantage of good convergence also. So the Rosen's gradient method has proposed an easy method to search the direction of decent method.In this paper some new algorithms with linear constrained optimization are proposed by combing the idea of conjugate gradient method and the Rosen's gradient projection. we make use of unconstraint optimization question's method to linear con-straint optimization question.In chapter 1 we introduced the development of optimization and some extensive optimality conditions which to decide the optimum solution. We reviewd several methods of constrained optimization.In chapter 2 we propose a linear equality constraint optimization question , the new algorithm is combined with the new conjugate gradient method(HS-DY conjugate gradient method)and Rosen's gradient projection method , and has proven it's convergence under the Wolfe line search.In chapter 3 we have combined a descent algorithm of constraint question with Rosen's gradient projection, and proposed a linear equality constraint optimization question's new algorithm, and proposed a combining algorithm about this algorithm, then we have proven their convergence under the Wolfe line search, and has performed the numerical experimentation.In chapter 4 we proposed a linear inequality constraint optimization question frame about projection of gradient algorithm, and has performed the numerical experimentation, but it's convergence also needs further study.
Keywords/Search Tags:constrained optimization, conjugate gradient method, gradient projection method, (strong)Wolfe line search, global convergence
PDF Full Text Request
Related items