Font Size: a A A

Memory Gradient Algorithm

Posted on:2011-03-31Degree:MasterType:Thesis
Country:ChinaCandidate:S ZhuFull Text:PDF
GTID:2190360308971816Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Optimization method is an important part of operations research. It has wide application to many fields, such as, natural science, social science, practical production, engineering design, an d modern management etc. With the rapid development of computer science and urgent needs for practical problem in recent years, the large-scale optimization problem has drawn more and more attention. Many practical problems can be reduced to optimization problems. The key to investigating optimization problems is to design efficient algorithm for solving it. As main way of solving large-scale optimization problem, the memory gradient method has paid special attention. This paper mainly studies the theoretical proprieties of memory gradient method. The main results obtained in this dissertation may be summarized as follows:1. In the first chapter, nonlinear memory gradient methods of the optimization problem is introduced, the context of this dissertation and the main results obtained in this dissertation.2. In the secondly chapter, a new class of memory gradient method for solving unconstrained optimization problem is developed by constructing a newβk.The global convergence with the Armijo line search rule is proved .Numerical results shows that the new method is efficient.3. In the thirdly chapter, an assumption aboutβk of the line search in paper [1] is proposed , and a new range is identified. To ensuring the search direction is fully decline in the objective function, a new class of memory gradient algorithm is proposed. On the base of removing the iterative points listed in the community and the generalized Armijo step-size search, we discuss the global convergence of the algorithm, and give an amendment of memory gradient such as the shape of FR, PR, HS method. Compared with Armijo line search of the FR, PR, HS conjugate gradient method and the literature [1] in thesuper-memory gradient Numerical experiments show that the new algorithm is more stable and more effective.
Keywords/Search Tags:Gradient method, Memory gradient method, Line search, linear convergence rate
PDF Full Text Request
Related items