Font Size: a A A

Unconstrained Nonlinear Conjugate Gradient Method

Posted on:2009-01-23Degree:MasterType:Thesis
Country:ChinaCandidate:Z R ZhangFull Text:PDF
GTID:2190330335453255Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Conjugate gradient method as a good method in optimization because of their simple structure and less computation, less storage structure and not solution lin-ear system of equations for construction search direction and the algorithm have quadratic termination etc,especially in solving large-scale unconstrained optimiza-tion problems is to be more widely used.The results of research of this paper main summarizes three aspects:(1)Presents a new conjugate gradient methods for unconstrained optimization problem which extended the range of the iterative parameter through improving and it can be less than zero in the strange Wolfe inexact line search. Global convergence is proved under the condition that the objective function is continuously differentiable.(2)The value range are given on the parameter to ensure that the conjugate direction is sufficient descent, and a new conjugate gradient method is presented.At the same time, the objective function decline faster and Wolfe step improves further. Especially when the difference between given the initial point and the exact point is larger,the decrease speed was more quickly than the original Wolfe step size rule. This algorithm only need a smaller memory and have the better convergence rate.(3)When the objective function is non-convex function,a combined algorithm for solving the nonlinear equations is presented. A new combined algorithm that com-bines the chaos optimization method and the nonlinear conjugate gradient method approach having an effective convergence property is proposed.The combined algo-rithm can help the the conjugate gradient approach to skip the local minimum.At the end,it can find the global minimum.The global convergence of the algorithm is proved under certain conditions.
Keywords/Search Tags:conjugate gradient method, Wolfe step size rule, combined optimization algorithm, global convergence
PDF Full Text Request
Related items