Font Size: a A A

Nonlinear Conjugate Gradient Methods And Their Convergence

Posted on:2013-02-17Degree:MasterType:Thesis
Country:ChinaCandidate:J L LiFull Text:PDF
GTID:2230330371474168Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Nonlinear conjugate gradient methods are often used to solve large scale unconstrainedoptimization problems due to their simple structures and low memory storage. In this paper,we mainly construct some efcient nonlinear conjugate gradient methods and investigate theirglobal convergence properties and practical computational efciency.In Chapter1, we first recall some well-known nonlinear conjugate gradient methods andanalyze recent progress on this topic in the world. Then we present some related prerequisites.In Chapter2, based on the Scaled BFGS method, we propose a new three-term PRPmethod. An important feature of this method is that it can produce sufcient descent searchdirection, which is independent of any line search used and the convexity of the objectivefunctions. This method sufciently utilizes the second information of the objective functionand hence converges faster. Moreover, we prove that the proposed method converges globallyfor nonconvex minimization under some inexact line search.In Chapter3, we present a new globalization technique for nonlinear conjugate gradientmethods for nonconvex optimization. This technique sufciently uses the information of theprevious search directions and adopt a simple but efcient truncated way to guarantee globalconvergence of a large class of nonlinear conjugate gradient methods for nonconvex problemseven if relatively weaker Armijo or Wolfe line search is used. Extensive numerical resultsshow that this new globalization technique is very efcient in practical computation.
Keywords/Search Tags:Conjugate gradient method, PRP method, nonconvex optimization, globalconvergence
PDF Full Text Request
Related items