Font Size: a A A

A Class Of Conjugate Gradient Methods And The Convergence

Posted on:2010-04-18Degree:DoctorType:Dissertation
Country:ChinaCandidate:M J MaFull Text:PDF
GTID:1100360272496798Subject:Basic mathematics
Abstract/Summary:PDF Full Text Request
In this paper, we aim at solving global unconstrained optimization using the conjugate gradient methods. Under the conditions of monotone and nonmonotone line search, combining with generalized quasi-Newton condition, we propose a class of conjugate gradient methods. According to monotone and nonmonotone inexact line search, we give out a new line search and the corresponding conjugate gradient methods.Consider the general unconstrained optimization problem:where f(x) is a continuously differentiable nonlinear function.Usually, the conjugate gradient method has the following form:(?), (1)(?), (2) where dk is the search direction,βk is a scalar, gk = (?)f(xk),αk is the step-size along the search direction, and has the form:Monotone inexact line search(?) (3)and(?). (4)Nonmonotone inexact line search(?) (5)and(?) (6)Some conditions are involved with:We say the search direction dk satisfing the desent condition, if(?). (7)We say the sufficient descent condition holds if there exists a constant c > 0 such that each search direction dk satisfing(?). (8)In this paper, we make the following assumptions:(H1) The level set is bounded, where x1 is the initial point; (H2) In the neighborhood U of the level setf is continuously differentiable, and its gradient g is Lipschitz continuous, i.e., there exists a constant L > 0 such thatThe main results of this paper are as follows.Ⅰ,In Chapter 2, using the ideas of [13], [53], we propose a new form ofβk. We use the parameter(?), (9)where(?). (10)In the form ofβk =θkβkHS , we give out the analysis of convergence and examples.Lemma 1[58] Suppose that (H1) and (H2) hold, Lipschitz contant is L(x), 0 <λ<σ< 1, and d is the descent direction at point x. Then the Wolfe condition(?), (11)forα∈[0,αmax(x)] is satisfied, where(?). (12) Lemma 2[8] Suppose that (H1) and (H2) hold, f is convex function. xk is obtained by (1)(2),αk satisfy (3) (4), dk is the descent direction,Then their exsits a m such that(?). (13)Lemma 3 ||xk - xk+1||→0, k→∞.Lemma 4[39] Suppose that (H1) and (H2) hold, f is uniformly convex function, xk is obtained by (1)(2),αk satisfy (3)(4), dk is the descent direction,If(?) , (14)then(?). (15)Theorem 1 Suppose that (H1) and (H2) hold, f is uniformly convex function. xk is obtained by (1)(2),αk satisfy (3) (4), dk is the descent direction,βk =θkβkHS. Then Ⅱ,Combining the ideas of [48, 23], spectral and Generalized Quasi-Newton condition, we propose some new forms ofβk and dk, where(?), (16) (?). (17)We also prove the convergence of the methods.Lemma 5 Letαk be obtained by the strong Wolfe line search. Suppose that Assumption (H1) and (H2), and the descent condition hold. Then we have(?). (18)Lemma 6 Consider the conjugate gradient method in the form of (1) and (17), whereαk is obtained by the strong Wolfe line search. Suppose that Assumptions (H1) and (H2) and the descent condition hold. Then the following so-called Zoutendijk condition holds:(?). (19)Lemma 7 Consider the conjugate gradient method in the form of (1) and (17), whereαk is obtained by the strong Wolfe line search. Suppose that Assumptions (H1) and (H2) and the descent condition hold. Then either(?) (20) or(?) (21)holds.Corollary 1 Consider the conjugate gradient method in the form of (1) and (17), whereαk is obtained by the strong Wolfe line search. Suppose that Assumptions (H1) and (H2) and the descent condition hold. If(?), (22)then(?). (23)Theorem 2 Consider the conjugate gradient method in the form of (1) and (17), whereαk is obtained by the strong Wolfe line search. Suppose that Assumptions (H1) and (H2) and the descent condition hold. Then(?). (24)Ⅲ,Under the nonmonotone line search of [50], we give out the convergence analysis ofβk, dk which are in Chapter 2 and Chapter 3.In Chapter 2, we have(?), (25) where(?). (26)In Chapter 3, we have(?), (27)(?). (28)Lemma 8 Suppose that the nomonotone line search condition (5), (6) and desent condition (7) hold. LetThen the sequence f(xl(k)) monotonically decreases, where l(k) =max{i | 0≤k - i≤m(k),(?).Lemma 9 Assume that (H1) holds, suppose thatαk satisfies the line search (5), (6) and the desent condition (7). ThenLemma 10 Assume that (H1), (H2) hold, suppose thatαk satisfies the line search (5), (6). Then there exists a constant c such thatTheorem 3 Consider the conjugate gradient method in the form of (1), (25), whereαk is obtained by the nonmotone line search (5), (6). Suppose that Assumptions (H1), (H2) and the descent condition (7) hold. Then we have(?). (29)For the formβk, dk in Chapter 3, we also have the lemmas similar to Lemma 8, Lemma 9, Lemma 10. For the simplicity, we only give the main theorem.Theorem 4 Consider the conjugate gradient method in the form of (1), (28), whereαk is obtained by the nonmotone line search (5), (6). Suppose that Assumptions (H1), (H2) and the descent condition (7) hold. If f(x) is uniformly convex function, then we have(?). (30)Ⅳ,Combining with monotone and nonmonotone line search, we propose a new line search which is a combination of monotone and nonmonotone line search and it has the following form: (31) (?), (32)whereλ∈[0,1]. Whenλ= 0, it is the general Wolfe line search; whenλ= 1, it is nonmonotone line search .
Keywords/Search Tags:Global optimization, Spectral gradient, conjugate gradient, Monmonotone, Inexact line search
PDF Full Text Request
Related items