Font Size: a A A

Some Computing Methods For Unconstrained Optimization Problems And Nonlinear Equations

Posted on:2017-02-04Degree:DoctorType:Dissertation
Country:ChinaCandidate:J K LiuFull Text:PDF
GTID:1310330503482801Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
The purpose of this thesis is to mainly study nonlinear conjugate gradient method, spectral gradient method and their variants for solving large-scale unconstrained optimization problems, and then tries to extend them to establish some derivative-free projection methods for solving nonlinear monotone equations. We establish the global convergence of these methods theoretically. We test these methods using a large number of problems. Preliminary numerical results show that these methods are effective and stable.We first review the background and the existing research results of the problems, and then introduce the motivation and main work of this thesis in Chapter 1. In Chapters 2-3, from different perspectives, we study hybrid LS-DY conjugate gradient method and three-term HS conjugate gradient method for unconstrained optimization problems, and denote these methods as HLSDY method and TMHS method, respectively. The important property of these methods is that the search direction of HLSDY method not only satifies D-L conjugate condition, but also accords with Newton search direction, and the search direction of TMHS method satisfies the traditional conjugate condition and sufficient descent property. These properties are independent of any line search condition. Moreover, if the exact line search is used, HLSDY method and TMHS method reduce to the traditional LS method and HS method, respectively.The search direction of nonlinear conjugate gradient method has a great influence for the convergence property and the numerical results. For this reason, we employ the famous Powell restart criterion in HLSDY method. Under the strong Wolfe line search, we prove that HLSDY method generates a sufficient descent direction at each iteration, and possesses the global convergence for unconstrained optimizations. Under appropriate conditions, we also prove that TMHS method is globally convergent for unconstrained optimizations under the standard Wolfe line search. Moreover, we test these two methods using a large number of unconstrained optimization problems from CUTEr library, and the reported numerical results show that HLSDY method and TMHS method are effective.We propose a modified spectral gradient method in Chapter 4. An attractive property of this method is that it always generates sufficient descent directions without any line search. Under Armijo line search, we prove the global convergence of this method for unconstrained optimization problems. Moreover, this method is applied to the impulse noise removal by combining the two-phase method. In the first phase, the noisy candidates are detected by the adaptive median filter method. In the second phase, the proposed method is used to recover the noisy candidates by minimizing a nonconvex objective function.In Chapter 5, based on a modified HS method, we first propose a new three-term conjugate gradient method to solve large-scale unconstrained optimization problems. The search direction of this method satisfies D-L conjugate condition, and accords with that of the memoryless BFGS quasi-Newton. Then, using the projection technique proposed by Solodov and Svailter, we extend this method to build a three-term derivative-free projection method for large-scale unconstrained nonlinear monotone equations. We denote this extended method as TTDFP method. Under appropriate conditions, we prove the global convergence and R-linear convergence rate of TTDFP method.Based on the stablility of traditional DY method and the effectiveness of the multivariate spectral gradient method, we propose a derivative-free multivative spectal DY-type projection method for solving large-scale nonlinear equations in Chapter 6, and denote MSDYP method. Under appropriate conditions, we prove the global convergence and R-linear convergence rate of MSDYP method for nonlinear monotone equations with convex constraints.In Chapter 7, for the1 regularization problem in compressive sensing, we propose a modified derivative-free projection method based on CGD method. Under some approciate conditions, we prove the global convergence of the proposed method. Moreover, we provide some numerical results.Finally, the contents of this thesis are briefly summarized. And some problems are proposed, which are remained and considered in the future.
Keywords/Search Tags:Unconstrained optimization, Nonlinear equations, Nonlinear conjugate gradient method, Derivative-free projection method, Global convergence
PDF Full Text Request
Related items