Font Size: a A A

Limited Memory BFGS Algorithm Based On Parameter Adaptive Scaling And Modified Secant Equation

Posted on:2022-01-22Degree:MasterType:Thesis
Country:ChinaCandidate:X L YinFull Text:PDF
GTID:2480306530959709Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Limited memory BFGS(broyden-fletcher-goldfarb-shanno)method is mainly used to solve large-scale unconstrained optimization problems and is one of the most effective quasi-Newton methods.This method,by utilizing the storage of a certain amount of vector pairs,not only overcomes the shortcomings of the quasi-Newton method(a method needs to store a large number of matrices),but at the same time maintains good convergence property.Considering the curvature information of the objective function,global convergence and calculation efficiency of the methods,combined with the characteristics of initial scaling method and secant condition,based on BarzilaiBowein’s step factor,diagonal preconditioner and DL conjugate condition,as well as various modified secant equations proposed by Biglari,Kou,Khoshgam,etc.This thesis propose two types of limited memory BFGS methods for adaptive selection of initial scaling parameters and a limited memory BFGS method for modified secant equation.Initial scaling method only needs to scale 0 in the first iteration and is simple and effective for the problem of smooth curvature change.This thesis combines BarzilaiBowein step in Dai-Kou Based on the selection of a suitable parameter for the subspace minimization conjugate gradient method(SMCG),the two types of Barzilai-Bowein step factor are linearly combined;the third term in the standard BFGS update formula is adjusted in Andrei,on the basis of reducing the large eigenvalues of the approximation to the Hessian of minimization function,combining diagonal preconditioner and DL conjugacy condition to scale BFGS,so as to introduce two positive factors for the inverse of the Hessian matrix approximation in limited memory BFGS method.These metheds also establishes the global convergence on uniformly convex problems.Numerical experiments show that these two types of limited memory BFGS methods with adaptive selection of initial scaling parameters have certain advantages over the standard limited memory BFGS method.In order to make full use of function value information and gradient value information of objective function to improve standard secant equation,this article are introduced.Combines high-order tensor model proposed by Biglari et al.and modified secant conditions proposed by Kou and Khoshgam respectively,and proposes a class of limited memory BFGS method with parameter.It is proved that the method has global convergence for uniformly convex functions under the Wolfe line search,and the convergence rate is -linear.Numerical results show that modified limited memory BFGS method is better than the standard limited memory BFGS method.
Keywords/Search Tags:Limited memory BFGS method, Curvature information, Initial scaling, Conjugacy condition, Secant equation, Global convergence
PDF Full Text Request
Related items