| The limited memory BFGS(L-BFGS)method is a common method for solving largescale nonconvex unconstrained optimization problems.In recent years,many scholars have invested in the research of this method,and the main research directions can be divided into the following two aspects,One is the research on the selection of the initial matrix,and the other is to extend the correction technology applicable to the BFGS method to the L-BFGS method.In order to obtain better numerical experimental results and theoretical results,based on the above two different ideas,the thesis modifies and generalizes L-BFGS method,and proposes two kinds of L-BFGS methods which can be used to solve nonconvex unconstrained optimization problems.The specific work is as follows:1.based on the hybrid scaling memoryless BFGS method proposed by Saman,this chapter generalizes this method to the self-scaling limited memory BFGS method,and combines it with the regularization strategy proposed by Liu,and proposes the regularized self-scaling L-BFGS(RL-SBFGS)method.This method improves the computational efficiency of the L-BFGS method on ill-conditioned problems by adding a small amount of operation.In theory,In theory,this chapter establishes the global and local convergence of this method for nonconvex unconstrained optimization problems under the standard Wolfe line search.Numerical results show that the RL-SBFGS method is superior than L-BFGS and RL-BFGS.2.Combining the modified secant equation proposed by Dehghani and Fahimeh,a new secant equation is proposed.Based on the secant equation,the L-BFGS method is modified,and the self-scaling parameter in the previous chapter is applied to LBFGS.In the method,a self-scaling L-BFGS method(ML-SBFGS method)based on the modified secant equation is proposed.The self-scaling parameter plays a role in reducing the condition number of the quasi-Newton matrix and improving the approximation of the quasi-Newton matrix and the Hesse matrix.This method improves the calculation efficiency of the L-BFGS square without adding additional storage.And under the assumption of nonconvexity,the global convergence of the method is established.Numerical experiments show that ML-SBFGS is effective. |