Font Size: a A A

A Cubic Regular Term Method For Derivative-free Filtered Line Search For Linearly Constrained Optimization Problems And Its Research

Posted on:2022-12-09Degree:DoctorType:Dissertation
Country:ChinaCandidate:L Y HeFull Text:PDF
GTID:1480306749483444Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
In this thesis,a kind of new algorithms for solving nonlinear optimization problems with bound constrainted or linear inequality constrainted is proposed by using the strategy of adaptive regularization with cubic,filter line search technique and derivative-free interpolation model,and the theoretical study of the proposed algorithm is given.Trust region method and line search technique are two classical strategies for solving global convergence of nonlinear optimization problems,while adaptive cubic regularization has been developed recently to replace the classic trust region method.The main idea is to minimize a suitable cubic regularizatin model with a parameter of the objective function at the current iteration point,and obtain an acceptable direction step by constantly adjusting the parameter of the cubic regularization term.In the adaptive cubic regularization method,the adjustment of parameters and the adjustment of trust region radius have similar effects,and their values have a reciprocal relationship.The cubic regularization method ensures the global convergence and local superlinear convergence rate.The technique of backtracking line search can select the appropriate step size and satisfy the strict feasibility of the algorithm.The search direction obtained when determining the next iteration point avoids the repeated solution of the approximate model.The filtering method originally proposed by Fletcher and Leyffer guarantees the global convergence of the algorithm for nonlinear optimization problems.The main idea is to accept the trial points by improving the objective function or reducing the degree of the constraint violation.By using the concept of double(multiple)objective optimization,the possible problem of penalty parameter adjustment is replaced.The non-monotone technique can be embedded in the framework of filtering backtracking line search,which overcomes the high nonlinearity in progress of solving optimization problems,improves the possibility of finding the global optimal solution,and speeds up the convergence process in some ill-conditioned cases.In optimization theory,the optimality conditions depend on the derivative information of the objective function.But in practical problems,due to various reasons,derivative information is not available or the calculation cost is too expensive,so the derivative-free optimization methods become particularly important.At present,there are many derivative-free methods,but they are mainly used to solve the unconstrained or bounded optimization problems in small scale.The derivative-free interpolation model proposed by Powell uses the existing information of the objective function at the current iteration point to build a quadratic approximation model,and the interpolation radius is updated as the iteration point changes.The model is stable under certain conditions.Compared with the finite difference model,the interpolation model can guarantee the local accuracy at the iteration point better.At the same time,the model can also be applied to the case with derivatives.The research work done in this paper is summarized as follows.1.Based on the relationship between projection gradient and affine gradient,the derivative-free augmented cubic regularization model of the bounded constrained optimization problem is constructed.By calculating the Cauchy steps of the model,the relation between the cubic regular term model and the critical measure is obtained.The relation between critical measure and the norm modulus of search direction is also obtained.Under certain conditions,it is proved that the proposed algorithm is globally convergent and maintains local superlinear convergence rate.Numerical results verify the effectiveness of the proposed algorithm.2.After transforming the nonlinear equations into nonlinear least squares problems with bounded constraints,the derivative-free affine cubic regularization model of the linearized least squares equations is constructed.Different from the affine scaling of the first term,the affine matrix is constructed by using the first-order necessity condition of the bounded constraint optimization problem,and it is proved that the linearized approximation model has sufficient first-order and second-order reduction.Without considering the switching conditions of classical filtering techniques,the convergence and local superlinear convergence rate of the proposed algorithm are given.The numerical results show the feasibility of the proposed algorithm.3.A derivative-free affine cubic regularization approximation model is obtained by using the modified Newton step.By the Lagrange function,the solution of approximate model is equivalent to that of affine cubic regular term model with linear equality constraint.It is proved that the proposed algorithm has sufficient first and second order reduction.Without considering the switching condition,the convergence of the algorithm is studied and the numerical results are given.The filtering technique involved in this thesis provides some ideas for the further research on multi-objective optimization.
Keywords/Search Tags:adaptive regularization with cubic, filter line search, derivative-free interpolation, affine scaling, global convergence, local convergent rate
PDF Full Text Request
Related items