Font Size: a A A

Research On Two Class Of Three-Dimensional Subspace Conjugate Gradient Algorithms

Posted on:2023-03-11Degree:MasterType:Thesis
Country:ChinaCandidate:J L YangFull Text:PDF
GTID:2530306794477454Subject:Mathematics
Abstract/Summary:PDF Full Text Request
The conjugate gradient method is often used to solve unconstrained optimiza-tion problems because of its simple iterative form,less computation and storage,and fast convergence.The subspace method simplifies the original problem by minimizing the approximate model of the objective function in a given subspace to reduce the amount of computation and storage in iteration,which is very suitable for solving large-scale optimization problems.In recent years,some researchers combine the conjugate gradient method with the subspace technology to study the subspace conjugate gradient algorithm for solving large-scale unconstrained optimization problems.In this thesis,two class of three-dimensional subspace conjugate gradient algorithms are studied for solving large-scale unconstrained optimization problems.By embedding the subspace method into the conjugate gradient algorithm,and then focusing on the construction of three-dimensional subspace,the selection of the approximate model and the processing of the em-bedded parameters.The specific research contents are as follows:In this thesis,a special three-dimensional subspace (?)k+1is constructed for the search direction of the current iteration point based on the modified gradi-ent change,the gradient of the current iteration point and the search direction of the previous iteration point.In Chapter 3,a class of three-dimensional subspace conjugate gradient algorithms are given with embedding the determined search direction in the conjugate gradient algorithm.It’s determined by minimizing the quadratic approximation model of the objective function in the subspace (?)k+1,and estimating the embedded parameters with the cosine square mean and BBCG method.The global convergence of the algorithm for the general non-convex ob-jective functions is established.Finally,the obtained algorithm is applied to solve large-scale unconstrained optimization problems and the image restoration prob-lems in numerical experiments,and the numerical results show that the algorithm has robust and efficient numerical performance.When the distance between the iterative point and the optimal solution of function is far away or the non-quadratic property of the objective function is strong,the convergence speed of the algorithm based on quadratic approxima-tion model is slow.In response to such question,the three-dimensional subspace conjugate gradient algorithm based on the cubic regularization model is discussed in Chapter 4.The regularization parameter in the cubic regularization model are updated by an interpolation function.According to the criterion of approximate model,the algorithm adaptively selects the quadratic approximation or the cubic regularization model to approximate the objective function by adjusting the reg-ularization parameter.The corresponding subspace conjugate gradient algorithm is obtained,by minimizing the approximation model of the objective function in the given subspace.The algorithm has global convergence for the general non-convex objective functions,and numerical experiments imply that the algorithm is robust and efficient.
Keywords/Search Tags:Unconstrained Optimization, Conjugate Gradient Method, Subspace Method, Approximate Model
PDF Full Text Request
Related items