Font Size: a A A

The Research Of Several Methods On Unconstrained Optimization And Tensor Eigenvalue Problems

Posted on:2021-02-16Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y T ChenFull Text:PDF
GTID:1360330623977219Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
As the basis of optimization,unconstrained optimization problems have wide ap-plications in many fields,such as transportation,industrial and agricultural production,finance and trade.A large number of optimization scholars are devoted to designing new optimization methods with strong theoretical properties and nice numerical performance.With the rapid development of information technology,the dimensions of variables increase dramatically and structure complexities enhance,it is particularly significant to propose some efficient methods for solving large-scale unconstrained optimization problems.Mean-while,many problems can also be described by tensors with higher-dimensional arrays.As one of the important research directions of tensor theory,tensor eigenvalue problem is a hotspot in practical application.In this paper,a three-term conjugate gradient method based on memoryless BFGS update and a new accelerated conjugate gradient method with parameter selection are suggested for large-scale unconstrained optimization.Consider-ing the tensor eigenvalue problem,a derivative-free spectral conjugate gradient method is presented for Z-eigenvalue of symmetric tensors and an adaptive trust-region method is proposed for generalized eigenvalue of symmetric tensors.Specific research contents and innovative are as follows:Firstly,combined with the memoryless BFGS quasi-Newton update,a three-term conjugate gradient method is derived to solve large-scale unconstrained optimization.The obtained search direction can be regarded as a linear combination of the current negative gradient,the difference between the lastest two iteration points and two gradients,and can satisfy both of the descent condition and Dai-Liao conjugacy condition.Under appro-priate conditions,the global convergence result is established.The numerical results are promising.Due to Hessian matrix and its approximation of the objective function do not need to be calculated or stored during the execution of the proposed method,it is very suitable for solving large-scale problems.Secondly,a new accelerated conjugate gradient method is developed to solve large-scale unconstrained optimization.The step-length is modified by the acceleration scheme in a multiplicative manner when some conditions hold.The generated search direction satisfies both the sufficient descent condition and Dai-Liao conjugacy condition.Moreover,the values of the parameters contain more useful information without adding more com-putational cost and storage requirements,which can improve the numerical performance.The global convergence of the proposed method is established under proper assumption-s.Numerical experiments show that the given method is competitive for unconstrained optimization problems,especially when the dimension is very large.Thirdly,the Z-eigenvalue problem of symmetric tensors is transformed into a system of nonlinear equations.Based on the derivative-free scheme,a spectral conjugate gradient method without gradient information is designed.The advantage of the proposed method lies that there is no need to compute or storage the Jacobian matrix or its approximation of the underlying function,thus the shortcomings of tedious calculation and high memory requirement in the tensor eigenvalue problem are avoided.Under the backtracking line search,the global convergence theory and its proof are given.The preliminary numerical results show the effectiveness and feasibility of the proposed method.Fourthly,by the variational principle,the generalized eigenvalue problem of symmet-ric tensors is transformed into the homogenous polynomial optimization problem over the unit sphere.Combined with the projection scheme and adaptive technique,an adaptive trust-region method for generalized eigenvalue of symmetric tensors is suggested.The proposed method can not only guarantee the feasibility of each iteration point,but also automatically update the trust-region radius.The global convergence,local quadratic convergence and second-order necessary condition of the given method are proved,re-spectively.The numerical results compared with several existing methods show that the proposed method is efficient.Finally,we conclude the dissertation with a summary of the main results and establish the goal and direction of the future research.
Keywords/Search Tags:Unconstrained optimization, Conjugate gradient method, Trust-region method, Symmetric tensors, Eigenvalue, Global convergence
PDF Full Text Request
Related items