In today’s society,the amount of data is growing at an unprecedented rate,and more and more large-scale high-order data sets need to be processed.Tensor approximation is an effective tool for processing these high-order data,which has been concerned and studied by many scholars in recent years.Tensor CANDECOMP/PARAFAC(CP)lowrank approximation,as one of the most important tensor approximation models,has been widely used in many fields,such as signal processing,image processing,machine learning and so on.In the face of today’s huge data sets,the computational efficiency of the existing tensor CP low-rank approximation algorithm needs to be improved,so it is very valuable and meaningful to design an efficient tensor CP low-rank approximation algorithm.In this thesis,efficient algorithms are proposed for unconstrained tensor CP low-rank approximation and nonnegative tensor CP low-rank approximation respectively.The specific research contents are as follows :(1)In this thesis,an optimization algorithm based on spectral conjugate gradient method is proposed for unconstrained tensor CP low-rank approximation.Firstly,the CP low-rank approximation problem of tensor is transformed into a general unconstrained optimization problem.Secondly,by combining the idea of spectral gradient method and nonlinear conjugate gradient method,an optimization algorithm based on spectral conjugate gradient method is designed,and its convergence is analyzed.Finally,experiments are carried out on amino acid experimental data sets and synthetic data sets.The numerical experimental results show that the proposed algorithm has less calculation time and amount of calculation compared with the existing gradient-based optimization algorithms while maintaining the same accuracy.(2)Because tensor data is nonnegative in many applications,this thesis also designs an efficient algorithm for tensor CP low-rank approximation with nonnegative constraints.In this thesis,the CP low-rank approximation problem of nonnegative tensor is transformed into N nonnegative least squares problems,and a quadratic regularized projection Barzilai-Borwein algorithm is used to solve each nonnegative least squares problem.In each iteration of the subproblem,a point is first generated by solving the strongly convex quadratic minimization problem,and then the projection Barzilai-Borwein method is applied to update the solution of the subproblem.In order to improve the overall convergence speed of the algorithm,an extrapolation acceleration strategy is adopted after each subproblem is updated.A large number of experiments on simulated data sets and real data sets verify the efficiency of the proposed algorithm.Then,this thesis applies the newly proposed extrapolation acceleration strategy to other nonnegative tensor CP low-rank approximation algorithms under the block coordinate descent framework,and numerical experiments show that the extrapolation strategy can greatly improve the computational efficiency of these algorithms.Finally,a Tucker compression technique is adopted to further improve the computational efficiency of the proposed algorithm. |