Font Size: a A A

Study On A Fast Algorithm For Solving Large-scale Tensor CP Decomposition

Posted on:2022-05-10Degree:MasterType:Thesis
Country:ChinaCandidate:X P MaoFull Text:PDF
GTID:2480306533995979Subject:Mathematics
Abstract/Summary:PDF Full Text Request
Tensors are a generalization of matrixes and vectors.Tensors have natural advantages in describing big data.They have been widely used in psychometrics,signal processing,the food industry,metrology chemistry,linguistics,data mining,machine learning,text analysis,and other fields.At this stage,scholars have shifted their focus from matrix decompositions to tensor decompositions.How to design a new and efficient algorithm for solving the large-scale tensor CP decomposition is an important research topic.This dissertation studies a fast algorithm for the large-scale tensor CP decomposition.The main work of this paper is to propose an improved acceleration algorithm for Proximal Alternating Least Squares(PALS)that has undergone rigorous convergence analysis.Our acceleration is based on an intuitive observation,that is,the original PALS can be understood as a descent method whose step size is fixed to be unity.In view of this,for each subproblem,instead of choosing a unit step size,we calculate an optimal step size,where the optimal step size is updated based on a cost function.Due to the special structure of the tensor CP decomposition,the step size can be easily computed and adaptive.Therefore,we call our method Self-adaptive Proximal Alternating Least Squares,which has the following properties: Firstly,it can be regarded as a descent method,and the step size is self-adaptive;Secondly,the algorithm is suitable for solving the large-scale tensor CP decomposition problems;Thirdly,the proposed method inherits the advantages of PALS,namely,to a certain extent,it can reduce the occurrence of the swamp phenomenon,causing the convergence rate to slow down dramatically;Then,under appropriate conditions,it is proved that the new algorithm has global convergence and local linear convergence rate.Finally,numerical results show that for the CP decomposition of data tensors of different orders,dimensions,and noise levels,the algorithm proposed in this paper performs better than PALS in terms of the number of iterations and the CPU time.
Keywords/Search Tags:Tensor Decomposition, Alternating Least Squares Method, Self-adaptive, (?)ojasiewicz Gradient Inequality, Convergence
PDF Full Text Request
Related items