Font Size: a A A

Research On Data Completion Algorithm Based On Low-rank Tensor Ring Decomposition

Posted on:2022-12-21Degree:MasterType:Thesis
Country:ChinaCandidate:J M YaoFull Text:PDF
GTID:2518306764972569Subject:Automation Technology
Abstract/Summary:PDF Full Text Request
With the continuous development of modern information technology,ubiquitous terminal equipment produces massive data with complex structure.Due to the advantages of tensors in representing multi-order and multi-dimensional data,tensor analysis is widely used.In the process of data collection and transmission,it is inevitable that some elements of the data will be lost.Incomplete data can greatly reduce the quality of the data and affect the data analysis process.In order to solve the problem of missing tensor data,low-rank tensor completion has attracted much attention.In recent years,tensor ring factorization has achieved excellent performance in tensor completion due to its powerful and general representation capability.However,existing data completion algorithms based on tensor ring decomposition rely heavily on initial rank selection,and have high computational overhead and high time cost,which limit their practical applications.Based on this,this thesis focuses on the data completion algorithm based on low-rank tensor ring decomposition,and proposes two data completion models from the two directions of tensor ring decomposition and tensor ring rank minimization.The main research contents are as follows:1.Aiming at the problem that the traditional data completion algorithm based on tensor decomposition relies on the initial rank selection,which leads to the lack of stability and validity of the recovery results,a low-rank tensor ring completion algorithm based on factor prior is proposed,which can achieve simultaneous tensor ring decomposition and completion.The proposed algorithm combines the tensor ring decomposition model with the tensor rank minimization model,and designs a hierarchical tensor decomposition model.For the first layer,the incomplete tensor is represented as a series of third-order factors by tensor ring decomposition.For the second layer,the transform tensor nuclear norm is used to represent the low-rank constraint of the factors,and two strategies for factor priors,namely graph regularization and smoothing constraint,are considered.The proposed algorithm utilizes both the low-rank structure of the factor space and the prior information.On the one hand,the model has implicit rank adjustment and enhances the robustness of rank selection.On the other hand,it makes full use of the prior information of the data to further improve the recovery performance.The effectiveness of the proposed algorithm is verified on different visual datasets.2.Aiming at the problem of low efficiency of the existing data completion algorithms based on tensor ring decomposition due to high computational complexity,a smooth tensor ring completion algorithm based on laplace function is proposed,which can simultaneously explore the global low-rank structure and local smooth structure of multidimensional data.The proposed algorithm first introduces the tensor circular unfolding operation,and uses the laplace function to approximate the tensor ring rank.On the one hand,it avoids the tedious initial rank adjustment and greatly reduces the computational cost in practical applications.Different degrees of shrinkage are automatically taken,resulting in a better rank approximation.Then,the piecewise smoothness of visual data is exploited through total variational regularization,which provides more auxiliary information in the presence of high missing rates.In addition,the visual data upgrade method can reconstruct the original low-order tensors into high-order tensors,which helps to utilize the local structural information of the data and further improves the completion performance.Extensive experiments demonstrate the efficiency of the proposed algorithm.
Keywords/Search Tags:low-rank tensor completion, tensor ring decomposition, transformed tensor nuclear norm, laplace function, total variation regularization
PDF Full Text Request
Related items