High-order data,including images,videos,and traffic flow data,has garnered much attention in the field of data processing.However,missing entries may arise due to factors such as improper operations and incomplete data during the acquisition or transmission.Recovering high-quality data from missing high-order data is a ill-posed inverse problem and stands as a critical challenge in the field of data science.Tensor completion aims to reconstruct missing entries by leveraging the low-rank prior.The recovery performance is largely depends on the chosen low-rank tensor structure,specifically,its sample complexity.Nevertheless,current methods encounter challenges such as insufficient exploration in traditional tensor decomposition and instability in tensor ring(TR)decomposition,leading to suboptimal tensor completion outcomes.Furthermore,in scenarios characterized by few observed samples,existing low-rank tensor completion methods fail to provide effective solutions.This dissertation aims at reducing the sample complexity and investigating flexible and stable low-rank tensor representations as well as tensor completion models enhanced with auxiliary information,developing a series of low-rank tensor approximation methods for analyzing missing components:(1)To address the insufficient exploration of low-rank information in CP and Tucker methods,a novel tensor completion model is introduced,incorporating constraints based on multiple low-rank structures.This model optimizes the linear combination of CP rank and Tucker rank,harnessing the advantages of both heterogeneous tensor decomposition mechanisms to flexibly explore high-order data.In comparison to traditional approaches relying on single tensor decomposition methods(CP and Tucker),the proposed method exhibits significant enhancements,achieving improvements of 10% and 40% in terms of PSNR on dynamic magnetic resonance image with sample ratio(SR)=5%.(2)To tackle the instability arising from TR decomposition,which causes a sensitivity to rank in low-rank TR completion,this dissertation firstly proposes a Bayesian low-rank TR completion method for image recovery.At each iteration,the horizontal and frontal slices with zero components are pruned to enable automatic TR rank determination.Secondly,this dissertation further explains that tensor representations in TR format with predefined bounds on TR-ranks do not form a closed set,which makes the computation of approximate TR unstable.Based on this finding,a non-negative low-rank TR completion method is proposed.Numerical experiments on synthetic data,real-world color images,and the Yale Face dataset demonstrate that the proposed methods outperform the state-of-the-art methods,especially in terms of recovery accuracy.(3)To tackle the “cold start” problem arising from an extremely limited sample size,two low-rank tensor completion models enhanced with auxiliary information are introduced.Firstly,this dissertation considers incorporating additional auxiliary information,i.e.the local piecewise smoothness of data,and develops a new optimization model for tensor completion by jointly minimizing tensor tree rank and total variation.In comparison to methods without smooth constraints,the reconstruction accuracy improves by nearly 20%on images with severe random scratches.Secondly,by leveraging auxiliary data,a trainable subspace tensor completion method with theoretical guarantees is presented.This model incorporates auxiliary data by assuming that the tensor data to be recovered comprises two low-rank components.One component shares subspace information with the auxiliary data,while the other exists outside this shared subspace.In comparison to traditional low-rank Tucker method,the proposed one demonstrates a nearly 50% improvement in the reconstruction accuracy of HSI with SR=2%.In conclusion,this dissertation addresses challenges encountered in tensor completion,including inadequate exploration in traditional tensor decomposition and instability in tensor ring decomposition.With the objective of reducing sample complexity,this dissertation firstly develops a series of low-rank tensor approximation models,exemplified by multi-structure tensor decomposition,Bayesian TR decomposition,and non-negative TR decomposition,enhancing the performance of current tensor completion methods.Secondly,to address the “cold start” problem in tensor completion,this dissertation proposes smooth tensor tree completion and trainable subspace tensor completion model,effectively lowering sample complexity and providing new approaches to tackle tensor completion problems with extremely limited samples. |