Font Size: a A A

Research On Models And Algorithms For Robust Low-rank Tensor Recovery

Posted on:2022-08-20Degree:DoctorType:Dissertation
Country:ChinaCandidate:D QiuFull Text:PDF
GTID:1488306731483024Subject:Mathematics
Abstract/Summary:PDF Full Text Request
Low-rank tensor data,such as hyperspectral images,multispectral images,and video images data,plays an important role in real-world applications.However,tensor data could be corrupted(such as noisy corruption,missing values)during the acquisition and transmission process due to the limitations of imaging equipments,imaging environ-ments,and transmission conditions.Therefore,it is a hot topic to recover the degraded tensor,i.e.,to recover the ground-truth tensor from the degraded tensor.However,the intrinsic structure of the tensor data should be used in order to recover a tensor,where the prior information of the underlying tensor can be utilized,such as low-rankness,lo-cal smoothness,and nonlocal self-similarities.In this thesis,we focus on how to recover a low-rank tensor based on the prior information of the tensor data.First,we investigate the problem of robust low-rank tensor completion with differ-ent degradations(such as Gaussian noise,sparse noise,and missing entries)for third-order tensors,and propose a transformed tensor nuclear norm method combined the tensor 1norm with total variational(TV)regularization.Our model is based on a re-cently proposed algebraic framework in which the transformed tensor nuclear norm is introduced to capture lower transformed multi-rank by using suitable unitary transforma-tions.We adopt the tensor 1norm to detect the sparse noise,and the TV regularization to preserve the piecewise smooth structure along the spatial and tubal dimensions,which can preserve the edge information of the tensor.The proposed model can recover the un-derlying tensor better.Moreover,a symmetric Gauss-Seidel based alternating direction method of multipliers is developed to solve the resulting model and its global conver-gence is established under very mild conditions.Extensive numerical examples on both hyperspectral images and video datasets are carried out to demonstrate the superiority of the proposed model compared with several existing state-of-the-art methods.Second,For the third-order robust tensor completion problem with different degra-dations,which aims to recover a tensor from partial observations corrupted by Gaussian noise and sparse noise simultaneously.we propose a nonlocal robust low-rank tensor recovery model with nonconvex regularization(NRTRM)to explore the global low-rankness and nonlocal self-similarity of the underlying tensor.The NRTRM method is first to extract similar patched-tubes to form a third-order sub-tensor.Then a class of nonconvex low-rank penalties and nonconvex sparse penalties are employed to ex-plore the low-rank component and the sparse corruptions for such sub-tensor,respec-tively.Moreover,a proximal alternating linearized minimization algorithm is developed to solve the resulting model in each group and its convergence is established under very mild conditions.Extensive numerical experiments on both multispectral images and video datasets demonstrate the superior performance of NRTRM in comparison with several state-of-the-art methods.Finally,we propose a combination of transformed tensor nuclear norm and tensor 1norm to deal with the image alignment problem,where the observed images,stacked into a third-order tensor,are deformed by unknown domain transformations and corrupted by sparse noise like impulse noise,partial occlusions,and illumination variation.The key advantage of the proposed method is that both spatial correlation and images variation can be captured by the use of transformed tensor nuclear norm.We show that when the underlying of correlated images is a low multi-rank tensor,an upper error bound of the estimator of the proposed model can be established and this bound can be better than the previous result.Besides the proposed convex transformed tensor model,the method can be further studied by incorporating nonconvex functions in the transformed tensor nucle-ar norm and the sparsity norm.Both the proposed convex and nonconvex optimization models are solved by generalized Gauss-Newton algorithms.Also the global conver-gence of the numerical methods for solving the subproblems of convex and nonconvex optimization models can be provided under very mild conditions.Extensive numeri-cal experiments on real images with misalignment and sparse corruptions demonstrate the performance of our proposed methods is better than that of several state-of-the-art methods in terms of visual quality and relative errors.
Keywords/Search Tags:Low-rank tensor recovery, total variation regularization, transformed tensor nuclear norm, nonlocal self-similarity, nonconvex regularization, image alignment, proximal alternating linearized minimization algorithm, proximal Gauss-Seidel method
PDF Full Text Request
Related items