Font Size: a A A

L_q Regularization Based Low-rank Matrix Recovery Algorithms And Its Applications

Posted on:2021-05-14Degree:DoctorType:Dissertation
Country:ChinaCandidate:Z WangFull Text:PDF
GTID:1367330611464872Subject:Statistics
Abstract/Summary:PDF Full Text Request
In the era of big data,humans generate and obtain data with higher dimensions and more complex structures.However,these high-dimensional data often suffer from the problem of incomplete or corrupted with noise.How to recover these incomplete data is a hot topic in the filed of machine learning,data mining,signal processing,and computer vision.In recent years,nuclear norm minimization is a dominating approach for high-dimensional data approx-imation.However,nuclear norm minimization ignores the difference among singular values of target matrix and the results obtained by nuclear norm minimization often deviate the opti-mal solution.Besides,the proximal gradient algorithm is a widespread tool for optimization problems.Unfortunately,solving the solution of proximal operator suffer from full singular value decomposition?SVD?,and become inefficient for large-scale problems.To address these issues,this thesis has made some research of high-dimensional data approximation in terms of modeling,algorithm designing,and algorithm analysis.The main contributions can be listed as follows:1.We employ the Schatten-2/3 quasi-norm to approximate the rank of a matrix and pro-pose the L2/3regularization model for low-rank matrix completion.We establish a necessary optimal condition and propose a fixed point iterative scheme.The convergence of this itera-tion is also analysed.Extensive experiments on synthetic data,image data,and large-scale recommendation data validate that our proposed algorithm is a highly competitive algorithm in the field of matrix completion.2.A flexible model with a novel nonconvex regularizer is proposed.Such a model not only promotes low-rankness,but also can be solved much faster and more accurate.Subsequent-ly,Nesterov's rule and inexact proximal strategies are adopted to achieve a novel algorithm highly efficient in solving this problem with convergence guaranteed.Besides,the asymp-totic convergence rate is also analyzed rigorously adopting Kurdyka-Lojasiewicz inequality.Furthermore,we apply the proposed optimization model to typical low-rank problems includ-ing matrix completion,robust principal component analysis?RPCA?,and tensor completion.Exhaustively empirical studies regarding data analysis tasks,i.e.,synthetic data analysis,image recovery,personalized recommendation and background subtraction,indicate that the proposed model outperforms state-of-the-art models in both accuracy and efficiency.3.We further investigate the basic properties of TS1 penalty function and propose a novel algorithm highly efficient in solving low-rank matrix recovery problems.In addition,we theoretically prove that the original low-rank matrix recovery problem can be equivalently transformed into the TS1 optimization problem under certain conditions.Finally,extensive experiments results on real images data sets show that our proposed algorithm outperforms state-of-the-art methods in both accuracy and efficiency.4.By observing that the IFISTA algorithm can be further accelerated by using a se-quence of over relaxation parameters which do not satisfy the Nesterov's rule,we propose an efficient and fast thresholding algorithm to handle image deblurring problems.In addition,we theoretically study the convergence of our proposed algorithm and obtain some improved convergence rate.Furthermore,we investigate the local variation of iterations which is still unknown in IFISTA algorithms so far.Extensive experiments have been conducted and show that our proposed algorithm is more efficient and robust.
Keywords/Search Tags:Nonconvex regularization, Proximal operator, Low-rank matrix recovery, Low-rank structure, Robust principal component analysis, Low-rank tensor completion
PDF Full Text Request
Related items