Font Size: a A A

Algorithm Research Of Fast Low-rank Matrix And Tensor Recovery

Posted on:2014-01-09Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y Y LiuFull Text:PDF
GTID:1228330398998891Subject:Pattern Recognition and Intelligent Systems
Abstract/Summary:PDF Full Text Request
With the rapid development of the sensor technology, multimedia technology,computer network and communication technology, high-dimensional data with morecomplex structures such as face images, surveillance videos and multispectral imagesare becoming very ubiquitous across many areas of science and engineering. Besides,these data often suffer from the problem of deficiency, loss, or corrupted with noise oroutliers. Then low-rank matrix and tensor recovery is a hot topic in machine learning,data mining, pattern recognition, and computer vision.In recent years, the nuclear norm minimization methods for low-rank matrix andtensor recovery and completion problems have come into wide use. However, thesealgorithms have to be solved iteratively and involve singular value decomposition (SVD)or eigenvalue decomposition (EVD) at each iteration. Therefore, they suffer from highcomputation cost of multiple SVDs or EVDs. As a class of approaches for low-rankmatrix and tensor recovery, matrix or tensor factorization is not robust to noise and thegiven rank. To mitigate SVD or EVD computation for nuclear norm minimizationproblems, this thesis has made some research of fast low-rank matrix and tensorrecovery and completion in terms of modeling, algorithm designing, and algorithmanalysis. The author’s major contributions are outlined as follows:1. A fast matrix tri-factorization (FTF) based nuclear norm minimization frameworkis proposed, and is used into three classes of low-rank matrix reconstruction problemssuch as low-rank matrix completion (MC), low-rank and sparse matrix decomposition(LRSD), and low-rank representation (LRR). Motivated by SVD and nonnegativematrix factorization, the matrix tri-factorization idea is firstly introduced into theoriginal nuclear norm minimization models. Then three small scale matrix nuclear normminimization models for LRR, LRSD, and MC are presented respectively. Moreover,three alternating direction method (ADM) based iterative algorithms are developed toefficiently solve three proposed models. Experimental results on a variety of syntheticand real-world datasets validate the efficiency, robustness and effectiveness of theproposed FTF approach comparing with the state-of-the-art algorithms.2. A novel matrix bi-factorization (MBF) nuclear norm regularization frameworkwith the linearized optimization technique is proposed. Under the proposed framework,a matrix bi-factorization linearized low-rank representation model is designed. Besides,the proposed MBF method can be used to address a wide range of low-rank matrixrecovery and completion problems such as robust principal component analysis (RPCA) and low-rank matrix completion (MC). Then, two concrete linearized proximalalternative optimization algorithms are developed to solve the above three problems.Finally, the theoretical analysis of the proposed algorithms also is presented.3. The matrix factorization based nuclear norm regularization framework is used totackle the low-rank representation and low-rank tensor completion problems. First, anovel matrix factorization based low-rank representation model with a positivesemidefinite (PSD) constraint is presented. Moreover, a fast matrix factorization basedlow-rank tensor completion model is also designed. Finally, two efficient solvingschemes combining the first order method for non-smooth problems and QRdecomposition for orthogonality constraints are developed to solve two proposedmodels. The promising experimental results on synthetic data and real-world datavalidate both efficiency and effectiveness of two proposed approaches.4. The Riemannian manifold idea is introduced into nuclear norm least squaredminimization (NNLSM) problems. And a hybrid gradients method alternatively usinggradient descent on the Euclidean space and the Riemannian manifold is proposed forsolving the NNLSM problem. The proposed method can mitigate the SVDcomputational burden of a large-scale matrix, and the iterative scheme can convergenceto the optimal solution of the original NNLSM problem. Firstly, the original costfunction can be approximated by using the linearized technique, and then the linearizedcost function on Euclidean space can be converted into a nuclear norm minimizationover Grassmann manifold by using the matrix bi-factorization approach. Furthermore, aRiemannian gradient is derived for the non-smooth cost function over Grassmannmanifold, and an alternative iterative scheme is also provided. Finally, the convergenceanalysis of the proposed algorithm is presented.5. A novel core tensor nuclear norm minimization method for low-rank tensorcompletion problems is proposed. First, the classical Tucker tensor factorization isintroduced into the original tensor n-rank minimization model. The tractable convexrelaxation technique (the nuclear norm) is used to take the place of the n-rank ofinvolved tensors, and then a much smaller scale matrix nuclear norm weightedminimization model is presented. Finally, an efficient solving scheme under thealternating direction method of multipliers (ADMM) framework is developed.6. A novel factor matrix rank minimization framework is proposed for low-ranktensor completion problems. First, a factor matrix rank minimization based on theCANDECOMP/PARAFAC (CP) factorization is used to take the place of the low-n-rank minimization of involved tensors, which is applied by popular low-ranktensor completion approaches. Then a much smaller scale factor matrix nuclear normminimization model is presented. Finally, an efficient solving scheme under thealternating direction method of multipliers (ADMM) framework is designed. Thepromising experimental results on synthetic and real-world data including nature imagesand hyperspectral images validate both efficiency and effectiveness of the proposedapproach.
Keywords/Search Tags:Low-rank matrix recovery, Low-rank matrix completion, Nuclearnorm l1or l2,1 norm Low-rank representation, Robustprincipal component analysis, Low-rank tensor completion
PDF Full Text Request
Related items