Font Size: a A A

Nonconvex Low-rank Matrix Recovery Models And Algorithms

Posted on:2020-11-19Degree:DoctorType:Dissertation
Country:ChinaCandidate:H M ZhangFull Text:PDF
GTID:1368330602961094Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
Based on the high-dimensional data compression and recovery as the research background,this thesis states the developing process from the theory of Shannon sam-pling to sparse representation and compression perception and to the low rank matrix recovery in detail,which further leads to low rank matrix related relaxation function and its decomposition formulas?mathematical models and optimization algorithms.We also apply them to image classification?matrix completion and subspace learn-ing tasks,etc.Note that nuclear norm is the tightest convex approximation of rank function,the low rank matrix can be recovered with a higher probability under some certain conditions.Unfortunately,when the rank number of low rank matrix is larger,the low rank solution would be biased,as well as relatively lower calculation efficiency due to singular value decomposition(SVD)and too much iteration numbers.To solve these disadvantages,the main contents of this work will be given as follows:(1)With the aid of both nonconvex MCP and weighted sparse coding,we es-tablished the regression models named as WS?MR and WS?M2R and applied them to the face recognition problems with structural noise and mixed noise,respective-ly.The related representation coefficients can be achieved by nonconvex ADMM and nonconvex MM-ADMM.The further analysis is from both computational complexity and convergence guarantee.To obtain the recognition results,we design the classifier based on matrix ? norm through the merit of objective function.(2)Motivated by weighted nuclear norm and weighted Schatten-p norm,we pro-vided the general WNNR relaxation function and the related DNNR matrix recovery model.The nonconvex IRSVFc algorithm was further designed to obtain the low-rank solution.Under some milder assumptions,the local and global convergence properties of the proposed algorithm would be guaranteed through the Kurdyka-Lojasiewicz(KL)property of objective function.(3)Taking advantage of Schatten-p norm and its matrix decomposition for de-scribing the low rank coefficient matrix,we give two nonconvex low rank representa-tion models simply denoted as SpNM_LRR and SpNF_LRR,in which the former can get higher accuracy while the latter can improve the computational efficiency for the subspace clustering.Due that both models may involve multiple variables and/or con-straints,we provide the analysis from both computational complexity and convergence guarantees for the multiple nonconvex ADMM.(4)A class of unconstrained nonconvex nonsmooth optimization problems,which contain two independent variables,can be solved by PJIM and accelerated PJIM,re-spectively.We further generalize it to solve multiple variables cases.Specifically,they are different from some existing algorithms such as APG?ADMM and PALM.Under some certain conditions,using the KL property of the objective function construct-ed from the convex function,we further establish the local and global convergence guarantees of the proposed algorithms.In summary,the aforementioned works are mostly related with the low rank ma-trix recovery problems.The basic ideas use the nonconvex rank substitutes to es-tablish the mathematical models and devise some first-order optimization algorithms with convergence guarantees.Experimental results on both synthetic data and real world databases can show that our proposed methods have the superiority over some state-of-the-art methods in recovery accuracy and computational efficiency.
Keywords/Search Tags:Rank Function, Nuclear Norm, Schatten-p Norm(0<, p<, 1), Low-rank Matrix Recovery, First-order Optimization Algorithm, Convergence Analysis, Face Recognition, Matrix Completion, Low-rank Representation
PDF Full Text Request
Related items