Font Size: a A A

First-order Optimization Method For Principal Component Pursuit

Posted on:2015-03-03Degree:MasterType:Thesis
Country:ChinaCandidate:Y WangFull Text:PDF
GTID:2298330467972385Subject:Pattern Recognition and Intelligent Systems
Abstract/Summary:PDF Full Text Request
Compressed sensing and low-rank matrix recovery are ubiquitous in many areas ofsignal/image processing, computer vision, pattern recognition, etc. In these real applications,analysis and processing of high-dimensional data are often used, which needs to utilize row andcolumn correlation of images or data matrix, sufficiently and comprehensively. Althoughminimization of objective functions like sparsity or matrix rank is NP-hard, convex relaxation couldgive the convex optimization solution of the original problem. Moreover, many efficient methodscould be used for solving the convex optimization problem, such as first-order, second-ordermethods, etc. In this paper, our research is focused on Principal Component Pursuit (PCP), whichhas been widely used in foreground extraction from surveillance video and the pretreatment of facerecognition as one of the classic convex optimization modelsAlthough second-order methods can be used to solve many convex programs, it may sufferfrom unbearably high computation cost when handling large scale problems. And second-ordermethods have higher iteration complexity than first-order methods. Besides, instead ofdecomposing or calculating the inverse of the large scale matrix, first-order methods only need tocalculate the product of matrix or vector. Because this paper focuses on the application of imagesand video, first-order methods are the better choice.The best uniform rate of convergence for first-order methods depends very strongly onsmoothness. In this paper, we adopt Nesterov’s smoothing technique to smooth the non-smoothterms in the objective functions of the PCP problems. In addition, we hope to improve the originalADM. Firstly, proximal gradient term is introduced in ADM, so improving ADM is proposed. Then,the convex functions are alternatively replaced by proximal gradient term to get an approximationto the original function. Later on, by applying the acceleration step to the dual variable l, we obtaina fast ADM algorithm. Optimization methods have good effect when doing synthetic dataexperiments on large scale convex optimization problems, such as PCP, etc. Experiments on theforeground extraction and the pretreatment of face recognition all indicate that optimizationmethods have the higher computing efficiency and lower iteration complexity than original ADM.Generally, ADM possesses a complexity of O(n3) when solving nuclear-norm minimizationproblem. It performs singular value decomposition (SVD) at each iteration. To reduce thecomplexity, we used low-rank matrix separation instead of SVD. New method has good effect on the foreground extraction. In the experiment of recovering sparse matrix, the computing time isgreatly optimized. And the recovery effect is better than ADM when adding impulsive noise.
Keywords/Search Tags:PCP, convex optimization, first-order method, ADM, smoothing technique, low-rankmatrix separation
PDF Full Text Request
Related items