Font Size: a A A

Research On Regression Analysis Based On Low-Rank Representation Algorithm For Noise Data

Posted on:2020-12-03Degree:MasterType:Thesis
Country:ChinaCandidate:K A LiFull Text:PDF
GTID:2428330596495134Subject:Software engineering
Abstract/Summary:PDF Full Text Request
High dimensional data is an interesting problem to be solved in the areas of data mining and computer vision.However,the curse of dimensionality makes the data analysis process is more difficult,and the analysis results are often unreliable.High dimensional face data is mapped into a lower subspace by subspace learning method,which can deserve the information between high dimensional data.The low dimensional subspace information of the data is stored in the representation matrix for further data analysis and mining,thus better classification performance and computational complexity can be obtained.However,the traditional subspace learning method suffers from noise and outliers.How to design a robust model for face image classification is the focus of this paper.Low-rank subspace method is widely used to recover essential clean data from high dimensional data with noise and outlier.Based on the study of low-rank theory,this paper proposes a robust regression model based on low-rank representation,including LR-RRM and its extended model LR-RRMSp.The main work and contributions of this paper are summarized as follows:First,the LR-RRM model is proposed to learn low-rank subspace structures in high-dimensional data and improve the robustness of regression.The main contribution is recovering low-rank subspace and learning regression models simultaneously.In this model,the global multi-subspace structure can be learned by low-rank representation,and the clean data and noise data can be separated in a supervised way,so the reconstructed clean data can keep the low-dimensional subspace that have the maximum correlation with the label information.By solving the rank minimization of the self-expression coefficient matrix of the original data and learning the clean data regression model,LR-RRM removes noise or outliers and obtains robust regression performance.Then,for the LR-RRM model,the kernel norm of the matrix and the7)1 nor m are used to solve the convex relaxation approximation,which leads to the model solution deviates from the original problem solution.So,a new robust regression LR-RRMSp method is proposed.The main contribution is to use the Schatten-p norm and7)norm to replace the low-rank and7)0 norm respectively.By solving the non-convex Schatten-p norm minimization,the non-convex approximation of the original problem is obtained.Finally,the detailed optimization process of LR-RRM and its extended model LR-RRMSp is given by augmented Lagrangian multiplier method.In addition,experimented on several face datasets with noise,the results demonstrate the effectiveness of the proposed method.
Keywords/Search Tags:Linear Regression, Low-rank Representation, Noise Data, Subspace Learning, Schatten-p Norm
PDF Full Text Request
Related items