Font Size: a A A

Research On Models Of Low-Rank Subspace Learning

Posted on:2020-03-21Degree:MasterType:Thesis
Country:ChinaCandidate:L J ZhangFull Text:PDF
GTID:2428330572967393Subject:Software engineering
Abstract/Summary:PDF Full Text Request
In recent years,with the increase of the amount of high-dimensional data,it is becoming more and more complex to deal with these data to serve human well.Luckily,these high-dimensional data usually hold high similarities,this can be proved by the famous Netflix Challenge:the unknown movie evaluation can be estimated by learning from the known evaluation.As the similarity of high-dimensional data is high,the matrices composed of these data are naturally highly correlated,thereby behaving of low rank.At present all the way from Compressed Sensing and Sparse Coding to low-rank model,the algorithm of data processing based on low rank theory has achieved great success.Among them,the low-rank model even realized the function of automatic data recovery.In addition,in order to solve the problem appeared in processing high-dimensional data,subspace learning is presented to reduce the dimensionality of data for the further clustering or classification.However,the effect of existing subspace algorithms is likely crashed when the data is polluted by non-Gaussian noise.Therefore,in order to improving the robustness of subspace learning to noise,exploring the data correlation when operating subspace learning process has become an urgent need.Based on the above situation,this essay studies low-rank subspace learning algorithm from the following two aspects:(1)From the perspective of matrix factorization to constrain the rank of mapping matrix,the low-rank spectral regression(LRSR)model based on matrix factorization is proposed.Since the rank of the matrix reflects the relativity of the row(column)of the matrix,and its rank is less than or equal to the rank of any factor matrix,LRSR decomposes the mapping matrix into two factor matrices,and then constrains the rank of the mapping matrix by limiting the rank of the factor matrix.LRSR optimizes the mapping matrix along with the optimization of the factor matrices,and guarantee the low rank of the mapping matrix,thus ensuring the robustness of the model to noise and achieving better result in dimensionality reduction.Finally,experiments show that the classification accuracy after dimensionality reduction using LRSR model is much better than that of using spectral regression.(2)From the point of view of using spectral norm as the metric of the rank of affinity matrix,a model of joint low-rank and subspace learning(JLRSL)via Spectral regression for robust feature extractions is proposed.According to the fact that the rank of matrix can characterize the relevance of data,and in order to solve the problem that progresses of dimensionality reduction and data recovery are artificially separated in subspace learning,JLRSL integrates low-rank learning and subspace learning into a unified framework by adding different regularization factors on the basis of spectral regression formula.In the form of least squares,JLRSL combines the constraints of spectral norm of affinity matrix with those of mapping matrix,and optimization iterates alternately in the process of optimization.The synchronization of low rank learning and subspace learning is achieved.JLRSL learns low-rank affinity matrix and uses it to process data,and then,it uses the optimized mapping matrix to project the cleaned data into low-dimensional space to achieve dimensionality reduction.In the end,a series of experiments are carried out and the validity of JLRSL in dimensionality reduction and data automatic recovery is thus proved successfully.
Keywords/Search Tags:Low-rank Model, Subspace Learning, Spectral Regression, Data Recovery
PDF Full Text Request
Related items