Font Size: a A A

Study On Sparse And Low-Rank Subspaces Clustering Algorithm

Posted on:2021-05-03Degree:MasterType:Thesis
Country:ChinaCandidate:Q Q YuFull Text:PDF
GTID:2428330602489022Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Sparse Subspace Clustering algorithm(SSC)and Low-Rank Representation subspace clustering(LRR)are two popular clustering methods in recent years.The basic principle of SSC is using the sparsity of data self-representation to establish affinity matrix,and then using spectral clustering algorithm to obtain clustering results for the affinity matrix.LRR uses the low rank of data self-representation to establish affinity matrix.The key step of these two algorithms is to set up the affinity matrix from the data so as to ensure that the data points belonging to the same subspace can be linearly represented by other points which are in the same space.However,the goal is different,one is to find a representation matrix as sparse as possible,and another is to find a representation matrix whose rank is as low as possible.While,for samples with large data volume and unknown noise,it is always difficult to get good clustering results.In this paper,sparse and low rank subspace clustering algorithms are further discussed,and the following three new clustering methods are proposed to improve the accuracy of subspace clustering.(1)The sparse subspace clustering algorithm(SSC)may neglect the relation between data to some extent,because it focuses on the local structure of data sets too much,which shows that the block diagonal structure of affinity matrix is poor.Low rank subspace clustering(LRR)establishes the objective function by minimizing the rank of the matrix and considering the global structure of the data set and,for the affinity matrix with obvious block diagonal structure.However,it is difficult to solve the above problem,so people usually use kernel norm minimization to take place of minimizing the rank of the matrix.In order to simultaneously consider the local and global structure of the data set and make the correlation matrix have both sparsity and block diagonal structure,with the help of the idea in CLAR algorithm,an effective subspace clustering method,called strengthening the local structure and global structure(LSGS),is proposed which have combine the Logdet function and the F norm.The improved LSGS algorithm can effectively improve the grouping effect,and the affinity matrix has obvious block diagonal structure,especially when the size of the data set is large.(2)The affinity matrix with block diagonal structure is an important guarantee to obtain the ideal clustering effect.In order to obtain the block diagonal structure of affinity matrix as soon as possible,the least square regression subspace clustering algorithm(LSR)defines the group effect to measure the degree of the block diagonal structure,and uses the minimization of F norm to enhance the group effect of the affinity matrix.Based on the proposed algorithm LSGS,an algorithm,called enhancing the local structure and global structure(ELSGS),was proposed to enhance the group effect by building a mathematical model with regular term.The algorithm can not only preserve the local structure and global structure of the data when the data set is large,but also strengthen the block diagonal structure of the affinity matrix.(3)A large number of experiments prove that if l1 norm is weighted reasonably and updated iteratively,the performance of subspace clustering algorithm based on l1 norm minimization can be greatly improved.Therefore,many sparse subspace clustering(RSSC)methods based on weighted(i.e.iterative weighted)have been proposed in recent years,especially the structural Weighted sparse subspace clustering(SRSSC)algorithm,which has achieved good results by introducing a structural sparse norm into RSSC.In addition,the noise sparse subspace clustering(TSRSSC)algorithm based on weighted l1 minimization combines the two-step l1 norm minimization algorithm of RSSC algorithm,which have solved the problem of noise data clustering with unknown prior information.Combining the ideas of SRSSC algorithm and TSRSSC algorithm,the third algorithm proposed in this paper is the improved structural weighted subspace clustering(RSSCN)algorithm,which extends the two-step l1 norm minimization algorithm and its improved algorithm.The algorithm can still perform ideal clustering effect,even if it does not know the type of noise and related parameters.
Keywords/Search Tags:sparse subspace clustering, weighted subspace clustering, low-rank subspaces clustering
PDF Full Text Request
Related items