Font Size: a A A

Research On Nonnegative Matrix Factorization With Subspace Constraints

Posted on:2019-03-21Degree:DoctorType:Dissertation
Country:ChinaCandidate:G S CuiFull Text:PDF
GTID:1368330596456537Subject:Signal and Information Processing
Abstract/Summary:PDF Full Text Request
In the field of data analysis and processing,one of the most basic tasks is to find a good data representation.A good data representation can effectively reveal the potential information of the data,such as major components,implicit concepts or salient features,and facilitate further data analysis and processing.Nonnegative matrix factorization is a basic data feature extraction and dimensionality reduction method.Its goal is to decompose the original nonnegative data matrix into the product of two nonnegative factor matrices,so as to extract the effective features of the data and achieve the reduction of feature dimension.Nonnegative matrix factorization can enhance the “parts to whole” interpretation of data because it can find a part-based data representation.Because of its huge theoretical research and practical application value,nonnegative matrix factorization has been extensively studied and has achieved considerable development.However,there are still the following problems: the insufficient exploration the low-rank feature space of the original data,the lack of a strategy and a unified framework that can effectively discover and use the implicit subspace structure information of data,and the existence of unreasonable constraining mechanisms for discriminative structural information in supervised methods.In response to the above problems,the main research contents of this dissertation are summarized as follows:(1)Graph regularized nonnegative low-rank matrix factorization.At present,most existing works directly apply nonnegative matrix factorization on high-dimensional image datasets to obtain the effective representation of the raw images.In order to obtain a valid low-rank representation of the data,a nonnegative low-rank matrix factorization is proposed.This method recovers the low-rank parts of the original data and performs a nonnegative matrix factorization on it to extract a nonnegative low-rank representation of the data.Since the essential feature of data exists in its low-rank parts,and sparse parts often correspond to high-frequency noise of data,this feature extraction framework can obtain effective low-dimensional data representation.In order to use the manifold structure information of the data,a graph regularized nonnegative low-rank matrix factorization is proposed to further improve the effectiveness of the extracted features.(2)Refined-graph regularization based nonnegative matrix factorization.In this method,a manifold regularized least squares regression model is constructed to fully exploit the subspace structure of the data,and a refined-graph based on this model is developed to obtain a robust and efficient encoding of the subspace structure information.By further constraining the low-dimensional representation of nonnegative matrix factorization to preserve the subspace topological structure relationship in the original data space,an effective low-dimensional representation which is full of subspace structure information can be obtained.Additionally,because it is very time consuming to obtain the closed-form solution of manifold regularized least squares regression.This dissertation proposes an alternative iterative optimization scheme to solve the problem effectively.(3)Nonnegative subspace clustering guided convex nonnegative matrix factorization.In this method a unified framework that can collaboratively optimize subspace clustering and convex nonnegative matrix factorization is designed.And this method explores the possibility of using the subspace structure information to improve the quatity of the feature extracted by nonnegative matrix factorization from another perspective.Specifically,subspace clustering term is used to effectively acquire the subspace structure information of the data.And with the collaborative optimization of the model,the quality of the featrure extracted by convex nonnegative matrix factorization is improved by using this information.The graph regularizer is the bridge which connects the subspace clustering term and convex nonnegative matrix factorization.The nonnegative constraint imposed on the subspace representing coefficients facilitates the optimizing of the model,while the local subspace constraint can improve the robustness of the proposed model to the noise existing in the crosssubspace.Two specific implementations of the proposed model are presented,and both of them can be effectively optimized by using the corresponding optimizing schemes proposed in the dissertation.(4)Discriminative and orthogonal subspace constraints based nonnegative matrix factorization.This method applies the discriminant constraint to the subspace projection of the original data with respect to the base matrix and imposes orthogonal constraints on the base matrix so that the base matrix obtained by the model can have a good discriminative projecting performance for original data samples.This constraining strategy can make the features obtained by the model more distinguishable in the projection space.The orthogonal constraints on the base matrix also improve the sparseness of the extracted features.Two specific implementations of the proposed framework are designed and both of them can be optimized using multiplicative updating rules.
Keywords/Search Tags:Low rank recovery, Least squares, Subspace structure, Unified framework, Supervised method, Discriminant information
PDF Full Text Request
Related items