Font Size: a A A

Study On Deep Non-negative Matrix Factorization Algorithm

Posted on:2017-06-15Degree:MasterType:Thesis
Country:ChinaCandidate:S W QuFull Text:PDF
GTID:2348330533450186Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Nonnegative Matrix Factorization is an effective method for local feature extraction, which has the advantages of fast decomposition, simple realization, and clear physical meaning, and has wide applications in image processing, speech signal processing, text clustering and so on. It has become an important research direction of feature extraction and data analysis. With regard to the complex data, only using the single layer network composed by NMF cannot express the characteristics of the data from multiple angles. In order to learn and express the high level and complex data, it is necessary to use multi-layer network to feature extraction. And on the other hand, based on Nonnegative Matrix Factorization to construct Deep Nonnegative Matrix Factorization can maintain the reality that the whole is made of the parts, and it can also learn hierarchical feature representations, which combines the advantages of Nonnegative Matrix Factorization and Deep Networks. It has important practical significance for its further research. The main innovative works are as follows:1. Graph Regularization Semi-nonnegative Matrix Factorization algorithm is proposed. For the problem that Semi-nonnegative Matrix Factorization does not consider the geometric structure of the complex data, and if we use the geometric structure and local invariant assumption to learn feature when we factorize the matrix, we can have a better performance on the feature representation. Based on Semi-nonnegative Matrix Factorization, the algorithm encodes the data space by constructing an adjacency graph. Experiments about clustering, dimensionality reduction, feature sparse on COIL20, CMU PIE show GR Semi-NMF has better clustering performance and sparseness than NMF, PNMF, Semi-NMF, GNMF, CNMF and DPNMF, when it does not reduce the dimensionality reduction efficiency.2. Graph Regularization Deep Semi-nonnegative Matrix Factorization algorithm is proposed. For the problem of Graph Regularization Semi-nonnegative Matrix Factorization as a signal network not having a deep feature representation, and the problem of Deep Seminonnegative Matrix Factorization not considering the intrinsic geometric structure of the complex data, we can use Graph Regularization Deep Semi-nonnegative Matrix Factorization to construct a deep network to learn feature. Experiments reconstruction error and clustering on COIL20 and CMU PIE show GR Deep Semi-NMF has better better clustering performance than Semi-NMF, GR Semi-NMF, Deep Semi-NMF, and can extract feature from layer by layer, when it does not increase the reconstruction error.3. Bi-directional Nonnegative Deep Learning algorithm is proposed. For the problem that Projective Nonnegative Matrix Factorization as signal layer structure cannot learn abundant data feature, and the problem of that the related algorithms based on Nonnegative Matrix Factorization proposed in recent years are shallow networks, which have limited capacity to learn feature, we factorize a matrix many times to construct a Bi-directional Nonnegative Deep Learning network that has multiple hidden layers by Projective Nonnegative Matrix Factorization. Bi-directional Nonnegative Deep Learning can extract the feature from each layer of the network, and achieve the feature representation on the higher level; after fine-tuning the network, it has a good reconstruction of the original data. Experiments about clustering on COIL20, COIL100 and CMU PIE show BNDL has better better clustering performance than Deep Semi-NMF and GR Deep Semi-NMF, and can extract feature from layer by layer.
Keywords/Search Tags:Non-negative Matrix Factorization, Deep Learning, Deep Non-negative Matrix Factorization, Dimension Reduction, Clustering
PDF Full Text Request
Related items