Font Size: a A A

Improvement For Algorithms Of Nonnegative Matrix Factorization And Its Applications

Posted on:2012-05-26Degree:MasterType:Thesis
Country:ChinaCandidate:J J JiangFull Text:PDF
GTID:2178330338991485Subject:Mathematics
Abstract/Summary:PDF Full Text Request
Nonnegative Matrix Factorization (NMF) has attracted great attention re-cently in machine learning, data mining, image processing, signal processing andother fields. In this paper, three aspects of NMF and have been improved.Firstly, we propose an interior point method to solve general nonnegativeregularized trust region problem. By using the Taylor expansion, we get a problemwhich is equivalent to the original problem. And then according to the KKTsystem of nonlinear optimization problem, we get a very easy method to solvethe original problem. Then we apply this method in NMF with di?erent typesof cost functions, includingα-divergence,β-divergence, KL-divergence, dual KL-divergence, where di?erent types of cost functions are imposed on di?erent kindsof noise. Numerical experiments on blind source separation demonstrate that theproposed method is over-performed than the current update rules.Secondly, we present a family of projective nonnegative matrix factorization(PNMF) algorithm, PNMF with Bregman divergence. Bregman divergence is avery popular measurement in machine learning and data mining. And we proposea multiplicative updates for PNMF with Bregman divergence. Several special ver-sions of divergence such as Euclidean distance, Kullback-Leibler (KL) divergence,and I-S divergence have been studied in this paper. Experimental results demon-strate that the bases derived by PNMF with Bregman divergence are somewhatbetter suitable for a localized and sparse representation than the bases by thetraditional NMF, as well as being more orthogonal.Finally, we propose a novel method to solve large-scale l1-regularized leastsquare problem (LSP) using a monotonic fixed-point algorithm. l1-regularizedLSP has been proposed as a promising method for sparse signal reconstruction (e.g., basis pursuit de-noising and compressed sensing) and feature selection (e.g.,the Lasso algorithm) in signal processing, statistics, and related fields. We provethe stability and convergence of the proposed method. Furthermore we generalizethis method into least square matrix problem and apply it into NMF. The methodis illustrated on sparse signal reconstruction, partner recognition and blind sourceseparation problems, and is faster and sparser than other popular l1-regularizedLSP algorithms.
Keywords/Search Tags:Nonnegative Matrix Factorization, Projective NonnegativeMatrix Factorization, Blind Source Separation, Partner recognition, Bregmandivergence, Sparse signal reconstruction, Fixed point method
PDF Full Text Request
Related items