Font Size: a A A

Stochastic Gradient Descent Algorithm For Non-Negative Matrix Factorization

Posted on:2022-09-02Degree:MasterType:Thesis
Country:ChinaCandidate:S S BaiFull Text:PDF
GTID:2480306545499524Subject:Mathematics
Abstract/Summary:PDF Full Text Request
In recent years,with the continuous expansion of data scale,stochastic gradient descent algorithm has become a hot topic in machine learning,especially in deep learning.It has the characteristics of simple parameter updating process,fast convergence speed and low computational complexity,and it is the main method to solve optimization problems.Therefore,the application of stochastic gradient descent algorithm in different scenarios is worthy of further study.For the multiplicative updating rule in solving the non-negative matrix factorization,there are some shortcomings of high computational complexity and low iterative efficiency.We proposes a stochastic variance parameter adjusted gradient method SVPAGMU.Combining the variance reduction strategy with the multiplicative updating rule,a parameter is adopted to adjust the stochastic gradient estimator,and the gradient descent direction is corrected to balance its deviation and variance,so as to reach the optimal solution quickly and accurately.Based on SVPAGMU algorithm,its accelerated variant algorithm SVPAGMU-ACC is proposed to further accelerate the iteration rate.The algorithm iterates the coefficient matrix several times until a certain termination condition is satisfied,and then updates the basis matrix.The feasibility and high efficiency of the accelerated variant are verified by numerical experiments.To reduce the computational complexity of the multiplicative iterative algorithm in solving the non-negative Tucker decomposition problem,this thesis proposes a stochastic variance reduction multiplicative updating algorithm SVRMU?NTD.This method combines stochastic variance reduction multiplicative updating algorithm and gradient descent idea.In order to realize the non-negative Tucker decomposition of highdimensional data,the convergence speed is accelerated,the computational complexity is reduced,and the tensor decomposition performance is improved.Numerical experiments are carried out on synthetic data sets and real data sets,the results show that the proposed algorithm has high practical value.
Keywords/Search Tags:stochastic gradient descent algorithm, non-negative matrix factorization, parameter adjusted gradient, variance reduction, multiplicative updating, non-negative Tucker decomposition
PDF Full Text Request
Related items