Font Size: a A A

Research On Dynamic Sampling Algorithms Based On Gradient Decent

Posted on:2020-10-12Degree:MasterType:Thesis
Country:ChinaCandidate:L X ZhangFull Text:PDF
GTID:2428330596984915Subject:Engineering
Abstract/Summary:PDF Full Text Request
Due to its nonlinearity,non-convexity and popularity,it is very critical to solve the problems formalized from artificial neural networks.It is very hard to find an effective algorithm solving artificial neural networks.In this paper,we proposed a gradient based method to solve the multi-layer neural network.Instead of using the traditional gradient descent method,we replace the gradient by a gradient estimator that is an unbiased estimator of the gradient.We call this method dynamic sampling.Using gradient estimator can make lower computational burdens,but this also can result some inaccuracy that is given by the variance.Compared to the stochastic gradient descent method,mini-batch stochastic gradient descent method and dynamic sampling method,we reckon the latter one is better and choose to formalize it in this paper.We proved that dynamic sampling not only achieves very low computational burdens but also has the sublinear convergence rate for non-convex problems.This kind of method speed up the algorithm and reduce the number of iterations by using variance reduction technique.Our method has faster convergence speed in the early period and more accurate in the latter period compared with the SVRG,which is a variance reduction based method.In the experiment part,we give the comparison of convenience rate,running time and texting accuracy.Some experiment validate the efficiency of our algorithm.
Keywords/Search Tags:Neural network, Back propagation, Stochastic gradient descent, Variance reduction
PDF Full Text Request
Related items