| With the progress of science and technology and the development of artificial intelligence,machine learning,as an important branch of artificial intelligence,has become a hot topic of research nowadays.And stochastic optimization method is an important theoretical basis of machine learning,which is widely used in management science,information engineering,economics,optimal control of agriculture and industrial engineering.In this thesis,a gradient descent algorithm for a class of stochastic optimization problems and its applications is investigated.The main research contents of this thesis are as follows:First,a stochastic three-term conjugate gradient method is proposed,which combines the improved Dai-Liao three-term conjugate gradient with the random variance reduction technology,and obtains the step size through Wolfe line search based on batch sample function.Under appropriate assumptions,it is demonstrated that the algorithm has global convergence.Compared with the variance reduction gradient algorithm in numerical experiments,it is concluded that the algorithm has certain advantages.Second,a stochastic recursive gradient algorithm with random Barzilai-Borwein step size is proposed,which uses the random Barzilai-Borwein method to automatically obtain the step size and incorporates a significant sampling strategy to effectively improve the convergence speed by reducing the random gradient variance.Global convergence of the new algorithm is proved under some appropriate assumptions,and numerical results show that the new algorithm is more efficient among existing random variance reduction class methods. |