Font Size: a A A

Convergence Of Online Gradient Method Of BP Neural Network With Momentum

Posted on:2021-05-30Degree:MasterType:Thesis
Country:ChinaCandidate:C Y PanFull Text:PDF
GTID:2428330605453631Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
As an information processing system that simulates biological neural networks,artificial neural networks have been widely favored in artificial intelligence in recent years.The BP neural network,which uses the steepest descent method(gradient descent method)as the basic idea for learning,is often used to deal with approximation problems because of its strong nonlinear mapping ability.Adding momentum gradient learning algorithm can improve the learning speed of BP neural network.This article mainly studies the convergence of the momentum-driven two-layer BP neural network online gradient method when the training samples are randomly arranged in each iteration.Selecting a appropriate learning step size,the momentum coefficient is selected in an adaptive manner,and certain constraints are given to the activation function.We prove the weak convergence and strong convergence theorems of the algorithm.Further,when the training samples are fixedly grouped,the weights are updated in the unit of group,and the samples are grouped and rearranged in each iteration cycle,and the corresponding weak and strong convergence results are also given.
Keywords/Search Tags:BP neural network, momentum, online gradient method, convergence
PDF Full Text Request
Related items