Font Size: a A A

The Research Of Improved BP Algorithm Based On Self-adaptive Learning Rate

Posted on:2009-11-12Degree:MasterType:Thesis
Country:ChinaCandidate:J P YangFull Text:PDF
GTID:2178360272485964Subject:Signal and Information Processing
Abstract/Summary:PDF Full Text Request
Artificial neural network is a main developed aspect of intellect computation. At present, it is widely used in many fields including pattern recognition, signal processing, knowledge engineering, expert system, machine controlled and so on. Because of simple structure, high accuracy, easy to program and operate, good nonlinear mapping capability, back propagation network is becoming one of the widely used neural networks at present.Because standard back propagation algorithm is based on minimum gradient descending, problems come into existence, for example: the rate of convergence is slow, it is easy to trap in local minimum and hard to determine the quantities of hidden layers and nodal points, the expected output of input patterns have to be known. According to this, a lot of scholar researched and presented many improved back propagation algorithms including additional momentum algorithm, self-adaptive learning rate algorithm, resilient back propagation algorithm, Levenberg-Marquardt algorithm, Gauss-Newton algorithm, conjugate gradient algorithm, back propagation-genetic algorithm and Metropolis algorithm.Combined with the principle of standard back propagation algorithm, in order to overcome their limitations, this paper put forward a new algorithm with self-adaptive learning rate facor added to additional momentum algorithm for the improvement of back propagation algorithm. This paper also gives the mathmatic representation of the new algorithm. The new algorithm is based on additional momentum algorithm, so it can avoid trapping in local minimum. Because the self-adaptive learning rate facor is added to the new algorithm, it has good convergence effect.In order to verify the effect of the new algorithm, we use the new algorithm to solve the XOR question and function approximation. We also contrast the new algorithm with standard back propagation algorithm and improved algorithms existed. The program simulations show the new algorithm has better convergence effect, can avoid trapping in local minimum, and also has good stability and robust performance.
Keywords/Search Tags:Neural Networks, Back Propagation Algorithm, Self-adaptive Learning Rate, Convergence Rate, Local Minimum
PDF Full Text Request
Related items