Font Size: a A A

Improved Algorithm And Application Of Multi-layer Neural Network

Posted on:2007-03-06Degree:MasterType:Thesis
Country:ChinaCandidate:S W HuFull Text:PDF
GTID:2178360212968517Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
The back-propagation algorithm has been widely recognized as an effective method for training feed-forward neural networks. This method has its popular utility in terms of pattern recognition, control engineering, signal processing and economic prediction etc. However,there exists three essential defects in back-propagation algorithm as follows: slow convergence,trap in local minima and worse tolerant capacity. This paper tries to make some improvements with regard to those shortcomings。Firstly, we expound back-propagation networks systematically in this paper. For the shortcomings of standard back-propagation algorithm, the mechanism that gives rise to the phenomenon of premature saturation of the output units of back-propagation network from mathematics is described. The theorem for the occurrence of premature saturation is presented and proved.Secondly, a three-term back-propagation algorithm consisting of a learning rate, a momentum factor and a proportional factor is presented, and the learning parameters of optimal estimated formula are obtained by deducing in detail. Meanwhile, the computational complexity is analyzed. The results show that the optimal learning parameters of three-term back-propagation algorithm needn't heavy computational and storage burden compared with the standard back-propagation algorithm. It can also improve efficiently the performance of back-propagation algorithm.Finally, by analyzing the influences of saturation degree in the hidden layer on the performances of multilayer feed forward neural networks, constructing a new error function and designing a new adaptive method of magnifying error signal, an improved back-propagation algorithm is presented in this paper. The results show that, in terms of the convergence rate and the capability of escaping local minima, the new algorithm always outperforms the other traditional methods.
Keywords/Search Tags:Feed Forward Neural Networks, Back-propagation Algorithm, Premature Saturation, Optimal Learning Parameters, Local Minima, Error Signal
PDF Full Text Request
Related items