Font Size: a A A

Convergence Analysis Of BP Algorithm Based On Relative Entropy Function Criterion

Posted on:2008-03-25Degree:MasterType:Thesis
Country:ChinaCandidate:C B LianFull Text:PDF
GTID:2178360242964411Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Back Propagation Neural Network is the most widespread application and perfect neural network theory at present. With its simple structure and strong workability, it can simulate any non-linear relations between input and output. In the practical application, the majority of feed forward network uses the Error Back Propagation Algorithm ( BP algorithm), its characteristic solves the implicit strata to introduce questions for later study. In fact, the erroneous reverse dissemination algorithm is the non-linear optimization question gradient algorithms with many born-with flaws which affect the network performance directly. The present researches focus on network weight, study algorithm, error function, network architecture as well as correlative astringency and stability etc.In order to improve the convergence rate of network, many scholars have conducted thorough researches into the structure and application of error function. N. B. Karayiannis proposed the entropy error function to solve false saturated condition existing in the traditional error function training process in 1992; S-H. Oh modified the entropy error function proposed by N. B. Karayiannis so that to avoid the phenomenon of over study due to the strong error signals in 1997; Minghu Jiang and so on affirmed its function of enhancing the feed forward neural network study algorithm convergence rate once more and revised it in 2003, however its form is too complicated to spread and use.This article discusses the astringent of the BP algorithm mainly based on the relative entropy function, and modifies the study rate slightly to reduce the possibility of the network vibration. In the suppose condition of this article, we obtain the erroneous sequence and the weight sequence regarding the model of entire connection feed forward neural network based on the relative entropy function in BP algorithm. And then using the lemma we obtain the monotony of the erroneous sequence and the restraining result. Finally we obtain the weak convergence and strong convergence result of the weight sequence, namely based on relative entropy function criterion BP network weak convergence and strong convergence result.
Keywords/Search Tags:Error Back Propagation Algorithm, Relative entropy, Monotonous, Weakly convergent, Strongly convergent
PDF Full Text Request
Related items