Font Size: a A A

Robust Stability Analysis For Several Classes Of Neural Networks Without Lipschitz Activation Function

Posted on:2014-02-21Degree:MasterType:Thesis
Country:ChinaCandidate:M YanFull Text:PDF
GTID:2268330422951460Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
As we all know, neural networks are widely used in various fields, and thestability is the basis and precondition of applications. During the implementation ofneural networks, the stability is destroyed because of two main factors, which aretime delays and uncertainties. Therefore, in order to play the application value of theneural network, the discussion for robust stability of delayed neural network isextremely important.In this paper, we mainly study the robust stability for two classes of the delayedneural networks. Firstly,The robust exponential stability for neural network withgeneral activation function is studied. We make a comparison with some previousresults. We cancelled global Lipschitz condition of the activation function. Theassumptions for activation function in our paper is more generally. Based on thetopological degree theory, we prove existence, uniqueness of equilibrium point. ByLyapunov method, we provide global robust exponential stability of the neuralnetwork. And the advantages of this conclusion is illustrated by an example.Secondly, the global robust stability of neural networks with inverse-Lipschitzactivation function is investigated. The neural networks with inverse-Lipschitzactivation function have important application value and attract more and moreattention from researchers. In this paper, a sufficient condition which ensures theglobal robust stability of the neural network is established. Some numericalexamples are given to verity the effectiveness of the result.
Keywords/Search Tags:delayed neural network, robust stability, topological degree theory, non-Lipschitz, Lyapunov function
PDF Full Text Request
Related items