Font Size: a A A

The Stability Analysis Of Neural Networks With Asymmetric Connection Weights

Posted on:2006-05-27Degree:MasterType:Thesis
Country:ChinaCandidate:J X YangFull Text:PDF
GTID:2190360152498494Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
This paper investigates the stability of the neural networks with asymmetric connection weight. Some sufficient conditions of the stability are derived via constructing a Liapunov function and using the method of the variation of constant. The thesis consists of three chapters. Chapter 1 gives an introduction to the development of the neural network, and has analyzed the meaning of the stability for the asymmetry neural network. It has introduced the method and main content of the paper. In Chapter 2, by constructing the Liapunov function and utilizing M ? matrixand theω?limit set, some new sufficient conditions on asymptotical stability or exponential stability of neural networks are obtained. Furthermore, by dividing the network state variables into some parts according to the characters of the asymmetry neural networks, some new sufficient conditions of exponential stability are derived via using the method of the variation of constant. In Chapter 3, the stability of DNN's has been investigated. Some conditions of the stability are derived via leading into many parameters, as well as utilizing the technique of inequality analysis. By dividing the delay network state variables into some parts, we derive some conditions of exponential stability. By utilizing the contraction-mapping theorem, some conditions are obtained to guarantee that the system has a unique and globally exponentially stable periodic solution.
Keywords/Search Tags:neural networks, asymptotical stability, exponential stability, Liapunov function, method of the variation of constant
PDF Full Text Request
Related items