Font Size: a A A

Asymptotic Stability Analysis Of Discrete Neural Networks With Multiple Additive Delays

Posted on:2013-05-13Degree:MasterType:Thesis
Country:ChinaCandidate:X F XuFull Text:PDF
GTID:2358330371992438Subject:Systems analysis and integration
Abstract/Summary:PDF Full Text Request
In resent years, the stability of discrete-time neural networks has become a hot problem. Inthe network, a signal may pass through many network segments when it transmits from one pointto another, and different network segments generally have different transmission conditions, thisleads to multiple additive delays. There is an important theoretical and practical significance toresearch the asymptotic stability of a class of discrete-time recurrent neural networks withmultiple additive time-varying delays. To this kind of neural network, this paper studies globalexponential stability problems with certain parameters by taking two additive delays for example.By LMI method and Lyapunov stability, this paper derives stability criterion.This condition hasless conservatism and it is easy to check. When the case of parameter uncertainties is concerned,this paper researches the robust asymptotic stability problem of a class of discrete-time neuralnetworks with two additive delay components and derives new condition of this class of neuralnetworks. At last, this paper improves the stability results of neural networks with certainparameters and derives new results which involve fewer decision variables but have lessconservatism. My thesis is organized as follows:Chapter1introduces the background of the discrete-time recurrent neural networks withmultiple additive time-varying delays in detail and proposes the research problems of this topic.Chapter2presents the preparative knowledge for this paper including Lyapunov stabilityand LMI.Chapter3is concerned with the problem of the global exponential stability analysis for aclass of discrete-time recurrent neural networks with two additive delay components consideringthe situation of certain parameters. By constructing a new Lyapunov functional and using LMItechniques, we derive a new global exponential stability criterion for this class of neuralnetworks. The criterion can be checked by LMI easily.Chapter4investigates a delayed discrete-time recurrent neural networks withnorm-bounded parameter uncertainties. By Lyapunov functional methord, we derives adelay-dependent robust exponential stability criterion for this class of neural networks.Numerical examples are provided to demonstrate the effectiveness of the proposed criterion.Chapter5improves the stability results of chapter3.With a different Lyapunov functional given and a novel technique, we derive a new stability criterion in which they involve fewerdecision variables but have less conservatism.Chapter6summarizes this paper and proposes the problem for further research.
Keywords/Search Tags:Additive time-varying delays, discrete-time neural networks, globalexponential stability, robust stability, linear matrix inequality (LMI)
PDF Full Text Request
Related items