Since recurrent neural networks were proposed, the stability problems become hotspot of neural networks theory research. Such neural networks in the circuit analysis, optimization calculation and the associative memory etc. have applied widely, and all the applications depend on stability properties. Therefore, the studies on the stabilities of recurrent neural networks have important significance of theoretical and practical.Based on linear matrix inequality technique and Lyapunov function method, this paper investigates exponential stability problem of recurrent neural networks with multi-time delays, obtains the global exponential stability criterion and estimates exponential convergence rate in the case of global Lipschitz conditional activation function. There are four chapters in this thesis:In the first chapter, we present a comprehensive description of the research status about recurrent neural networks with optimization calculation and associative memory functions. It mainly includes the developing history of neural networks, common types and research status of recurrent neural networks and so on. Moreover, the main work of this paper is briefly introduced.In the second chapter, the symbols, definitions, hypotheses and related lemmas used in this paper are illustrated.In the third chapter, based on the linear matrix inequality technology, using Lyapunov function method, we consider a type of state with multi-time delay recurrent neural network, propose a global exponential stability criterion with delay dependent, and provide an estimate method of exponential convergence rate on this basis.In the forth chapter, we analyze the stability of the recurrent neural networks with multi-time delays activation functions, give an exponential stability criterion of delay independent, and also derive a method about estimating exponential convergence rate by using the LMI technology and the Lyapunov stability theory. |