Font Size: a A A

Stability And Applications Of Several Classes Of Recurrent Neural Networks

Posted on:2011-12-15Degree:DoctorType:Dissertation
Country:ChinaCandidate:R ZhangFull Text:PDF
GTID:1228330371450244Subject:Control theory and control engineering
Abstract/Summary:PDF Full Text Request
Since Hopfield first introduced the concept of energy function to study the stability for a class of fixed-weight recurrent neural networks called Hopfield networks, was proposed in 1980’s, and implemented the neural networks in circuits, the qualitative analysis on the stability of the equilibrium point for this class of recurrent neural networks has been intensively studied by many scholars. It is significantly important in theory and practice to qualitatively study the stability of neural networks because many applications of neural networks are dependent on the properties of stability. Recurrent neural networks have powerful computational capabilities, and are one of the most important types in neurocomputing.In this paper, the stability problem for several classes of recurrent neural networks and application of recurrent neural networks in optimization problems are investigated on the basis of linear matrix inequality technique.The main contents and contributions of this dissertation are summarized as follows.(1) It is summarized systematically that present situations of study on the stability of fixed-weight recurrent neural networks and the applications in optimal computation. A review of the developing history of artificial neural networks is firstly presented. And it is presented of the several kinds of recurrent neural networks with fixed-weight, the kinds of delays and the effects of the delays on the neural networks, and research of recurrent neural networks in optimal computation. Then, some basic theories and definition used in the chapters are outlined. And the main work of this paper is also simply introduced.(2) Exponential stability of a class of static neural networks with time varying delay is investigated. Different from the previous methods, a new Lyapunov functional useful differential inequality is utilized to prove the exponential stability of the concerned neural networks on the basis of linear matrix inequality approach. Considering the different effects of change rate of time varying delay, two exponential stability criteria are established, where one is dependent of the upper bound of time varying delay and the other is dependent on the change rate of time varying delay. The obtained results can be suitable of the cases of slow/fast time varying delay, easy to check, and have wide application ranges and less conservativeness.(3) The global exponential stability is discussed for Cohen-Grossberg neural networks with time varying delays. On the basis of the linear matrix inequalitiy (LMI) technique, and Lyapunov functional method combined with the Bellman inequality and Jensen inequality technique, we have obtained two main conditions to ensure the global exponential stability of the equilibrium point for this system, one of which is dependent on the change rate of time varying delays, and the other is dependent on the upper bound of time varying delays. Remarks are made with other previous works to show that the proposed results are less restrictive than those given in the earlier literatures, easier to check in practice, and suitable of the cases of slow or fast time varying delays.(4) Considering the dynamics of discrete-time neural networks be different from those of continuous-time ones, it is studied the global exponential stability for discrete-time bidirectional associative memory (BAM) neural networks with time varying delays. By the linear matrix inequality (LMI) technique and discrete Lyapunov functional combined with inequality techniques, a new global exponential stability criterion of the equilibrium point is obtained for this system. They are added to information on xT(k-σM),yT (k-τM) and information on neural states change rate. Thus, this Lyapunov functional is more general and less conservative. And the simulation example is used to demonstrate the effectiveness of our result.(5) By the description of the uncertain in control system, several global robust exponential stability condtions are presented for two kinds of uncertain recurrent neural networks with time varying delays via linear matrix inequality (LMI) technique and Lyapunov functional. At last, the simulation examples are used to demonstrate the effectiveness of our results.(6) The most important advantages of the neural networks are massively parallel processing and fast convergence, therefore, neural networks as a promising approach are emphasised and employed in optimization problems. This paper presents a new optimized neural networks for nonlinear convex programming problems with both quality and inequality constraints. The proposed neural network avoids the deficiency of the penalty function approach. Meanwhile, the present network needs less neuron than that of slack variables approach, which lead to simple circuit implementation, reduced computation burden and fast convergence. On the basis of energy function, the stability and convergence of the proposed neural network are analyzed, which guarantee the global stability of the equilibrium point of neural network.(7) It is studied to solve quadratic programming problem with equality constrains. Delays can alter the topology structure of neural network, so we can change dynamic behavior of neural network by adding some delay state items, and do not change the equilibrium points of neural network. So a Lagrange network with time varying delay is proposed to solve quadratic programming problem with equality constrains. Using different methods, the three stability criteria are obtained. And Theorem 7.2 and Theorem 7.3 can be adaptable to either quickly changed or slowly changed time varying delay, which have wider applications, less conservativeness and are easy to check.
Keywords/Search Tags:recurrent neural network, static neural network, Cohen-Grossberg neural network, BAM neural network, nonlinear programming, quadratic programming, Lyapunov functional, global exponential stability, global robust exponential stability
PDF Full Text Request
Related items