Font Size: a A A

Stability Of Recurrent Neural Networks

Posted on:2008-06-19Degree:DoctorType:Dissertation
Country:ChinaCandidate:J XuFull Text:PDF
GTID:1118360212489546Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
A neural network is an artificial network model, which consists of many neurons. This artificial model is presented as models of animal and human's nerve system in learning, associative memory and pattern recognition. A first wave of interest in neural network emerged after the introduction of simplified neurons by McCulloch and Pitts in 1940's. However, the research went through slow development stage afterward. But this status was changed after the Hopfield neural network was proposed in 1980's. Hopfield neural network is an representative recurrent neural network model. Because of this full-connected structure, it behaves complicated dynamic behavior, which is the basis of some intelligent activities such as learning, associative memory, and pattern recognition. Stability analysis is a most important problem in dynamical analysis of dynamics system.Neural networks have attracted extensive attention because of its great potential in theory and applications. For example, it has been successfully applied in signal and image processing, pattern recognition, associative memory, and so on. It is well known that such kinds of engineering applications of neural network rely crucially on qualitative properties of stability and dynamic behaviors of the network. In this thesis, based on Lyapunov stability theory, the stability problem of recurrent neural network (RNN) such as global exponential stability of delayed neural network, domain of attraction and absolute exponential stability are studied by M-matrix theory, LMI and LaSalle invariant set principle. This thesis consists of six Chapters:In Chapter 1, a review of the history of neural network is firstly presented. It shows that Hopfield neural network is an important model in the history of neural networks. Then the structure of neural network is described. In general, a neural network can be classified into two main categories based on connection: feedforward network and recurrent neural network. We only focus on recurrent network in this thesis.In Chapter 2, some preliminaries which include definition, lemma and some matrix inequality are given. Then Lyapunov stability theory is introduced in detail. Next, some types of recurrent neural networks model are described. In the end, the some kinds of activation functions are listed.In Chapter 3, a set of criteria have been derived for the global exponential stability of general delayed recurrent neural networks with monotonic nondecreasing and non-constant activation functions. A new stability condition is obtained based on the Lya-punov functional approach. Three special circumstances of general delayed recurrent neural networks are then addressed by the proposed approach and global exponential stability conditions are obtained for the delayed neural networks, the Hopfield neural networks and delayed Hopfield neural networks, respectively. To demonstrate the advantage of the proposed results, some comparisons are made with the previous ones. A numerical example is constructed to show the effectiveness of our approach.In Chapter 4, based on Lyapunov-Krasovskii functional or Lyapunov-Razumikhin functional method and invariant set principle, we presented a new method to estimate the domain of attraction for general recurrent neural networks with time-varying delays. A convex optimization method is proposed to enlarge and estimate the domain of attraction. Some local exponential stability conditions are derived, which are expressed as linear matrix inequalities (LMIs) in terms of all the varying parameters and hence can be easily checked in both analysis and design.In Chapter 5, both the existence of equilibrium point and the AEST of RNN with generalized activation function are addressed. Every component of the activation functions of neural networks is assumed to belong to the convex hull of two piecewise linear functions. This generalized activation function allows more flexible or more specific description of activation functions. We demonstrate that the original RNN with generalized activation function is equivalent to RNN under all vertex functions of convex hull regarding global exponential stability (GES). Since the equivalence of these two systems, the stability analysis is focused on the RNN system under all vertex functions of convex hull. Then the neural networks under all vertex activation functions is transformed into neural networks under an array of saturated linear activation function. In the end, a two-order RNN with generalized activation function is constructed to show the effectiveness of our results.In Chapter 6, a review of recent results on stability of neural networks and the prospect of my further research are given.
Keywords/Search Tags:Recurrent neural networks, Time-varying delay, Estimation of domain of attraction, Global exponential stability, Absolute exponential stability
PDF Full Text Request
Related items