Font Size: a A A

Stability analysis and control applications of recurrent neural networks

Posted on:2003-09-28Degree:Ph.DType:Thesis
University:Chinese University of Hong Kong (People's Republic of China)Candidate:Hu, San-qingFull Text:PDF
GTID:2468390011978322Subject:Engineering
Abstract/Summary:
Because of the massively parallel distributed nature of neural computation and the very fast convergence rate, recurrent neural networks are widely applied to solve many problems in the fields of optimization, control, signal processing, system identification, and so on. Hence, it is an important work to take stability analysis of existing recurrent neural networks and develop new recurrent neural networks to solve some practical problems. This thesis contains two main topics: Stability analysis of recurrent neural networks and control system synthesis using recurrent neural networks.; New stability results are presented for different recurrent neural networks including two general classes of continuous-time ones and two general classes of discrete-time ones. Since guaranteeing global asymptotic stability (GAS) and global exponential stability (GES) is essential for many applications, GAS and GES analyses of neural networks are important work in order to expand the application domain of these neural networks.; For the first type of continuous-time recurrent neural networks, we give three sufficient conditions for GAS. These testable sufficient conditions differ from and improve upon existing ones. We extend an existing result from GAS to GES and also extend some existing GES results to more general cases with less restrictive connection weight matrices or/and with partially Lipschitz activation functions. We also present some new results on absolute exponential stability (AEST) for the neural networks with locally Lipschitz continuous and monotone nondecreasing activation functions. Three necessary and sufficient conditions for AEST of the neural networks with symmetric (or noninhibitory lateral) connection weight matrices or two neurons are given.; For the second type of continuous-time recurrent neural networks, we investigate GAS and GES. After introducing a necessary and sufficient condition for existence and uniqueness of the equilibrium of such a neural network, we first present two sufficient conditions to guarantee GAS of the neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions. We then give two GES results for the neural networks whose activation functions may or may not be monotone nondecreasing. With globally Lipschitz continuous and monotone nondecreasing activation functions, we provide a Lyapunov diagonal stability result that guarantees GES of the neural networks without the nonsingularity requirement for the connection weight matrices. In particular, this Lyapunov diagonal stability result generalizes and unifies all the existing GAS and GES results. Moreover, two better exponential convergence rates are obtained. (Abstract shortened by UMI.)...
Keywords/Search Tags:Neural networks, GES, GAS, Stability, Monotone nondecreasing activation functions, Connection weight matrices, Existing, Lipschitz continuous and monotone nondecreasing
Related items