Font Size: a A A

On The Global Exponential Stability And Convergence Of Neural Networks With Time-varying Delays

Posted on:2012-12-11Degree:MasterType:Thesis
Country:ChinaCandidate:X G ZouFull Text:PDF
GTID:2180330395473508Subject:Applied mathematics
Abstract/Summary:PDF Full Text Request
In recent years, neural networks are widely used. Variety of neural network has been extensively studied. We have done a lot of stability conditions derived. In this paper, we discuss the continuous neural networks with time-varying delays and random neural networks with time-varying delays.In chapter1, we basic overview of the neural network, Brief introduction of the basic neural network model and simple explanation of some of the current research status.In chapter2, we discuss the global exponential stability of neural networks with time-varying delays. Based on the Lyapunov stability theory and linear matrix inequality (LMI) technique, several new sufficient conditions guaranteeing the global exponential stability of the equilibrium point are obtained. In addition, the maximum bounds of the time delays and the convergence rate can be estimated. These results are compared with some existing results and it turns out that our conditions are less conservative than some results in the literature.In chapter3, we using Lyapunov-Krasovskii functional and the linear matrix inequality (LMI) approach the global asymptotic stability of stochastic recurrent neural networks with multiple discrete time-varying delays and distributed delays is analyzed. A new sufficient condition ensuring the global asymptotic stability for delayed recurrent neural networks is obtained in the stochastic sense using the powerful Matlab LMI toolbox.
Keywords/Search Tags:Neural network, Global asymptotic stability, Linear matrixinequality, Multiple time-varying delays, Lyapunov-Krasovskii function, Random recursive, Infinite Distributed Delays
PDF Full Text Request
Related items