In this paper, we study the global asymptotic stability and exponential sta-bility of neural networks with variable delays described by the delayed differential equations as follows:(1) x(t)=-Cx(t)+Af(x(t))+Bg(x(t-τ(t)))+Dx(t-τ(t))+b,(2) x(t)=-Cx(t)+Af(x(t))+Bf(x(t-τ(t)))+Dx9t-τ(t))+b, and stochastic delayed recurrent neural networks with time varying delays: dx(t)=[-Cx(t)+Af(x(t))+Bf(x(t-τ(t)))+b]dt+σ(x(t), x(t-τ(t)))dω(t). wheref(x(t))is the activation functions of the neurons, variable delaysτ(t)is non-negative, bounded, and differentiable, satisfy:0<τ(t)≤τ<∞,τ(t)≤η<1.In this paper, the stability of different neural network models is analysed by making use of Lypaunov function. It can be devided into three chapters.The first chapter introduce the research background, significance and related research progress of the artificial neural network research. We also simply introduce the main work and some basic theory.The second chapter, we prove the global asymptotic stability and exponentially stability of the solution through the analyse of the model involving the characteristic of neural networks of neutral-type with time-varying delays. In the end of each section, we all test the theories through numerical simulation.Finally, we discuss delay-dependent asymptotic stability for stochastic delayed recurrent neural networks with time varying delays. We obtain new result fo global asymptotically stability through constructing different Lyapunov functional, using Ito Formula and linear matrix inequality skills. |