Font Size: a A A

Study Of The Dynamical Behavior Of Feedback Neural Networks

Posted on:2003-10-07Degree:DoctorType:Dissertation
Country:ChinaCandidate:Q ZhangFull Text:PDF
GTID:1118360095451184Subject:Circuits and Systems
Abstract/Summary:PDF Full Text Request
The study of the dynamical behavior of feedback neural networks has been an important topic in the neural-network field since the Hopfield network model was proposed in 1982 which demonstrated the great potential of the model in applications of optimization, associative memory, signal processing, image processing, pattern recognition, etc., to name just a few. For the most applications of the network, the underlying qualitative properties of the network model are global stability of the network equilibrium point, oscillation and chaos. For this reason, theoretical study of neural dynamics has advanced rapidly in recent years. In this dissertation, we investigate the stability, bifurcation and chaotic phenomena of feedback neural networks including Hopfield network and cellular neural networks (CNNs). The main contributions of this dissertation are summarized as follows:For the exponential stability of neural networks, many existing results are related to local exponential stability. Since the local exponential stability of a nonlinear system is equivalent to that of its linearized system, it can be easily obtained. But this conclusion does not hold for global exponential stability (GES). Based on Lyapunov energy function, we establish three sufficient conditions which guarantee global exponential stability of CNNs. We also study the asymptotic behavior of CNNs by means of LaSalle invariant principle. Some convergent criteria are derived which improve the available results in the literature.We analyze global stability of a class of neural networks with discrete delays under three assumptions of activation functions. Lyapunov functionals are constructed and employed to obtain sufficient conditions for global asymptotic stability (GAS) in dependent of the delays. Global exponential stability theorems are given by using a method based on delay differential inequality. The method is simple and straightforward in analysis, without resorting to any Lyapunov functionals. Furthermore, an estimate of exponential convergence rate can be obtained at the same time. Some existing results via Lyapunov functionals method and linear analysis are found to be special cases of the presented result. In the meantime, two different approaches, Lyapunov functionals and functions (combined with the Razumikhin conditions), are employed to investigate the global attractivity of the delayed model. The obtained results depend on the magnitude of delays.In general, neural network has a spatial nature due to the presence of an amount ofparallel pathways of a variety of axon sizes and lengths. It is desirable to model them by introducing distributed delays. We present some new conditions ensuring global exponential stability of the equilibrium point by utilizing Lyapunov functionals and constant variant method. Compared with those previously given in the literature, our result can be applied to a wider class of networks. In addition, we also study the stability of neural networks involving both discrete and distributed delays.Besides stability, bifurcation and chaos in neural networks have receiving much attention recently. In this dissertation, we propose two neuron models with chaotic dynamics, which constitute chaotic neural networks that encompassed various associative and back-propagation networks. In order to apply these models to solve optimization problems, we design a parameter to control the networks dynamics. As the parameter is gradually reduced, a reverse bifurcation process results. The process starts with an unstable phase for searching global minima, followed by a stable, convergent phase. Two representative examples of function optimization are given to show the higher efficiency. Further, we investigate a first-order neural network model with a discrete delay. The linear stability of this model is discussed. It is found that Hopf bifurcation occurs when the delay is considered as bifurcation parameter. As the delay increase, chaotic behavior has been observed in computer simulation. Some waveform diagrams, p...
Keywords/Search Tags:Feedback neural networks, delay-independent stability, delay-dependent stability, Lyapunov functions(functionals), delay differential inequality, Razumikhin condition, Hopf bifurcation, chaos.
PDF Full Text Request
Related items