Font Size: a A A

Research On Dynamical Behavior Of Stochastic Recurrent Neural Networks

Posted on:2011-07-02Degree:DoctorType:Dissertation
Country:ChinaCandidate:S ZhuFull Text:PDF
GTID:1118330332967973Subject:Systems Engineering
Abstract/Summary:PDF Full Text Request
Artificial neural network is an information processing network, which simulates the behavior of human brain. According to the different connection methods of neurons, the networks can be divided into two types:feedforward neural networks and recurrent neural networks. Relation between the input and output of the former type is static, which cannot present the dynamic characteristics of the neuron system accurately. The later type can reflect the dynamic characteristics of nervous systems and store information. And we call this kind of networks as recurrent neural networks. In the practical application of neural networks, for there exists the random fluctuations and some other probability factors in the neuronal transmission, the axonal transmission, in fact, is a process with noises. Therefore, the dynamic behavior of stochastic recurrent neural networks becomes the focus of neural networks and the front of the academic.On the other hand, in the practical application of recurrent neural networks, due to the unpredictable of failure, the weight of various parameters such as weight-connection or deviation would suddenly change. In this case, the neural network model can be seen as a neural network module switching from one to another according to the given rule. Most people studied the neural networks with Markovian switching. Therefore, we must research the dynamic behavior of neural networks with Markovian switching.There is a variety of materials about the recurrent neural networks, but very few about the stability of stochastic recurrent neural networks with Markovian switching. As we all know, noise can stabilize and also destabilize a system, and produce a continuous oscillation and bifurcation for regression neural networks. Besides, it is able to stabilize a non-periodic attractor. In this way it turns a non-chaos system into a chaos one. However, very little literature considers the impact of noise on the solution of neural networks with Markovian switching. And how much can the noise impact? The physical meaning of passivity is that it requires the system to absorb the energy from the outside provide their own energy surplus. Its essential feature is the ability to maintain the internal stability. About the passivity of recurrent neural networks, research papers are quite few as well. As far as I know, almost no one have considered the passivity or exponential passivity of neural networks with Marko-vian switching, fewer people have considered the connection of passivity and stability of neural networks.Based on the above considerations, we use Lyapunov stability theory, the matrix the-ory, inequality, nonnegative semimartingale convergence theorem as the main mathematical tools to study the dynamical behavior of stochastic recurrent neural networks. The main content and innovations are as follows:We discuss about the robustly asymptotical stability of the model of stochastic recur-rent neural networks with interval discrete and distributed delays. Meanwhile, we make a discussion about the exponential stability of uncertain stochastic recurrent neural networks with Markovian switching. Besides, delay-dependent and delay-independent sufficient sta-bility conditions are established, based on Lyapunov stability theory and the nonnegative semimartingale convergence theorem. The results herein include those existing results of stochastic recurrent neural networks without Markov chain as special cases.We study of the impact of noise on the solution of neural networks with Markovian switching, we present that noise can suppress the exponential growth, even when the mode of the Markov chain cannot be observed. On the other hand, we also present that as long as the dimension of neural networks is greater than one, noise can express exponential growth. We can also select the format of noise as the linear of neuron state directly.We study the passivity of recurrent neural networks and stochastic recurrent neural net-works with Markovian switching. Through applying Lyapunov function and free-weighting matrix, we get delay-dependent and delay-independent passivity criteria in terms of linear matrix inequalities. Moreover, we give the exponential passivity definition and exponen-tial passivity condition for recurrent neural networks. Then, we extend the study based on two types of uncertainties:time-varying parameter uncertainty and Markovian switching type uncertainty. The results herein contain existing results of stochastic recurrent neural networks without Markov chain as special cases.We investigate the relation between passivity and stability of stochastic recurrent neural networks. In order to make the networks passive, we get the proper output by modification of the connection weighted matrix. Meanwhile, we prove that the passive stochastic recurrent neural networks also satisfies a nonlinear version of the Kalman-Yakubovich-Popov(KYP) criterion. Preliminary discussion about the relation between this two characteristics are given.These studies reveals the essential dynamic behavior characteristics of stochastic recurrent neural network, enrich the stochastic recurrent neural network theory and provide reliable basis for the implementation and application of artificial neural network circuit. They have not only important theoretical value but also important application value.
Keywords/Search Tags:Recurrent neural networks, Delay, Markov chain, Stability, Exponential growth, Polynomial growth, Passivity, Linear matrix inequality
PDF Full Text Request
Related items