Font Size: a A A

Multistability Analysis Of Recurrent Neural Networks With Non-monontonic Activation Functions

Posted on:2018-06-19Degree:DoctorType:Dissertation
Country:ChinaCandidate:P LiuFull Text:PDF
GTID:1318330515472354Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
Neural network is an information processing model with a structure similar to that of synaptic connections in the brain.Recently,neural networks have been successfully applied into many fields such as optimization problems,pattern recognition,associative memory,and so on.The dynamic behavior of recurrent neural networks is the basis for their successful applications.Different applications rely on different dynamic behavior of recurrent neural networks.In some applications,such as potimization problems,the neural network is often needed to have a unique asymptotically stable equilibrium point,implying that neural network is mono-stability.However in some application,such as associative memory,it may require that the neural network has multiple stable equilibrium points,implying that the neural networks is multistability.Hence,multistability analysis of recurrent neural newtworks becomes one of important research topics.The type of activation functions plays an important role in the multistability analysis of neural networks.The early works about this issue are based on the assumption that the activation functions are monotonic.It should be noted that the non-monotonic activation functions can reduce both the time required to learn to solve a given problem and the size of the resulting network compared with those with monotonic activation functions.In view of this,some researchers investigated the multistability of neural networks with nonmonotonic activation functions,almost all of which are based on piecewise linear assumptions.But there are also many neural networks with other notable types activation functions requiring that neural networks have multiple equilibrium points and their corresponding locally multistability.For example,the Mexican hat activation functions,Gaussian activation functions and sinusoidal functions.All of these activation functions share the common feature that they are both nonlinear in any open interval and non-monotonic.Therefore,in this dissertation,based on previous researches made by others,we investigate the multistability of recurrent neural networks with non-monotonic activation functions.The main contents of this dissertation are as follows:(1)The multistability analysis of a class of recurrent neural networks with non-monotonic activation functions and mixed time delays is studied.By virtue pf the properties of activation functions,several sufficient conditions are derived for ascertaining the extence of 3n equilibrium points by state space partition method and fixed point theorem.Then the sufficient criteria are presented for the exponential stability of 2n equilibrium points via comparation method.Furthermore,the attraction basins of these exponentially stable equilibrium points are estimated.(2)The multistability for a general class of recurrent neural networks with time-varying delays is addressed.Without assuming the linearity or monotonicity of the activation functions,several new sufficient conditions are obtained to ensure the existence of(2K+1)n(K ? 0)equilibrium points and the exponential stability of(K+1)n equilibrium points among them for n-neuron neural networks,where K is a positive integer and determined by the type of activation functions and the parameters of neural network collaboratively.Furthermore,the attraction basins of these exponentially stable equilibrium points are estimated.(3)The multistability analysis of delayed recurrent neural networks with Mexican hat activation function is explored.Some sufficient conditions are obtained to ensure that an n-neuron recurrent neural network can have 3k15k2 equilibrium points with 0 ?(k1+k2)? n and 2k13k2 of them are locally exponentially stable.Furthermore,the attraction basins of these stable equilibrium points are estimated.(4)Both the multistability and complete stability of delayed recurrent neural networks with Gaussian activation function are investigated.By means of the geometrical properties of Gaussian function,state space partition method and fixed point theorem,some sufficient conditions are obtained to ensure that an n-neuron neural network can have exactly 3 equilibrium points with k?n.By using iterative approach,it concludes that all the trajectories will converge to one of equilibrium points.That is to say,the neural networks are completely stable.Moreover,based on the comparation method,it is revealed that 2k and3k-2k equilibrium points are locally exponentially stable and instable,respectively.Finally,a summary for all discussions is proposed in the dissertation.Future works that related to this work are also presented.
Keywords/Search Tags:Recurrent neural network, Non-mono tonic activation function, Multistability, Complete stability, Attraction basin, State space partition
PDF Full Text Request
Related items