Font Size: a A A

Exponential Stability Analysis Of Discrete-Time Recurrent Neural Networks With Multiple Time-Varying Delaysby

Posted on:2023-10-17Degree:MasterType:Thesis
Country:ChinaCandidate:A D ZhangFull Text:PDF
GTID:2530307088466774Subject:Biomedical engineering
Abstract/Summary:PDF Full Text Request
Recurrent neural networks are an important branch in the development of neural networks,which have led to significant improvements in the accuracy of training sequential samples and have performed well in semantic analysis,machine translation and pattern recognition tasks.Also due to this structural feature,the unsuitable activation function for the model and the increase in network depth bring about the problem of gradient disappearance and gradient explosion at the data stability level.On the other hand,from a hardware implementation point of view,the time lag phenomenon occurs in all types of engineering systems and it is one of the most important factors affecting system stability and performance.The feedback mechanism based on the recurrent neural network itself and the chaos and oscillations generated during the implementation of the dynamical model make the model less effective,and the oscillations and time lags in multiple intervals generate a large amount of data redundancy,consuming time costs and resource computing power.To this end,this thesis investigates the exponential stability of discrete recurrent neural networks under multi-time-varying time lag perturbations,where the activation function is satisfied with low conservativeness,the existence of unique equilibrium points is discussed using the immobility point theorem and the free weighting matrix approach,and the Lyapunov-Krasovskii generalization function is constructed to analyse the stability of the neural network.First,a model of discrete-time recurrent neural networks with multiple time-varying delays and its stability conditions are proposed.Based on the possibility of multiple time-varying delays in the practical application of the RNN neural network model,it will satisfying different time-delays cases is constructed.Based on the non-linearity of the md-RNN,the global exponential stability of the neural network model is proved using Lyapunov-Krasovskii theory,and relatively less constraints for satisfying the stability are given.Secondly,the stability of discrete-time recurrent neural networks with multiple time-varying delays system is proved according to the stability theory.The multiple time-varying delays md-RNN system is constructed by introducing time-delays connection weight coefficients;On this basis,the Lyapunov-Krasovskii energy function is constructed to prove the asymptotic stability of the system,and a more general condition for its stability is given.The system function is simulated numerically using the LMI linear matrix inequality toolbox in the MATLAB software.The data from the linear simulations show that the system satisfies the stability results in finite time.Finally,a class of neural network models with multiple time-varying time lag intervals is considered based on the proof of the stability of stochastic time-lagged DRNNs.The stability criterion of the new neural network system is derived by more flexible setting of the feedback coefficients of the weight matrix of delayed connection coefficients in the system equations,combined with the Lyapunov energy function and the free weighting matrix method.Finally,the validity and superiority of the criterion is demonstrated by numerical arithmetic.
Keywords/Search Tags:discrete recurrent neural networks, exponential stability, multiple time elays, time-varying delays, Lyapunov-Krasovskii functional, Linear matrix inequalit
PDF Full Text Request
Related items