Font Size: a A A

Convergence Analysis On Recurrent Neural Network Model Via Group Lasso Regularization

Posted on:2022-02-04Degree:MasterType:Thesis
Country:ChinaCandidate:Y J HuFull Text:PDF
GTID:2518306491981469Subject:mathematics
Abstract/Summary:PDF Full Text Request
The recurrent neural networks are divided into the fully recurrent neural networks and the local recurrent neural networks according to the structure of the networks.The non-external nodes of the fully recurrent neural networks are completely connected,which are often used for combinatorial optimization and associative memory and so on.Local recurrent neural networks also add a layer of output nodes connected to the outside besides input nodes,which is widely used in complex system modeling and time series analysis,so that they have a strong nonlinear feedback capability.In order to improve the generalization ability of the networks,for these two models,we study the influence of the network weight on the generalization ability,and then introduce a recurrent neural network training method via group lasso regularization.However,it will cause oscillation behavior in algorithm training,since it's not derivative at origin point,and we effectively solve the problem of numerical oscillation by using the idea of smooth function approximation.Then,under suitable assumptions on the learning rate,penalization coefficients and smoothing parameters,we show the weak convergence and strong convergence of the new algorithm.Finally,the correctness of the theoretical results is verified by the results of numerical experiments.
Keywords/Search Tags:Recurrent neural network, Group lasso regularization, Smoothing approximation, Convergence
PDF Full Text Request
Related items