Font Size: a A A

Sparsification Of Hidden Nodes Of Neural Network

Posted on:2018-05-05Degree:MasterType:Thesis
Country:ChinaCandidate:S PengFull Text:PDF
GTID:2348330536460968Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
In this paper,we study the sparsification of BP neural network structure using L1/2 regularization method which is popular in recent years.We introduced the L1/2 regular terms in the traditional square error function and punish the L1 norm of weight vectors which connecting the input layer and the hidden layer in the process of training.Then,we construct a modified L1/2 regularization method.The modified L1/2 regularization method can get a more sparse neural network without a degradation in network's classification capacity and approximation capability.In addition,we did some comparison experiments between the smooth L1/2 regularization method and our modified L1/2 regularization method.The results showed that our modified L1/2 regularization method is convergent,sparse and superior.The article is divided into four chapters.The first chapter introduces the relevant theoretical knowledge and development of neural network.The second chapter introduces the regularization framework and several regularization methods and thus raises the L1/2 regularization method.The third chapter on the basis of the first chapter and the second chapter,introduces how to use regularization method for the improvement of BP neural network and concretely introduces the batch gradient method with L1/2 regularization.Then,we construct a modified L1/2 regularization method.Finally,in the fourth chapter,we conducted the numerical experiments and given the results.
Keywords/Search Tags:BP Neural Network, L1/2Regularization Method, Structure's Sparsification
PDF Full Text Request
Related items