Font Size: a A A

Research Of Mutual Learning Neural Network Training Method

Posted on:2017-11-11Degree:MasterType:Thesis
Country:ChinaCandidate:S LiuFull Text:PDF
GTID:2428330548977839Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
After more than 30 years development,Artificial neural network has made brilliant achievements in theory and application.Given specified network structure,how to choose network weight updating rule and training method,to solve the problems of slow convergence,local optimization and low generation ability,remains a hot button issue of neural network research.To accelerate convergence,we propose Sub-batch weight updating rule of BP neural network.By splitting up training data in different groups,The Sub-batch training method can make use of matrix operation to speed up calculation,besides which can also combine the advantage of data similarity and variation.By experimental comparison,the Sub-batch have the advantage of converge fast and stable,select optimal batch size and learning rate parameters can dramatically shorten training time and also improve classification accuracy.To improve generation ability,we apply the concept of mutual learning into training process of neural network.Firstly,we create two networks with symmetrical network structure,the positive network and negative network,whose connection weight matrices are transposed,then we can share the matrices values during training and bi-directionally update connection weight matrices.By experiment validation of UCI datasets and big image datasets,it can be found that the training method of mutual learning converge rapidly and stably,which also has better generalization ability than traditional BP training method.
Keywords/Search Tags:neural network, sub-batch weight update, mutual learning, weight sharing
PDF Full Text Request
Related items