Font Size: a A A

BP Network Algorithm With L1/2Regularizer

Posted on:2015-09-03Degree:MasterType:Thesis
Country:ChinaCandidate:Q L ShangFull Text:PDF
GTID:2298330467484599Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
The most common learning algorithm in feedforward neural network is Back Propagation algorithm which also is known as BP network. Although BP network is widely used, there is one drawback that the number of the hidden nodes in neural network cannot be decided easily in the ways other than by experience. As we all know, the number of hidden nodes is very important to the construction of neural network. Although it may improve the learning accuracy when increase the number of neural network, it can also lead to the problem of overfitting, generalization performance decline, and learning efficiency reduction when the number of hidden nodes is too much; while it may take the risk of incompletely learning the training sample data when the number of hidden nodes is insufficient.Thus, for a given problem, it is desperately necessary to determine the suitable number of neural network hidden nodes, and we try solving this problem. We add L1/2regularization item to the error minimization model and build nonlinear L1/2regularization model. Based on iterative half thresholding algorithm of linear L1/2regularization model, we propose a thresholding algorithm to solve nonlinear L1/2regularization model, and prove its convergence under some certain condition. By using this thresholding algorithm, we get sparse solution of the corresponding neural network weight.First, this paper introduces the iterative half thresholding algorithm of linear L1/2regularization model, and explains its deduction process in detail. Then we give the thresholding algorithm which solves nonlinear L1/2regularization model and apply this nonlinear L1/2regularization model to a three layers BP network which contains only one hidden layer to approximate function. According to the proposed thresholding algorithm which solves nonlinear L1/2regularization model, we get sparse solution when given training sample data. If the network weight value is zero, that means the links are absent. If there is no link to a hidden node, then this hidden node is useless and can be delete. Based on this rule, we can get proper number of hidden nodes for a specific problem. The numerical experiment shows that the new model is better than traditional models. It performs well when the training sample data have noise and need less hidden nodes than traditional steepest descent method.According to the theory analysis and the results of numerical experiment, it can be considered that the proposed thresholding algorithm for nonlinear Ll/2regularization model can solve the number selection problem of hidden nodes in BP network.
Keywords/Search Tags:Neural Network, L1/2Regularization, Thresholding Algorithm, SparseSolution, Hidden Nodes, BP Network
PDF Full Text Request
Related items