Font Size: a A A

The Research On L1Sparse Regularization And Adaboost Algorithms For Neural Networks With Random Weights

Posted on:2015-07-27Degree:MasterType:Thesis
Country:ChinaCandidate:X J WangFull Text:PDF
GTID:2298330431989069Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Artificial Neural Networks(ANNs) is a mathematical modelof neural network in theory, it has extensive applications. Compared withtraditional algorithms for ANNs, Neural Networks with Random Weight-s(NNRW) algorithm not only has fast training speed, but also good approx-imationperformance. Ithasbeenwidelyusedinvariousfields,somoreandmore people pay attentions to it.Firstly, we analyze the advantages and disadvantages of three differen-t sparse reconstruction algorithms-Orthogonal Matching Pursuit, IterativeShrinkage-Thresholding, Augmented Lagrange Multiplier in audio signalreconstruction, it’s beneficial to understand the sparse reconstruction algo-rithms.Then, we combine the ideas of sparse reconstruction algorithms andensemblelearningforNeuralNetworkswithRandomWeights. Weputfor-ward two efficient algorithms-1sparse regularization algorithm for feedforward Neural Networks with Random Weights and adaptive Neural Net-works with Random Weights algorithm. Neural Networks with RandomWeights is an effective Feed-forward Neural Networks (FNNs). Special-ly, the random choice of input weights and biases of networks largely im-proved the learning speed, and overcame some chanllenges faced by otherlearning technique. However, it had some shortcomings in calculating theoutputweightswhicharepoorstabilityandlargememoryconsumption. Tosolve these shortcomings we propose a new algorithm called Sparse Neu-ralNetworkswithRandomWeights(S-NNRW)algorithm. Weputforwardaniterativesolutioncombiningwithgradientprojectionalgorithm, thewayof parameters choice, and the termination criteria of iteration. The exper-imental results with comparisons indicate that our proposed algorithm has advantages in the large number of hidden neurons and training samples. Itcan not only avoid the over-fitting phenomenon but also have better stabil-ity.Finally, Neural Networks with Random Weights combines the idea ofensemble learning. We regard it as a weak learning classifier, given theiteration formula of distribution weights for training samples and error for-mula. The experimental results show that the algorithm has better trainingand testing accuracy than original algorithm for binary-classification andface recognition.
Keywords/Search Tags:Neural networks with random weights, sparse regularization, audio signalreconstruction, binary-classification, face recognition, handwritten numeral recognition, ensemble learning
PDF Full Text Request
Related items