Font Size: a A A

Improvement Of A Backpropagation Training Algorithm For Feedforward Neural Networks

Posted on:2011-12-17Degree:MasterType:Thesis
Country:ChinaCandidate:K Y JiaFull Text:PDF
GTID:2178330332461367Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Feedforward neural networks (FNN) have been widely used in many areas. The most commonly used method to train the FNN is based on backpropagation. The major problems associated with the backpropagation training algorithms are slow convergence for complex problems and local minima entrapment. Many variations have been proposed to improve the performance of the basic algorithm, and new ones continue to appear regularly. The improvement of backpropagation is always an important research domain of neural networks.The organization of the paper is as follows, and some background information about neural networks is reviewed in Chapter one.Taking BP neural networks for example, the second chapter introduces feedforward neural networks and the backpropagation algorithm. This chapter also especially introduces a variation of backpropagation, which is called MBP.In the third chapter, a new learning approach for single hidden layer feedforward network is proposed. This approach trains the output layer and the hidden layer separately, and improves an existing modified backpropagation training algorithm[28]. It devises a new optimization criterion to train the hidden neuron, and provides the method to find fictitious teacher signal for the output of each hidden neuron, in the case of multiple output neural networks. Simulation results of the circle in the square problem, the function approximation problem and the pattern recognition problem show the effectiveness of the proposed algorithm.
Keywords/Search Tags:Linear Error, Modified Standard Backpropagation, Error Functi- on, Single Hidden Layer Network, Nonlinear Error, Teacher Signals
PDF Full Text Request
Related items