Font Size: a A A

A Study For Improving Generalization Performance Of Multilayer Feedforward Neural Network

Posted on:2006-03-08Degree:MasterType:Thesis
Country:ChinaCandidate:W N WangFull Text:PDF
GTID:2178360182965454Subject:Control theory and control engineering
Abstract/Summary:PDF Full Text Request
Researches on Neural Network have been developing rapidly in recent years. Thefeatures of the neural network display its great capacity in solving highly nonlinear andindefinite control system. It has important significance in all kinds of science areas. Inneural network models, multilayer feedforward neural network is a kind of network,which has been used in many science fields such as function approximation, imageprocessing, pattern recognition and self-adaptive control etc.In most actual application, the critical issue for using a neural network isgeneralization: how well will the network make predictions for samples that are not inthe training set. The network without generalization performance is useless. In thisthesis the basic concepts of neural network are briefly introduced, and then the keypoints of effecting the generalization of neural network are analyzed systematically. Thefactors such as training sample, net structure and training method are particularlydiscussed from the aspects of basic concept, influence mechanism and fundamentalmethod. Take the BP algorithm for example, which is widely used in multilayerfeedforward neural network, based on the concept of random expanding training samplesets, training with noise and optimizing structure, several methods for improving thegeneralization performance of neural network are brought forward, then the test casesshow the effectivity of the proposed method. The main results and contributionsproposed in the thesis are as follows:Firstly, the noise influence on the generalization of neural network is analyzedparticularly. Based on the method of random expanding training sample sets, a newapproach with Kullback-Leibler information measure is proposed to estimate theparameter of samples. The simulation results demonstrate the precision of statisticalcharacterization for the new samples is improved and the generalization performance ofneural network used the proposed methods is changed for the better.Secondly, according to the new samples mentioned before, a novel trainingalgorithm with a regularized term of Kullback-Leibler information measure is affordedto avoid the illness of the noise in the target values.Thirdly, to optimize the structure of neural network, a pruning method withsensitivity and pseudo-entropy of weights is in used. The simulation results show themethod is effective in cutting down the redundant weights and reducing the freedom ofneural network, and then the generalization performance is improved in turn.Fourthly, aimed at the deficiency in the pruning method, e.g. intensive computationand low efficiency, a fast constructive algorithm is put forward. Based on the CC(Cascade-Correlation) algorithm, a new neural network is constructed during trainingnot beginning with a simplest structure but with a proper one. The simulation resultspoint that this fast constructive algorithm is a better choice in terms of convergence rate,computational efficiency and even generalization performance.
Keywords/Search Tags:multilayer feedforward neural network, BP algorithm, generalization performance, Kullback-Leibler information measure, randomly expand training sample sets, algorithm of training with noise, optimize structure, pseudo-entropy of weights, pruning method
PDF Full Text Request
Related items