Font Size: a A A

Study On Structural Optimization Of Fuzzy Neural Network

Posted on:2007-02-23Degree:DoctorType:Dissertation
Country:ChinaCandidate:F J AiFull Text:PDF
GTID:1118360185951629Subject:Computer software and theory
Abstract/Summary:PDF Full Text Request
Since advantages of Fuzzy Control and Neural Network are complementary, theresult of their combination, called as Fuzzy Neural Network(FNN), becomes primaryresearches in the domain of Intelligence Control nowadays. The typical FNN is a kindof network, called as Fuzzy Multi-layer Perceptron(FMLP). In FNN, the number offuzzy rule nodes directly determines the complexity and efficiency of the whole FNN.So optimizing the rules, i.e. simplifying the structure of the FNN, is desirable. Atpresent, most of structural simplification algorithms require retraining phase afterpruning, which wastes of time very much. So the major work of the thesis includesthe following issues:The thesis summarizes all kinds of structural simplification algorithms for FNNs.The Neural Network Self-configuring Learning(NNSCL) algorithm belongs to thekind of Statistical Method in the Pruning Method, and is a pruning structural learningalgorithm of multi-layer feedforward neural networks on the basis ofBack-propagation(BP) algorithm. By aiming at the disadvantage of the NNSCLalgorithm that is requiring retraining phase after pruning, the Improved NeuralNetwork Self-configuring Learning(INNSCL) algorithm is proposed. The proposedalgorithm modifies the correlative coefficient formula and the sample dispersivedegree formula, and uses the Generalized Inverse Matrix(GIM) algorithm to updatethe remaining weights while the remaining weights are linearly updated by themathematical statistics method in the NNSCL algorithm. The proposed algorithm canselect the best result by adjusting the four parameters C1 ,C2, γ1, γ2 and find theminimal number of rules with no need of retraining and preserving the overallbehavior of the network.The Iterative Pruning(IP) algorithm also belongs to the Pruning Method, and usesthe least-squares problem to update the remaining weights in order to maintain thenetwork outputs approximately unchanged, so that the network don't need retrainingphase after pruning. Because the IP algorithm spends much time computing adjustingfactors of the remaining weights, the Improved Iterative Pruning(IIP) algorithm is putforward, which adopts dividing blocks strategy and uses the Generalized InverseMatrix(GIM) algorithm to replace the Conjugate Gradient Precondition NormalEquation(CGPCNE) algorithm for updating the remaining weights, so as to enhancethe efficiency of simplifying the network very much.A new kind of FNN is constructed, which adds a few layers in the fuzzificationlayer of the typical FNN to form the membership functions of input variables that canbe adjusted during the training of the network. That makes the process of creating themembership functions very clear. The FNN can obtain the membership functions andthe rules automatically.The two proposed algorithms are applied in the rule-inference layer of theproposed FNN to optimize the rules, and expanded in the layers under therule-inference layer to adjust the membership functions of input variables, so that thestructure of the network can be simplified. They can also simplify the structures ofother kind of FNNs. Finally, several simulations were carried out over theno-person-driving small car model and the identification of a nonlinear system and apattern recognition problem. The simulation results demonstrate the effectiveness andthe feasibility of the proposed FNN and algorithms.
Keywords/Search Tags:Fuzzy Neural Network, Neural Network Self-configuring Learning(NNSCL) algorithm, Iterative Pruning(IP) algorithm, Conjugate Gradient Precondition Normal Equation(CGPCNE) algorithm, Generalized Inverse Matrix(GIM) algorithm
PDF Full Text Request
Related items