Font Size: a A A

The Study Of Self-Organization Modular Neural Network Architecture Design

Posted on:2014-01-28Degree:DoctorType:Dissertation
Country:ChinaCandidate:Z Z ZhangFull Text:PDF
GTID:1228330392960103Subject:Pattern Recognition and Intelligent Systems
Abstract/Summary:PDF Full Text Request
From the point of view of bionics, this dissertation has done some researcheson the issue of self-organizing modular neural network architecture design based onthe principle of brain function partition and brain-like information processing theory,including the issues of sub-network structure self-organizing, dynamicdecomposition of the complex task, and dynamic integration the sub-networks. Themajor contributions of this dissertation are specifically stated as follows:1. A connection weights decay algorithm for feedforward neural networkarchitecture design is presented. An improved pseudo-entropy of the hidden nodes isdefined based on the Shannon’s entropy, both of the two different definitions ofentropy have approximately the same effect on the uncertainty, but the newdefinition of entropy overcomes the inherent drawbacks of Shannon’s entropy. Thecross-entropy of neural network’s actual output, target outputs and thepseudo-entropy of the hidden nodes outputs are used as cost function, using entropycycle strategy to optimize the parameters of the networks, and a simplify neuralnetworks structure can be obtained by deleting the redundant hidden nodes at last.2. In order to overcome the problem that the traditional pruning algorithm musttrain the objective function to a local minimum, a pruning algorithm based on theneural complexity is proposed. In this algorithm, the entropy of the neural networkis calculated by the standard covariance matrix, and the complexity of the neuralnetwork can be acquired. In the premise of ensuring the information processingcapacity of neural network, the least important hidden node is deleted. Because thealgorithm is only concerned with the characteristics of the internal connections viathe neural network, and regardless of the external input information, the algorithmdoes not require to train the objective function to a local minimum.3. For most algorithms apply a greedy strategy in designing the structure of theartificial neural networks which are susceptible to become trapped into local optimalstructure, an adaptive algorithm for designing optimal feedforward neural network isproposed. Based on the fact that the output of the output node is a linearcombination of the output of the hidden nodes and the neural network learningdynamic, an optimization strategy is adopted to merge and split the hidden nodes.When there are redundant hidden nodes in the neural network, they can be mergedbased on the mutual information criterion, when the learning capacity of the neuralnetwork is inadequate, a hidden node will be splitted to improve the learningcapacity of the network. Experiment results show that the algorithm can achieve thepurpose of design optimal neural network structure. 4. Aiming at overcome the defects of long training time and falling easily intolocal minimal in single neural networks, a multi-hierarchical cooperation modularneural network is presented. Its structure has the hierarchical character, sample datawere detached by fuzzy clustering method firstly, and neural network could bepartitioned to several sub-nets based on the clustering results. The connectionweights were elicited by solving equations. For a given input data, somemulti-modules were selected to deal with it. The approximating performance wasimproved by combining divide-and-conquer and learning ensemble. A sub-netselecting method was designed based on distance measure. Simulation resultsdemonstrated that the multi-hierarchical cooperation modular neural network canheighten approximation ability effectively to complicated problems, and the trainingtime is faster than single back-propagation neural network.5. For the fully coupled BP neural network suffers from many problems to solvelarge scale and complex tasks, a local coupled modular neural network is proposed. Itemploys the physical characteristics of the RBF neurons to decompose the inputsample space, and distribute different sub-samples space data to different sub-areas tolearn automatically. Compared with the fully coupled BP neural network, thesearching space of weights in the learning process of neural network is reduced, thelearning speed and the generalization performance of the neural network are improved.In addition, the issue of the neural network architecture design is considered in thisalgorithm, the size of the sub-nets can be adjusted according to the learning task.Experiments results shows that the local coupled neural can solve many problemsbetter than fully coupled BP neural network.6. The structure of the traditional modular neural network is fixed when dealingwith the time-varying systems, thus, an online self-organization modular neuralnetwork architecture design algorithm is presented. It uses the physical characteristicsof the RBF neurons to decompose the input sample space, and an improved onlinesubtractive cluster algorithm is adopted to identify the centers of the RBF neuronsonline. The structure of the modular neural is capable of growing or mergingsubnetworks to maintain a suitable model complexity as the centers of the RBFneurons can adjust dynamically according to the changing environment. In order toimprove the learning speed, a fuzzy strategy is used to select parts of suitablesubnetworks to learn the task, and the structure of subnetworks can also self-adaptiveadjusted during the learning process. The experiment results of the benchmark andreal-world time-varying systems show that the proposed strategy is very suitable tohandle time-varying problems.
Keywords/Search Tags:Modular neural network, architecture design, self-organizing, informationentropy
PDF Full Text Request
Related items