Font Size: a A A

Analysis And Optimization Of Several Neural Networks And Their Applications

Posted on:2013-11-03Degree:DoctorType:Dissertation
Country:ChinaCandidate:L ZhangFull Text:PDF
GTID:1228330395957248Subject:Signal and Information Processing
Abstract/Summary:PDF Full Text Request
Artificial neural networks (ANNs) are one of the important branches in the artificialintelligence fields. ANNs are the nature-based computing techniques which have beenapplied widely in tasks such as controls, prediction, optimization, system identification,signal processing and pattern recognition, etc. This dissertation is focused on analysisand optimization of some important ANNs including evolutionary ANNs,Integrate-and-Fire ANNs and cellular ANNs, as well as their applications in functionapproximation, pattern recognition, data classification and image processing. The mainworks on these subjects can be summarized as follows:1. For solving unconstrained global optimization problems, a differential evolution(DE) with orthogonal crossover and local search is proposed. In the proposed algorithm,the Gaussian mutation and orthogonal crossover are combined with the DE mutationand DE crossover operators, respectively. The simplified quadratic interpolation is thentaken as the local search operator of the proposed algorithm. The simulationexperiments on20benchmark functions are carried out. The comparisons with otherDEs show its effectiveness and superiority.After the structures of feedforward ANNs are determined by trial and error, a hybridtraining algorithm, in which the modified DE algorithm is combined with theLevenberg-Marquardt method, is proposed to optimize the network weights and bias.The evolutionary ANNs are used in function approximation, pattern classification andrecognition.2. For solving the nonlinear optimization problems involving binary and realvariables, a cooperative binary-real differential evolution is proposed. In this algorithm,a binary-real mixed encoding is used, and the XOR logic operation is introduced in DEmutation to deal with binary variables, as well as the orthogonal crossover is combinedwith DE crossover. Some well-known benchmark problems are used to validate itsefficiency. The proposed algorithm performs well, and the results obtained are verycompetitive when comparing the proposed algorithm against other existing algorithms.To optimize simultaneously structure and weights of feedforward ANNs, a two-stagetraining algorithm is formed by combining the modified cooperative binary-real DEwith the scaled conjugate gradient method. The evolutionary ANNs are used in functionapproximation and pattern classification.3. For solving the nonlinear discrete optimization problems with binary and (or)integer variables, a cooperative binary-integer differential evolution is proposed. In theproposed algorithm, a binary-integer mixed encoding is used, and the XOR logicoperation is introduced in the DE mutation to deal with binary variable, as well as theorthogonal crossover is combined with the DE crossover. Some numerical examples areused to validate its effectiveness. The results obtained are also compared against those of other existing algorithms.To optimize simultaneously structure and integer weights of feedforward ANNs, thecooperative binary-integer DE training algorithm is proposed. The evolutionary ANNsare used in function approximation and pattern classification.4. To study the Integrate-and-Fire (IF) model, a new IF model is presented, and therelation between input and output is given, in which each of the nerve cells restrains theothers nearby them. Although this model has been simplified greatly, it characterizesmany aspects of real neurons. Especially, it is comparatively good that the modelmatches the nonlinear performance of the synaptic connection. We develop the previousmechanism and use the asynchronous firing mechanism, which makes the network moreflexible.The effect of an exponentially decaying threshold on a white-noise driven IF neuronis studied, especially on the mean and standard variance of the interspike interval. It isshown that for slow threshold decay the IF model shows a minimum in the coefficientof variation whenever the firing rate of the neuron matches the decay rate of thethreshold. This novel effect can be seen if the firing rate is changed by varying the noiseintensity or the input current. The errors are analyzed, which is associated with resettingthe potential following a spike in simulations of IF neural networks.5. A cognitive model is presented with classical conditioning behaviors. The modelcomprises a number of IF neurons connecting to form a neural network with reflex arcstructure, which made it fully exhibit the dependency of classical conditioning ontiming. The simulation results show that the model can successfully simulate manytypical experiments such as acquire, extinction, inter-stimulus effects, block andsecondary conditioning.6. A new model of cellular neural network with transient chaos is proposed, in whicha negative self-feedback is introduced into a cellular neural network after transformingthe dynamic equation to discrete time via Euler’s method. The simulation of singleneuron model shows their characteristics of bifurcation and chaos. In the optimizationproblem, the model gradually approaches, through a chaos search by the course ofreversed period-doubling bifurcations, to a dynamical structure similar to the Hopfieldneural network which converges to a stable equilibrium point. As the model has richdynamics such as randomicity, it can be expected to have robust search ability for globaloptimal solutions. The simulation results on two examples of function optimizationshow that the neural networks are efficient.
Keywords/Search Tags:Evolutionary neural network, Integrate-and-Fire neural network, Cellular neural network, Differential evolution, Orthogonalexperimental design, XOR logic operation, Levenberg-Marquardtmethod, Conjugate gradient method, Function approximation
PDF Full Text Request
Related items