Font Size: a A A

Research On Improvement Of Differential Evolution Algorithm

Posted on:2013-01-20Degree:DoctorType:Dissertation
Country:ChinaCandidate:P GuoFull Text:PDF
GTID:1118330362461061Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Differential Evolution algorithm is a kind of evolutionary algorithm based on population difference, it finds the solution of optimization problem through the cooperation and competition between individuals. Compared to Genetic Algorithm, Differential Evolution algorithm is easy to implement and has powerful search ability, which make it suitable to find optimal solution in high dimensional non-linear problems. Similar to other evolutionary algorithms, Differential Evolution algorithm convergents slowly and be trapped in the local optimal.In this dissertation, the research involves the improvement of evolutionary algorithm search space, configuration of evolution control parameters and hybrid Differential Evolution algorithms. Main focuses in this dissertation include:(1)Evolution operation of standard Differential Evolution algorithm involves only one population, different from standard Differential Evolution, a Differential Evolution with Secondary population(DESec) is proposed. In selection section, unselected individuals are put into Secondary population, individuals in main population and Secondary population evolve simultaneously and selection happens among individuals from main population, Secondary population and parental individuals. Experimental results of benchmark functions show the Secondary population can improve Differential Evolution's search ability and its precision, the convergence of DESec is faster than standard Differential Evolution. To solve the Clustering analysis problem, a Summary Weighted Validity Function(SWVF) is proposed to analyze middle results in Clustering. A Clustering algorithm based on DESec(CDESec) is presented, with the excellent global search ability of DESec, the CDESec search the global optimal solution. Experimental results show better performance of CDSec.(2)To solve the randomicity of parameters self adaption in jDE algorithm, self parameters control method (SelfDE-F) is presented. Control parameters in SelfDE-F are adapted according to the comparison between fitness of children and its parents. Experimental results of benchmark functions show SelfDE-F's superiority, this algorithm can improve performance of Differential Evolution efficiently. SelfDE-FBPNN algorithm that uses SelfDE-F in training weights of Back Propagation(BP) Neural Network is presented and the optimized weights of BP Neural Network is used in configuration of PID control parameters, experimental results show this method is more efficient than BP Neural Network, BP Neural Network optimized with standard DE and BP Neural Network optimized with jDE.(3)Different from the individual coding with real number of standard Differential Evolution, binary individual coding based on 0/1 matrix is presented and binary parameter self-adaptive Differential Evolution with logical operation AND, OR, XOR is proposed. In this section, two Bayesian Network learning algorithms is presented: Evolutionary MCMC Bayesian Network learning algorithm(EMCMCBN) and binary Differential Evolution Bayesian Network learning algorithm(BINDEBN), Bayesian Networks from different individuals exchange information in learning process. Experimental results show that BINDEBN has less dependency on original datasets and be more efficient.(4) To solve parameter learning and inference problem of Gaussian Mixture model, an approach based on population is presented. Gibbs sampling is applied in every individual and information exchange happened between individuals with Differential Evolution, MCMC method is used to sample the exchanged results. This method is applied to probability learning of Motifs, experimental results show that standard deviation of learning results will decrease with population size increasing.
Keywords/Search Tags:Differential Evolution, Secondary Population, Clustering Analysis, Parameter Self-adaption, Optimization of Neural Network, Bayesian Network Learning, Gaussian Mixture Learning
PDF Full Text Request
Related items