Font Size: a A A

Research And Optimization Of Neural Network Based On Evolutionary Algorithm

Posted on:2019-01-18Degree:MasterType:Thesis
Country:ChinaCandidate:M H HeFull Text:PDF
GTID:2348330542997630Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
With the rapid development of intelligent computing,evolutionary algorithms and neural networks as an important part of intelligent computing have been more and more universal application.Among them,the applications of Back Propagation(BP)neural network and its improved model is more common.BP neural network is a feed-forward neural network that follows error propagation and training.It adopts the gradient descent method to learn,and constantly adjusts the weights and thresholds of network to minimize the network error.BP neural network has the characteristics of simple topology,non-linear processing ability and self-learning.Although BP neural network has many significant advantages,it still has some defects in the practical application:the network structure is difficult to determine,the network training parameters are difficult to set,the network weights are difficult to select,the training algorithm is difficult to select,network training accuracy lower,easily trapped in local minima and so on.In view of above problems existing in BP neural network,varieties of optimization methods and improved neural network models have been proposed.Among them,the application of evolutionary algorithms combined with BP neural network to optimize the performance has been paid more and more attention by scholars.However,most of these methods are only combine evolutionary algorithm with the back propagation algorithm,and ignore the optimization of the structure and algorithm of the neural network itself.In this dissertation,firstly,based on neural network training,the number of hidden layer nodes is selected according to the error value of each training of the network adaptively,which can avoid the blindness of manually setting the network structure.Secondly,in view of problem of the evolutionary algorithm combines with the neural network training easily trapped in local optimum,an improved genetic algorithm and an improved particle swarm optimization algorithm are proposed.Among them,mainly including improved genetic algorithm combined with neural network training to select the best network training parameters,and improved particle swarm optimization algorithm combined with neural network training to select the best weight matrix of the network,thereby improving the training accuracy of the neural network.Finally,based on optimized algorithms combined with BP neural network model,the IGAPSONN neural network model is constructed,and the proposed neural network model is applied in practical problems.The main contributions and novelties in this dissertation are as follow:(1)In view of the problem of the number of hidden layer nodes is difficult to select,according to the mean square error value of each training of the network,an algorithm of hidden layer nodes number selected adaptively is proposed.The algorithm considers the influence of the input nodes number and the output nodes number on the hidden layer nodes number,limits the range of hidden layer nodes number,and selects the number of hidden layer nodes adaptively,which minimizes the network error and avoids the blindness of manually setting.(2)In view of the problem of the learning rate and momentum factor are difficult to determine,an algorithm of an improved genetic algorithm combined with neural network training to select learning rate and momentum factor dynamically is proposed.In neural network training,the algorithm sets the chromosome structure according to the value range of the learning rate and the momentum factor,and the operation of multi-point crossover and multi-point mutation in the evolution process are proposed,which can make each individual get a complete evolution,so that network training parameters are dynamically selected,thereby avoiding the blindness of manually setting.(3)In view of inaccurately selecting problem of neural networks weights,an algorithm of improved particle swarm optimization algorithm combined with neural network training to select weight matrix dynamically is proposed.In network training,the algorithm proposes dynamic inertia weight and parameters restrained,which can make parameter selected adaptively and balances the local search performance and global search performance.Simultaneously,the improved particle swarm optimization algorithm combined with traditional differential evolution algorithm to make particles have mutation and crossover operation,and the operation of particles crossover in dynamic neighborhood is proposed,enhance the interaction between particles and reduce the aggregation between particles,which can enhance the ability of particles to jump out of the local optimal solution,enhance the diversity of particles,select precise weights dynamically and avoid the low training accuracy of neural networks.(4)In this dissertation,based on hidden layer node adaptive selection algorithm,dynamic selection algorithm of learning rate and momentum factor and improved particle swarm optimization algorithm.The IGAPSONN neural network model is constructed with BP neural network and the model is applied in practical problems.Compared with the current general neural network model,the simulation experiments verify the accuracy and validity of the proposed model from three aspects:training accuracy,correct rate and algorithm performance.
Keywords/Search Tags:BP neural network, Genetic algorithm, Particle swarm optimization algorithm, Algorithm combination, Model performance
PDF Full Text Request
Related items