Font Size: a A A

Research On BP Neural Network Learning Algorithm Based On Simplex Evolution

Posted on:2020-03-27Degree:MasterType:Thesis
Country:ChinaCandidate:Z LinFull Text:PDF
GTID:2438330599455722Subject:Signal and Information Processing
Abstract/Summary:PDF Full Text Request
Artificial neural network is an adaptive system that obtains the ability to process nonlinear data by using the biological nervous system as a mathematical model.It successfully solves many practical problems that cannot be completed by computers in the fields of pattern recognition,prediction and estimation,and automatic control.Common artificial neural networks include BP(Back Propagation)neural network,Radial Basis Function(RBF)neural network,Convolutional Neural Networks(CNN)and the like.Among them,the BP neural network topology is relatively simple,and the theory and application are developing rapidly,which has gradually become a research hotspot.The traditional BP neural network learning algorithm adopts the principle of gradient descent,and the error is transmitted from the back to the front.The error between the output value and the expected value of each neural unit is calculated,and the hidden layer weight is corrected layer by layer to achieve the minimum expected output error.The purpose of this method is that it has the disadvantages of slow convergence and easy to fall into local extremum.In response to this problem,many scholars have introduced intelligent optimization algorithms to improve their learning stages,such as Particle Swarm Optimization(PSO)and Firefly Algorithm.BP neural networks based on these algorithms usually have multiple control parameters and need Select the location of the initial point.If these parameters and the initial point position are not properly selected,it is difficult to search for the optimal neural network weight.Many related improved algorithms based on these algorithms can obtain better results,but these algorithms introduce more control parameters and increase the complexity of the algorithm.The Surface-Simplex Swarm Evolution(SSSE)is a new type of group intelligence optimization algorithm,which has better globality and learning efficiency.At the same time,the application of multi-role state search strategy maintains the diversity of the group.The fully random search mechanism makes the algorithm achieve a good fit in the optimization of the function target with multiple local optimizations,which improves the efficiency of the algorithm.Based on the above problems,this paper combines BP neural network with simplex neighborhood and multi-role evolution strategy optimization algorithm,and proposes a BP neural network learning algorithm based on simple evolution.The algorithm can quickly and uniformly converge the neural network to the minimum point of the error curve without relying on parameter selection,improve the correct rate and learning efficiency of the learning algorithm,and obtain higher recognition rate and convergence speed.In addition,the use of integrated learning and multi-classification coding strategies increases the reliability of classification and the generalization ability of neural networks.In this paper,the proposed algorithm is applied to supervised classification and face recognition.Compared with other methods,the proposed algorithm is effective for BP neural network.The optimized BP neural network has good robustness.And stability,has a certain development prospects.
Keywords/Search Tags:Neural network, Intelligent optimization, Random search, Evolutionary strategy, Learning algorithm
PDF Full Text Request
Related items