Font Size: a A A

Research And Forecasting Applications Of Neural Network Parallel Optimization Based On Beetle Antennae Search Algorithm

Posted on:2021-03-17Degree:MasterType:Thesis
Country:ChinaCandidate:J WangFull Text:PDF
GTID:2428330605982505Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Prediction problems are usually very complicated and affected by many potential factors,and traditional prediction models have limitations in considering complex non-linear relationships,which often makes prediction problems difficult to solve.Since BP neural network has the ability to model transaction features and extract hidden relationships between things,it can be used as an alternative to traditional prediction models in a correct way,and unlike traditional prediction models,BP neural network imposes no restrictions on input and residual distribution.However,BP neural network also has some shortcomings,such as slow convergence speed,and easy to fall into local minimums,which makes it difficult to build accurate prediction models.In view of this situation,this paper studies the methods of optimizing BP neural network,including:(1)An intelligent optimization algorithm(BASOW)is proposed to optimize the neural network.This algorithm mainly introduces Beetle Antennae Search(BAS)for training during the neural network weight training phase,which is a training method instead of error back propagation.In view of BAS algorithm's global optimization ability and individual optimization strategy,BASOW algorithm improves the training efficiency of neural network,and in the application of population prediction,this algorithm shows a good prediction ability.(2)When the structure of neural network becomes complex and the number of training parameters increases,the training complexity of BASOW algorithm will also increase,and it will become difficult to complete the training task of neural network.In order to solve this problem,this paper proposes an improved BASOW algorithm(iBASOW).This algorithm uses the idea of cooperative research to fuse the BAS algorithm and BP neural network into a separate system and let the former be used in the initial weight optimization,the latter is used for weight training.(3)Since both BASOW and iBASOW algorithms only train for a single network,this single training method may cause fluctuations in prediction accuracy.In order to further improve the prediction stability of neural network,based on the proposed iBASOW algorithm,an improved parallel neural network optimization algorithm(P-iBASOW)based on Spark is proposed.This algorithm is guided by the idea of integrated learning and supported by Spark distributed computing framework,and trains the improved neural network model by means of data parallelization.By testing in 4 real data sets and comparing the prediction performance of BP and the other algorithms,it can be found that the parallel optimized iBASOW algorithm,namely P-iBASOW algorithm,has better prediction ability and prediction stability.
Keywords/Search Tags:artificial neural network, beetle antennae search algorithm, distributed computing, regression prediction
PDF Full Text Request
Related items