Font Size: a A A

Traffic Prediction In Wireless Cellular Networks Based On Attention Mechanism

Posted on:2022-04-26Degree:MasterType:Thesis
Country:ChinaCandidate:X HuFull Text:PDF
GTID:2518306557470694Subject:Signal and Information Processing
Abstract/Summary:PDF Full Text Request
With the rapid development of wireless cellular networks,the application of mobile networks has become more extensive,and the scale of the network has also expanded.In the face of increasingly complex wireless cellular networks,network design,construction,management and maintenance personnel will face great pressure,and relevant companies need to invest more resources in the network.Network and network traffic will consume more spectrum resources,power resources,etc.,which will become a burden for network operators.Moreover,network traffic is dynamically changing.During peak traffic,a lot of manpower and resources need to be consumed to ensure user experience,while in low traffic,only a small amount of resources need to be invested while ensuring user experience.If the network traffic can be effectively predicted,the distribution of network resources can be optimized and waste can be reduced.In view of this,this thesis studies the wireless cellular network traffic prediction based on the attention mechanism.The main work in this thesis includes:First of all,this thesis studies traffic prediction based on the attention Long Short Term Memory(LSTM)network.First,the original traffic data cleaning,abnormal data correction,and data standardization are introduced.Then,according to the characteristics of the wireless cellular network traffic,an attention LSTM model is adopted to predict the wireless cellular network traffic.The main idea of the attention LSTM is to find the weights related to the current prediction by solving the weights of the hidden layer output of the LSTM,so as to optimize the output results.The weight function is obtained through the back propagation of the entire neural network.Then,this thesis presents a method of filling missing traffic data based on attention LSTM.The experiment proves the effectiveness of attention LSTM in predicting traffic.Next,this thesis studies traffic prediction based on the Broad Learning System(BLS).First of all,in order to solve the deficiencies of the existing attention mechanism,such as complexity,the number of bases cannot be determined in advance,and the weak ability to learn information,the Expectation-maximization Attention(EMA)is extended to the form of variation.Then,combining BLS and Variational Expectation Maximization Attention(VEMA),a Variational Expectation Maximization Attention Broad Learning System(VABLS)is proposed.VABLS enhances the ability of BLS to fit time series.While obtaining better prediction errors,it maintains the BLS flexible structure.After the training is completed,if the distribution of traffic data changes due to external factors,rapid fitting can be achieved through incremental learning technology.The related experimental part proves that this method has better prediction accuracy than other methods.Finally,according to the characteristics of the cellular traffic data set,this thesis studies the enhanced VABLS on the basis of the typical VABLS.In order to help VABLS extract the data features after highly non-linear changes,this thesis uses a deep neural network as the feature extractor.The deep network can be trained in advance in a data center with sufficient computing resources,and after training,it can be used for traffic prediction of different base stations.In order to learn the relation and distinction between different base stations,and to identify the randomness in the traffic data,this thesis uses the base station embedding mechanism.The base station embedding matrix is obtained after deep network training optimization.Since the deep network uses the data of all base stations during training,some changes need to be made in the loss function.Subsequent experiments further prove the accuracy of this method.
Keywords/Search Tags:Cellular network, traffic prediction, attention mechanism, Broad Learning, LSTM
PDF Full Text Request
Related items