| According to the development trend of mobile communication in the past,a new generation of mobile communication system will be updated about every ten years,and the demand for communication speed and user traffic is increasing rapidly.In order to meet the network rate requirements of various application scenarios,our country has started research on 5G communication technology in 2013,and it has been commercialized for the first time in 2019,and most of the maximum 5G transmission rates is between 1Gbps and 2Gbps.,far exceeding the network capacity of the 4G era.At present,the focus of communication technology development is further shifted to 6G.The improvement of the above communication technologies means that a large amount of trafic will pass through the network at all times,and users have higher and higher requirements for communication delays.Therefore,it is necessary to allocate network resources reasonably,reduce network congestion,and provide users with high-quality services while at the same time.It also ensures the normal and stable operation of the network.In order to plan the network structure in advance and make reasonable use of network resources,this thesis discusses two aspects,one is how to use deep learning to predict the traffic size of the cellular network in the actual environment in the future,the other is to infer the corresponding network delay based on network traffic.As one of the important indicators of network quality of service(QoS),network delay has great significance for improving network performance.The goal of generating network delay is achieved through deep learning,which is convenient for predicting network dynamics and making network planning in advance.First of all,in the process of implementing the traffic prediction based on deep learning,this thesis adopts the actual telecommunication traffic data,and divides the data into cycles according to multiple dimensions.The main structure of deep learning adopts the combination of convolutional neural network and long short-term memory network,and attention mechanism is added in the time dimension.In this thesis,the proposed model is compared with the traditional sequence prediction model such as ARIMA,another deep learning model that does not consider attention mechanism,and the deep learning models that don’t divide the traffic sequence into multi-dimensional modules.The advantages of the model are further applied to other long-distance cellular networks to explore its robustness,which confirms that the model can be transplanted to other regions in the future.Secondly,in the process of implementing delay inference based on network traffic,this thesis uses the backbone network traffic data and the conditional variational autoencoder model(CVAE).On this basis,we improve the model and add the prediction network structure which is constructed by the long short-term memory network and the fully connected layer,and the loss function is constructed as a combination of the respective loss functions of the prediction module and the latent space distribution module.In this thesis,the improved CVAE model is compared with both the generative model proposed in the existing literature[1]and the prediction model constructed by the long short-term memory network,and the improvement effect of the model in the inference network delay is obtained according to various indicators.Finally,a comprehensive discussion and analysis of the experimental results of the two modules is carried out,and the feasibility and advantages of deep learning in network optimization are obtained. |