Font Size: a A A

Research On Interference Detection For LTE-M Based On Deep Learning

Posted on:2022-10-12Degree:MasterType:Thesis
Country:ChinaCandidate:B XieFull Text:PDF
GTID:2518306740951759Subject:Electronics and Communications Engineering
Abstract/Summary:PDF Full Text Request
Travel requirements and policy support have promoted the large-scale application of urban railway transit,and the vehicle-ground communication system has changed from a WLAN-based Communication Based Train Control(CBTC)system to a LTE-M-based CBTC system.In order to ensure driving safety,it is necessary to detect and identify the interference in the LTE-M-based CBTC system in a timely and accurate manner.The traditional interference signal detection requires high professional skills and human intervention,and the use of machine learning methods also faces a large number of and complex Feature extraction problems and different extracted features may also affect the final detection results.Therefore,this thesis applies the deep learning theory to the detection of LTE-M interference signals in urban rail transit systems,without the need for artificial extraction of signal characteristics,to complete the detection and identification of interference signals in LTE-M-based CBTC systems.First of all,this thesis designs an LTE-M interference signal detection model based on convolutional neural network(CNN),and elaborates the basic process of interference detection model training.Secondly,this thesis uses the railway wireless communication interference detection system to collect real-time "Rong 2" modern tram LTE-M signal data where the frequency band is 1790-1795 MHz,and generates some typical interference signals includng mono-tone interference,multi-tone inference,square wave pulse interference,noise amplitude modulation interference and linear frequency sweep interference through GNURadio and Universal Software Radio Peripheral(USRP).Third,the collected LTE-M interference signal is transformed to obtain a time-frequency diagram.This thesis uses a time-frequency transformation method based on multiple superimposed discrete Fourier transforms to generate waterfall diagrams,performs time-frequency transformation on the collected LTE-M interference signals,and completes the initial data set according to the differences of the respective waterfall diagrams.By annotating the expansion of the LTE-M initial data set through image geometric transformation and noise addition and other data enhancement methods,the final LTE-M interference data set and the LTE-M interference detection model are obtained.The training test data set is used to optimize the model depth by analyzing the training results,and to detect the model under the premise of ensuring the training accuracy under the condition of having the numbber of parameters in the network model and training time as small as possible.Thus the LTE-M interference detection model is used to detect and identify LTE-M interference signal.Finally,this thesis has conducted a series of experiments and has the following conclusions: 1).the training accuracy of the LTE-M interference detection model using the data-enhanced data set is 38.1% higher than that of no data-enhanced.2).Using the Re LU activation function and the Adam gradient descent optimization algorithm have higher recognition accuracy than other activation functions and gradient descent optimization algorithms.3).through comparative experiments with other CNNs,the training accuracy and test accuracy of the poprosed LTE-M interference detection model are higher than other CNNs networks,which proves our model has a stronger ability to detect and identify LTE-M interference signals.
Keywords/Search Tags:CBTC, Interference detection, LTE-M, Convolutional neural network, Time-frequency analysis
PDF Full Text Request
Related items