Font Size: a A A

Research On Technology Of Deep Learning Based OTFS Signal Demodulation

Posted on:2022-02-08Degree:MasterType:Thesis
Country:ChinaCandidate:Y TuFull Text:PDF
GTID:2518306338467784Subject:Electronics and Communications Engineering
Abstract/Summary:PDF Full Text Request
Orthogonal Time Frequency Space(OTFS)is expected to solve the problems in communication in high-speed mobile scenarios in future wireless communications.However,the size of channel matrix in the OTFS system is too large and there will be inter-symbol interference and inter-subcarrier interference,making OTFS signal demodulation a challenge.There have been many studies on OTFS demodulation.The demodulation of OTFS signal consists of channel estimation and signal detection.In OTFS system,an impulse function is usually used as the pilot and the channel estimation is done directly through a threshold-based method,which may lead to higher mean square error(MSE).For signal detection,the linear detection algorithms will need matrix inversion operations,which cost much time and storage and its performance is also undesirable.Non-linear detection algorithms like message passing(MP)algorithm also has the disadvantage of complexity.In recent years,deep learning has been widely used in communication system.Studies have proved that deep learning can show better performance than traditional methods in some way with a faster execution rate,which brings a new idea for OTFS demodulation.This paper uses deep learning methods to optimize the channel estimation and signal detection for OTFS system.In the channel estimation part,by analyzing and drawing on the existing deep learning methods for channel estimation,we introduce the image restoration(IR)network in image processing to optimize the two-dimensional channel image,and simplify the denoising convolutional neural network(DnCNN)according to the characteristics of the channel and then use it as the IR network.Simulations show that using the simplified DnCNN can achieve a lower MSE than the traditional method and reduce the overall bit error rate(BER)of the system.In the signal detection part,we analyze the implementation of traditional linear detection methods and propose a simplified linear detector that can greatly reduce the storage consumption and time consumption.Then we studied the existing signal detection network suitable for large-scale multiple input multiple output(MIMO)system.By analyzing the characteristics of the signals in OTFS system,we designed a deep learning based signal detection network suitable for OTFS systrm.Through comparison and simulation,it is verified that the proposed signal detection network has better performance than the existing signal detection network.Compared with the MP algorithm,it greatly reduces the total running time and achieves a good compromise between complexity and performance.
Keywords/Search Tags:OTFS, Deep Learning, Channel Estimation, Signal Detection
PDF Full Text Request
Related items