Font Size: a A A

Study And Hardware Implementation Of NB-IoT Downlink Channel Estimation Algorithm

Posted on:2022-05-26Degree:MasterType:Thesis
Country:ChinaCandidate:X N WangFull Text:PDF
GTID:2518306569979329Subject:IC Engineering
Abstract/Summary:PDF Full Text Request
Nowadays,mobile communication is still undergoing rapid transformation.The characteristics of Narrow Band Internet of Things(NB-IoT),with low power consumption and wide coverage,enable it to generate a wide range of application scenarios in many traditional fields.Due to the typical working SNR of NB-IoT is less than 0d B and it simplifies the physical channel,the existing channel estimators which are widely used in OFDM(Orthogonal Frequency Division Multiplexing)systems are not directly applicable.Therefore,it is particularly important to study a channel estimation algorithm which has better performance and lower complexity in NB-IoT systems.Based on the physical layer of NB-IoT,this thesis analyzes the influence of the wireless channel model on the channel transmission.Through the design of NB-IoT downlink,the channel estimation algorithm of pilot position and the interpolation algorithm of time domain direction are studied.Considering the complexity of hardware implementation and system performance,an Improved Algorithm for channel estimation(IA?LMMSE)is proposed.Based on this scheme,the FPGA development board of Xilinx is used as the hardware platform,and the hardware description language of Verilog is used to implement the algorithm.In view of the performance of the channel estimation algorithm,the classical channel estimation algorithm is studied,and an improved LMMSE algorithm based on parameter estimation is proposed,which uses the frequency domain autocorrelation of pilot frequency to calculate the root Mean Square delay,thus reducing the complexity of the traditional LMMSE algorithm.In order to further improve the performance of the algorithm,two modes of interframe filtering are proposed in the frequency domain,and linear interpolation is proposed in the time domain,which has lower complexity and better performance than other interpolation algorithms.Aiming at the hardware implementation of channel estimation algorithm,in order to reduce the complexity of hardware implementation,the coefficient values of pilot generated signal and LMMSE algorithm are extracted in the form of lookup table,and by studying the coefficient matrix,the storage is reduced by 44% by using the partial symmetry of coefficient matrix.In order to reduce the resource utilization and operation delay,the divider in the parameter estimation module is designed with time-division multiplexing and two-stage pipelining is used in the time-domain interpolation module.Finally,based on Matlab and Vivado for NB-IoT channel estimation algorithm can realize the function of the system simulation and performance test.The results show that the proposed IA?LMMSE channel estimation algorithm is better than the Least Square method.The bit error rate of the proposed algorithm is improved by 8d B,and the proposed inter-frame iterative filtering method is effective at low SNR.It can also be used in other mobile communication systems.In terms of hardware implementation,the channel estimatian consumption time of one subframe is 0.01 ms,and the utilization resources of each subframe on FPGA are all below 10%,which meets the application requirements.
Keywords/Search Tags:NB-IoT, Channel Estimation, LMMSE, FPGA
PDF Full Text Request
Related items