Font Size: a A A

Research On Uplink Channel Estimation Algorithm In LTE-Advanced System

Posted on:2019-07-19Degree:MasterType:Thesis
Country:ChinaCandidate:P ShuaiFull Text:PDF
GTID:2428330545984704Subject:Communication and Information System
Abstract/Summary:PDF Full Text Request
Channel estimation is the key technique to estimate the characteristics of transmission channels in mobile communication systems,and can guarantee the quality of communication to the greatest extent.The performance of channel estimation can directly affect the results of data equalization detection at the receiver as well as ensure the accuracy of the received signal.LTEAdvanced follows the key technologies in LTE and introduces some new technologies.The performance of channel estimation can affect the performance of these key technologies and then directly affect the performance of LTE-A communication systems.Therefore,it is of great significance to improving the performance of channel estimation both in the LTE-A system and in the upcoming 5G communication system.The uplink of the LTE-A system adopts the SC-FDMA technology as its access method.Compared with the OFDM system,SC-FDMA has a better signal peak-to-average ratio(PAPR),which reduces the requirements of the receiving equipment and also guarantee the performance of channel estimation.The paper firstly studies the fading characteristics of the transmission channel,and selects the channel model that is suitable for the paper.Afterwards,the SC-FDMA technology was studied in detail.The simulation verified the PAPR performance and also compared it with the OFDM system.The results showed that the PAPR performance of the SC-FDMA system was better than the OFDM system.Then it focused on the uplink pilot-based channel estimation algorithms for LTE-A systems,including the selection of reference signals,the design of pilot patterns and the channel estimation algorithms at the pilot signals,such as LS,MMSE,and DFT algorithms.Researches made in the paper focused on the DFT channel estimation algorithm,and analyzed the performance of the algorithm from various aspects,and made specific improvements in the following three aspects: to work out the discontinuity problem at the front and the back of the sequence,the sequence was flipped,so that the sequence of energy is more concentrated in the beginning and end of the sequence;to resolve the denoising problem in the DFT algorithm,the cluster analysis method was introduced to determine the filtered noise;in order to solve the problem of energy leakage,the energy leak rate was defined to select an effective path to reduce its impact on channel estimation performance.In Matlab environment,the improved algorithm and several classical algorithms were verified by simulation.The performance of the improved algorithms were verified in terms of bit error rate and mean square error.The results showed that the improved DFT algorithm performed better than the LS algorithm and DFT algorithm,and its complexity was lower than LMMSE algorithm while ensuring that the performance of it was close to its original algorithm.A comparison of the improved algorithms for selecting different energy leak rates was also performed to show that the improved DFT algorithm has a stronger applicability.In general,the improved DFT algorithm has better performance.
Keywords/Search Tags:LTE-Advanced(Long Term Evolution Advanced), Channel estimation, DFT(Discrete Fourier Transform), Clustering analysis, Energy leakage
PDF Full Text Request
Related items