Font Size: a A A

On The Synchronization And Channel Estimation Technique For Practical TD-LTE System

Posted on:2015-02-25Degree:MasterType:Thesis
Country:ChinaCandidate:Z Q HeFull Text:PDF
GTID:2268330428976655Subject:Electronics and Communications Engineering
Abstract/Summary:PDF Full Text Request
With the ever increasing need for mobile multi-media services, the broadband data provisioning capability are strongly desired. In order to meet this requirement, the existing mobile technology is quickly evolving from the3G to the3GPP long term evolution (LTE). In the physical layer, the LTE system employes the combination of MIMO and OFDM modulation to improve the system capacity and enhance the transmission performance. The LTE system should not only provide high speed date rate in low mobility scenarios, but also maintain its efficiency under high mobility environment. In low mobility case, the traditional channel estimation schemes which always assume the static channel coefficients within a certain time period may not work well in high mobilility scenarios, because now the fading channel becomes obviously time-frequency doubly selective. Obviously, the direct utilization of traditional channel estimation scheme can not achieve good channel estimate quality. Compared to the single carrier modulation, OFDM technology can achieve high spectrum efficiency and effect combate strong multipath fading deterioration. However, the OFDM modulation is sensitive to timing error and carrier frequency offset. The ISI and the ICI introduced by timing error and frequency offset can significantly degrade the achieved performance by using OFDM. Seemingly, effective channel estimate and accurate synchronization are critical for the practical application of LTE system.Aiming at the design of feasible channel estimate and synchronization with high performance for practical LTE system, the link layer simulation platform for both the uplink and downlink system are developed at in this thesis by strictly following the specification of the LTE technical standard. The effect of the non-ideal synchronization and channel estimate on the achieved bit error rates is assessed through simulations to show the significance of reasonable synchronization and channel estimate. As for the LTE uplink system, the DMRS based joint frequency offset and channel estimation schemes are investigated, wherein both the maximum likelihood estimate scheme and the Bayesian estimate scheme are utilized. As for the LTE downlink, the good correlation properties of the primary synchronization sequence (PSS) can be employed to derive the timing and frequency offset estimation scheme, based on which, the joint timing and frequency offset estimation scheme can be derived by using the cyclic prefix (CP) of the OFDM symbol. Finally, the dedicated pilot design and placement in both the uplink and downlink of LTE system are presented to highlight the corresponding channel estimate schemes, wherein the traditional LS, LMMSE channel estimation algorithms together with the maximum likelihood estimate algorithm as well as the Kalman filtering based estimate algorithm based on the BEM model of the fast time-varying channel are analyzed and compared. The achieved channel estimate performance and the corresponding applicability are validated and discussed based on the simulation validation over the LTE link layer platform.The analysis in this thesis reveals that, the Bayesian based carrier frequency offset estimate as well as the Bayesian based joint timing and frequency offset estimation scheme provides feasible synchronization technique for practical LTE system. While in fast time varying environment, the BEM model based Kalman filtering scheme offers promising channel estimation performance.
Keywords/Search Tags:Timing Synchronization, Frequency Synchronization, Channel Estimation, Kalman filtering, LTE
PDF Full Text Request
Related items