Font Size: a A A

The Lte System Base With Dagc Applied Research And Fpga Implementation

Posted on:2010-11-04Degree:MasterType:Thesis
Country:ChinaCandidate:C LiuFull Text:PDF
GTID:2208360275483100Subject:Information and Communication Engineering
Abstract/Summary:PDF Full Text Request
Mobile communication is developing to the fourth-generation communication system today, and as the key technology of the fourth-generation digital mobile communication (4G) system, OFDM is adopted by many quasi-4G protocol, including LTE. As the key functional modules in the OFDM system, the accuracy of IDFT/DFT has a significant impact on the accuracy of the baseband demodulation performance, especially of the LTE uplink which adopts SC_FDMA. In order to achieve a better performance of fixed-point IDFT/DFT, this paper uses digital AGC (DAGC) technology to solve the output signal to noise ratio (SNR) degradation of the IDFT/DFT, which is caused by the large dynamic range of the input signal.First of all, this paper introduces the mature AAGC (Analog AGC) technology, and focuses on the digital AGC technology to improve its performance, which arose in recent years, that are both mainly used for compressing the dynamic range of ADC input to prevent its saturation. Aiming at the fixed-point IDFT/DFT technology with cumulative characteristic in baseband processing, the paper further analysises the similarities and differences of the implementation object and method between the AAGC technology and the baseband DAGC, then points out the important of the baseband DAGC.Secondly, a baseband PUSCH processing uplink from modulation to demodulation is built up under LTE protocol, and according to the disadvantage of DFT-based channel estimation method, the use of easy implementation which adopts two points replacement achieves optimization, it can achieve the desired effect under Gaussian channel by MATLAB simulation. Without taking synchronization problem into account, the simulation results also show that, the baseband processing link in this paper can achieve very low bit error ratio (BER) at SNR higher than 17dB, which adopts 64QAM modulation under the Gaussian channel with hard decision decoding.Thirdly, basing on the link built, the paper proves the baseband DAGC, which contains the time and frequency domain DAGC, can keep the stabilization of receiver's demodulation performance by theoretical analysis and MATLAB simulation. At the same time, after comparing several DAGC algorithms, a set of baseband DAGC algorithm for implement, which makes IDFT/DFT output SNR into the optimum range, is selected to meet the needs of LTE baseband demodulation system requirements. According to the differences between the time and frequency domain DAGC, shift and addition, as well as look-up table are selected separately for the implementation of baseband DAGC algorithm.Finally, this selected baseband DAGC algorithm is used for FPGA design, the results of simulation and synthesis, as well as the implementation on demo board all show that, the implementation methods of time and frequency domain DAGC occupy few resources, they can be integrated easily, can achieve a high maximum operating frequency, which meets the baseband processing rate requirement completely, and can deal with every IQ data by pipelining, so as to meet the requirement of baseband demodulation performance.
Keywords/Search Tags:Long Term Evolution (LTE), Orthogonal Frequency Division Multiplexing (OFDM), Digital Automatic Gain Control (DAGC), FPGA implement
PDF Full Text Request
Related items