Font Size: a A A

Research Of Digital Calibration Technique Based On Modulation For Time-interleaved ADC

Posted on:2017-05-01Degree:MasterType:Thesis
Country:ChinaCandidate:Y S PanFull Text:PDF
GTID:2308330488495487Subject:Electronic and communication engineering
Abstract/Summary:PDF Full Text Request
The evolution of microelectronics technology enabled the great increasement in monolithic integration, but did not bring the design of traditional architecture of analog-to-digital converter (Analog to Digital Converter, ADC) much help. With the limitation of various non-ideal factors, it is very difficult to achieve both high-speed and high-precision for ADC on a single chip. Fortunately, the application of time-interleaved technique brings the designers of high performance ADCs the hope.Time-interleaved ADC is based on multi ADCs alternately sampling to realize the high-speed sampling, which is an effective way to solve the contradiction between high speed and high precision. Nowadays, more and more high-end ADCs are using this architecture. However, due to the deviation of the manufacturing process, there are a variety of mismatch effects among sub-channels of time-interleaved ADC. These mismatches seriously reduce the dynamic performance of ADC. These mismatches mainly include three kinds of mismatch error, and they are offset mismatch, gain mismatch and time mismatch. Only eliminate these errors can high speed and high precision be achieved in the true sense. Strict matching design for sub-channels in the analog circuit receives little effect, but the use of the advantages of digital circuit and digital aided design can realize the elimination of errors with skill and ease. The introduction of digital aided design has become the mainstream of the current ADC design.According to the study in some time-interleaved ADC calibration algorithms, some characteristics of these algorithms was discussed, and after in depth understanding the characteristics of these mismatch errors, a gain mismatch and sampling time mismatch of full digital calibration algorithm based on the Walsh sequence modulation was designed. The algorithm makes use of the digital signal processing method, and introduces Walsh sequence modulation signal, which can perfectly eliminate the spurious spectrums caused by gain and time mismatches without introducing new errors. Compared with the existing algorithms, the calibration algorithm has a comparative advantage in the calibration effect and the cost of hardware.In order to verify the validity of the algorithm level, a 12bits two-channel and four-channel time-interleaved ADC calibration model were built with MATLAB/Simulink, and simulations were done for behavior level. In order to verify the realizability of the algorithm, RTL code of the algorithm was designed with Verilog HDL, and function simulations of RTL code were carried out with Mentor’s Modelsim. Finally, in order to verify hardware realizability of the algorithm, FPGA verification was done with Altera’s StratixIV series FPGA, and verification results from three different platforms are compared. Verification results show that, for the input signals within the Nyquist band(except some special frequencies), the spurious spectrums caused by gain and time mismatches have been completely eliminated after the calibration, the signal to noise distortion ratio (SNDR) and spurious free dynamic range (SFDR) have been greatly improved.
Keywords/Search Tags:Time-interleaved ADC, Channel mismatch, All digital background calibration, Walsh transform
PDF Full Text Request
Related items