Font Size: a A A

Research Of Background Digital Calibration Algorithm For Time-Interleaved ADCs

Posted on:2017-03-02Degree:MasterType:Thesis
Country:ChinaCandidate:Y Y LiuFull Text:PDF
GTID:2308330488995472Subject:Microelectronics and Solid State Electronics
Abstract/Summary:PDF Full Text Request
Analog-to-Digital Converter (ADC) plays an increasingly important role in modern electronic systems. However, CMOS feature size gradually develop into the scale of deep submicron and nanometer, resulting in some nonideal effects as device scaled down and supply voltage reduction, which brings great challenges to the design of high performance ADC. For the traditional single channel ADC, its speed and accuracy has reached the limit of the current level of process and design, but the demand for higher speed and precision ADC has never stopped. Time-Interleaved ADC (TIADC), in which several Sub-ADCs are used in parallel, can maintain high accuracy in the situation of high speed, and has now become a mainstream architecture of high-speed ADC.Theoretically, the speed of TIADC can be improved linearly by adding ADC channel. But in practice there exits error as offset mismatch, gain mismatch, sampling time mismatch error between adjacent channels, which seriously affect the dynamic performance of the system. This paper will calibrate the three main mismatch in TIADC, by studying the existing calibration algorithm, this paper uses a cascaded calibration digital background calibration algorithm for the three main mismatch, offset and gain mismatch calibration based on LMS iteration, and sampling time mismatch calibration based on statistical feedback method. The algorithm has no restriction on the input signal frequency, can be extended to arbitrary number of channels, and has simple structure, so the hardware implementation is relatively easy.To verify the validity of the algorithm, a four-channel 400MHz 12bits TIADC model was built with MATLAB/Simulink, and after calibration, the ENOB was improved from 3.56bits to 11.75bits when normalized frequency was fm/fs=0.4115; Then, the algorithm was implemented with verilog and the simulation was completed with modelsim, the RTL code was compiled with Quartus Ⅱ and downloaded to the FPGA, after calibration, the ENOB was improved by 8.14bits, so the hardware simulation test was completed. Finally, based on the 180nm SMIC technology, the algorithm was implemented by ASIC. and compiled with Design Complier, then post-simulation and PT power consumption was completed, after calibration, the ENOB was improved from 3.56bits to 11.70bits, and the layout was produced by using IC Compiler tool.
Keywords/Search Tags:Time-Interleaved ADC, Cascaded calibration, Digital background calibration algorithm, LMS iteration
PDF Full Text Request
Related items