Font Size: a A A

Research And Implementation Of Full-digital Background Calibration Algorithm For Multi-channel ADCs

Posted on:2014-05-23Degree:MasterType:Thesis
Country:ChinaCandidate:M WangFull Text:PDF
GTID:2268330401488413Subject:Circuits and Systems
Abstract/Summary:PDF Full Text Request
As the interface between the analog and digital parts, ADC is widely used inconsumer electronics, communications, computing and control, and instrumentation.Time-interleaved ADC (TIADC), in which several ADCs are used in parallel, isespecially favored by high-speed systems for its high sampling rate. However,mismatch errors between sub-ADCs introduced in manufacturing process in TIADCare inevitable, which severely limit the dynamic performance of TIADC.This thesis researches digitally assisted analog design technique and itsimplementation for high performance TIADC.14-bit TIADC whose sub-channel is"split ADC" architecture is introduced. Mismatch model based on such TIADC isanalyzed and set up. Background digital calibration algorithm used in TIADC isstudied and implemented. When calibrating timing error, first-order calibrationalgorithm, high-order calibration algorithm and first-order calibration algorithmcascade are studied and compared in low frequency and high frequencyrespectively.In first-order calibration algorithm, LMS filter is used to estimate timing erroradaptively, and the first-order term of timing error is calibrated with the Taylor’sDeveloped Formula. Similarly, high-order calibration algorithm is to calibrate thehigh order terms of timing error. First-order calibration algorithm cascade isproposed based on first-order calibration algorithm and high-order calibrationalgorithm. It calibrates the first-term of timing error at two consecutive points, andthen the calibrated outputs are used to calculate the first derivative involved in thefirst-order calibration at the prior point. Simulation results show that the threealgorithms have the same effect at low frequency. While at high frequency, theENOB of the calibrated output is about10bits in first-order calibration algorithm,11.5bits in high-order calibration algorithm and13.5bits in cascaded first-ordercalibration algorithm. Firstly, the function design and verification of all calibration algorithms in thisthesis is completed using MALAB, which is the golden model to achieve theRTL-level design using Verilog HDL. The function of the hardware circuit isverified on Modelsim. With the help of synthesis tools ASIC structure is achieved.
Keywords/Search Tags:time-interleaved ADC, digitally assisted analog, digital backgroundcalibration, first-order calibration
PDF Full Text Request
Related items