Font Size: a A A

Digital background calibration of time-interleaved analog-to-digital converters

Posted on:2002-11-06Degree:Ph.DType:Thesis
University:University of California, DavisCandidate:Jamal, Shafiq MohammadFull Text:PDF
GTID:2468390011491928Subject:Engineering
Abstract/Summary:
Digital signal processing (DSP) systems operating on analog inputs are often limited by the speed of the analog-to-digital (A/D) interface. With a given process, the maximum sample rate at which a given analog-to-digital converter (ADC) will operate is limited for a given resolution. A well-known technique that can be used to increase the maximum sample rate of ADCs is time-interleaving more than one ADC [1]. Since time-interleaving requires two or more channels in parallel, there are some disadvantages. In addition to increases in area and power, the disadvantages include sensitivity to offset and gain mismatches, and sample-time errors between the individual channels. An adaptive digital background calibration technique is introduced in this thesis to reduce the effects of offset and gain mismatches, and sample-time errors between the time-interleaved channels.; Background calibration is used because it can track variations of offset and gain mismatches, and sample-time errors over time, temperature, and process; that is, during normal ADC operation. The background calibration technique presented has no effect on the input signal bandwidth and eliminates the difference in interference during calibration and conversion modes that may be associated with foreground calibration. Although analog background calibration techniques can be used, digital background calibration is used here because digital circuitry can be scaled with technology and keep the calibration functionality intact.; To demonstrate the digital background calibration approach, a prototype 10-bit 120 MS/s, two-channel parallel pipelined ADC has been designed and fabricated in a 0.35-μm CMOS process. The digital calibration algorithm uses the input signal itself to calibrate for gain mismatches and sample-time errors. The offset calibration algorithm uses random chopping to avoid holes in the ADC output spectrum.; Test results show that the ADC system achieves a peak SNDR of 56.8 dB for a 0.99 MHz sinusoidal input, a peak INL of −0.88 LSB, and peak DNL of +0.44 LSB. The 5.17 mm2 IC dissipates 234 mW from a 3.3 V supply.
Keywords/Search Tags:Background calibration, ADC, Sample-time errors, Gain mismatches
Related items