Font Size: a A A

Multi-channel Time-interleaved ADC Correction Integrated Circuit Implementation Study

Posted on:2011-04-03Degree:DoctorType:Dissertation
Country:ChinaCandidate:F YeFull Text:PDF
GTID:1118360305997167Subject:Microelectronics and Solid State Electronics
Abstract/Summary:PDF Full Text Request
Analog-to-Digital Converters (ADCs) are widely used in communication, computer, instrument and other fields. High performance ADCs, which are controlled by US government, have important strategic value. After years of academic and industrial researching, ADC's performance is approaching technology limitations on speed and accuracy within conditions of the time.Therefore, time-interleaved architecture with several ADC channels becomes a necessary solution to break the bottleneck on speed of single ADC. However, offset mismatch, gain mismatch, bandwidth mismatch and sample-time error among channels seriously deteriorate performance of time-interleaved ADC. For dealing with sample-time error, existing methods can not improve performance substantially, or can not expand to more channels, or only effective in special input conditions such as low-frequency band-limited or Single-tone sinusoid.In this paper, a digital background calibration method for time-interleaved ADC is presented, based on LMS-FIR adaptive filter and interpolation. This method can calibrate sample-time error among channels as well as gain mismatch and bandwidth mismatch. Using correlation results between interpolation data of reference channel and sample data of channel under calibration, correct expected value of channel under calibration and calibrated results are obtained. Moreover, LMS iteration is controlled adaptively, which makes this calibration applicative in any cases avoiding error convergence. Compared to existing methods, proposed calibration method covers all the conditions of others, can be applicative in most cases, and can be expanded to channels of any number.Based on this calibration method, a 14-Bit 200-Msps time-interleaved ADC chip with 2 channels is implemented in a 0.18-μm CMOS VLSI process. High performance sample switch and clock scheme is adopted to optimize speed, and background random-noise injection is adopted to optimize accuracy. Test results of the chip shows that proposed calibration method eliminates non-ideal characters between channels. The ENOB is improved over 2 Bit to 11.3 Bit, and the SFDR is improved about 30 dBc to 88.9 dBc.
Keywords/Search Tags:Analog-to-Digital Converter (ADC), Time-Interleaved, Digital Background Calibration, Least Mean Square (LMS) Method, Interpolation Filter, Sample-Hold Amplifier (SHA), Pipelined
PDF Full Text Request
Related items