Font Size: a A A

Calibration And Design Techniques Research For Multi-Channel Time-Interleaved Analog-To-Digital Converter

Posted on:2007-07-12Degree:DoctorType:Dissertation
Country:ChinaCandidate:G L WuFull Text:PDF
GTID:1118360212965456Subject:Microelectronics and Solid State Electronics
Abstract/Summary:PDF Full Text Request
With the development of integrated circuit process, the speed of analog-to-digital converter was improved continuously. However, speed improvement from reduced feature size requires reduced supply voltage which, due to KT/C noise, requires larger capacitors and thus lower speed. In order to increase the sampling rate of an analog-to-digital converter beyond a certain process technology limit, the effective way is to use a parallel structure where multiple ADCs can be placed in an array in which each ADC samples the input during a different phase and the digital output taken from each ADC are reconstructed in digital form, that is, multi-channel time-interleaved ADC (MTIADC) technology. A MTIADC performs high-throughput ADC with no degradation in spectral purity if all ADCs have identical electrical characteristics (e.g., gain, sampling time, dc offset, nonlinearity, etc.). In practice, however, various electrical mismatches are inevitable, which result in dynamic range degradation such as signal-to-noise ratio(SNR), spurious-free dynamic range(SFDR), etc., that-in an FFT plot-show up as spurious frequency components called image spurs and offset spurs. The image spurs associated with MTIADC systems are a direct result of gain and phase mismatch between the ADC channels. The offset spur is generated by offset differences between the ADC channels.Firstly, linearity mismatch model and nonlinearity mismatch model were built for MTIADC and in theory the influence of offset mismatch, gain mismatch, clock mismatch and nonlinearity mismatch between the channels on ADC performance had been researched. Then a new offset mismatch and gain mismatch calibration algorithm - least mean squares (LMS) algorithm, for multi-channel time-interleaved ADC were described and an improved global sampling clock was used to reduce the clock mismatch between channels. In addition mismatch calibration scheme and calibration circuits were designed.In order to verify the above calibration algorithm and calibration scheme, A 10-bit 180-Msample/s and 10-bit 720-Msample/s four-channel parallel pipelined analog-to-digital converter (ADC) with digital calibration had been designed and fabricated in a 0.18-μm single poly six layer metal CMOS technology. Implementation details of clock generator, bootstrap switch, sample and hold circuit, operation amplifier, multiply digital-to-analog converter (MDAC), comparator, bandgap circuits, high speed I/O interface - low voltage differential signaling (LVDS) and etc in terms of low supply voltage deep submicron design were elaborated on. Finally, test scheme and measured results were given.
Keywords/Search Tags:Time-interleaved, LMS algorithm, ADC, mismatch, calibration, pipelined
PDF Full Text Request
Related items