Font Size: a A A

Blind calibration for time-interleaved analog-to-digital converters

Posted on:2007-07-10Degree:Ph.DType:Dissertation
University:University of California, DavisCandidate:Huang, YuhuiFull Text:PDF
GTID:1448390005460852Subject:Engineering
Abstract/Summary:
Digital signal processing applications rely on high speed high precision data conversion systems. A time-interleaved analog to digital converter (ADC) achieves high sampling rate by multiplexing a bank of slow ADCs operating in parallel. However, time and gain mismatches among the slow ADCs can degrade significantly the performance of the overall converter. To overcome this problem, the mismatches can be first estimated and then corrected by applying digital filters to the sample sequences produced by the slow ADCs. This document demonstrates and analyzes an efficient digital blind calibration method for estimating the timing offsets and gain mismatches in M-channel time-interleaved ADC.; The calibration method is first illustrated by considering the estimation of the gain and timing mismatch for the case of M = 2 slow ADCs, when the input signal is slightly oversampled, a frequency band around the zero frequency is created where the subchannel ADC outputs are alias free. Lowpass filtering the ADC subchannels to this alias free region converts the blind calibration problem to a conventional gain and time delay estimation problem for an unknown signal in noise. An adaptive filter structure with three fixed FIR filters and two adaptive gain and timing offset parameters is employed to achieve the calibration. A convergence analysis and numerical simulations are performed to validate the method. A Cramer-Rao lower bound is evaluated and used to analyze the dependence of the calibration performance on design parameters such as oversampling ratio or SNR in the alias free band.; The above principle can be generalized to timing mismatch calibration for an M-channel time-interleaved ADC. The proposed method requires that the input signal should be slightly oversampled and the input signal should contain energy, at least intermittently, in M/2 frequency bands. The oversampling condition ensures that there exists a frequency band around the zero frequency where the Fourier transforms of the ADC subchannels contain only M - 1 alias components. Then the matrix power spectral density (PSD) of the ADC subchannels is rank deficient over this frequency band. Accordingly, when the timing offsets are known, it is possible to construct a filter bank that nulls the vector signal produced by the ADC outputs. The filter bank is used to develop an adaptive algorithm for estimating the ADC timing offsets. The filter bank employs 2 M - 1 fixed FIR filters and M - 1 unknown timing offset parameters which are estimated by using an adaptive stochastic gradient technique. A convergence analysis is performed to justify the method. A M ATHEMATICA based technique is developed to obtain a closed-form parametrization of the null steering filter bank for arbitrary M. The effectiveness of the calibration method is demonstrated by numerical simulations for a bandlimited white noise input and for inputs containing several sinusoidal components.
Keywords/Search Tags:Calibration, ADC, Time-interleaved, Digital, Signal, Slow adcs, Filter bank, Band
Related items