Font Size: a A A

Research Of Internal Channel Sampling Timing Mismatch Calibration For TIADC

Posted on:2018-12-29Degree:MasterType:Thesis
Country:ChinaCandidate:X YeFull Text:PDF
GTID:2348330512989825Subject:Engineering
Abstract/Summary:PDF Full Text Request
In the rapid development of the information age, the demand for analog-to-digital converters is growing,and the design of high-speed, high-precision, low-power analog-to-digital converters is becoming more and more difficult, and time interleaved technology can achieve high performance. However, the unintentional internal channel mismatches greatly reduce the overall performance of the analog-to-digital converters,especially the internal channel sampling time mismatch, which has become a hot topic in academia and industry due to its complex characteristics.In this paper, the influence of internal channel sampling time mismatch on TIADC is studied and discussed. The advantages and disadvantages of the conventional calibration algorithm are analyzed from the aspects of mismatch estimation and mismatch compensation. Based on the previous research results, an all-digital background, wide frequency-band, scalable channel-count, new calibration algorithm is proposed, which is based on the minimum time function and the digital delay line. The mismatch estimation is based on the idea of constructing a minimal function of the sampling time mismatch between channels, converges the constructed function values to the minimum point, which is the state without internal channel sampling time mismatch, and the mismatch estimation is achieved. The mismatch compensation is based on the traditional digital delay line technology, modifies the key module fractional delay filter by introducing Gilbert transform and modification factors, so that the compensation frequency range from the first Nyquist zone extended to any Nyquist zone.Based on the idea of algorithm, this paper constructs the MATLAB behavior level model and completes the RTL level description with Verilog language, and finally through the back-end implementation, get the post-virtual net table and verify it by carrying out the MATLAB model of TIADC with 12-bit 1.2GSPS four-channel. The simulation results show that ENOB is increased from 9.51 bits to 11.81 bits when the input signal is 36.32MHz, and ENOB is increased from 3.51 bits to 11.42 bits, When the input signal frequency is 2.356GHz after the algorithm proposed in this paper.
Keywords/Search Tags:TIADC, internal channel sampling time mismatch, correction algorithm, minimum function, fractional delay filter
PDF Full Text Request
Related items