Font Size: a A A

Research And Design Of All-Digital Calibration Algorithm For Time-Interleaved ADC

Posted on:2022-11-30Degree:MasterType:Thesis
Country:ChinaCandidate:R G HuaFull Text:PDF
GTID:2518306764463254Subject:Wireless Electronics
Abstract/Summary:PDF Full Text Request
With the advent of a new round of information technology revolution,China has begun to prioritize the development of the integrated circuit industry as a strategic basic industry.The improvement of its technical level can promote economic development,contribute to scientific and technological progress and enhance national defense strength.High-speed and high-precision analog-to-digital converter(ADC)chip is the core component of integrated circuit,which can convert continuous intractable analog signals from nature to discrete tractable digital signals into the digital world.The conversion accuracy and sampling rate of ADC directly limit the processing capacity of the mixed signal processing system.Such as radar system,navigation system,5G communication system and high-speed interface in data center also require higher ADC accuracy and speed.So ADC is the key technology.Common ADC structures include flash ADC,pipelined ADC,?-? ADC,successive approximation ADC and time-interleaved(TI)ADC.Because TI ADC has the structural characteristics of multiplying the ADC sampling rate,more and more academics and engineers focus on it in recent years and it has become a hotspot in academia and industry.This thesis focuses on TI ADC,firstly introduces the basic principle,performance index and circuit structure of ADC,and the basic principle of time-interleaved technology by establishing error model and mathematical derivation for analyzing the unique characteristics of TI ADC.Finally,the effects of offset mismatch,gain mismatch and time mismatch error on ADC performance are summarized and compared.According to the above,this thesis focuses on the method of estimating and compensating time mismatch in digital domain,and based on autocorrelation function for the time mismatch error of TI ADC,an all digital calibration algorithm is proposed.In addition,the calibration principle of offset mismatch and gain mismatch are briefly introduced.The simulation by Python show that the time mismatch calibration algorithm designed in this thesis can be applied to TI ADC with any number of channels,with good calibration effect over a wider input signal frequency range,with good robustness.Finally,according to the requirements of the actual project,the proposed all digital calibration algorithm is applied to a 14-bit 600-MS/s TI-pipelined ADC.The digital calibration circuit is sequentially subjected to RTL code design,function simulation verification and back-end implementation under the 40 nm CMOS process.The results of function simulation verification show that after calibration,the effective number of bits of the TI ADC is improved from 3.23-bit to 12.63-bit,and the spurious-free dynamic range is inproved from 20.69 d B to 84.92 d B.The validity of the proposed calibration algorithm is verified.
Keywords/Search Tags:Time-Interleaved Technology, Analog-to-Digital Converter, Digital background calibration, time mismatch error, autocorrelation function
PDF Full Text Request
Related items