Font Size: a A A

Research And Implementation Of Calibration Algorithm For Time-interleaved Analog-to-digital Converter

Posted on:2021-02-26Degree:MasterType:Thesis
Country:ChinaCandidate:C GaoFull Text:PDF
GTID:2428330626456086Subject:Microelectronics and Solid State Electronics
Abstract/Summary:PDF Full Text Request
In recent years,with the rapid development of semiconductor manufacturing processes and design technologies,the performance of integrated circuits has also been continuously improved.Various high-tech based on integrated circuits such as artificial intelligence,autonomous driving,biomedicine,Internet of Things and 5G technology are also driving the rapid development of integrated circuits.Digital integrated circuits have the advantages of high integration and ease of large-scale development,and can be widely used in general computing,signal processing and other technical fields.When processing various analog signals in nature,in order to take advantage of the powerful signal processing and calculation capabilities of digital integrated circuits,an analog-to-digital converter(ADC)is required to convert the analog signals into digital signals for operation.Common ADC structures include flash ADCs,Sigma-Delta ADCs,successive approximation ADCs,pipeline ADCs and so on.This graduation project mainly focuses on Time-Interleaved(TI)ADC.TI ADC is favored because of the structural characteristics of its multiple sub-ADCs that alternately sample and interleave the output conversion results,making it possible to achieve higher conversion speeds under the same semiconductor manufacturing process.However,because TI ADC uses the technique of multiple sub-ADC sampling,the offset,gain,and sample-time mismatch between each sub-ADC will significantly affect the performance of TI ADC.Therefore,this paper proposes a background digital correction algorithm for TI ADC.First,MATLAB software is used for mathematical modeling,analysis of the working mode of TI ADC,and the impact of the mismatch effect is simulated by adding mismatch error.Then use MATLAB to model and simulate the correction algorithm,verify the correctness of the algorithm,simulate some implementation methods and parameter selection in the algorithm to determine the design details.Finally,using Verilog-HDL hardware description language to implement the circuit of this correction algorithm,the design ideas and design details of each circuit module are analyzed,and pre-simulation is performed to verify the correctness of its timing and function.As a background correction,the algorithm can correct the mismatch effect without stopping the operation of TI ADC,and implement error estimation and compensation in the digital domain.It also has good process compatibility.The 14-bit TI ADC designed this time has been verified by the previous simulation.Compared with the uncorrected case,the ENOB of the TI ADC has a performance improvement of more than 6-bit,and the SFDR has a performance improvement of more than 30 dB.In terms of indicators,there have been obvious improvements,which have improved the dynamic performance of TI ADCs and verified the correctness of the correction algorithm.
Keywords/Search Tags:TI ADC, Time-Interleaved, Mismatch effect, Digital-background calibration, TI ADC correction
PDF Full Text Request
Related items