Font Size: a A A

Research On Signal Integrity Analysis Based On Machine Learning

Posted on:2022-11-22Degree:MasterType:Thesis
Country:ChinaCandidate:Y N WangFull Text:PDF
GTID:2518306764479054Subject:Automation Technology
Abstract/Summary:PDF Full Text Request
In recent years,with the development of high-speed circuits toward integration,large-scale,and densification,a series of problems such as reflection and synchronous switching noise have been caused in high-speed channels due to the accelerated circuit frequency.These all cause a certain degree of interference to the reception of the signal.Therefore,it is particularly important to analyze the integrity of the signal,which jeopardizes the correctness and stability of the operation of the electronic system.However,the algorithms in the existing traditional simulation software are timeconsuming and inefficient,and can no longer meet the simulation and modeling speed of dense circuits.In order to solve the problems of slow speed,low efficiency and high data dependence of the existing signal integrity analysis methods,this thesis takes the signal integrity analysis based on machine learning and its optimization as the research point,and firstly establishes an end-to-end neural network to solve the traditional For the problem of low efficiency of signal integrity analysis methods,in view of the problems of strong dependence on data volume and low accuracy of existing machine learning methods,this paper proposes a hybrid neural network and digital-analog hybrid optimization method.Design and implement a high-speed,high-efficiency,low-datavolume machine learning algorithm for signal integrity.Compared with the existing machine learning signal integrity analysis,the algorithm model proposed in this paper has the following innovations:1)A hybrid network model based on semi-supervised learning is proposed.For the problem that traditional machine learning is highly dependent on the amount of data,this paper proposes a network optimization method based on semi-supervised learning.The hybrid network automatically labels a large amount of unlabeled data,so as to solve the problem of low accuracy and overfitting in small sample data prediction.The algorithm model compares the confidence of the results of different network predictions,and uses the data with high confidence to label,and gradually optimizes the network model with excellent performance.The model reduces error by about 33% and saves 50% of labeled data.2)A neural network architecture and training method that combines machine learning models with prior knowledge are proposed.The algorithm is mainly oriented to the scenarios of high-speed serial links and power distribution networks.The timedomain network of S-parameters and circuit parameters is connected in parallel into a serial-parallel network model,which predicts eye height and eye width at the same time,saving 42.8% of data,and solving the dependence of traditional machine learning on large data volumes.3)A fully-connected Neural Network(FC-NN)+ TCNN(Transpose Convolution Neural Network,TCNN)+ prior algorithm is proposed.In the scenario of predicting the impedance of the power distribution network(PDN),according to the parallel translation of the decoupling capacitors and the offset of the parallel resonance frequency,an algorithm of FC-NN+TCNN+a priori is proposed,and the FC-NN and the prior are responsible for predicting the curve.The general trend of the TCNN is responsible for predicting the local features of the curve.2000 groups of impedance predictions were predicted in the model for less than 1s.However,it takes about 24 hours in ADS,which effectively solves the problem of slow impedance prediction rate of the PDN board.
Keywords/Search Tags:Signal Integrity Analysis, Channel Simulation, PDN Impedance Prediction, Machine Learning, Transpose Convolution
PDF Full Text Request
Related items