Font Size: a A A

Research On Out-of-cell Delay Estimation Method Based On Sparse Bayesian Learning

Posted on:2022-12-10Degree:MasterType:Thesis
Country:ChinaCandidate:W Y LiFull Text:PDF
GTID:2518306746468674Subject:Information and Communication Engineering
Abstract/Summary:PDF Full Text Request
Time delay estimation method is widely used in emerging fields such as intelligent transportation,intelligent agriculture and high-precision positioning.It is an important technology for the new generation of mobile communication.The traditional time delay estimation based on sparse optimization will cause the mismatch between the time delay parameters to be estimated and the predefined grid points.Solving this off-grid problem is a hot and difficult problem in the field of signal estimation.The existing off-grid sparse Bayesian learning(SBL)algorithm can effectively solve the off-grid problem by setting and iterating off-grid hyperparameters.However,in the adverse channel environment such as dense multipath and low signal-to-noise ratio,the estimation accuracy of the existing SBL algorithm is not high.Based on the existing off-grid SBL algorithm,this paper proposes a new method to solve the problem of time delay estimation of dense multipath signals,and extends it to the field of joint time delay angle estimation.The main work of this paper is listed as follows:(1)The time delay model,frequency domain model and frequency domain cross-correlation model are introduced into a variety of off-grid sparse Bayesian learning algorithms to solve the problem of multipath time delay signal estimation.Compared with the existing subspace algorithm(MUSIC),on-grid orthogonal matching pursuit algorithm(OMP)and sparse iterative optimization algorithm(SPICE),the simulation results show that the estimation accuracy of off-grid algorithm is significantly improved under the improvement of frequency domain and frequency domain cross-correlation model.(2)For dense delay estimation,when the distance between real receiving delays is close,or even when multiple delays are located in a predefined grid interval,the existing sparse Bayesian learning methods are difficult to obtain the high accuracy required for application.In order to solve this problem,an off-grid Bayesian method suitable for dense signal delay estimation is proposed.This method proposes a biased total grid(BTG)strategy.The overall grid is iteratively moved according to the position of off-grid points between the two grids,so as to reduce the correlation between grid and off-grid vector in the iterative process.Two strategies are proposed for different models.For low signal-to-noise ratio(SNR),an algorithm with high estimation accuracy — sparse Bayesian learning algorithm for frequency domain cross-correlation model to bias total grid(FR-OGSBI-BTG)is designed,and for high SNR,an algorithm with high estimation accuracy — sparse Bayesian learning algorithm for time delay model to bias total grid(TD-OGSBI-BTG)is proposed.(3)The single parameter estimation of time delay estimation is extended to the multi-parameter estimation of joint estimation of time delay and angle,and the off-grid sparse Bayesian algorithm is introduced to design two models to solve the dense joint estimation.In order to reduce the time delay and the angle of correlation between grids,the first method is to set the time delay and the angle of correlation between grids.In the second method,in order to reduce the computational complexity of joint estimation,the delay value is estimated by SBL algorithm,then the angle and delay are matched by inversion,and the angle estimation value is obtained by sparse iterative algorithm.Simulation results show that the proposed two strategies have good performance of joint time delay and angle estimation.
Keywords/Search Tags:dense time delay estimation, sparse Bayesian learning, compressed sensing, off-grid, joint time delay and angle estimation
PDF Full Text Request
Related items