Font Size: a A A

Research On Fast Approximate Algorithm Based Bayesian Inference

Posted on:2020-06-18Degree:MasterType:Thesis
Country:ChinaCandidate:Q Y ZouFull Text:PDF
GTID:2428330596995007Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
This thesis mainly studies Bayesian algorithm and its application in inverse problem for recovering the signal of interest.Specifically,this thesis has derived two state-of-the-art algorithm which are approximate message passing like(AMP-like)algorithm and expectation propagation(EP-like)algorithm,respectively.AMP-like algorithm includes approximate message passing(AMP),generalized approximate message passing(GAMP),and multi-layer generalized approximate message passing(ML-GAMP)etc.EP-like algorithm comprises expectation propagation(EP),expectation consistent(EC),generalized expectation consistent(GEC),generalized expectation consistent signal recovery(GEC-SR),and multi-layer generalized expectation consistent(ML-GEC),etc.The AMP algorithm was initially proposed by Donoho et al.to solve the signal recovery of standard linear model in compressive sensing.Subsequently,Rangan et al.broke the restriction of the noise distribution in AMP and proposed GAMP algorithm,which makes it apply to generalized linear model.Both AMP and GAMP are derived from Sum-Product algorithm of factor graph,which are widely concerned by their low complexity and excellent performance.The part of experiment has also verified the excellent performance of AMP-like algorithm in signal recovery.EP was initially found in Minka's doctoral thesis,which was applied to approximate the factorable distribution in Bayesian network.EP develops the form of message update of assumed density filtering.In addition,there is a strong relationship between variational inference and EP.To be more precise,the difference between EP and variational inference is the form of KL-divergence.Moreover,there is also a strong connection between EP and AMP.Using the factor graph to represent the EP,it can be verified that Sum-Product algorithm is a class of EP.That also states that AMP is a class of expectation propagation(EP)techniques in fully factorable case.Similar to AMP,EP also has extension in generalized linear model and multi-layer generalized linear model.GEC and GEC-SR extend EP to generalized linear model for recovering the signal of interest in generalized inverse problem.Although GEC applies to generalized linear model,its convergence is poor.With a linear space,GEC-SR overcomes this problem perfectly.Totally,this thesis has derived AMP-like algorithm and EP-lilke algorithm including l Using a split factor graph to derive AMP.Comparing the traditional factor graph,this· Using a split factor graph to derive AMP.Comparing the traditional factor graph,this factor graph is easy to extend to multi-layer model.·We have proposed a novel message passing rules called “moment-matched message passing”.With this message update rules,it is natural to obtain the derivation of VAMP algorithm.In addition,combining the rules and the split factor graph is also appropriate to multi-layer model.· We have given a derivation of GAMP based on EP.It reduces the steps of GAMP's derivation sharply.Beside that,it also covers the complex domain and builds a bridge between EP and GAMP.· We have presented a derivation of GAMP based on the split factor graph.Comparing the original derivation of GAMP,it reduces the steps of GAMP sharply and both covers the real domain and complex domain.More essentially,it is easy to extend to multi-layer model.· We have given a proof of GEC-SR.
Keywords/Search Tags:signal recovery, Bayesian estimation, approximate message passing, expectation propagation
PDF Full Text Request
Related items