Font Size: a A A

Research On Sparse Bayesian Learning Theory And Its Application

Posted on:2013-08-15Degree:MasterType:Thesis
Country:ChinaCandidate:J WangFull Text:PDF
GTID:2248330395956766Subject:Circuits and Systems
Abstract/Summary:PDF Full Text Request
Recently the Sparse Bayesian Learning (SBL) approach has been a hotspot researching area in Machine Learning, which can learn the model of low dimensional by fully discovering and using the prior information of the signal. SBL supposes that the prior information has a distribution function form,then model the problems, and design a effectively optimization algorithm to solve this problem.There are two difficulties for using sparse Bayesian learning,one is how to choose the optimization model,the other is how to assume the prior probability. Different models can solve the problem with the similar solution by which the data can achieve sparse representation. In this paper, we investigate the sparse Bayesian learning algorithm with several models and its applications on signals or images. The main contributions can be summarized as follows:(1) A sparse learning machine based on fast Bayesian Matching Pursuit algorithm (FBMP) is proposed,in which the components of sparse parameter vector are supposed to have a prior distribution.The distribution is generated from a Gaussian mixture density,and a matching pursuit algorithm is introduced to locate the nonzero coefficients.FBMP performs better than conventional matching pursuit methods,We take experiments on1-dimensional signal and sparse machine learning under the Compressive Sensing(CS) framework.We take the double helix as example and the results show a better performance than other methods.(2) A Bayesian framework for learning the optimal regularization parameter in the l1-norm penalized Least-Mean-Square (LMS) problem is introduced, which defines the optimal sparsenessof solutions via the optimal sparsity regularization parameters under the Bayesian framework and infers it by learning directly from data.In this chapter we introduce two types of algrithm:one is Ordinary Least Square, and the other is Nonnegative Least Square where the sparse optimal solution has the Nonnegative constraint.The results show that both the two methods can achieve the optimal sparse solution.(3) A Mixture of Factor Analyzers(MFA) based nonparametric hierarchical Bayesian model is introduced, in which a hierarchical Bayesian algorithm that learns MFA model based on the training data.This method can simultaneously infer the number of clusters and the intrinsic subspace dimensionality of the high dimensional datasets. By using the probability density function learned from the MFA as the prior distribution, the CS reconstruction can be found analytically by Bayesian rule using compressed random measurements. From the results,we note that MFA based nonparametric hierarchical Bayesian model can achieve better performance than the methods without using the structure of data sets.This paper was supported by the National Natural Science Foundation of China (NOs.61072108,60601029,60971112,61173090), the New-Century Training Programme Foundation for the Talents(NCET-10-0668), the Programme of Introducing Talents of Discipline to Universities(111Program):No. B0704,the Fundamental Research Funds for the Central Universities (JY10000902041).
Keywords/Search Tags:Sparse Representation, Sparse Bayesian Learning, Compressive Sensing, Mixture of Factor Analysis, Hierarchical Nonparametric Bayesian
PDF Full Text Request
Related items