Font Size: a A A

Sparse Bayesian Regularization Method

Posted on:2017-01-14Degree:MasterType:Thesis
Country:ChinaCandidate:Q DongFull Text:PDF
GTID:2349330512469256Subject:Statistics
Abstract/Summary:PDF Full Text Request
With the development of technology, high dimensional and massive data are col-lected in scientific research and social activities. Therefore, study how to extract important information and features from these data has very important significance. The development of sparsity regularization methods provide a unified framework to deal with the high dimensional and massive data.The researches on sparse regularization method is one of the hot topics in ma-chine learning and statistics. In this thesis, we study the choice of regularization pa-rameter ? and theoretical properties of Lasso with the bounded second moment noise. The main work of this paper are the following:In chapter one, we review the background of the high dimensional analysis and regularization framework.In chapter two, a sparse Bayesian linear regression model is proposed that gen-eralizes the Bayesian Lasso to a class of Bayesian models with scale mixtures of nor-mal distributions as priors for the regression coefficients. We assume a hierarchical Bayesian model with a binary indicator for whether a predictor variable is included in the model, a generalized normal prior distribution for the coefficients of the included variables, and a Student-t error model for robustness to heavy tailed noise. Our model out-performs other popular sparse regression estimators on synthetic and real data.In chapter three, we study the high-dimensional statistical theory of Lasso with the bounded second moment noise. We propose the nonasymptotic bounds of Lasso which generalize the existing results.
Keywords/Search Tags:Lasso, Sparse, Bayesian, Variable selection
PDF Full Text Request
Related items