Font Size: a A A

Application And Comparison Of L_q Penalty Functions In Variable Selection

Posted on:2013-04-26Degree:MasterType:Thesis
Country:ChinaCandidate:W H MaFull Text:PDF
GTID:2230330374982937Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
Variable selection is a very important issue in linear regression analysis, the most basic method is the least squares method, the least squares estimation is the best linear unbiased estimator, but when there is collinearity in the design matrix, it will no longer be efficient. Sometimes the combination of the least squares with the Lq penalty funtions will produce good results.This paper introduces the applications of the Lq penalty functions in variable selection, it introduces a scries of penalty functions.such as the ridge, the bridge, LASSO and Elastic Net, and describes the advantages and disadvantages and their oracle propertis of these penalty functions. In addition, the article also describes the combination of the Bayesian approach and the Lq penalty functions for vari-able selection. Finally, numerical examples show the superior performances of these penally functions. This paper includes the following three parts:In chapter Ⅰ, we introduce several variable selection methods as well as their background and their advantages and disadvantages, we also expound the practical significance of cach methods.In chapter Ⅱ. we introduce the ridge, the bridge, LASSO and the Elastic Net in detail, we study the properties of each methods, at the same time, we compare their advantages and disadvantages. We discuss the bridge. LASSO and the Elastic Net from a bayesian method perspective and get the same results. In chapter Ⅲ, we first compare the OLS, the ridge and LASSO with a set of simulation data, then the grouping effect of the Elastic Net. Finally, we compare these methods with a real set of data.
Keywords/Search Tags:Ridge regression, Bridge, LASSO, Elastic Not, Variable Selection
PDF Full Text Request
Related items