Font Size: a A A

Boosting methods for variable selection in high dimensional sparse models

Posted on:2010-01-10Degree:Ph.DType:Dissertation
University:North Carolina State UniversityCandidate:Hwang, Wook YeonFull Text:PDF
GTID:1440390002978988Subject:Biology
Abstract/Summary:
First, we propose new variable selection techniques for regression in high dimensional linear models based on a forward selection version of the least absolute selection and shrinkage operator (LASSO), adaptive LASSO or elastic net, respectively to be called as forward iterative regression and shrinkage technique (FIRST), adaptive FIRST and elastic FIRST. We exploit the fact that the LASSO, adaptive LASSO and elastic net have closed form solutions when the predictor is one-dimensional. Second, we propose a new variable selection technique for binary classification in high dimensional models based on a forward selection version of the squared support vector machines (SVM) or one-norm SVM, to be called as forward iterative selection and classification algorithm (FISCAL). We suggest the squared support vector machines using ℓ1-norm and ℓ2-norm simultaneously. The squared support vector machines are convex and differentiable except at zero when the predictor is one-dimensional. We apply the processes to the original one-norm support vector machines. By carefully considering the relationship between estimators at successive stages, we develop fast algorithms to compute our estimators. It is observed that our approaches have better prediction performance for high dimensional sparse models.
Keywords/Search Tags:High dimensional, Selection, Models, Squared support vector machines, FIRST, Forward, LASSO
Related items