Font Size: a A A

Research On Sparse Regression For Feature Selection

Posted on:2019-08-10Degree:MasterType:Thesis
Country:ChinaCandidate:T Y HuangFull Text:PDF
GTID:2428330545985541Subject:Computer technology
Abstract/Summary:PDF Full Text Request
In many areas such as machine learning,data mining and computer vision,feature selection is a crucial and challenging task to find a relevant feature subset of the original features.In the filed of feature selection,unsupervised feature selection and cost–sensitive feature selection are two important tasks which have attracted much research attention.In this dissertation,we propose two feature selection methods.One is for unsupervised feature selection and the other is for cost–sensitive feature selection.The details of these two methods are as follows:1.For unsupervised feature selection,firstly,we propose the difference degree matrix of the features,then,based on this type of matrix,we propose a new unsupervised feature selection method called unsupervised feature selection with the largest angle coding?FSAC?.Different from existing unsupervised feature selection methods,FSAC selects features through the self–representation of the difference degree matrix to select the features from different aspects.Therefore,it can select the features with sufficient information to distinguish the samples in the dataset.For making the self–representation of the difference degree matrix more useful and reduce the noisy features,L21–norm constraint is added into the objective function of FSAC.2.For cost–sensitive feature selection,firstly,we define the cost–distance among the samples,then based on this type of distance,we combine manifold learning and sparse regression into cost–sensitive feature selection to develop a corresponding method,called cost–sensitive feature selection via manifold learning?CFSM?.Most previous cost–sensitive feature selection methods rank features individually and select features just by the correlation between the cost and the features.However,with the help of manifold learning and sparse regression,our new method selects features by not only the correlation between the cost and the features but also the discriminative information implied within the features.Therefore,compared with previous cost–sensitive feature selection methods,our new one can better select the features.
Keywords/Search Tags:feature selection, unsupervised learning, cost–sensitive learning, manifold learning, sparse regression
PDF Full Text Request
Related items