Font Size: a A A

Research On Graph Regularized And Discriminant Information Based Feature Selection

Posted on:2018-02-01Degree:MasterType:Thesis
Country:ChinaCandidate:W B WangFull Text:PDF
GTID:2348330518499553Subject:Engineering
Abstract/Summary:PDF Full Text Request
With the development of science and technology,information has become increasingly rich and complex,the data also showed a trend of high dimensional development.In many high dimensional data sets,only a small subset of the available features are useful,with most features being redundant,and some features even corresponding to information-less noise.To facilitate the subsequent processing,it is often necessary to reduce the dimension of such high dimensional data.Feature selection is a kind of commonly used dimension reduction methods,which selects representative features from the original set of features based on a variety of evaluation methods.Feature selection has become a hot research topic in many different fields,including data mining,machine learning,pattern recognition and others.Processing of high dimensional data is also a challenge for researchers.In recent years,some new feature selection algorithms have been proposed.However,they ignore the geometric structure information and discriminant information of the data,so they do not achieve better feature selection results.In view of these problems,some feature selection algorithms are studied in this thesis.The main contributions and research contents of this thesis are as follows:1)A new method called nonnegative spectral learning and sparse regression-based dual-graph regularized feature selection?NSSRD?is proposed.NSSRD is based on the feature selection framework of joint embedding learning and sparse regression,but extends this framework by introducing the feature graph,thus it simultaneously exploits the geometric information of both the data space and the feature space.Secondly,the algorithm uses nonnegative constraints to constrain the low dimensional embedding matrix of both the feature space and data space,ensuring that the elements in the matrix are nonnegative.Thirdly,to ensure the sparsity of the feature array,the sparse transformation matrix is constrained using the l2,1-norm.Thus feature selection can obtain accurate discriminative information from these matrices.Finally,NSSRD uses an iterative and alternative updating rule to optimize the objective function,enabling it to select the representative features more quickly and efficiently.2)A new method called subspace learning-based graph regularized feature selection?SGFS?is proposed.SGFS is based on the framework of subspace learning feature selection,which exploits the advantages of matrix factorization techniques.On this basis,the proposed algorithm introduces the concept of graph regularization and preserve the local structure information of the feature space of the data,which is used to guide the learning of the feature selection matrix.Additionally,the l2,1-norm is used to constrain the feature selection matrix to ensure the sparsity of the feature array and avoid trivial solutions.The resulting method can provide more accurate discrimination information for feature selection.3)A new method called local discriminative based sparse subspace learning for feature selection?LDSSL?is proposed.At first,LDSSL introduces a local discriminant model in the feature selection framework of subspace learning.This model preserves the local discriminant structure and local geometric structure of the data simultaneously.It can not only improve the discriminative ability of the algorithm,but also utilize the local geometric structure information of the data.Local discriminant model is a linear model which can not deal with nonlinear data effectively.Therefore,this method needs to kernelize the local discriminant model to get a nonlinear version.Next,the proposed algorithm introduces l1-norm to constrain the feature selection matrix.It can ensure the sparsity of the feature selection matrix and improve the algorithm's discrimination ability.
Keywords/Search Tags:Graph regularization, nonnegative spectral learning, feature manifold, sparse regression, subspace learning, local discriminant model, sparse constraint, feature selection
PDF Full Text Request
Related items