Font Size: a A A

On The Sparse Learning And Its Application In Multilabel Learning

Posted on:2011-08-16Degree:MasterType:Thesis
Country:ChinaCandidate:S XiangFull Text:PDF
GTID:2248330338496181Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Machine Learning aims to design and implement algorithms that enable computer to evolve according to empiricaldata. It is one of the most active research areas of Artificial Intelligence and has extensive applications inComputer Vision, Natural Language Processing, biomedical informatics and bioinformatics. As an important featureof data, sparsity acts as the role of simplifying the representation, reducing computational cost and providinginterpretable global features. How to take advantage of the sparsity to improve the Machine Learning system is aninteresting and important research topic.In this paper, we obtain different Sparse Learning Models by exploring various practical problems. We getto know why the l1 constrained optimization leads to sparsity by reviewing the theory of sparse solutions of underdeterminedlinear equations system. We refer to the LASSO algorithm to reduce the variance in linear regressionand introduce Group LASSO, Tree Group LASSO and Sparse Inverse Covariance Estimate to solve group variableselection, tree-structured feature selection and graph-based relation mining respectively.We explore various algorithms in order to solve the optimization behind the Sparse Learning. We refer tothe first-order black box methods to avoid the convergence problem of Coordinate Descent and large time-spaceconsumption of second-order methods. To take advantage of the optimal O( 1√? ) algorithm for smooth convex optimization,we need to figure out efficient l1 projection algorithms, of which we refer to the Pivot Algorithm and theZero Finding Algorithm. We prove they are equivalent by analyzing the procedure, variable and time complexity.We propose an Improved Zero Finding Algorithm which serves to retain the O(n) complexity and reduce the codingwork.Finally, we aims to solve the Multilabel Learning via Sparse Learning. We construct an adaptive neighborhoodto avoid getting into the predicament of choosing the similarity measurement and the size of the neighborhood arose intraditional instance-based Multilabel Learning. We formulate this procedure in an optimization problem motivatedby the sparse representation in Face Recognition. Based on this constructed adaptive neighborhood, we design aweighted sum algorithm to perform the Multilabel Classification and the experiments validate our idea and achievethe results of the state-of-the-art.
Keywords/Search Tags:Machine Learning, Sparse Learning, Optimization,l1 ball projection, Multilabel Learning
PDF Full Text Request
Related items