Font Size: a A A

Spectral Gradient Algorithm For Matrix L2,1-Norm Minimization

Posted on:2014-07-15Degree:MasterType:Thesis
Country:ChinaCandidate:L H LiuFull Text:PDF
GTID:2250330401475074Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Multi-task feature learning has recently received increasing attention in machinelearning including medical diagnosis, text classifcation, and biomedical informatics. Gen-erally, recent results show that it can be realized by solving a l2,1-norm involved non-smooth convex minimization problem. However, solving the optimization problem is achallenging task due to the non-smoothness of the regularization term. In this thesis, wepropose spectral gradient algorithms for solving the matrix l2,1-norm minimization prob-lem. We establish the global convergence of the proposed algorithms and do numericalexperiments to illustrate their efectiveness.In the frst chapter, we introduce the background and signifcance of the l2,1-norm in-volved minimization problem, which consists of the problems’ formulations, developmentsand some existing methods. We present some preliminaries including the line search tech-niques and spectral gradient methods. Moreover, some important notation and symbolswhich used in the context are also included.In chapter2, we propose a spectral gradient algorithm for matrix l2,1-norm mini-mization. The proposed algorithm only requires the gradient of the smooth function andthe value of objective function at each and every step. To improve the performance of theproposed algorithm, a non-monotone line search technique is incorporated. Under somemild conditions, we show that the proposed algorithm converges globally. The numericalexperiments show that the proposed algorithm works quite well and performs better thanthe algorithm SLEP and IAMD MFL.In chapter3, to improve the method which given in the previous chapter, we use thefnite diference method with a parameter h. Under some mild conditions, we show thatthe proposed algorithm converges globally. Moreover, numerical experiments illustratethat the proposed method is very efcient.In chapter4, we give a summary of this thesis and list some further research topics.
Keywords/Search Tags:multi-task feature learning, non-smooth convex optimization, matrixl2,1-norm, spectral gradient method, nonmonotone line search
PDF Full Text Request
Related items