Font Size: a A A

A New High-dimensional Nonconvex Sparse Optimization Algorithm

Posted on:2021-05-08Degree:MasterType:Thesis
Country:ChinaCandidate:Y Q LiuFull Text:PDF
GTID:2518306224994199Subject:Mathematical Statistics
Abstract/Summary:PDF Full Text Request
Sparse recovery is a fundamentally important problem in statistics,machine learning,and signal processing.In statistics,sparsity is an important variable selection tool for constructing parsimonious models that are easy to interpret.Sparsity is an important structural property in signal processing,especially in compressed sensing,which can be used for data acquisition,signal transmission,storage and processing,etc.Sparse recovery model is usually expressed as a high-dimensional linear regression model.There are two main methods to solve this model,that is,adding a penalty to transform it into a convex optimization model or a non-convex optimization model.In this paper,a primal-dual active set algorithm is proposed,which can solve a class of non-convex sparse optimization models.In this paper,we consider the problem of recovering a sparse signal based on penalized least squares formulations.In this paper,a new primal-dual active set algorithm is proposed for a class of nonconvex sparsity-promoting penalties,including7)~0,bridge,capped-7)~1,smoothly clipped absolute deviation and minimax concavity penalty.Firstly,we consider the existence of the global minimum of the relevant nonconvex sparse optimization model.The existence issue has not been thoroughly studied in prior works.Then,the relevant threshold operator is used to derive a new necessary optimal condition of the global minimizer,and it is proved that the solutions of the necessary optimal condition are coordinate-wise minimizers,and under certain conditions,the solution of the necessary optimal condition is also the local minimizers.Finally,a dual variable is introduced,and the active set is determined by using the primal variable and the dual variable.In addition,this relationship is applicable to a class of active set iterative algorithm,which first updates only the primal variable on the active set in each step,and then explicitly updates the dual variable.Combined with the continuation of the regularized parameters,it is proved that the primal dual active set algorithm is globally convergent to the potential regression coefficient under a certain regularized condition.A large number of numerical experiments,including simulation data and actual data,show that the method has higher efficiency and accuracy than the existing sparse recovery method.The research framework of this paper is as follows:The introduction of the paper mainly introduces the background and significance of this paper,and the research situation and main contents of the research at home and abroad.The first chapter is related concepts and algorithms.This paper mainly introduces some concepts of non-convex optimization theory,primal-dual active set algorithm(PDAS)and two commonly used algorithms to solve the problem of high-dimensional linear regression,that is,the GIST algorithm and GLMNET algorithm.The second chapter is to construct the primal dual active set algorithm.This paper mainly introduces the proof of the optimality of the PDAS algorithm,the construction of the PDASC algorithm and the consistency of the PDASC algorithm.In the third chapter,the numerical simulation and the case analysis are carried out.In this paper,the simulation data and the real data are analyzed by the PDASC algorithm,and compared with the other common methods for solving the high-dimensional non-convex sparse optimization algorithm,the results show that,compared with the other algorithms for solving the high-dimensional linear regression model,the PDASC algorithm has the advantages of fast convergence speed and high prediction accuracy.In addition,the PDASC algorithm is strong for the concavity parameter and the related parameter of the design matrix.At the end,we summarize the conclusion,some main conclusions of this paper,and discuss the future direction of nonconvex research.
Keywords/Search Tags:nonconvex, sparsity, primal-dual active set algorithm, continuation, consistency
PDF Full Text Request
Related items