Font Size: a A A

Piecewise-linear Homotopy Path Tracking Methods And Decomposition Techniques For Large-scale Quadratic Programming And Sparse Optimization

Posted on:2020-07-04Degree:DoctorType:Dissertation
Country:ChinaCandidate:G Q WangFull Text:PDF
GTID:1360330575956995Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Quadratic prograrrmming and sparse optimization are two classes of important and basic op?timization models,which have important applications in many areas of science,engineering and economy.Their numerical solutions is an important research topic in numerical algebra and op-timization.Although rich results have been achieved on this topic,due to the upgrading of the computer and the data collection technique,and the advances of the computing and information technology,especially the rapid development and popularization of machine learning and big data technology,the scale of the practical problems is getting larger and larger.Hence,the study on efficient algorithms for large-scale quadratic programming and sparse optimization problems is still a challenging and active topic.This paper conducts a thorough and systematic study on the piecewise-linear homotopy path tracking method,the decomposition techniques,the warm up technique and the proximal point methods for large-scale quadratic programming and sparse optimization.We present several efficient algorithms for solving large-scale quadratic program-ming and sparse optimization.In Chapter 1,we give an introduction about the background and current status of the nu?merical algorithms for quadratic programming and the sparse optimization.In Chapter 2,based on the parametric active set method,we present an APG based warm up technique,an ?-precision verification and correction technique and a fast Cholesky update teclhnique.With these techniques,we propose a new piecewise-linear homotopy path tracking method.The numerical experiments demonstrate that this method is much more efficient than the state-of-the-art algorithms and exhibits robustness for ill-conditioned problems.We prove that the proximal point method for nonconvex box constrained quadratic pro?gramming problems Q-linearly converges to a local minimizer with probability 1,and give an estimation of the linear convergence factor.Based on this result,we propose an accelerated proximal point?APP?method.The numerical results show that the APP method exhibits obvi-ous acceleration effect.By combining the piecewise-linear homotopy path tracking method and the APP metlhod,we propose an accelerated proximal point homotopy?APP-Hom?for noncon-vex box constrained quadratic programming problems.The numerical experiments show that APP-Hom is much more efficient than the state-of-the-art algorithms.Since the piece-wise linear homotopy path tracking method is not so efficient as expected when it is extended to solving general quadratic programming problems,in Chapter 3,we present a proximal augmented Lagrangian homotopy?PAL-Hom?method for solving general convex quadratic programming problems.This method uses piecewise-linear homotopy path tracking method to solve the proximal augmented Lagrangian subproblems,which takes the advantages of the piecewise-linear homotopy path tracking method and the augmented Lagrangian method.The numerical results demonstrate that PAL-Hom performs better than the interior-point method on solving the problems which have dense Q,small number of constraints and solutions with smaller number of free variables.Moreover,compared to active set method and parametric active set method,it is more efficient and robust.In Chapter 4,to solve large-scale quadratic programming problems from SVM,we propose an efficient and globally convergent proximal stochastic block coordinate minimization aug-mented Lagrangian homotopy?PSBCM-ALH?method.Based on an efficient heuristic update strategy for the block coordinate,this method decomposes the large-scale problem into a se-quence of small-scale strongly convex quadratic programming subproblems,and solves them by augmented Lagrangian homotopy methods.Moreover,by exploiting the sparsity of the solution and the similarity of the subproblems,we design a shrinking technique and an adaptive parame-ters updating technique to improve the efficiency and the robustness of the algorithms.We prove that the complexity of PSBCM-ALH is linear with n for training linear SVM,and quadratic for nonlinear SVM.The numerical experiments show that for training large-scale linear and nonlin?ear SVM,our method is much more efficient than the famous LIBSVM package.Moreover,for training large-scale linear SVM,our method outperforms the state-of-the-art linear SVM trainer LIBLINEAR.In Chapter 5,we devote to designing efficient algorithms for large-scale sparse optimization.We present a proximal block coordinate minimization?h-homotopy?PBCM-?1-Hom?method for large-scale LASSO problems and a proximal block coordinate difference of convex ?h-homotopy?PBCDCA-?1-Hom?method for large-scale?1-2 minimization problems.These two methods ex-ploit the sparsity of the solution and the separable structure of the problem,and only need to solve a small-scale strictly convex ?1 regularized subproblem at each iteration.We prove that,based on the well-designed block coordinate updating strategy,they converge to the optimal solution of LASSO and a KKT point of the ?1-2 minimization problem,respectively.In addition,we adopt the shrinking technique and the parameters updating technique to improve the efficiency of the algorithms.The numerical results show that our methods are more time and space efficient to the state-of-the-art algorithms.
Keywords/Search Tags:quadratic programming, homotopy method, support vector machine, sparse op-timization, LASSO
PDF Full Text Request
Related items