Font Size: a A A

Nonlinear Cone-constrained Optimization And Nonconvex Nonsmooth Optimization: Methods And Applications In Management

Posted on:2020-12-22Degree:DoctorType:Dissertation
Country:ChinaCandidate:L ZhaoFull Text:PDF
GTID:1368330620959475Subject:Management Science and Engineering
Abstract/Summary:PDF Full Text Request
Motivated by the applications in management of the Big Data era,this thesis considers two classes of problems:Composite Optimization with Composite Cone-constraints(COCC)and NonConvex and Nonsmooth Op-timization problem(NCNO).For COCC,we introduce a flexible first-order primal-dual algorithm called the Varying Auxiliary Problem Principle(VAPP).Each iteration of VAPP generates a nonlinear approximation to the primal problem of the augmented Lagrangian method.The approximation incorporates both lin-earization and a variable distance-like function or auxiliary core function.In this way,the primal problem can be decomposed into smaller subproblems,each of which has a closed-form solution or an easily approximated solution.Moreover,these subproblems can be solved in a parallel way.This thesis proves convergence and an O(1/t)convergence rate on average for primal suboptimality,feasibility,and dual suboptimality.An o(1/t~2)primal error bound and O(1/t~2)convergence rate is also proposed for the strongly convex case.A backtracking scheme is discussed to treat cases where the Lipschitz constants are not known or computable.Additionally,we introduce a stochastic coordinate extension of VAPP to solve COCC.We named this method the Stochastic Primal-Dual Coordinate method with Large step-size(SPDCL).In this method,we randomly choose a block of variables based on uniform distribution.The linearization and Bregman-like function(core function)to that randomly selected block allows us to get simple parallel primal-dual decomposition for COCC.We obtain almost surely convergence and O(N/t)expected convergence rate of this scheme.The high probability complexity bound is also derived.For the NCNO,we derived a Variable Bregman Stochastic Coordinate Descent(VBSCD)method.The convergence of VBSCD is proposed,i.e.,any accumulation of the sequence generated by VBSCD is almost surely a critical point.Moreover,we develop a new variational approach on level sets that is aimed towards the convergence rate analysis.If the level-set subdifferential error bound holds,we derive a linear rate of convergence for the expected values of the objective function and expected values of random variables generated by VBSCD.Applications of COCC and NCNO in the area of procurement,portfo-lio,and regression are also discussed by this thesis.
Keywords/Search Tags:constrained optimization problem, nonconvex optimization, first-order method, stochastic coordinate method, augmented Lagrangian, level-set subdifferential error bound
PDF Full Text Request
Related items