Font Size: a A A

Study On The Splitting Algorithms For Separable Convex Pro- Gramming And Their Applications

Posted on:2013-03-25Degree:DoctorType:Dissertation
Country:ChinaCandidate:M TaoFull Text:PDF
GTID:1360330473459265Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Many applications arising in various areas can be captured by the task of solving a large scale separable convex minimization problem. It includes Compressive Sensing, imaging processing, maching learning and multistage stochastic programming prob-lem. Many famous researchers, such as Terence Tao, Donald Goldfarb, Emmanuel Candes, show great interests to it. Because the objective function is expressed as the sum of m operators (m?3), the classical splitting algorithm, such as forward-backward algorithm, Douglas-Rachford algorithm, can not directly handle it whenever the number of individual operators is greater than two. The difficulty of high dimen-sionality excludes direct applications of some second order algorithms and some state-of-the-art yet generally-purposed solvers such as SeDuMi, SDPT3. Therefore, we want to design some customized splitting algorithms to solve the large scale separable con-vex problem. By exploring the special structure and curtain partial relaxed techniques, we can alleviate the hardness brought by high dimensionality.In Chapter 1, we introduce the separable convex minimization problem with linear constraints. The motivation will be also addressed in this chapter. In order to simplify the coming analysis of different splitting algorithms, we list some basic conceptions, variational characterization and crucial inequalities.In Chapter 2, we introduce the background of designing splitting algorithms and motivation problem. Principal Component Analysis (PCA) is a popular tool for data analysis. The classical PCA breaks down even one of the observed data is corrupted by outlying. Therefore, Candes et al. proposed Robust Principal Component Analysis (RPCA) model to accomplish the task of recovering the low-rank and sparse compo-nents (sparse but big in magnitude). They verified that it will be successful to recover the principal component (low rank part) under curtain conditions. However, because of the experimental budget or reachability, only a fraction of entries can be observed. Sec-ond, the observed data may be corrupted by both impulsive noise (sparse but large) and Gaussian noise (small but dense). In order to handle the more general case, we put for-ward a more general model for recovering low-rank and sparse components of matrices from incomplete and noisy observations, to capture more concrete applications in real-ity. The Extended Robust Principal Component Analysis (ERPCA) model can be easily formulated as a separable convex programming with three operators in objective func-tion and linear link constraint. It falls perfectly in the applicable scope of the classical augmented Lagrangian method (ALM). Since alternating direction method (ADM) is an efficient algorithm for separable convex programming, it has been widely used in a lot of concrete applications. The global convergence of classical ADM was established. The direct extension of ADM for solving separable convex programming with multi-operators is referred as alternating splitting augmented Lagrangian method (ASALM). The numerical performance of ASALM for solving ERPCA is very attractive. Since the convergence property of ASALM is still open, we propose another splitting algo-rithm which is referred as variant alternating splitting augmented Lagrangian method (VASALM). The algorithm makes full use the nice structure to derive each subproblem to have a closed-form solution when applied to solve the ERPCA problem. In addition, the resulting recovery components are very accurate. Hence, the recovery model is justified empirically.Chapter 3, we propose a novel splitting method for solving a separable convex minimization problem with linear constraints, where the objective function is expressed as the sum of many individual functions without coupled variables. The new method is suitable for exploiting the properties of these individual functions separably, resulting in subproblems which have closed-form solutions. This algorithm can be referred as extension version of VASALM which proposed in Chapter 2. When the number of operator in objective function is set to 3, the proposed splitting algorithm is reduced to VASALM. Moreover, an improvement of the new method over some existing splitting methods is that no correction step is required. We verify these advantages numerically by some particular applications in image processing and matrix decomposition.Chapter 4, we show that the straightforward extension of ADM is valid for the general case of m?3 if a Gaussian back substitution procedure is combined. The resulting ADM with Gaussian back substitution is a novel approach towards the ex- tension of ADM from m=2 to m?3, and its algorithmic framework is new in the literature. For the ADM with Gaussian back substitution, we prove its convergence via the analytic framework of contractive type methods and we show its numerical efficiency by some application problems.
Keywords/Search Tags:Matrix recovery, principal component analysis, sparse, low rank, alter- nating direction method, augmented Lagrangian method, convex programming, linear constraint, separable structure, contraction method, operator splitting methods, image processing
PDF Full Text Request
Related items