Font Size: a A A

Algorithms for Sparsity-Constrained Optimization

Posted on:2014-12-05Degree:Ph.DType:Dissertation
University:Carnegie Mellon UniversityCandidate:Bahmani, SohailFull Text:PDF
GTID:1458390005496232Subject:Engineering
Abstract/Summary:
Sparsity-constrained optimization has wide applicability in machine learning, statistics, and signal processing problems such as feature selection and Compressive Sensing. A vast body of work has studied the sparsity-constrained optimization from theoretical, algorithmic, and application aspects in the context of sparse estimation in linear models where the fidelity of the estimate is measured by the squared error. In contrast, relatively less effort has been made in the study of sparsity-constrained optimization in cases where nonlinear models are involved or the cost function is not quadratic. We propose a greedy algorithm, Gradient Support Pursuit (GraSP), to approximate sparse minima of cost functions of arbitrary form. Should the cost function have a "well-behaved" second order variation over the sparse subspaces, we show that our algorithm is guaranteed to produce a sparse vector within a bounded distance from the true sparse optimum. Our approach generalizes known results for quadratic cost functions that arise in sparse linear regression and Compressed Sensing. We also evaluate the performance of GraSP through numerical simulations on synthetic and real data, where the algorithm is employed for sparse logistic regression with and without ℓ2-regularization. We also formulate the 1-bit Compressed Sensing problem as a sparsity-constrained optimization with non-quadratic objective. We show through numerical simulations that the GraSP algorithm with slight modification can show better performance compared to existing algorithms.;Moreover, we study the structured sparsity estimation problems that involve nonlinear statistical models. Previously, several methods have been proposed for these problems using convex relaxation techniques. These methods usually require a carefully tuned regularization parameter, often a cumbersome or heuristic exercise. Furthermore, the estimate that these methods produce might not belong to the desired sparsity model, albeit accurately approximating the true parameter. Therefore, greedy-type algorithms could be more desirable in estimating structured-sparse parameters. So far, these greedy methods have mostly focused on linear statistical models. We study a non-convex Projected Gradient Descent (PGD) for estimation of parameters with structured sparsity. Similar to the requirements for GraSP, if the cost function has proper second-order variation over the structured subspaces, the PGD algorithm converges to the desired minimizer up to an approximation error. As an example we elaborate on application of the main results to estimation in Generalized Linear Models.;Furthermore, we study the performance of the PGD algorithm for ℓ p-constrained least squares problems that arise in of Compressed Sensing. Relying on the well-known Restricted Isometry Property, we provide convergence guarantees for this algorithm for the entire range of 0 ≤ p ≤ 1, that include and generalize the existing results for the Iterative Hard Thresholding algorithm and provide a new accuracy guarantee for the Iterative Soft Thresholding algorithm as special cases. Our results suggest that in this group of algorithms, as p increases from zero to one, conditions required to guarantee accuracy become stricter and robustness to noise deteriorates.
Keywords/Search Tags:Algorithm, Sparsity-constrained optimization, Sensing
Related items