Font Size: a A A

Splitting Algorithms for Convex Optimization and Applications to Sparse Matrix Factorization

Posted on:2014-05-06Degree:M.SType:Thesis
University:University of California, Los AngelesCandidate:Rong, RongFull Text:PDF
GTID:2458390005994092Subject:Engineering
Abstract/Summary:
Several important applications in machine learning, data mining, signal and image processing can be formulated as the problem of factoring a large data matrix as a product of sparse matrices. Sparse matrix factorization problems are usually solved via alternating convex optimization methods. These methods involve at each iteration a large convex optimization problem with non-differentiable cost and constraint functions, which is typically solved by block coordinate descent algorithm. In this thesis, we investigate first-order algorithms based on forward-backward splitting and Douglas-Rachford splitting algorithms, as an alternative to the block coordinate descent algorithm. We describe efficient methods to evaluate the proximal operators and resolvents needed in the splitting algorithms. We discuss in detail two applications: Structured Sparse Principal Component Analysis and Sparse Dictionary Learning. For these two applications, we compare the splitting algorithms and block coordinate descent on synthetic data and benchmark data sets. Experimental results show that several of the splitting methods, in particular Tseng’s modified forward-backward method and the Chambolle-Pock splitting method, are often faster and more accurate than the block coordinate descent algorithm.
Keywords/Search Tags:Splitting, Block coordinate descent algorithm, Applications, Convex optimization, Sparse, Matrix, Data
Related items