Font Size: a A A

Block, group, and affine regularized sparse coding and dictionary learning

Posted on:2014-02-05Degree:Ph.DType:Dissertation
University:University of FloridaCandidate:Chi, Yu-TsehFull Text:PDF
GTID:1458390005995404Subject:Computer Science
Abstract/Summary:
In this DISSERTATION, I first propose a novel approach for sparse coding that further improves upon the sparse representation-based classification (SRC) framework. This proposed framework, affine constrained group sparse coding, extends the current SRC framework to classification problems with multiple inputs. Geometrically, the constrained group sparse coding essentially searches for the vector in the convex hull spanned by the input vectors that can best be sparse coded using the given dictionary. The resulting objective function is still convex and can be efficiently optimized using iterative block-coordinate descend scheme that is guaranteed to converge. Furthermore, I provide a form of sparse recovery result that guarantees, at least theoretically, that the classification performance of the constrained group sparse coding should be at least as good as the group sparse coding.;While utilizing the affine combination of multiple input test samples can improve the performance of the conventional sparse representation-based classification framework, it is difficult to integrate this approach into a dictionary learning framework. Therefore, we propose to combine (1) imposing group structure on data (2) imposing block structure on the dictionary and (3) using different regularizer term to sparsely encode the data. We call this approach either block/group (BGSC) or reconstructed block/group (R-BGSC) sparse coding. Incorporating either one of them with the novel Intra-block Coherence Suppression Dictionary Learning (ICS-DL), which, as the name suggests, suppress the coherence of atoms within the same block, algorithm results in a novel dictionary learning framework.;An important and distinguishing feature of the proposed framework is that all dictionary blocks are trained simultaneously with respect to each data group while the intra-block coherence being explicitly minimized as an important objective. We provide both empirical evidence and heuristic support for this feature that can be considered as a direct consequence of incorporating both the group structure for the input data and the block structure for the dictionary in the learning process. The optimization problems for both the dictionary learning and sparse coding can be solved efficiently using block-coordinate descent, and the details of the optimization algorithms are presented.;In both parts of this work, the proposed methods are evaluated on several classification (supervised) and clustering (unsupervised) problems using well-known datasets. Favorable comparisons with state-of-the-art methods demonstrate the viability and validity of the proposed frameworks.
Keywords/Search Tags:Sparse coding, Dictionary learning, Framework, Block, Classification, Using, Affine, Data
Related items