Font Size: a A A

High Sparsity Representation and Applications

Posted on:2014-12-31Degree:Ph.DType:Thesis
University:The Chinese University of Hong Kong (Hong Kong)Candidate:Lu, CewuFull Text:PDF
GTID:2458390005995241Subject:Computer Science
Abstract/Summary:
Sparse representation has achieved great success in many domains. Most sparse representation methods employ the ℓ1-norm for regularization and ℓ2-norm for data-fitting in the ℓ2 + ℓ1 framework. In this thesis, we mainly discuss a new representation using even higher sparsity measures. We name it high-sparsity representation. We present several novel methods using this model to tackle challenging computer vision and imaging problems.;For regularization, we study two excellent properties of ℓ 0-norm, that is, main contrast preserving and mode selection, which enable tackling problems in principle structure representation and contrast preserving decolorization. Specifically, thanks to the contrast preserving property using the ℓ0-norm, we propose a new image editing tool effective for sharpening major edges by increasing the steepness of transitions while eliminating a manageable degree of low-amplitude structures. The seemingly contradictive effect is achieved in an unconventional optimization framework making use of ℓ0 gradient minimization, which can globally control how many non-zero gradients are resulted in to approximate prominent structures. We also present a novel pencil sketch production method taking advantage of principal structure representation. Following it, we study the mode selection property of ℓ0-norm and an optimization approach aiming at maximally preserving the original color contrast. We can alleviate the strict order constraint for color mapping based on human visual system to enlarge possible solution space. This relaxation freely chooses "contrast sign" at each pixel. Therefore, a bimodal distribution derived from ℓ0-norm minimization is designed to constrain spatial pixel difference and allows for automatic selection of suitable gray scales in order to preserve original contrast. Both the quantitative and qualitative evaluation bears out effectiveness of the proposed method.;For high sparsity representation in data-fitting, we improve dictionary learning. We propose an online robust dictionary learning framework using ℓ 1 data-fitting term that has higher sparsity in comparison to traditional ℓ 2-norm. We develop an algorithm with full details to accomplish online learning and have shown that this algorithm can produce similar-quality results as the batch robust one. The online dictionary update scheme saves time and memory while not sacrificing much accuracy. Our framework can solve many practical problems containing dynamic and large-scale data.
Keywords/Search Tags:Representation, Sparsity, Framework
Related items