Font Size: a A A

Ranks Constraints On Joint Dictionary Learning

Posted on:2017-01-15Degree:MasterType:Thesis
Country:ChinaCandidate:H H MengFull Text:PDF
GTID:2308330503958927Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Sparse coding and low rank learning is currently one of the frontier research topic in the field of machine learning and signal processing. Because sparse coding model conforms to the human eye model which processes external information, it has obtained the unprecedented development.in image classification, image denoising, image compression and signal transmission. Sparse coding is modeled in complete bases(also called a dictionary) which finds the sparse representation, thus reducing the storage of image data, enhance the interpretability of the image at the same time. Low rank learning removes random but large noise for the original sample with low rank structure. This paper introduces the related technologies about sparse coding, low rank learning and their achievements. We determine to study the topics about low rank learning and sparse coding in the image classification and face recognition algorithm.Bags of Words algorithm makes image expressed in a form of word histogram, not caring that the current dictionary histogram can accurately reconstruct the original sample as far as possible, and ignores the location of sample points and the class label information. A lot of research work has shown that the learned dictionary algorithm is usually better than fixed dictionary due to a better reconstruction and discrimination performance.In this paper, the particularity and universality in philosophy are introduced to the dictionary learning method. By designing new dictionary structure including common dictionaries and particular dictionaries, that dictionary has better reconstruction error. We set a rank constraint on the data samples of low dimension space. In order to get the correct common dictionaries, we add rank function to the common dictionary subspace. Based on the equivalent substitution of objective function, we finally get a rank constraint on joint dictionary learning. Although objective function is not a convex optimization problem, we can turn it into two convex optimization sub-problems. we obtain a visual dictionary of the algorithm by optimization. we redesign the classification algorithm to produce better classification performance. Finally the experimental results show that our method has better classification performance indeed.
Keywords/Search Tags:Sparse Coding, Low Rank Learning, Dictionary Learning, Commonality, Particularity
PDF Full Text Request
Related items