Font Size: a A A

Research Of Dictionary Learning Algorithm On Massive Remote Sensing Images Of Long Time Sequences

Posted on:2015-11-09Degree:MasterType:Thesis
Country:ChinaCandidate:J WenFull Text:PDF
GTID:2298330452453384Subject:Information and Communication Engineering
Abstract/Summary:PDF Full Text Request
In recent years, sparse representation of image has become a researching hotspotin image processing. Research is mainly based on design of dictionary learningmethods, effective, fast sparse representation algorithms and its application in imageprocessing. Transforming image information to sparse domain will greatly simplifythe analysis and processing of subsequent images, and will be help to promote thedevelopment of image processing. It is of great importance in both theory andapplication.Supported by the “one three five” planning project of the Chinese academy ofsciences. In this thesis, dictionary learning algorithm on massive remote sensingimages of long time sequences is researched, so that this kind of data can berepresented more sparse and efficiently, and promotes the development of theresearches on the image sparse representation.Firstly, the reason of traditional dictionary learning algorithms not suited tomass data is analyzed, and a dictionary learning algorithm trained in batches ofsamples based on combining the idea of increment learning in classification fieldwith classical K-SVD dictionary learning algorithm is proposed. The algorithmregards each image as a small sample set, selectively trains a certain number ofatoms from each new batch of samples data added into the training process, and addsthose new atoms into the current dictionary. Thus, atomic characteristics of thedictionary will continue to extend as samples are added, and the dictionary canefficiently and sparsely represent the current sample without affecting the sparserepresentation effect for the original sample in order to achieve training dictionary onmassive samples.Secondly, a dictionary atoms initialization method based on information entropytheory is proposed. Setting the initial value is the first step in dictionary learningprocess, the paper judgments difference of the distribution of the sparse coefficient bycalculating the entropy of each column sparse coefficient. Those structurescorresponding to larger entropy sparse coefficient column are not easy to be sparserepresentation, and will be set to the initial values of new atoms, which make trainedatomic structure more in line with the current samples structure, and enrich atomiclibrary information. Then, dictionary removes correlation. Many algorithms are researched to designincoherent dictionary, a model dynamically eliminate the correlation of atoms in thedictionary is proposed. The model introduces a coherence threshold as a judgment inthe dictionary learning process,for added atomic combination,the model judges theeffect on the coherence of the dictionary,if the mutual coherence of the dictionary ishigher than threshold,firstly,in order to ensure that the incoherence constraint issatisfied, iterative projection is implemented on the added atomic combination, Thesecond step the atomic combination is rotation updated with respect to the objectivefunction without affecting its mutual coherence. The model constrains the mutualcoherence and is better adapted to the training sample.Finally, we compare our algorithm with two state-of-the-art learning dictionaryalgorithms on long time sequence data. We demonstrate the experimental resultsachieved by applying all three algorithms on massive data set with long time sequenceof remote sensing images,The large data set we used was from Landset satellite.Experiments show that our dictionary can more sparsely represent testing data set thantrained ones by other two algorithms.
Keywords/Search Tags:sparse representation, dictionary learning, K-SVD, dictionary initialized, mutual coherence
PDF Full Text Request
Related items