Font Size: a A A

Dictionary Learning Based On Bayesian Inference And Its Application In Image Denoising

Posted on:2020-03-09Degree:MasterType:Thesis
Country:ChinaCandidate:M Y LiFull Text:PDF
GTID:2428330623456430Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Sparse representation based on over-complete dictionaries is a research hotspot in the field of computer vision and machine learning.Classical dictionary learning algorithms such as KSVD have been widely used,but it is an algebraic learning method,often unable to know the confidence of the dictionary and representation.For this reason,probabilistic dictionary learning method is proposed,but the existing probabilistic dictionary learning method is mainly aimed at Gaussian noise.Many data obtained in reality,such as images and video data,have matrix or tensor form.For such data,the traditional dictionary learning method is to vectorize the data,turn the tensor data into vector data,and then use the existing dictionary learning algorithm for dictionary learning.In the course of vectorization,the structural information of higher-order data will be lost or destroyed.Based on the above work,this paper studies the learning algorithm of vector and tensor dictionary in the context of Laplace noise and probability framework.Specific work,mainly study the content of two aspects:(1)A vector dictionary learning method based on bayesian inference is proposed for Laplace noise.Because of the complexity of Laplace function calculation,the Laplace distribution is replaced by the superposition of an infinite number of Gaussian distributions.The weight of the Gaussian mixture is controlled by an additional implicit variable.All variables in a probabilistic model can be learned by variational reasoning.The experimental results show that this method can not only remove salt and pepper noise,but also remove mixed noise.(2)The above method is extended to tensor data,and the Laplace noise tensor dictionary learning method based on bayesian inference is proposed.This method avoids the data vectorization process and directly learns the dictionary of each data mode of the tensor.In the process of model learning,the superposition of infinite Gaussian distributions is used to replace the L1 density function in the Laplace distribution,and the weights of these mixed Gaussian noises are controlled by hidden variables.The random variables in the model are obtained by variational inference learning.Finally,experiments show that this method can remove the salt-pepper noise and Gaussian salt-pepper mixture noise.
Keywords/Search Tags:dictionary learning, salt and pepper noise, beta process, tensor dictionary learning, Laplace distribution
PDF Full Text Request
Related items