Font Size: a A A

Research And Application On Sparse Deep Models Based On Autoencoders

Posted on:2018-02-09Degree:MasterType:Thesis
Country:ChinaCandidate:C H ZhangFull Text:PDF
GTID:2348330518498647Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Compared with the traditional linear shallow networks,deep networks of nonlinear activation functions possess stronger function representation and approximation ability,and can learn a hierarchically effective feature representation for a function of high nonlinearity and high complexity.Based on a well-designed and proper pre-training and fine-tuning technique,deep learning algorithms have addressed the problem of training process,which makes deep models have a fine,strong generalization ability,so it is widely used in image,computer vision,text,multimedia etc.As a model of unsupervised learning,the autoencoders are often employed to perform the pre-training of the deep neural networks to obtain the optimal initial value of parameter space,so as to avoid the local minimum problem that the non-convex problem may fall into and gradient vanishment of the process of back propagation.However,aimed at solving the problem of image processing and computer vision,the autoencoder and its various variants have not taken the statistical characteristics and domain knowledge of the training dataset,such as the sparsity of the common images,into the design of deep neural networks.Moreover,since the autoencoder usually only serves as a pre-training method of deep models,the abandon of the wealth of different levels of abstraction of hierarchical feature learning obtained from the pre-training process,resulting in a waste of computing resources and storage.This thesis focuses on the research and application of sparse deep model based on the autoencoder.Encouraging the sparsity of the training process of autoencoder can significantly improve the performance of the autoencoder and make full use of the hierarchical features obtained by the pre-training process to improve the classification performance of deep neural networks.The main contents are as follows:(1)Inspired by the topological structure of convolutional neural network,a novel sparse autoencoder based on sparsity-induced layer,named Sparsity AE,is proposed.From the receptive field of the mammalian brain visual system,it is possible to improve the performance of the proposed algorithm by introducing and encouraging sparsity.The introduction of the traditional production of sparse encoder from KL dispersion penalty,since this type of autoencoder punishes very less on such hidden neurons whose output values are far away from the desired activation value,which causes the under-fitting problems occur occasionally.The sparse representation theory on the image features and properties provides a more effective incentive method and can fully encourage sparsity.The experimental results show that the novel sparse autoencoder based on sparsity-induced layer has a higher performance than the traditional sparse autoencoder based on KL sparse penalty term.(2)A ensemble sparse feature learning algorithm,named Boosting AE,based on the mentioned above novel sparse autoencoder,is proposed.On the one hand,the completion of the pre-training sparsity-induced autoencoder the can obtain a plurality of different levels of abstraction of sparse features;on the other hand,multi classifiers ensemble learning can effectively improve and enhance the recognition rate and stability of single classifier.Based on the above two points,the Boosting AE algorithm is used to train multiple classifiers by using the hierarchical features obtained from the pre training,and the final image classification is obtained by integrating the output results of these classifiers.Experimental results on three different datasets show that the proposed ensemble feature learning method can significantly improve the overall performance.
Keywords/Search Tags:Deep Neural Network, Sparse Autoencoder, Ensemble Learning, Image Denoising, Image Classification
PDF Full Text Request
Related items