Font Size: a A A

Research On Incremental Learning Algorithm Based On Pseudo-Sample Rehearsal

Posted on:2021-11-18Degree:MasterType:Thesis
Country:ChinaCandidate:Y J ChenFull Text:PDF
GTID:2518306554965729Subject:Master of Engineering
Abstract/Summary:PDF Full Text Request
On the way to artificial intelligence,a major open issue is the development of incremental learning systems that can learn more and more knowledge from data streams over time.In recent years,the application of deep convolutional neural networks in image classification has achieved impressive performance,and deep neural networks are currently the best solution to solve many machine learning problems.However,when performing incremental learning on deep neural networks,it will suffer from an inevitable problem that the neural network model will rewrite and overwrite the parameters on the previous task during the training of the new task.This phenomenon is known as catastrophic forgetting by neural networks.Aiming at the catastrophic forgetting problem caused by the neural network model in class incremental learning,a class incremental learning method based on pseudo sample rehearsal is proposed.The main research work is as follows:(1)In order to overcome the effect of catastrophic forgetting when the neural network model performs incremental learning,an original solution is pseudo-rehearsal,which generates pseudo samples of the old class by randomly generating input data and making it allocate its target output through the network,so as to protect the performance of the old class in the network.However,pseudo-rehearsal is difficult to generate representative images,so it is not effective in image classification.In response to this problem,this paper proposes a kind of incremental learning method based on pseudo-rehearsal with variational autoencoder.This method first uses VAE to generate pseudo samples to achieve rehearsal.It can generate pseudo images similar to real images.In addition,because VAE is an unsupervised model,and the sample generation is random,with the increase of the number of classes,it is difficult to accurately control the number and quality of pseudo samples generated by each class by the random generation of the VAE model alone.Therefore,on the one hand,this method introduces the Large-Margin Softmax Loss to train the classifier,which enables the classifier to learn the features of greater distance between classes and smaller distance within classes.On the other hand,a pseudo sample selection strategy based on class-average features is used,which is based on the class-average features of the classifier training samples,making the filtered pseudo samples more representative.(2)The use of the Large-Margin Softmax Loss is helpful for the selection of pseudo samples based on class-average features,but its calculation is much larger than that of ordinary Softmax loss.In order to improve the deficiencies of the above methods,a class incremental learning method based on variational pseudo sample generator with classification feature constraints is proposed.Based on the loss of ordinary Softmax,this method firstly introduces the concept of rejection sampling,and by applying rejection sampling to the classifier,proposes a pseudo sample selection strategy based on the classifier score,which can make the probability distribution of generated pseudo samples closer to the real samples.Secondly,In order to make the pseudo samples generated by VAE better retain the performance of the old class on the classifier,based on the VAE,the classification features are used to constrain,and a variational pseudo sample generator with classification feature constraints was proposed(CF-VAE).Finally,based on the idea of knowledge distillation,the output of the old classifier is used as the distillation label of the pseudo sample,and the knowledge obtained from the old class is further retained.Experimental results on MNIST,FASHION,E-MNIST and SVHN data sets show that the proposed method can retain the information of the old class more effectively,reduce the impact of catastrophic forgetting,and improve the accuracy of image classification.
Keywords/Search Tags:class incremental learning, catastrophic forgetting, pseudo-rehearsal, VAE, distillation label
PDF Full Text Request
Related items