Font Size: a A A

Research On Few Shot Incremental Learning Methods For Catastrophic Forgetting

Posted on:2024-02-19Degree:MasterType:Thesis
Country:ChinaCandidate:G LiFull Text:PDF
GTID:2568307118479724Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Thanks to the rapid development of deep learning,neural networks are widely used in the fields of natural language processing,image classification,object tracking,etc.Catastrophic forgetting is one of the important problems in the field of neural network research.With the development of deep learning,few shot incremental learning also suffers catastrophic forgetting.This thesis relieves the problem of catastrophic forgetting from two aspects: the convolutional neural network classifier and the implicit time series relation in the training process.Firstly,a few shot increment learning algorithm for multi-hop graph attention is proposed by introducing the multi-hop graph attention module.Then,Long Short Term Memory(LSTM)model is used to model the time series relationship in the process of incremental learning,and the local context information is explicitly explored.A few shot incremental learning algorithm based on LSTM feature memory is proposed.The contributions are as follows:1.To solve the problem that single-hop graph attention network can only explore adjacent information,a few shot incremental learning algorithm for multi-hop graph attention is proposed to explore the global context information explicitly.From the point of attention aggregation,the calculation method of attention score is improved so that the attention network can fully explore and use the context information.Then,a few shot incremental learning algorithm using multi-hop graph attention is proposed to map the feature information to the topology space for node creation and update,and this algorithm is used to update the classifier in the original network.Experiments with 5-way 5-shot scenarios on CIFAR100 and CUB200 datasets show that the proposed algorithm fully explores the multi-hop structure to model the global context information,and achieves high accuracy and low forgetting rate.2.In order to enhance the correlation of local context features,a feature memory algorithm based on LSTM is proposed.The "gate" mechanism introduced by LSTM can explicitly explore local context information by using local feature information between the previous and current session states.The feature information of the previous and current session states is sent to LSTM to obtain the new feature information after state stacking.Then,the coarse label information is used to enhance the features memorized by LSTM during the incremental training process.Experimental verifications of 5-way 5-shot scenarios on CIFAR100 and CUB200 datasets show that the proposed algorithm can effectively use the local context information,obtain good classification performance and low forgetting rate.There are 33 figures,21 tables and 84 references in this thesis.
Keywords/Search Tags:few shot incremental learning, catastrophic forgetting, multi-hop graph, long short term memory, coarse label
PDF Full Text Request
Related items