In recent years,machine learning models have achieved excellent performance in many applications.Existing machine learning models often rely on closed-world assumptions and can only handle fixed classes after deployment,causing models can only generalize to preset classes.In practical applications,new classes or tasks will continue to appear,resulting in poor generalization of the model in the real open world,as well as a decrease in performance and credibility,causing models hard to apply under the actual scenario.To solve these problems,models need to be able to continuously incorporate new classes.The most straightforward approach is to train the model by mixing all the old and new class data.However,it requires storing all trained old data,which is impossible due to data privacy and copyright problems,as well as storage and computing capacity limitations.The class incremental learning aims to learn new classes with the lack of old class data.It is a very challenging task,which faces the following two difficulties:(1)how to overcome the catastrophic forgetting caused by the lack of old class data;(2)How to overcome the confusion between tasks caused by the lack of joint task training.In response to the above problems,this thesis conducts the following works:(1)To address the catastrophic forgetting,this thesis proposes a distillation-based representation expansion strategy and a queue-based knowledge distillation in the scenario of exemplar-based class incremental learning.The distillation-based representation expansion strategy expands the feature space to encode new classes in an additional feature space while preserving the invariance of the old classes’ feature space through knowledge distillation.The queue-based knowledge distillation constructs a model queue for previous tasks,allowing early tasks’ models to participate in knowledge distillation,effectively alleviating catastrophic forgetting in long-sequence class incremental learning.In the scenario of non-exemplar class incremental learning,this thesis introduces diverse image inversion techniques.By clustering and aligning local prototypes in the old classes’ feature space,the conditional generator is trained to learn the local distribution of old classes.By combining the local distributions,diverse pseudo-samples are generated to effectively reduce catastrophic forgetting.(2)To overcome the problem of task confusion,this thesis proposes incremental semantics mining in the scenario of exemplar-based class incremental learning.By learning discriminative representations for new classes without redundant semantics,the separability of the old and new classes is effectively improved,and the misclassification between the old and new classes is reduced.In the scenario of non-exemplar class incremental learning,this thesis introduces a hybrid classifier.By using local prototypes to correct inter-task predictions of the linear classifier,unbiased and robust predictions across tasks are achieved. |