Font Size: a A A

Progressive Deep Neural Network Research

Posted on:2024-06-15Degree:MasterType:Thesis
Country:ChinaCandidate:C WuFull Text:PDF
GTID:2568307127453404Subject:Software engineering
Abstract/Summary:PDF Full Text Request
In recent years,Machine Learning(ML)algorithm has become one of the focuses in the field of artificial intelligence with its powerful processing ability and wide application scenarios.It has achieved remarkable results in computer vision,natural language processing and other fields.Among them,deep learning,as an important branch of machine learning,its main advantage lies in its wider and deeper complex network structure.It learns the mapping relationship from input to output through multi-level data representation and complex nonlinear transformations,and can automatically learn effective feature representations from large-scale or high-dimensional data such as images and videos,greatly improving the model’s representation and generalization ability,And these feature information also have great utilization value in the upstream and downstream,driving the development of the entire artificial intelligence field.Admittedly,deep neural networks have achieved high performance in individual tasks,but in reality,it should be expected that machine learning systems,similar to organisms in nature,can continuously acquire,update,accumulate,and utilize knowledge to cope with external changes.This continuous,adaptive learning ability is a necessary condition for future intelligent systems.However,traditional machine learning paradigms presupposes static data distribution,while in dynamic environments,the probability distribution of tasks may undergo significant changes.That is to say,intelligent systems need to continuously learn different tasks,namely Continual Learning(CL).Continual learning requires learning from dynamic data distributions,which inevitably faces the problem of Catastrophic Forgetting(CF),where the model’s ability to predict the distribution of new task data significantly decreases during the continuous learning process.How to learn new knowledge without forgetting old knowledge is the research focus of continual learning.In addition,in the process of continual learning,data privacy and knowledge reuse are also urgent challenges that need to be addressed.Aim to solve these problems,this paper proposes two different kinds of continual learning algorithms:Firstly,this paper presents a Task-Similarity Guided Progressive Neural Network(TSGPNN)guided by task similarity.This method is based on a deep neural network and assigns an independent network branch(progressive block)to each task to learn progressively.In the asymptotic learning process,knowledge between tasks is selectively migrated to improve the performance of the current task by assessing the similarity between tasks.Experiments show that this method exhibits superior performance in a continuous sequence of tasks scenario compared with single task,multitask learning and other continual learning methods.Secondly,this paper proposes a data free contrastive reversion network for continual learning(DFCRCL),the core idea of which is to recapitulate a pseudo sample using a previous task model,embed the contrasted learning framework to improve semantic diversity of the repeated sample,and use this semantic information in combination with knowledge distillation to help the current task model recall knowledge of its previous task,and explored the role of semantic diversity in continuous knowledge extraction.Compared to the mainstream data-free recapitulating continual learning methods,DFCRCL brings significant performance improvement and classification stability in multiple continual learning scenarios.This method alleviates the problems of large scale and the need for task data and identification in progressive network methods,and has stronger applicability in practical environments.
Keywords/Search Tags:Continual learning, Catastrophic forgetting, Progressive neural networks, Data-free replay, Contrastive learning
PDF Full Text Request
Related items