Font Size: a A A

Research On Lie Group Continual Meta Learning Algorithm

Posted on:2023-05-18Degree:MasterType:Thesis
Country:ChinaCandidate:M J JiangFull Text:PDF
GTID:2530306629975789Subject:Software engineering
Abstract/Summary:PDF Full Text Request
In recent years,deep learning has achieved great success with the support of a large number of labeled samples.However,when the number of labeled samples is insufficient,over fitting will occur.On the other hand,neural network is prone to catastrophic forgetting,so it can not achieve continual learning like human beings.As a hot field after deep learning,continual learning solves the problem of catastrophic forgetting to a certain extent,but most methods learn inductive bias through hand-designed mechanisms,which limits its development.Meta learning can not only take the ability of continuous learning without forgetting as the meta objective optimization model,but also achieve good results through only a few learning in the case of limited samples.Therefore,using meta learning for continual learning is a promising research direction.However,the traditional Euclidean metric can not be used for the nonlinear data in the real world,which makes the meta learning algorithm lead to the instability of the model in some cases.Based on this,aiming at the problems of catastrophic forgetting,insufficient labeled samples and unstable model,this thesis uses Lie group mathematics to study the theoretical framework of Lie group continual meta learning.The research results of this thesis include:(1)The Multi Task Meta Learning based on Attention Mechanism(MTMLAM)model is proposed.There is always noise in the data collected in real life;On the other hand,when extracting sample features,the shallow network structure leads to incomplete sample feature extraction;Although the sample features extracted by deep network have increased,some important features have not been paid enough attention.To solve the above problems,this model obtains the most critical contrast features of the sample image by embedding the attention mechanism module in the network.(2)The Continual Meta Learning(CML)model is proposed.Aiming at the problems of catastrophic forgetting and insufficient labeled samples,this model uses the cosine similarity judgment mechanism to optimize the update mode of task gradient.Finally,the effectiveness of the model is proved on the general data set.(3)The Lie Group Continual Meta Learning(LGCML)model is proposed.Aiming at the problems of slow update rate and instability in the continual meta learning model,on the one hand,this model boosts the continuity by changing the inner loop update rule.On the other hand,it uses the orthogonal constraints to limit the parameter space and adopts the natural gradient descent to solve the above problems.Finally,the effectiveness of the model is verified by experiments.
Keywords/Search Tags:Machine Learning, Meta Learning, Continual Learning, Continual Meta Learning, Lie Group Continual Meta Learning
PDF Full Text Request
Related items