Font Size: a A A

Research On Few-Shot Learning Classification Algorithm Based On GNN

Posted on:2022-01-23Degree:MasterType:Thesis
Country:ChinaCandidate:X C SongFull Text:PDF
GTID:2518306512971869Subject:Pattern Recognition and Intelligent Systems
Abstract/Summary:PDF Full Text Request
With the development of artificial intelligence,many fields based on deep learning no longer rely too much on a large amount of data,and it can use past experience to effectively learn from a small number of samples for new problems.In reality,people will inevitably face more problems with insufficient data.Therefore,how to make machines like humans able to quickly learn from a small number of label samples to achieve classification through existing learning experience has become an important research direction.In recent years,classification algorithms for Few-shot Learning have emerged one after another,and there have been many breakthroughs in the research of migration networks,metric spaces,and data enhancement.However,the Few-shot Learning classification algorithm based on GNN is still in a state of continuous exploration,how to make full use of the characteristics of the sample in the case of label data shortage,how to improve the effect of network feature extraction without overfitting,and The research on how to get more suitable edge annotation for network initialization needs to be further in-depth.Therefore,this paper has carried out a series of researches from the three aspects of mining hidden features,feature attention and the potential correlation between the optimization of initial edge weights.The main work and features of this paper are as follows:(1)This paper proposes an edge-weighted single-step memory constrained network(ESMC).According to the implicit distribution characteristics of the edge-weighted data,a new graph structure is designed,combined with node features for fusion and update,so as to rich features and take advantage of features under limited sample data.(2)Based on the attention mechanism of convolution block,different integration methods of channel attention and spatial attention are proposed,and feature attention is used to help the model extract more meaningful features from the samples.(3)This paper proposes a self-learning initial edge weight modification module,which uses the initial node feature measurement result as a constraint,to find more suitable initial edge weight parameters for the network to further improve the learning level of the model.Finally,combining the above methods,this article conducts ablation experiments and comparative analysis in each training mode on the standard datasets,and the experimental results prove the rationality and innovation of the method in this article.
Keywords/Search Tags:Few-shot learning classification, Graph neural network, Edge weight single-step memory constraint, Feature enhancement, Learnable initial edge weight
PDF Full Text Request
Related items