Font Size: a A A

Research On Relation Extraction Based On Few-Shot Learning

Posted on:2022-12-19Degree:MasterType:Thesis
Country:ChinaCandidate:B N JiFull Text:PDF
GTID:2518306758991609Subject:Automation Technology
Abstract/Summary:PDF Full Text Request
Once the concept of deep learning is proposed,it triggers the rapid development in the field of artificial intelligence,especially in the field of natural language processing,and its impact is extremely profound and lasting.For natural language processing tasks,it is impossible to accurately understand semantics only by combining traditional models with data annotation.To accurately and deeply understand semantics,prior knowledge must be incorporated into natural language processing tasks.Practice has proved that knowledge-guided natural language processing is the only way for future development.With the deepening of research,knowledge graphs have been widely used,but the coverage of knowledge graphs is not yet complete.In order to add richer world knowledge to knowledge graphs as quickly and accurately as possible,researchers have proposed relation extraction,which aims to identify and extract the semantic relations existing between entity pairs.The mainstream method of entity relation extraction today relies on large-scale data,but because the long-tailed distribution problem is a common and unsolved problem between relations and entity pairs,there is a lack of a large amount of effective data on this problem.However,the network requires a large amount of valid data,which leads to a great impact on the performance of the neural network model when the training data is limited.Therefore,when the training data is limited,relation extraction can be regarded as a few-shot learning task.The few-shot learning task is inspired by the learning ability of human beings to “learn by analogy”,and then the relation extraction model can also acquire the learning ability.Introducing few-shot learning into a relation extraction model enables the model to obtain useful prior knowledge from the trained data and combine it with new data,ultimately making computers as capable as humans of " learn by analogy ".In this paper,few-shot learning is combined with relation extraction,and a neural network model is used to extract the semantic relations contained between a given entity pair on a small amount of sample data.For the problem of few-shot relation extraction,this paper improves the traditional induction network by adding a multi-level selfattention mechanism.First,we proposes a hierarchical self-attention mechanism,including instance-level and task-level attention mechanisms,which can adjust the support set to obtain high-level information between the support set and the query set.Second,we combine the network with the self-attention mechanism,the dynamic routing algorithm based on the attention mechanism is used to represent the category representation related to the query set,and the Improved Induction Network with Hierarchical Self-attention Scheme model is constructed.Then the model is applied on the Few Rel dataset,and verified the effect of multi-level self-attention mechanism on model performance through experiments.Finally,compared with traditional inductive network and other few-shot relation extraction models,the final results show that the model in this paper is adaptable to task and outperforms the state-of-the-art models.
Keywords/Search Tags:Relation Extraction, Few-shot Learning, Relation Classification, Induction Network, Self-Attention
PDF Full Text Request
Related items