Font Size: a A A

Research On Entity Relation Extraction Based On Joint Learning

Posted on:2022-08-25Degree:MasterType:Thesis
Country:ChinaCandidate:B B TongFull Text:PDF
GTID:2518306530980169Subject:Electronics and Communications Engineering
Abstract/Summary:PDF Full Text Request
Entity recognition task and relation extraction task are the two classic tasks of information extraction,which are of vital importance to the construction of downstream tasks such as automatic question answering and knowledge graphs.Therefore,this article mainly focuses on the entity relation extraction task in the information extraction task.The task is to identify various entities in the text,analyze and judge the possible relations between the various entities.The main work and innovations of this paper are as follows:1.Aiming at the low quality of the data set in the field of entity relation extraction,this paper adopted a new labeling strategy to label the English Sem Eval2010 Task8 public data set and the Chinese judicial theft judgment document data set to improve the quality of the data set,which provide a solid foundation for subsequent entity relation extraction tasks.2.A model of relation extraction based on dynamic pre-training was proposed.Firstly,Fine-tuning the BERT pre-training model as the Word Embedding layer of the model could better understand the semantic information of the sentence.Then,the model built Long and Short-Term Memory Network as the decoding layer of the model to further learn the deeper features of the text.Finally,adding the word-level Attention layer was used to increase the weight of keywords and sent it to the Softmax layer for classification.The experimental results show that the F1 value of this model is increased by 5% and 7% respectively compared with the baseline model on the English Sem Eval2010 Task8 data set and the Chinese judicial theft judgment document data set.At the same time,compared with other improved models,the model has more good performance.3.Aiming at the problem of error accumulation and no interaction between subtasks in traditional methods,an improved entity relation joint extraction model was proposed.Firstly,BERT was used as the bottom layer of the model to obtain better word vector representation.Then,Bi-directional Long Short-Term Memory Network was built as the decoding layer of the model,and the extracted Bi-directional features were stitched and then sent it to the Multi-head Attention layer to increase the weight of keywords.Finally,adding global normalization as output layer improved the performance of the model.The experimental results show that the F1 value of the model proposed in this paper has increased by 11% compared with the baseline model on the English Sem Eval2010 Task8 data set and the Chinese judicial theft judgment document data set.At the same time,compared with other improved models,the model in this paper has certain competitiveness.
Keywords/Search Tags:Entity recognition, relation extraction, pre-training model, joint learning
PDF Full Text Request
Related items