Font Size: a A A

Research On Relation Extraction Based On RoBERTa And Multi-Task Learning

Posted on:2021-02-10Degree:MasterType:Thesis
Country:ChinaCandidate:Z D ZhuFull Text:PDF
GTID:2428330611465588Subject:Computer technology
Abstract/Summary:PDF Full Text Request
With the continuous development of natural language processing,relation extraction has attracted the attention of many researchers as an important subtask of information extraction.Relation extraction can be divided into open domain relation extraction and domain-specific relation Extraction.As the most classic task in relation extraction,domain-specific relation extraction has always attracted the attention of researchers.Traditional domain-specific relation extraction methods are based on template matching or traditional machine learning.These methods requires manually constructed matching templates or features based on the corpus,which is time-consuming,labor-intensive,and lack of generalization.With the development of deep learning,researchers use deep neural networks to extract features from text,and make breakthroughs in relation extraction.However,most of the current methods use pre-trained word vectors to map text.The performance of the model is limited by the semantic modeling capabilities of the pre-trained word vectors.Also,these models only focus on training data and neglect external knowledge.To address the above issues,this paper proposes a relation extraction model combining Ro BERTa and semantic role information,and uses multi-task learning methods for training.Our model uses Ro BERTa to encode the input sequence,fully extract the contextual semantic features of the text,and introduce the semantic role tags of the entities as external features to enhance their semantic information.In addition,in order to strengthen the ability to describe the relationp between text context and the entities,our model uses entity-aware attention to generate a context vector more relevant to the entities,reducing the impact of the noisy words in the text.Finally,based on multi-task learning,our model introduces auxiliary tasks and allows the model to learn the external knowledge contained in other tasks by sharing parameters,which further improves the performance of our model.Experiments show that our model can effectively improve the effect of relation extraction.On the Sem Eval-2010 Task 8 dataset,our model has achieved a Macro-F1 value of 89.94% without any manually defined rules,second only to 90.36% of the Indicator-aware BERT that requires complex rules to extract syntax indicators.The results of experiments show that both semantic role information and multi-task learning can improve the relation extraction performance of the model.
Keywords/Search Tags:Relation Extraction, Natural Language Processing, Ro BERTa, Semantic Role Labeling, Multi-task learning
PDF Full Text Request
Related items