Font Size: a A A

Entity-Relation Joint Extraction Fusing Attention Mechanism And Graph Neural Network

Posted on:2022-09-11Degree:MasterType:Thesis
Country:ChinaCandidate:D L GaoFull Text:PDF
GTID:2518306533972809Subject:Control Engineering
Abstract/Summary:PDF Full Text Request
With the advent of the big data era,there are more and more unstructured text information.It is no longer realistic to rely on manpower to extract valuable information from massive text information.How to automatically extract valuable information from massive unstructured texts has become a research hotspot and difficulty in the field of natural language processing.As two basic tasks in natural language processing,the main purpose of entity recognition and relationship extraction technology is to automatically identify entities from unstructured text and accurately determine the relationship between entities.They are the basis for completing upper-level natural language processing tasks such as the construction of question answering systems and the construction of knowledge graphs,and have important research significance.This paper focuses on entity recognition and relationship extraction tasks,analyzes and improves the shortcomings of current mainstream research methods,and combines the characteristics of Chinese and English corpus to construct corresponding relationship extraction models in a targeted manner.The main work of this paper includes the following two aspects:(1)To solve the single-task problem of entity relationship extraction in English text,this paper proposes a new graph neural network structure,namely the attention graph long short time memory neural network(AGLSTM).The model uses LSTM to extract the initial temporal context feature of the sentence,uses GLSTM to extract sentence initial semantic structure information from sentence dependency analysis tree;Meanwhile,a soft pruning strategy based on multi-head attention mechanism is introduced to comprehensively learn useful sentence structure information for relation extraction.By effectively fusing the semantic structure information and the timing information of the sentence,the proposed model can effectively solve the problems of information loss and error propagation caused by the excessive dependence of existing methods on natural language processing tools.Finally,the proposed model is applied to datasets such as Tacred and Semeval,and compared with a variety of typical methods,and the effectiveness of the proposed model is verified by experiments.(2)To solve the multi-task problem of entity-relation joint extraction in Chinese text,an attention graph convolutional neural network model based on ETL paradigm decomposition tagging strategy is proposed(TL-Attention graph convolution networks,ETL-AGCN).In order to solve the problems of entity overlap and information redundancy in the traditional joint extraction method,a decomposition tagging strategy based on the ETL paradigm is introduced;In order to simultaneously extract mixed feature information containing characters,words and sentence structure,a new charword mixing method is designed,which combines word-level attention mechanism and GCN;Meanwhile,for the mixed feature information,a char-level attention mechanism is introduced to extract the temporal context information of the sentence from a global perspective.The effective integration of the above strategies improves the model's ability to process Chinese text.Finally,the proposed model is applied to the SKE data set,and compared with a variety of typical methods,experiments verify the effectiveness of the proposed model.
Keywords/Search Tags:named entity recognition, relation extraction, attention mechanism, graph neural network, joint extraction
PDF Full Text Request
Related items