Font Size: a A A

Application Of Relation Extraction Based On Attention And Graph Convolutional Network

Posted on:2022-06-27Degree:MasterType:Thesis
Country:ChinaCandidate:J XuFull Text:PDF
GTID:2518306557467994Subject:Software engineering
Abstract/Summary:PDF Full Text Request
With the approaching of Big Data era,manipulate large amounts of information are presented to users in semi-structured or unstructured forms.Providing accurate and valuable information by textual analysis model become a hot issue.As one of the important components of information extraction tasks,relation extraction gain more granular information on semantic analysis and provide data support for more tasks,so it has great value in study and application.Based on LSTM,thesis designs two different relation extraction methods combine the attention mechanism and graph convolution technology to optimize the model architecture from different perspectives.The main research contents and innovations are as follows:1.Based on Bi LSTM and attention mechanism technology,a graph convolutional neural network for realtion extraction(AL-GCN)has been proposed.At the higher level of the model,an attention mechanism layer is added,combining the output of Bi LSTM,GCN and the global position feature of the sentence to construct a new sentence vector representation.Introducing the attention mechanism can make the model focus on words that contribute more to the extraction of the relationship through a reasonable weight distribution strategy.Meanwhile,the introduction of the attention mechanism provides a richer global high-level semantics.feature.Experimental results show that compared with traditional machine learning methods,the proposed method can effectively integrate different levels of information extracted by long-short-term memory networks and graph convolutional networks.2.Thesis propose a relation extraction model based on adaptive graph convolutional neural network(A-GCN).The dependency parsing contains a wealth of grammatical structural relations.Many works introduce the structural information of the dependency parsing by simply using the 0,1adjacency matrix,without effectively considering the influence of different dependency relations between nodes,and placing all dependency relations in Pay equal attention.A-GCN model uses dependency relationships between nodes,nodes vector representation and other features to model a learnable adjacency matrix.By modeling the adjacency matrix,the model can capture the different contributions of different dependencies in the propagation of feature information between nodes.The experimental results show that compared with the graph convolution relationship extraction model using 0 and 1 adjacency matrix,the proposed model can learn more grammatical features and effectively improve the accuracy of model extraction.
Keywords/Search Tags:Relation Extraction, Deep Learning, Attention Mechanism, Graph Convolutional Network, Long Short-Term Memory
PDF Full Text Request
Related items