Font Size: a A A

Relation Extraction Of Attention Guided Graph LSTM Based On Dependency Tree

Posted on:2022-08-24Degree:MasterType:Thesis
Country:ChinaCandidate:C L LiFull Text:PDF
GTID:2518306332453404Subject:Computer technology
Abstract/Summary:PDF Full Text Request
With the advent of the Internet era,big data technology came into being,which marks the beginning of an era where we have to receive information everywhere and all the time.However,how to obtain useful information from it and transform these semi-structured or unstructured data into structured data that machines can understand,learn,and store is very important.The core idea of relationship extraction is to classify the relationship between the given entity pairs in the sentence text,so it plays an important role in the field of natural language processing.One of the research directions dealing with relation extraction tasks is to utilize the rich structural information contained in the dependency tree.It turns out that this information is very useful for relation extraction.However,how to effectively use the syntactic relationship in the dependency tree while ignoring the useless information is still a challenging problem.Most of the existing methods use pre-defined pruning strategies to selectively retain part of the dependency structure,but in fact this rulebased hard pruning strategy greatly reduces the information usage rate of the dependency tree,and cannot always Is to produce the best results.Therefore,we propose an attention-guided graph LSTM network model(AGGLSTM),which is a novel model that inputs a complete dependency tree structure into the neural network.The model in this paper uses the multi-head self-attention mechanism to transform the original dependency tree into a fully connected edge-weighted graph when processing the dependency tree.This is equivalent to a soft pruning strategy,which makes the model effective in mining the effective information in the dependency tree.At the same time filter out useless information.The graph-based LSTM neural network converts the input vector into a document graph,integrates all the dependencies within and between sentences,and can automatically learn how to selectively focus on relevant substructures useful for relation extraction tasks,and without losing information In this case,each word in the sentence can be modeled and learned in parallel,and the state value of the word can be cyclically enriched in the process of message transmission.After each round of state transition,the word will transmit information to the word that is dependent or directly connected to itself.After multiple rounds of state transition steps,each word can capture more contextual information.The model in this paper combines the local and non-local dependency features in the dependency tree at the same time,and then further extracts the key information;finally,the key information is input into the classifier to get the relationship category label.Most of the existing research on relationship extraction tasks focus on the binary relationship of a single sentence.Although excellent results have been achieved,a single sentence sometimes cannot provide a rich ambiguity context,resulting in performance in some high-value fields such as biomedicine.Not good enough.Therefore,the main research object of this paper is to extract cross-sentence binary or multiple relations from instances containing multiple sentences.Experiments show that the model in this paper can make better use of the structural information in the dependency tree in the task of extracting n-ary relations across sentences,and achieve better classification results.And our model has better performance results than other models in the case of many interference items,such as reducing sentence length and reducing the maximum number of neighbors.
Keywords/Search Tags:Relation extraction, Multi-head self-attention mechanism, Dependency tree, Graph LSTM
PDF Full Text Request
Related items