Font Size: a A A

Research On Relation Extraction Models Based On Neural Networks

Posted on:2021-04-27Degree:DoctorType:Dissertation
Country:ChinaCandidate:S F LvFull Text:PDF
GTID:1368330605979416Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
With the rapid development of the Internet,a large number of unstructured texts are scattered all over the Internet.From these unstructured text,entity and relation ex-traction can extract entities and semantic relations among entities,which effectively transforms unstructured texts into computer-friendly structured texts.Relation extrac-tion is a sub-task of entity and relation extraction.It extracts relations between entities after which are recognized from texts.This paper mainly researches relation extraction that focuses on a sentence-level text with two entities in the supervised learning version.In recent years,with the development of deep learning,relation extraction models have made great progress.However,there are still some questions of concern in relation extraction,including but not limited to(1)how to efficiently exploit dependency trees parsed from sentences,(2)how to effectively explore the special relation,and(3)how to judge whether models recognize the directionality of relation.The above issues will be investigated in this dissertation.And we summarize our contributions as follows:(1)Research on how to efficiently exploit dependency trees parsed from sentences.A novel model,self-attention over tree for relation extraction(SATRE),is proposed in this dissertation.SATRE consists of several sub-modules:long short-term memory(LSTM)networks,self-attention over tree(SAT),a linear combination layer,a pooling layer,a feed forward neural network(FFNN),and a softmax layer.Word vectors are fed into LSTM followed by SAT to obtain the information of word order and the structured information of dependency trees,respectively.Then,the output of LSTM and the output of SAT are combined by a linear layer to obtain word representations.Word representations are fed into a pooling layer followed FFNN to obtain the relation representation,which is then fed into a softmax layer to predict a relation.Experimental results on two public datasets empirically indicate the effectiveness of the proposed model in comparison with the current models.Especially,extensive experiments indicate the data-efficiency of SATRE in exploiting the training data.(2)Research on how to effectively explore the special relation.A novel auxiliary learning method for relation extraction is introduced in this dissertation.On the model learning phase,a auxiliary learning task is proposed for the special rela-tion(i.e.no_relation).The auxiliary learning task is a binary classification task,which regards the special relation as negative class while treats the rest seman-tic relations as positive class.The auxiliary learning task pays more attention to no_relation with a class-wise cost-sensitive loss by assigning higher cost on the misclassification of negative samples than positive ones.Significant improve-ments are consistently achieved when eight models are equipped with the auxil-iary learning task on two public datasets,which demonstrates the effectiveness of the proposed method.(3)Research on how to judge whether models recognize the directionality of rela-tion.A novel evaluation task,called Relation Direction Recognition(RDR),is proposed in this dissertation.RDR reflects the difference or consistency of two performances that a trained model achieves on a pair of paired test sets.Three metrics for RDR,which are Performance Difference(PD),Predictive Immobility Rate(PIR),and Paired Predictive Rate(PPR),are introduced to measure the de-gree to which models recognize the directionality of relation.Several state-of-the-art methods are evaluated under RDR.Experimental results indicate that there are clear gaps among these methods even though they achieve similar performances in the traditional metric(i.e.Macro-F1),which demonstrates the effectiveness of the proposed task.
Keywords/Search Tags:Relation Extraction, Neural Networks, Dependency Tree, Self-Attention over Tree, Special Relation, Auxiliary Learning, Relation Direction Recognition
PDF Full Text Request
Related items