Font Size: a A A

Relation Classification Model Of Mutual Learning Via High Dimensional Attention Mechanism

Posted on:2021-05-25Degree:MasterType:Thesis
Country:ChinaCandidate:L Q LiFull Text:PDF
GTID:2428330611965589Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Relation classification is a key hot issue in neural language problem.It is of great significance to the construction of question answering system,recommendation system and sentiment classification.If the relationship information between entities in the text can be effectively captured,this has great significance to the improvement of information extraction capability.Previous studies have shown that attention mechanism and shortest dependency path method play a positive role in relation classification.To effectively combine the advantages of this two methods,the keyword-attentive sentence mechanism is proposed in this paper.The keyword-attentive sentence mechanism can fully integrate the prior knowledge of the shortest dependency path into the attention mechanism,so as to reduce the use of a large number of artificial marking features.Meanwhile,the keyword-attentive sentence mechanism extends the dimension of traditional attention mechanism,so that the generated attention weight matrix can adapt to different sentences.It also makes up for the deficiency that the traditional attention mechanism can't mine the abstract semantics of sentences.Furthermore,this paper proposes synthetic stimulation loss to alleviate the imbalanced classification problem.A large number of negative samples interfere with model learning,and the learned feature information can not effectively predict the positive samples,which will cause the model degradation.In order to solve this problem,the synthetic stimulation loss introduces highest misclassification score and modulating factor to increase the weights of hard-to-classify samples.Such a strategy can let the model learn useful features for classification.Anti noise ability of a single model is not strong,so this paper applies the mutual learning method to make two student networks teach each other.Finally,based on the posterior entropy of the most likely relation,this method can make up for the lack of the expression ability in a single model.Experiments on the Sem Eval-2010 Task 8,Co NLL-3R and TAC40 datasets show that the new model proposed in this paper achieves state-of-the-art results.
Keywords/Search Tags:Relation Classification, Attention Mechanism, Shortest Dependency Path, Imbalanced Classification Problem, Mutual Learning
PDF Full Text Request
Related items