Font Size: a A A

Research On Semantic Relation Classification Based On Context-Aware Neural Networks

Posted on:2018-06-18Degree:MasterType:Thesis
Country:ChinaCandidate:Y F RenFull Text:PDF
GTID:2348330512983575Subject:Computer software and theory
Abstract/Summary:PDF Full Text Request
Relation classification is associated with many potential applications in the artificial intelligence area.Recent more and more approaches usually leverage neural networks based on structure features such as syntactic or dependency features to solve this problem However,high-cost structure features make such approaches inconvenient to be directly used.In addition,structure features are probably domain-dependent.This paper proposes a bi-directional long-short-term-memory recurrent-neural-network(BiLSTM)model based on surrounding contexts to address relation classification.Our motivation is that the relation between two target entities can be represented by the entities and contexts surrounding them directly.In our model,BiLSTM is used to perform bi-directional recurrent computation along all the tokens of the sentences which the relation spans.Then,the sequence of token representations,which are generated in the previous step,is divided into five parts according to the order that tokens occur in these sentences,After the sequence of token representations has been divided,standard pooling functions are applied over the token representations of each part,and we obtain five representations corresponding to the five parts.Lastly,they are concatenated and fed into a softmax layer for relation classification.To avoid the need of structure features,our model uses low-cost sequence features such as words and part-of-speech(POS)tags.We evaluate our model on two standard benchmark datasets,namely SemEval-2010 Task 8 and and BioNLP-ST 2016 Task BB3.In the former dataset,our model achieves comparable performance compared with other models using sequence features.In the latter dataset,our model obtains the third best results compared with other models in the official evaluation.Moreover,we find that the context between two target entities plays the most important role in relation classification.Furthermore,statistic experiments show that the context between two target entities can be used as an approximate replacement of the shortest dependency path when dependency parsing is not used.
Keywords/Search Tags:Relation classification, Bi-directional long-short-term-memory recurrent-neuralnetwork, Sequence features, Structure features
PDF Full Text Request
Related items