With the development and application of knowledge graph,as an important substitute for the construction and completion knowledge graphs,relation extraction algorithms have developed rapidly in the past few years.Because large-scale labeled data can be obtained without spending a lot of cost and time,distantly supervised relation extraction has gradually become the mainstream task in relation extraction technology.Because the assumption of distant supervision is too strong,a lot of noise is introduced in the data labeling.Therefore,the main goal of this task is how to model the data to avoid noise in the labelling corpus,and at the same time,to mine more semantic features that conform to the relation between entities.With the development of deep learning,researchers have proposed a number of methods to improve the performance of distantly supervised relation extraction.On the one hand,many researchers have analyzed the characteristics of the corpus labelled by distant supervision to mine the inherent connections and characteristics of the data itself.On the other hand,some researchers use external knowledge to improve the effect of relation extraction.Based on these two perspectives,this paper explores the problems in the distant supervision relation extraction and proposes solutions.From the data itself,the existing approaches treat the instances in the same bag independently and ignore the semantic structural information.In this paper,we propose a novel GCN based model ICRE to incorporate the instance correlations for improving relation extraction.For each bag,the model first builds a graph through the dependency tree of each instance in this bag.In this way,the correlations between instances are built through their common words.The learned node(word)embeddings which encode the bag information are then fed into the sentence encoder,i.e.,text CNN to obtain better representations of sentences.Besides,an instance-level attention mechanism is introduced to select valid instances and learn the textual relation embedding.Finally,the learned embedding is used to train our relation classifier.Experiments on two benchmark datasets demonstrate that our model significantly outperforms the compared baselines.Additionally,the existing approaches treat labels as independent and meaningless one-hot vectors,which cause a loss of potential label information for selecting valid in-stances.In this paper,we propose a novel multi-layer attention-based model to improve relation extraction with joint label embedding(RELE).The model makes full use of both structural information from Knowledge Graphs(KGs)and textual information from entity descriptions to learn label embeddings through gating integration,while avoiding the imposed noise with an attention mechanism.Then the learned label embeddings are used as another attention over the instances(whose embeddings are also enhanced with the entity descriptions)for improving relation extraction.Extensive experiments demonstrate that our model significantly outperforms state-of-the-art methods. |