Font Size: a A A

Research On Key Issues In Entity Linking

Posted on:2021-01-08Degree:MasterType:Thesis
Country:ChinaCandidate:M Y LiFull Text:PDF
GTID:2428330605974884Subject:Software engineering
Abstract/Summary:PDF Full Text Request
Entity linking is a fundamental task in the field of Natural Language Processing(NLP).Accurate linking results are helpful for the construction of high-quality knowledge graph.In addition,it plays an important role in other NLP applications,such as question answering,semantic search and information extractionEntity linking task aims to align the mentions in the text with the entities in the knowledge base.It mainly consists of two components,i.e.,knowledge base and entity linking model.In terms of knowledge base,the increasing scale does not improve the linking performance,but causes more noise and more resource losses.In terms of entity linking model,on one hand,the existing models are insufficient to mine the information of text itself,and can't fit with the larger and more informative knowledge base.On the other hand,the existing models rely too much on the algorithm with high complexity.This cuases many problems such as repetitive computation,taking up too many resources,and so on.In this paper,we introduce three approaches to address the above problems.?.In order to strip off those domain-independent noises and improve the quality of knowledge base,we propose a corse to fine knowledge base extraction approach Experiment results show that our KB extraction approach can save about 70%of storage space and about 60%of model running time without affecting the linking performance Moreover,our KB extraction can also improve the accuracy of domain-specific entity linking.?.In order to mine the information of both KB and the text itself more sufficiently,we propose a neural entity linking model combining both local and global information.In particular,highway network is employed to bridge the keyword information,and self-attention mechanism is used to capture the sequential information.The experiment results show that our approach can better capture the potential information of the text,and thus improve the linking performance.Our neural model achieves the best performance using the latest general KB.?.In order to reduce the computational complexity,repetitive calculation and the occupied resources,we propose a global optimization linking model based on reinforcement learning.In particular,global disambiguation is casted as a sequence decision task.The reward and punishment mechanism with three different decision strategies are employed to find the optimal global entity link.The experiment results show the effectiveness of our proposed approach.It can improve the linking performance while greatly reducing the complexity.
Keywords/Search Tags:entity linking, knowledge base extraction, self-attention mechanism, highway network, reinforcement learning
PDF Full Text Request
Related items