Font Size: a A A

Research On Entity Linking Algorithm By Combining The Attention Mechanism And Hidden Semantic Information

Posted on:2022-12-22Degree:MasterType:Thesis
Country:ChinaCandidate:H B ZhangFull Text:PDF
GTID:2518306779996679Subject:Computer Software and Application of Computer
Abstract/Summary:PDF Full Text Request
The goal of entity linking is to link a mention with ambiguation to a correct candidate,which is the guarantee of the overall quality of knowledge graph.Based on neural networks,existing entity linking systems are mainly composed of prior probability,entity type modeling,local and global models,which can effectively capture entity type information,local and global features.However,most entity linking systems pay more attention to the optimization of reasoning and training processes,but ignore the latent information in local and global models.Therefore,these systems are difficult to capture more comprehensive information so that incorrect links for mentions.In view of the above problems,the thesis focuses on hidden information extractors and adaptive features,as follows.(1)To solve the problem that information hidden in local and global models is difficult to be captured and related explorations are little.Based on the joint learning among extensible vector space module,hidden semantic information extractor and neural attention module,the thesis proposes the entity linking system with hidden semantic information extractors,in which local,global and global(neighbor)models are equipped with their own hidden semantic information extractor,the extensible vector space modul e and the neural attention module.The system enables to collect latent information in local and global models from various perspectives,where the extensible vector space module aims to strengthen the connection between the hidden semantic information ext ractor and the neural attention module.The neural attention module computes more precise feature scores for candidates according to the latent information that are captured by the hidden semantic information extractor.(2)For the entity linking system with hidden semantic information extractors has two problem that are accumulation errors and entity type information modeling.The thesis jointly learns three perspectives by the extraction of contextual weight information,contextual latent semantic information of macro and micro,which puts forward the first-level adaptive feature to capture latent information in local and global models.Meanwhile,from the perspective of entity type modeling,the thesis proposes the second-level adaptive feature to describe entity type information for four kinds of entity types.In addition,based on the first-level and second-level adaptive features,the influence of uncertain entity type information can be further reduced through the joint learning of two-level adaptive features.Experimental results show that the proposed entity linking algorithm with hidden semantic information extractors achieves high scores on three simple datasets,which verifies that the hidden semantic information extractor can effectively improves the performance of entity linking system.The entity linking algorithm with two-level adaptive features all improves performance on the other three datasets that have greater challenges,and achieves the highest average performance on out-domain datasets.Compared the entity linking algorithm with hidden semantic information extractors with the entity linking algorithm with two-level adaptive features,the second algorithm all promotes performance on out-domain datasets,which verify the second-level adaptive features can adaptively capture latent information,solve the problem of uncertain entity type information,and improve the generalization ability for entity linking system.
Keywords/Search Tags:Entity Linking, local and global feature, hidden semantic information, adaptive features, entity type information
PDF Full Text Request
Related items