Font Size: a A A

Representation Learning,Automatical Acquisition And Computational Application For Large-scale Structured Knowledge

Posted on:2020-09-13Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y K LinFull Text:PDF
GTID:1368330626964466Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
In the 21 st century,with the deep integration of artificial intelligence technology and fields such as home,medicine,education,finance,and law,people have a strong demand for large-scale knowledge graphs,and their applications.Knowledge intelligence has become one of the most popular areas of artificial intelligence.Knowledge graphs can be viewed as knowledge systems which store structured human knowledge.Knowledge graph is the core of artificial intelligent service.It can give the intelligent agent the ability of precise query,deep understanding and logical reasoning.It is widely used in search engine,question answering,dialogue system,and recommendation system.Until now,deep learning based natural language processing can only learn the semantic pattern for specific tasks from the data mechanically due to the lack of background knowledge.It is neither robust nor interpretable,and cannot be able to understand natural language.We believe that to achieve genius natural language understanding,it is necessary to integrate knowledge graph information into deep learning.Combining natural language processing and knowledge graphs is not trivial,and some key issues need to be addressed:(1)Knowledge Representation.Deep learning based natural language processing models usually use distributed representations.To utilize large-scale knowledge graphs in deep learning models,we need first to represent knowledge graphs.In this part,my work consists of a.Knowledge graph representation considering complex relations: Due to the complexity of information in different scenarios,the unified entity representation greatly limits the modeling ability of Trans E and its extensions.To solve this problem,we propose Trans R model which models entities and relations in distinct spaces and performs the translation in the corresponding relation space.b.Knowledge graph representation considering complex relational paths: Most existing knowledge representation learning model only consider the direct relation between entities.We believe that the multi-step relational paths between entities also contains rich reasoning information,and propose a path-based Trans E model.c.Most existing knowledge representation learning models cannot distinguish the differences of the relations between entities and the attributes of entities.Hence,it is impossible to accurately represent the interaction between entities,relationships,and attributes.To solve this problem,we propose a knowledge graph representation learning model that simultaneously learns the representation of entities,relation,and attribute.(2)Knowledge Acquisition.To automatically acquire relational facts from the large-scale structured,semi-structured and unstructured data on the Internet is the only way to build a large-scale knowledge graph.In this part,my work consists of: a.Neural relation extraction with selective attention: To address the noise problem of distantly supervised relation extraction data,we propose a neural relation extraction model based on the sentence level selective attention mechanism,which is used to filter the sentences with incorrect annotation.b.Neural relation extraction with multi-lingual attention: Most existing relation extraction systems concentrate on extracting relational facts on monolingual data,which cannot utilize diverse information hiding in the data with various languages.To address this problem,we propose a multi-lingual neural relation extraction system which employs multi-lingual attentions.(3)Knowledge Application.For different natural language processing tasks,we explore how to integrate knowledge into task-specified deep learning models to achieve knowledge-driven natural language understanding.In this part,my work consists of a.Knowledge-driven entity typing: We propose a neural entity typing model based on the knowledge attention mechanism,which considers the relationship between name entities and contexts and the rich information in the knowledge graphs.b.Knowledge-driven open-domain question answering: We propose an open domain question answering system based on the pattern of “skimming-intensive reading-summarizing”.Besides,we use knowledge representation learning to enhance the representation of the question and its relevant articles and perform multi-task learning with relation extraction to introduce the relations of entities in knowledge graphs to the model.Our work addresses the key problems in knowledge representation,knowledge acquisition,and knowledge application,which would be the foundation of genius natural language understanding.
Keywords/Search Tags:Knowledge Computing, Knowledge Representation, Knowledge Acquisition, Knowledge Application
PDF Full Text Request
Related items