Font Size: a A A

Research On Enhancing Natural Language Inference Through Knowledge Graph Embedding And Cross-lingual Transfer

Posted on:2021-05-30Degree:MasterType:Thesis
Country:ChinaCandidate:K X QiFull Text:PDF
GTID:2428330626459681Subject:Management Science and Engineering
Abstract/Summary:PDF Full Text Request
Recognizing textual entailment,also known as natural language inference,is a fundamental and challenging task in natural language processing.Nowadays,natural language inference has been widely used in many applications for natural language processing,including question answering,information retrieval and machine translation.It also plays a crucial role for evaluating natural language understanding.In recent years,with the emergence of large-scale corpora for natural language inference,deep neural models have become dominant in natural language inference,achieving outstanding performance.However,there are still two limitations in the existing deep neural models for natural language inference.On one hand,the existing deep learning models do not make full use of external knowledge resources.Most of the existing deep neural networks are based on word embeddings.However,the knowledge beyond word embeddings is essential for recognizing the inference relationship between two texts.For example,the antonymy relation between two words plays a crucial role in recognizing the contradiction relationship between two texts.Although some previous works study enhancing the neural model by external knowledge to enhance the performance for natural language inference,these methods usually extract numerical features from the knowledge base or employ a pipeline strategy that represents the knowledge as real-value vectors first and then integrate them into the neural model,which may ignore the interaction between the knowledge representation and the neural model,leading to the neural model could not make full use the of the auxiliary knowledge recognize the inference relationship between two texts.On the other hand,most of the previous studies focus on modeling natural language inference in English.To adapt the applications of natural language inference to multilingual scenarios,we should upgrade natural language inference to cross lingual natural language inference.Existing neural models either align sentence embeddings between source and target languages,heavily relying on annotated parallel corpora,or exploit pre-trained cross-lingual language models that are fine-tuned on a single language and hard to transfer knowledge to another language.To address these limitations in existing methods,this thesis enhances the existing methods of natural language inference on two main aspects:1.A joint learning framework is proposed to integrate knowledge graph embeddings that represent the entities and relations in external knowledge base WordNet into neural model for natural language inference.Extensive experiments on three datasets SNLI,MultiNLI and SciTail have been conducted to evaluate the effectiveness of the proposed framework.2.An adversarial training framework for cross-lingual natural language inference has been proposed to enhance both the classical neural model for natural language inference and the pre-trained language model.Extensive experiments have been conducted to evaluate the effectiveness of the proposed framework.Extensive experiments have been conducted on the XNLI benchmark with 15 languages to evaluate the effectiveness of the proposed framework.
Keywords/Search Tags:Recognizing textual entailment, Natural language inference, Deep learning, Cross-lingual natural language inference
PDF Full Text Request
Related items