Font Size: a A A

Research On Textual Entailment Recognition Based On Semantic Alignment

Posted on:2023-09-26Degree:MasterType:Thesis
Country:ChinaCandidate:J Y WuFull Text:PDF
GTID:2568306800960299Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Natural language processing is an important research direction in the field of artificial intelligence.One of the key tasks is textual entailment,which is to determine whether a given hypothetical sentence can be reasoned out of the premise sentence,help the computer interpret the deep semantic information between the sentence and the word,and identify the semantic logical reasoning relationship between the texts,so as to understand natural language.The main work of this paper is to improve the reasoning ability based on deep learning model recognition of textual entailment through text alignment.The specific work content is as follows:1)In order to solve the problems of gradient disappearance,long-distance dependence,and inefficient computing efficiency of recurrent neural network encoders in the traditional twin network implication recognition model.Based on Transformer Encoder,this paper redesigns and improves the model structure,innovatively introduces the predictive reasoning contribution mechanism in the model,and through attention alignment,the model pays more attention to the words with high contribution to the reasoning when reasoning about the semantic information of the sentence.At the same time,in order to obtain deeper semantic information,an information interaction mechanism is proposed,and both sentences can obtain semantic information that is aligned with each other,and finally the word vector after interaction is input to the next layer to obtain a deeper reasoning prediction.Taking ESIM as the baseline model,the experimental effect of the proposed methods in this paper on the SNLI,Multi NLI,and Sci Tail datasets exceeds that of the baseline model,and the model inference operation speed is faster.2)In order to enhance the semantic information of sentences and improve the performance of entailment recognition,it is a common method to add knowledge of external semantic role annotation to the model.However,the analysis of semantic role annotations for the same text may result in multiple different predicate structure sentences,so it is difficult to integrate multiple semantic role annotations into the model at the same time.In order to solve this problem,this paper proposes a method that uses the alignment mechanism of multi-head attention and the fusion of different predicate structure sentences.Through the attention mechanism,the similarity of the predicate vector output of BERT is calculated,the predicate structure sentence with high similarity in the premise sentence and the hypothetical sentence are aligned,and then the Transformer Encoder is used as the encoding fusion device to distribute the predicate structure sentence with high similarity to different attention heads,obtain different semantic information from multiple different subspaces,and improve the performance of predictive reasoning in the text.
Keywords/Search Tags:Natural language processing, Text Entailment Recognition, Attention Alignment mechanism, Transformer, Semantic Role Labeling
PDF Full Text Request
Related items