Font Size: a A A

Research On Dynamic Coding-driven Conversational Question Answering Method Based On BERT

Posted on:2022-12-30Degree:MasterType:Thesis
Country:ChinaCandidate:S ZhouFull Text:PDF
GTID:2518306788956769Subject:Journalism and Media
Abstract/Summary:PDF Full Text Request
Conversational question answering is a multi-round question answering task,which is an important part of conversational artificial intelligence.Given an article and a corresponding piece of dialogue,the conversational question answering task requires answering the next question in the dialogue.Make the entire conversation more fluid and coherent.Machine learning models for conversational question answering tasks not only need to consider the current question,but also incorporate contextual conversation history information.However,due to the long context,how to efficiently extract features from complex information has always been a major problem in conversational question answering tasks.Existing methods usually process it through multiple layers of LSTM and splicing feature vectors,which is easy to generate redundant information and cause context bias,which eventually leads to the decline of model performance.In response to the above problems,this paper proposes a dynamic reasoningdriven conversational question answering model based on BERT.The model is based on the Encoder-Decoder model framework.At the word embedding layer,with the excellent language understanding ability of the pre-trained model BERT,the long sequence input in the conversational question-and-answer task is supervised and finetuned,and the semantic information between paragraphs is extracted.On the basis,a dynamic way is incorporated,which can better understand the content and session history information in the paragraph,and discard irrelevant content to continuously generate new encoded representations.The research content of this paper is mainly divided into the following two aspects.(1)A conversational question answering method based on BERT fusion of multiple rounds of historical information is proposed.In context encoding,BERT is used to independently encode the questions and answers in the conversation history combined with the passages of the article,and generate corresponding feature vectors and send them to the decoder.More effectively extract the interactive relationship between the dialogue history and the article paragraph,and solve the problem of loss of longsequence input information of the model.(2)A dynamic coding-driven conversational question answering method is proposed.In the dynamic encoding layer,the encoding mechanism iteratively reads the dialogue history information,and the output of each iteration will be dynamically combined with the previous encoding representation through the decider Pd to generate a new encoding.Through a multi-layered dynamic coding procedure,reasonable weights are assigned to generate the answer to the current question,and irrelevant information is discarded to generate a smooth and coherent dialogue.Finally,different experiments are designed in this paper,and the experimental results on the newly released Co QA dataset are compared with various benchmarks and model variants,verifying that the proposed method is effective.
Keywords/Search Tags:machine learning, natural language processing, conversational question answering, dynamic coding
PDF Full Text Request
Related items