Font Size: a A A

Research On Dialogue Consistency And Continuity Of Chat-bot

Posted on:2019-03-28Degree:MasterType:Thesis
Country:ChinaCandidate:S K ZhuFull Text:PDF
GTID:2428330566976626Subject:Engineering
Abstract/Summary:PDF Full Text Request
In recent years,the rapid advancement of deep learning has promoted the chat-bot to be a heated topic.Due to the poor cross-domain ability and its poor adaptability to the open domain,the retrieval-based chat-bot was gradually replaced by the generative chat-bot,which is more suitable to the open domain.Therefore,the chat-bot gradually turns from template-matching,information-retrieved model to generative one.This article mainly focuses on the dialogue in the open field.Unlike the dialogue in the closed-domain that concentrates on the accuracy of the reply content,the dialogue in the open field cares more about the consistency and continuity of its content.This paper starts to introduce the basic seq2 seq model and describes its components in detail.Then,it also introduces the updated seq2 seq model,which has been modified in two aspects relied on the basic seq2 seq model.It introduces a bidirectional encoder and the attention mechanism.Such model sharply enhances the accuracy and fluency of the response content.Nevertheless,due to the inconsistency and non-continuality of the response content,this article puts forward the idea to build a suitable model which is based on the updated model to improve the consistency and continuity of dialogue content.In order to boost the consistency of the dialogue content,this article updates the model via building the character background model.In addition,through applying the character background vector that learned in this model to further upgrade the character relationship model,it could be found that the effects of both models are equally ideal.However,compared with the character background model,the character relationship model could better ensure the response fluency when trained by small-scale corpus.Because the seq2 seq model employs the maximum likelihood estimation method when outputting response,it will select the reply with the highest probability.Therefore,many replies that are meaningless but with comparatively high appearance rate in the corpus will be chosen.In order to solve this problem and ensure the continuality of the dialogue,I modified the loss function and put forward three methods to revise the loss function based on the maximum mutual information theory.Apart from this,I also compare them one by one and verify the effectiveness of the model by analyzing the continuity and smoothness of the response content.Through modifying the model,reconstructing the corpus and replacing the loss function,this article not only ensures the fluency of response content,but also boost the consistency and continuality of the dialogue.
Keywords/Search Tags:chat-bot, generative model, character background model, character relation model, maximum mutual information
PDF Full Text Request
Related items