Font Size: a A A

Research On Emotion Prediction In Multi-Turn Textual Conversations

Posted on:2022-07-27Degree:DoctorType:Dissertation
Country:ChinaCandidate:D Y LiFull Text:PDF
GTID:1488306722990929Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
In recent years,emotion prediction in multi-turn textual conversations,as one of the key tasks in the research of machine emotional intelligence,has gradually attracted the attention of researchers.Emotion prediction in multi-turn textual conversations aims to predict the future emotion of participants in an ongoing conversation,based on the conversation history,participant information and external commonsense knowledge.However,it is a severe challenge for machines to perceive and reason about the future feelings of participants in the absence of utterance information from the future.This paper uses the sequence model,graph neural network,knowledge graph,and ensemble model to improve the machine's emotion prediction ability from the aspects of emotion propagation characteristics,the multi-source of emotional information and the diversity of emotional reactions.The main research contents and innovations are as follows:(1)The formal definition of emotion prediction in multi-turn textual conversationsEmotion is an indispensable factor in the human-machine dialogue system.Moreover,making machines have the ability to perceive,understand and express emotions is a long-term goal in the field of machine emotional intelligence.However,the existing conversational emotion analysis techniques mainly focus on the task of emotion recognition in conversation,that is,identifying the emotion categories in a given utterance.This paper aims at the emotion prediction task in multi-turn textual conversations,that is,predicting the possible emotional state of interlocutors in the future.This paper analyzes the differences between the task of emotion prediction in multi-turn textual conversations and the existing related work,and formalizes the task of emotion prediction in multi-turn textual conversations.(2)Interactive double states emotion cell model based on emotion propagation characteristicsThere are three notable emotion propagation characteristics in multiturn textual conversations:context dependence,persistence and contagiousness.By considering these characteristics,an interactive double states emotion cell model(IDS-ECM)is proposed.The model simulates the input,storage,interaction and output of the interlocutor's emotional state in the conversation process through the emotion input gate,the double emotion memory unit,the emotion interaction gate,and the emotion output gate.The experimental results on two manually annotated datasets show that the proposed model is superior to the baselines in the macro-averaged F1 evaluation metric,and also reveal the communication differences between different emotional categories in conversations:positive emotions are more contagious than negative emotions.(3)Addressee-aware model based on multi-source information fusionIn the process of emotion prediction in conversations,there is a lot of conversation historical information,which not only records the speech content,speech theme and speech style of speakers,but also contains the speaker's identity information.Therefore,the emotion prediction model needs the ability to mine long-term and short-term conversation history information,and the ability to analyze the identity information of the interlocutor.In addition,the external commonsense knowledge is also the key information to improve the model's emotion prediction ability.This paper combines long-term and short-term conversation information,interlocutor's identity information and external commonsense knowledge information,and proposes an addressee-aware model based on multi-source information fusion.Specifically,sequence-based and graph-based models are used to mine long-term and short-term conversation history information.In addition,an addressee-aware module is proposed to embed the identity information of interlocutors.Moreover,a commonsense knowledge integration module is proposed to integrate the external commonsense knowledge and enhance the emotion prediction in conversations.The effectiveness of the addressee-aware model based on multi-source information fusion is verified on three multi-turn conversation datasets,and it is proved that the long-term and short-term conversation history information,interlocutor's identity information and external commonsense knowledge information all improve the performance of emotion prediction in multi-turn textual conversations.(4)Adaptive ensemble model based on the diversity of emotional reactionsThe emotion prediction task needs to predict the participant's future emotional reaction to the message without knowing the exact response of the participant in the future.However,there may be multiple appropriate emotional reactions.To address this issue,This paper proposes an adaptive ensemble model based on the diversity of emotional reactions.First,multiple basic prediction models with different structures and parameters are designed to simulate the diversity of emotional reactions,which leverages the multiple models to generate multiple candidate results of emotion prediction.And then an adaptive decision-maker is trained to automatically select the final result that best fits the current context and dialogue scene from the candidates.In addition,although the integration of external commonsense knowledge does improve the performance,the effective use of external knowledge is still a challenge for the emotion prediction task.Therefore,this paper proposes a selective knowledge integration strategy to selectively integrate the external commonsense knowledge,and reduce the negative impact of redundant and wrong information on the emotion prediction model.The experimental results on three multi-turn textual conversation datasets show that the adaptive ensemble emotion prediction model is significantly better than the single emotion prediction model.Moreover,the effectiveness of the selective knowledge integration strategy is also proved.To sum up,this paper focuses on the challenging problems in the task of emotion prediction in multi-turn textual conversations,and constructs multiple emotion prediction models from the perspectives of emotion propagation characteristics,the multi-source of emotional information and the diversity of emotional reactions.Moreover,the proposed models achieve state-of-the-art performance.The proposed models can provide technical support for applications in the fields of emotional chat machine,intelligent customer service,mental health counseling.
Keywords/Search Tags:Multi-turn textual conversations, Emotion prediction in conversations, Emotion propagation characteristics, Multi-source information fusion, Commonsense knowledge
PDF Full Text Request
Related items