Font Size: a A A

Research On Sentiment Analysis Based On Multimodal Fusion

Posted on:2024-02-12Degree:MasterType:Thesis
Country:ChinaCandidate:J B NieFull Text:PDF
GTID:2568307091997209Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Human beings are emotional animals.Emotions are ubiquitous in our daily life and can influence and even determine our judgments and decisions.At present,with the rapid development of multimedia technology and the surge of emotional data generated by social platforms,the research on emotion analysis is no longer entangled in the emotion analysis of single modal data,and more researchers begin to focus on multi-modal emotion analysis from the perspective of multiple emotional modes.At the same time,through the fusion of emotional information between multiple modes,the multimodal emotion analysis makes clear which modal characteristics affect the recognition effect;The second is to overcome the single mode is not enough to represent the overall data of emotion information,enhance the ability of emotion analysis.By studying the characteristics of multi-modal data,this thesis designs a complementary fusion and enhancement method of multi-modal affective data based on attention mechanism.It mainly analyzes the multi-modal fusion of text and audio,text and image with specific modal features,and then enhances the emotional information of the integrated features.Finally,it verifies and analyzes with the help of experiments.The results show that this method is effective for multimodal sentiment analysis.The following is the introduction of the main research content of this thesis.(1)Aiming at the problem that the importance of different emotional features has different effect on model recognition.This thesis verifies the importance of different modes to emotion analysis,selects the importance modes,and resolves the different importance of data between modes to emotion recognition models in multi-modal data sets.By establishing a set of empirical methods for comparison,the effect of emotion recognition was analyzed on the three modal data respectively,and the appropriate modal weight ratio was selected as the reference for the effectiveness and efficiency of model fusion.Finally,the idea of mode fusion was determined.(2)Aiming at the problem of weak correlation of modal features in traditional multi-modal data fusion.In this thesis,a mode fusion method based on attention mechanism is proposed.Based on text features,the other two mode features are fused in pairs,and the information of another mode is obtained through the cross-mode attention mechanism,and a module is designed to capture the dependency between the two modes.Finally,the effectiveness of the proposed method is verified by experiments on various data sets.In MOSI dataset,the classification accuracy of multimodal sentiment analysis using this method reaches 82.88%.(3)Aiming at the problem of insufficient emotional information of modal characteristics after modal data fusion.In this thesis,a method of enhancing the modal emotion information of self-attention mechanism is proposed.A feature enhancement layer is introduced to enhance the feature of the fusion modal data(with the input of the fusion modal feature and the single modal feature),that is,multi-modal feature update is carried out on the complementary features through multiple iterations,so as to obtain the features with more emotional information.
Keywords/Search Tags:multimodal sentiment analysis, attention mechanism, emotional data features, multimodal feature fusion
PDF Full Text Request
Related items