| The key difference between the cloud service model and the traditional ERP model is to build a "platform + ecology" model.Supported by the cloud platform,the cloud ERP platform integrates data of production,sales,service and other processes,and collaborates with cloud ERP providers,upstream and downstream entities of the supply chain,and developers to form a cloud ERP ecological community."Open,exchange,sharing" is the key content of cloud ERP ecological community,but also an important trend of cloud ERP ecological community optimization development.Among them,improving the service quality of cloud ERP ecological community is a key link in the construction of the ecosystem,an important guarantee for the construction of community culture and good communication between users and managers,and understanding the emotional information contained in user reviews is a prerequisite for improving service quality.However,the current cloud ERP ecosystem community contains massive interaction and comment data such as text,image and audio,which cannot be timely processed by manual means alone.In addition,the single modal sentiment analysis model can not accurately classify the emotion types of the multi-modal data in the ecosystem.Therefore,based on the review multimodal data of Kingdee Cloud ERP ecological community,this paper focuses on the research of multimodal sentiment analysis method for ecological community.The main contents are as follows:(1)Aiming at the problem that a single text modal sentiment classification does not make full use of multiple modal information,the overall technical scheme of multi-modal sentiment analysis method for cloud ERP ecological community is studied and proposed.(2)Aiming at the characteristics of the cloud ERP ecological community,such as the large difference in the length of text review data,the coexistence of professional and spoken words,and high noise,the text preprocessing scheme of the cloud ERP ecological community is designed,which mainly includes text cleaning,denoising,standardization,word segmentation and vector.(3)Aiming at the problem that the modal importance of multi-modal fusion is difficult to embody at present,a multi-modal emotion classification method with text mode as the main mode is proposed.Experiments on two large-scale multimodal public data sets show that the accuracy of emotion classification reaches 83.09%,which verifies the effectiveness of the proposed method.(4)Aiming at the problem that the pooling process of extracting image features by using CNN in multi-modal fusion is prone to loss of key information and single granularity of feature extraction,a feature extraction method based on window attention and hierarchical attention mechanism is proposed to achieve multi-granularity image feature extraction.(5)Aiming at the problems that the current intermodal fusion is not targeted and the fusion is insufficient,a dual-mode multi-granularity fusion method based on the interactive attention mechanism is proposed,and the effectiveness of the fusion method is verified through ablation experiments.(6)The model proposed in this paper is applied to the multi-modal comment dataset of cloud ERP ecological community,and the classification accuracy can reach 70%.The result shows that the scheme designed in this paper is significantly better than the current single-mode emotion classification scheme of cloud ERP ecological community,and can effectively improve the accuracy of cloud ERP ecological community emotion classification. |