With the rapid development of Internet and social network services,a large number of text information with distinct emotions is published on various platforms every day.Finding the implicit characteristics of data through massive data has become a hot research topic.Emotion analysis is a hot research topic in the field of natural language processing.Its task is to help users quickly acquire,organize and analyze relevant features,and make optimal decisions for users by identifying the emotional polarity(positive and negative)behind words.For the optimization of the emotion analysis algorithm,among the existing methods,the research mainly focuses on the vectorization of text data and how to build a high-quality deep learning classifier.A more effective sentence embedding method may further improve the performance of the text emotion classification model in the existing research.Therefore,based on contrastive learning and Bert pre training language model,this paper proposes a model BiSeCSE for textlevel emotion classification.The research work of this paper includes:(1)We propose a model framework BiSeCSE,which is a model research based on contrastive learning.Firstly,combined with the simple contrastive learning framework SimCSE,this paper uses contrastive learning to self-supervise and train the BERT model,apply input sentences and self-predict in the comparison target to train the SimCSE-BERT model.Then,through the back translation method,the simple text level emotion analysis data set is adjusted to the text level emotion analysis data set,and the trained SimCSE-BERT model is constructed to form siamese network BERTs.The BERT on both sides of the siamese network BERTs has the same structure and parameters.The different sentences in the emotion analysis text pair generated by the back translation are input into the BERTs model to obtain the sentence representation vector of each sentence,The loss function is added,and then the back propagation optimization model is established.Finally,the single-sided BERT network in the trained siamese network BERTs is transferred to the supervised classification module for Chinese text emotion classification.(2)For the BiSeCSE model proposed in this paper,we are in Waimai10K data dataset,chnsenticorphtlAll data dataset and onlineshopping10The cats data dataset is tested on three Chinese datasets,and compared with several cutting-edge text level emotion classification models.The experimental results verify the effectiveness and superiority of our method. |