Font Size: a A A

Implicit Sentiment Analysis Based On Deep Learning

Posted on:2022-02-07Degree:MasterType:Thesis
Country:ChinaCandidate:Y G LiuFull Text:PDF
GTID:2518306485466334Subject:Computer technology
Abstract/Summary:PDF Full Text Request
With the advent of the Web 2.0 era,various platforms such as social media and e-commerce websites have developed rapidly,bringing a large amount of text data containing sentiment information.How to quickly and effectively dig out and analyze the sentiment information contained in the massive text data to provide help for governments,individuals and businesses is the content of text sentiment analysis research.From the level of language expression,text sentiment analysis can be divided into explicit sentiment analysis and implicit sentiment analysis according to whether it contains explicit sentiment words.The former has been studied by a large number of experts and scholars as the basis of sentiment analysis;the latter is still in its infancy.When Internet users make comments,they often use more implicit and obscure language expressions(such as metaphors,ironies,and imperatives).The implicit sentiment analysis of this type of text has become one of the core and difficult points of sentiment analysis.This article focuses on the three-category(commendation,derogation,neutral)implicit sentiment analysis topic,starting from the characteristics of implicit sentiment expression,contextual information and knowledge transfer,the following three research works are carried out.(1)Aiming at the situation that existing research methods cannot effectively extract the deep features of implicit sentiment sentences,this paper proposes an implicit sentiment analysis model based on attention and feature multi-channel.First,construct the shallow feature representation of sentences such as word-word hybrid sequence,part-of-speech and dependency relationship.Then,use Word2Vec technology to vectorize it and input it into the dual-channel network layer of CNN and Bi-LSTM to extract deep features.Next,the attention mechanism is used to calculate the probability value of the importance distribution of the feature.Finally,the softmax is used to judge the sentiment category.The model is tested on the evaluation of Chinese implicit sentiment analysis,and the results show the classification indicators Pmacro,Rmacro,and F1macrovalues are 77.6%,77.0%,and 77.3%.(2)Aiming at the problem of the small amount of data in the existing implicit sentiment tagging language corpus,the explicit sentiment corpus data is introduced here,and an implicit sentiment analysis model based on instance transfer is proposed.The ISA-IT model uses the Tr Adaboost algorithm as the framework and ISA-MFNCA as the base classification.In the iterative training process,boosting is used to update the weights of implicit sentiment annotation corpus samples,and0)(92)0)()is used to update the weights of explicit sentiment corpus samples.The model is tested in SMP-ECISA,and the results show the classification indicators Pmacro,Rmacro,and F1macrovalues are 82.8%,79.5%,and 81.1%.(3)In the research of implicit sentiment analysis,it is found that usually a single sentence cannot express a certain sentiment tendency,but when it is placed in a certain context,it can recognize the sentiment.Therefore,this paper proposes a multi-feature information interaction attention implicit sentiment analysis model.First,build the document context with the help of Hierarchical Attention Networks.Then,use the word embedding layer and semantic learning layer of ISA-MFNCA to mine the deep feature representation of implicit sentiment sentences.Finally,use interaction the attention mechanism pays attention to the context and the deep feature representation of implicit emotion sentences,and uses softmax to judge the sentiment category.The model is tested in SMP-ECISA,and the results show the classification indicators Pmacro,Rmacro,and F1macrovalues are 80.7%,78.8%,and 79.7%.
Keywords/Search Tags:Implicit Sentiment Analysis, Two-Channel Network, Transfer Learning, Context, Attention over Attention
PDF Full Text Request
Related items