Font Size: a A A

Research On Improved Attention Mechanism Based On Short Text Sentiment Analysis

Posted on:2020-09-14Degree:MasterType:Thesis
Country:ChinaCandidate:F YangFull Text:PDF
GTID:2428330599459612Subject:Information and Communication Engineering
Abstract/Summary:PDF Full Text Request
With the rapid development of social media,more and more users are making comments and sharing opinions on the Internet.These user-generated data are extremely has the great commercial value and social information for consumers,enterprises and government departments.Therefore,a large number of researchers are focusing on sentiment analysis of texts,they expect to further analyze the subjective components of the texts,including user opinions and emotional tendencies.Generally,Traditional sentiment analysis focuses on document-level and sentence-level,and has achieved good results on some datasets.However,considering the entities or aspects of emotional expression,sentiment analysis requires more fine-grained analysis.In this paper,we propose an improved attention mechanism based algorithm for aspect-level sentiment analysis.Firstly,considering the absence of content in texts,we propose a word co-occurrence based text feature extraction method,we construct the word co-occurrence matrix based on semantic similarity weights to extracts the co-occurrence features of texts.Secondly,an Aspect-Attention memory network model is proposed in this paper,we first adopt Bi-LSTM to learn aspect-level representation of texts.Then an Aspect-Attention mechanism are constructed to extract the the potential semantic connection between the context features and aspect features of texts.The potential semantic connection is introduced.Finally,the POS features are introduced,a multi-features fusion based tensor neural network is proposed,we construct a tensor neural network to extract the semantic relationship between different features of the text,including the POS features,context features and aspect features.In addition,in order to reduce the redundant information of tensor weight and reduce the number of weight parameters,we use the tensor decomposition technique to reduce the number of tensor weight.The feature weight though dimensionality reduction not only can represent thepotential association between multi-features but also control model training scale.
Keywords/Search Tags:Word co-occurrence feature, Aspect-Attention, Tensor neural network, Tensor decomposition
PDF Full Text Request
Related items