Font Size: a A A

Research Of Aspect-level Sentiment Analysis Based On Deep Learning

Posted on:2022-05-25Degree:MasterType:Thesis
Country:ChinaCandidate:W L WuFull Text:PDF
GTID:2518306326994869Subject:Master of Engineering
Abstract/Summary:PDF Full Text Request
With the development of social networks,a large amount of text data has been accumulated on the Internet.Analyzing the sentiment polarity of text is of great significance to government,enterprises and individuals.Early sentiment analysis aimed to judge the overall sentiment polarity of a paragraph or sentence,and could not identify the sentiment polarity of different aspects in the sentence.Aspect-level sentiment analysis aims to judge the sentiment polarity of different aspects in a sentence,which has received extensive attention from researchers in recent years.The aspect-level sentiment analysis method based on deep learning can overcome the dependence of traditional machine learning methods on feature engineering,and has a significant classification effect.However,the aspect-level sentiment analysis research based on deep learning has the following problems: the model selection is single,the sentence hierarchy cannot be modeled,and the feature extraction of sentences with short length and unclear emotion expression is difficult.This thesis focuses on the above problems and the main research contents are as follows:(1)Single selection of convolutional neural network of recurrent neural network have some defects,such as insufficient feature extraction,difficult to pay attention to the words that have a significant impact on the classification results,and ignoring the hierarchical structure of sentences.The traditional language models cannot express polysemous words.To solve the above problems,a multi-channel model based on multi-head attention and BERT is proposed.The model introduces BERT to generate dynamic word embedding to solve the problem that traditional language models cannot express polysemous words.Word embedding and aspect embedding are concatenated as the input of the model.The multi-head attention mechanism is used to make the model focus on the words that have a significant impact on the classification results.The model uses the ordered neuron long and short-term memory network to model the hierarchical structure of sentences and extract the contextual semantic information of the text.At the same time,the convolutional neural network is used to extract the local features of the text and enhance the model's ability of feature extraction.The three-classification experiment is carried out on the Chinese and English datasets.The results show that the proposed model is superior to the selected baseline models in accuracy and macro-average.(2)The feature information of sentences with short length and unclear emotion expression is limited,and it is difficult for traditional models to extract sufficient feature information from such sentences.In response to the above problems,this thesis further studies the characteristics of the text,then proposes a model based on multiple features and attention mechanism,and designs a position encoding strategy which is sensitive to distance.The model adds sentiment embedding and part-of-speech embedding to word embedding to make up for the shortcomings of only taking word embedding as input.The bidirectional gated recurrent unit network is used to extract the contextual semantic information of the text.Aiming at the problem of insufficient location information extraction in traditional models,a location coding strategy which is sensitive to distance is designed.The attention mechanism is used to learn the relationship between aspect words and context.Two-classification and three-classification experiments are completed on three public English datasets.The results show that the proposed model is superior to the selected baseline models in accuracy and macro-average,and this model has a good classification effect for sentences with short length and unclear emotion expression.
Keywords/Search Tags:Aspect-level sentiment analysis, Attention mechanism, Deep learning, Natural Language Processing
PDF Full Text Request
Related items