Font Size: a A A

Research And Application On Aspect-Based Sentiment Analysis Methods For Course Comments

Posted on:2024-09-29Degree:MasterType:Thesis
Country:ChinaCandidate:J S HuFull Text:PDF
GTID:2568307112476834Subject:Electronic information
Abstract/Summary:PDF Full Text Request
With the rapid development of Internet technology,the Internet has penetrated into various fields,and the "Internet + education" model has become an important trend in the development of online education.Along with the rise of online teaching,more and more learners are choosing online course platforms for learning,thus generating a huge amount of course review text data.These review texts contain key information about learners’ emotional experience and evaluation of online courses,as well as their mastery of different knowledge points,which provide important reference values for the development of the online education field and educators’ teaching practice.Therefore,sentiment analysis of online course review texts has become an increasingly important research area.Aspect-level sentiment analysis refers to extracting specific aspect words from the text and then conducting sentiment analysis for each aspect word.Compared to sentence-level sentiment analysis,aspect-level sentiment analysis can more accurately analyze the sentiment of each aspect in a course review text,thus providing a more accurate understanding of learners’ emotional responses to different aspects and helping to better understand students’ learning needs and aspirations.In this paper,aspect-level sentiment analysis research is carried out using a self-built course review text dataset and a publicly available dataset as the data base,and using deep learning techniques.The study consists of two subtasks,namely aspect word extraction and aspect-level sentiment classification.The specific research work is as follows:(1)Due to the extremely time-consuming and costly annotation,there are currently relatively few publicly available datasets for aspect terms extraction tasks,which affects the effective training of neural network models.To alleviate this problem,some researchers use domain adaptation methods and bridge the gap between different domains through common syntactic relations between aspect and opinion words,but this is highly dependent on external linguistic resources.To address these issues,this paper proposes a domain-adaptive aspect terms extraction method based on dualmemory interaction networks.The method extracts coarse-grained aspect categories from rich data in the source domain,and then applies the aspect categories to finegrained aspect word extraction tasks in the target domain to improve the performance of fine-grained aspect word extraction tasks.To alleviate the inter-domain granularity inconsistency and feature mismatch problems,the local memory of each word is interacted with the global aspect word and aspect category memory through a dualmemory interaction network,which continuously iterates to obtain the correlation vector of each word,thus obtaining the interconnection between the aspect word and the aspect category,as well as the internal correlation between the aspect word and the aspect category itself.Finally,experiments are conducted on three publicly available datasets and a constructed real dataset of course reviews,and the experimental results show that the performance of the proposed model in this paper is optimal compared to several state-of-the-art models on the cross-domain aspect terms extraction task.(2)Most existing researchers have used a combination of recurrent neural networks and attentional mechanisms for aspect-level sentiment classification tasks.However,recurrent neural networks cannot compute in parallel,and traditional attention mechanisms may assign low attention weights to important sentiment words in a sentence,which cannot fully capture and fuse the interaction information between aspect words and the context.To address these problems,this paper proposes an aspectlevel sentiment classification model that incorporates Transformer and interactive attention networks.First,the proposed model uses the BERT pre-training model to initialize the embedding vector of words,and then uses the Transformer encoding layer and the interactive attention layer to generate the hidden representation of sentences.In the Transformer encoding layer,the input sentences are encoded in parallel using the Transformer encoder,which reduces the model training time and maintains long-range sentiment relations.In the Interactive Attention layer,CDM and CDW mechanisms are incorporated to focus on local context information that is semantically important in relation to particular aspect words based on the syntactic structure of the sentence and the syntactic distance between words,reducing the influence of other noisy words.Finally,experiments on nine publicly available datasets and a constructed real dataset of course reviews show that the proposed model achieves the best classification results in a shorter training time compared to several state-of-the-art models on aspect-level sentiment classification tasks.(3)An aspect-level sentiment analysis personalized adaptive learning system for course reviews is designed and implemented.Based on the proposed aspect terms extraction algorithm and aspect-level sentiment classification algorithm,the system can accurately analyze learners’ mastery of different knowledge points by understanding their sentiment tendencies towards the course knowledge points on the review text.
Keywords/Search Tags:course comments, aspect terms extraction, domain adaptation, dualmemory interaction networks, aspect-level sentiment classification, recurrent neural network, Transformer, interactive attention network
PDF Full Text Request
Related items