| Aspect-level sentiment analysis models an attribute of an entity in the text to explore reviewers’ sentiment toward this aspect.Existing research mostly uses attention mechanisms or graph convolutional networks to align aspect words and related contexts to obtain more accurate sentiment representations.However,the above methods ignore the dependencies between words so that the model will associate aspect words with irrelevant contexts.Based on this,this thesis obtains the potential syntactic information between words by injecting a syntactic dependency relationship into the language model and then conducts related research on aspect-level sentiment classification and triple extraction.The main contents of the thesis include:1.To address the problem that the graph convolution model assigns syntactically adjacent words with equal attention weights,which makes the existing model incorrectly associate syntactically irrelevant words with target aspects through multiple iterations of graph convolution propagation,this thesis proposes an aspect-level text sentiment analysis method based on graph neural network embedding syntactic dependencies.The above method can assign appropriate weights to syntactically adjacent contexts and block the interference of irrelevant contexts.Experimental results prove that the proposed method can effectively align aspect terms with opinion contexts and outperform attention mechanisms and graph convolutional network-based methods in classification accuracy.2.In view of the current pre-training models of global attention,this thesis proposes a method for embedding syntactic relations into the pre-training model.This thesis proposes a method for embedding syntactic relations into the pre-training model.Compared with the traditional method in which syntactic information is used as an extra layer and stacked in the hidden layer representation,the proposed method directly incorporates syntactic information into the attention distributions and the token representation in scaled dot-product attention,so the proposed model does not require additional pre-training.Experimental results on aspect-level sentiment datasets show that the proposed method can effectively inject syntactic dependency information into pre-trained knowledge,achieving better results than the existing methods,consequently.3.To solve the problem that the existing aspect-level triple extraction models lack syntactic dependencies,this thesis proposes a method based on context-aware syntaxdependent language models.This method incorporates syntactic dependencies into contextual representations to learn latent relationships between words.To enrich the syntactic representations,the proposed methods use information from adjacent edges.Compared with existing end-to-end methods,the proposed method can effectively learn the latent semantic relationship between word pairs,achieving better performance than the existing methods. |