Font Size: a A A

Research On Reading Comprehension Method Based On Graph Neural Network

Posted on:2024-05-14Degree:MasterType:Thesis
Country:ChinaCandidate:Z X LiuFull Text:PDF
GTID:2568307181950879Subject:Electronic Information (Computer Technology) (Professional Degree)
Abstract/Summary:PDF Full Text Request
Recently,with the continuous development of natural language understanding in the field of natural language processing,machine reading comprehension tasks have emerged as a key challenge in the field of natural language understanding.With the availability of large amounts of relevant datasets and models,machine reading comprehension has been decomposed into two sub-tasks: text classification and passage extraction.Although significant improvements have been made in related technologies,there are still many issues,such as:(1)Although using the [CLS] token of BERT can achieve good classification results,it does not effectively capture local information in the text,such as phrases and sentences.(2)Models cannot effectively integrate long-distance semantic information and syntactic relationships as features.To address the above issues faced in text classification and understanding,this study focuses on the following research work:(1)We propose a text classification algorithm based on pre-trained language models and multi-scale convolutional neural networks.This method retains the classification token[CLS] embedding from BERT and further extracts features using a multi-scale convolutional network to fuse the rich semantic information generated by BERT from different dimensions.After concatenating with the [CLS] vector,the resulting sequence is dimensionally reduced using a combination of max pooling and average pooling strategies,extracting key features while reducing noise.Extensive experiments demonstrate that this method can further leverage the semantic information in BERT.Compared to traditional methods,it achieves significantly improved accuracy on different datasets.(2)This study proposes a reading comprehension model that incorporates syntactic relationships based on graph attention networks.Constrained by the structure of BERT-based reading comprehension models,existing models struggle to effectively integrate crucial features such as syntactic relationships between sentences and long-distance semantics.This limitation hampers the understanding of texts and the identification of inherent connections between questions and answers.The proposed model leverages the properties of graph neural networks to model features of two granularities(inter-sentence dependency syntactic relationships and entity relationships)separately.By employing methods such as thresholdbased node filtering,the graph neural network is integrated with pre-trained language models to form a more precise vectorized representation of articles and questions.The experimental results indicate that the performance of the proposed method on the SQu AD dataset surpasses that of the traditional BERT baseline model,as evidenced by a significant increase in F1 score across multiple models,thereby substantiating the efficacy of the proposed approach.(3)This article presents the design of a reading comprehension question and answer system to better verify the effectiveness and practicality of the proposed method.The algorithm of this system is based on the previously mentioned convolutional neural networkbased classification model and the reading comprehension model that incorporates syntactic relationships through graph attention networks.By integrating relevant front-end and backend technologies for webpage design and presentation,a system capable of generating answers based on related texts and questions has been developed.
Keywords/Search Tags:reading comprehension model, graph neural networks, pre-trained language models, convolutional neural network, text classification
PDF Full Text Request
Related items