Font Size: a A A

Research On Extraction Technology Of Knowledge Unit In Teaching Resources

Posted on:2022-05-27Degree:MasterType:Thesis
Country:ChinaCandidate:W T ZhangFull Text:PDF
GTID:2518306551970739Subject:Master of Engineering
Abstract/Summary:PDF Full Text Request
"personalized learning" has always been an enduring topic in education.However,personalized learning has not been widely used in practice due to the limitation of technology level and teacher resources.Recently,personalized learning has been reshaped by rapidly developing artificial intelligence and internet technologies,providing an opportunity for its widespread application in practice.The two foundations of personalized learning are domain knowledge models and user feature models.In practice,domain knowledge models are generally constructed with knowledge graphs."Entity identification" is the first step in building knowledge graphs.In the education field,entity identification is also called "knowledge unit extraction".Based on the existing research,this paper focuses on the extraction method of knowledge unit in teaching resources.Since there is no public dataset on education,we created a "Python knowledge unit dataset",conducted a large number of experiments,and validated the effectiveness of the methods.The main works of this study are shown here:Firstly,we proposed a sequence-based knowledge unit extraction model,named "Lattice-TF model".The Lattice LSTM network was used in this model to encode a series of input characters and their matching potential words.The Transformer encoder was used to extract crucial parts from the existing information to enhance the ability of capturing long-range dependencies.By experimenting on "Python knowledge unit dataset",Lattice-TF model performed better than any of the Lattice LSTM model and Transformer model,verifying the effectiveness of combining two neural networks for knowledge unit extraction.Secondly,based on the Lattice-TF model proposed above,the LGN-TF model was developed by replacing the "sequence-based LatticeLSTM network" to "lexical graph neural network LGN".The results showed that the graph neural network was slightly better than the LatticeLSTM network.Finally,we proposed a collaborative multi-graph-based knowledge unit extraction model--TF-CGN model.Based on the LGN-TF model,the "collaborative graph network" was initially used to capture lexical information instead of the "lexical-based graph neural network",and then the usage level of Transformer encoder was adjusted.Our results indicated that"collaborative graph network" was slightly better than "lexical-based graph neural network" in integrating lexical information.Besides,it seemed that using Transformer encoder at a lower level could improve the performance of the model.To conclude,our results illustrate that,the method,which uses Transformer encoder to extract character context information initially,and then uses the collaborative graph neural network to integrate lexical features,enable to improve the model performance.Further studies in integrating character glyph information,multi-task learning,and cross-domain learning will be needed in the future.
Keywords/Search Tags:Named entity recognition, Convolutional neural network, Attention mechanism, Graph neural network
PDF Full Text Request
Related items