Font Size: a A A

Universal Text Generation Transfer Learning Based On Advanced Semantics

Posted on:2020-10-15Degree:MasterType:Thesis
Country:ChinaCandidate:H LiFull Text:PDF
GTID:2428330572996873Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Natural language processing(NLP)is a core part of huma.n-computer interaction,and it,is one of the rapid development directions in the field of artificial intelligence in recent years,attracting the attention of researchers.After the deep neural network is proposed,sequence-to-sequence model(Seq2Seq)is widely used in text generation tasks.With the help of Seq2Seq model,the text generation tasks,such as machine translation and summarization get great improvements.However,since neural network relys on large datasets,these models could only be built on a large corpus.Once the pre-trained model is applied to other corpora,the effect of the model will be significantly weakened.Therefore,this thesis improves the Seq2Seq model by adding advanced semantic encoders that could utilize transfer learning,enabling models to use prior knowledge or pre-trained models for different tasks.Compared to the original Seq2Seq model,the transfer learning model could achieve better results even on small datasets.In this thesis,experiments are carried out on several well-known public datasets and crawler crawling datasets.The experimental results show that the proposed transfer learning Seq2Seq model could make good use of prior knowledge and general knowledge learned in NLP tasks.The model achieves the universality of transfer learning in different datasets and different tasks.
Keywords/Search Tags:Natural Language Processing, Transfer Learning, Language Model, Seq2Seq
PDF Full Text Request
Related items