Font Size: a A A

A Conditional Text Generation Framework Based On Pre-Trained Model

Posted on:2020-03-16Degree:MasterType:Thesis
Country:ChinaCandidate:Y DuanFull Text:PDF
GTID:2428330590976536Subject:Cyberspace security
Abstract/Summary:PDF Full Text Request
Text generation is an important task in both academic research and realistic application,one of the necessary subtask in text generation is conditional text generation.But almost all of the conditional text generation methods based on deep learning need huge amount of labeled data,and need “retrain” if new “conditional” appears,in Internet era,this way is inefficient.In fact,for a human being,it's easier to learn a brand new task by few examples because he/she has enough background knowledge.From this point,we decided to introduce a new framework for conditional text generation task,our main idea is to transform the original word-level learning procedure to high-level feature space learning procedure by the help of pre-trained model,then only few labeled data are needed for effective conditional text generation,and if new “condition” appears,we don't have to retrain the pre-trained model,which is more efficient for realistic application.In experiments on a real-world dataset,we found our proposed method could achieve better performance in both single and multiple conditional generation tasks than baselines.The experimental results prove that generating texts on high-level latent space is suitable and effective,which is beneficial for further research on text generation field.
Keywords/Search Tags:conditional text generation, few labeled data, pre-trained model, deep learning
PDF Full Text Request
Related items