Font Size: a A A

Attention-based Adversarial Multi-task Review Text Classification

Posted on:2020-01-10Degree:MasterType:Thesis
Country:ChinaCandidate:R L ChenFull Text:PDF
GTID:2428330596982653Subject:Control engineering
Abstract/Summary:PDF Full Text Request
In the social life of today,machine learning has become an important research topic and application trend.In the field of machine learning,the classification and analysis of text is an important research direction and has practical application value.Multitasking helps all tasks improve learning ability and achieve more accurate results by using all the effective information in multiple learning tasks to alleviate data sparsity.At the same time,it helps to alleviate the problem of data sparsity by learning the information that can be used by each other and analyzing future data.From theoretical analysis and practical application,multi-task learning is more advantageous than learning the performance of these tasks alone.At present,the existing traditional multi-task text classification learning model can not effectively extract text features,and it is easy to ignore the importance of input information.This paper combines attention mechanism with multi-task learning,from text data.At the input,the attention mechanism is introduced,and a attention-based multi-task text classification model is proposed.The attention mechanism relies on the weight to focus on the important text information in the text data that can reflect the characteristics of the text,and achieve the purpose of effectively extracting the characteristics of the text data.Previous attention mechanisms have been more commonly applied to image processing tasks,and have only begun to be introduced in tasks such as text processing in recent years.Attention mechanism through training weights,weighted calculations,reasonable representation of text data information,learning text information according to the degree of importance,has significant help for subsequent learning tasks.By comparing the highest correct rate of the four multi-task text classification learning models,the correctness and advantages of the model designed in this paper are significantly demonstrated.The average correct rate of the four multi-task text classification learning models on multiple tasks is compared to illustrate the model in multi-task.The universality and correctness of the text classification problem can more intuitively explain that the attention-based confrontation multi-task text classification model can effectively solve the problem that the traditional traditional multi-task text classification learning model can not effectively extract text features.And the practical problem of easily ignoring the importance of the input information,so that the attention model's multi-task text classification ability is significantly improved.
Keywords/Search Tags:Text Classification, Attention, Adversarial Learning, Multi-task Learning
PDF Full Text Request
Related items