Font Size: a A A

Rumor Detection Based On Deep Bidirectional Transformer Encoder

Posted on:2021-02-28Degree:MasterType:Thesis
Country:ChinaCandidate:X Y JuFull Text:PDF
GTID:2428330623958904Subject:Information and Communication Engineering
Abstract/Summary:PDF Full Text Request
The development of social network platforms such as Weibo,Twitter and WeChat has completely changed the way people communicate,while it is convenient for people to get the latest information,the massive spread of rumors and false information on the network platform is increasingly harmful to individuals,the society and even our country.Due to the extremely fast spread of information,it is unrealistic to detect rumors or false information manually in a timely manner.Therefore,rumor automatic detection has become a research hotspot in recent years.Existing rumor detection methods mainly extract multiple features for classification,but this is not applicable to the early detection of rumors,and for long text information,the commonly used recurrent neural network and convolutional neural network cannot understand the semantics well.To solve the problems existing in the research of existing rumors detection,this paper proposes a new technique for detecting rumors,by analyzing the characteristic of the text content of early rumor detection.The idea of pre-training is also used to improve the timeliness of our model and the deep bidirectional Transformer encoder is used for feature extraction,which effectively solves the long distance dependent features from the long text and thus can improve the accuracy for detecting rumors.In order to further improve the detection performance of the model,the original data is also processed with data enhancement.In this paper,Twitter rumor datasets and FakeNewsNet false news datasets are used,and experimental results show that the accuracy and F1-score of the proposed rumor detection model are both better than the state-of-the-art models.
Keywords/Search Tags:Network platform, Rumor detection, Transformer encoder, pre-training
PDF Full Text Request
Related items