24 ESWA_Enhancing rumor detection with data augmentation and generative pre-trained transformer

  • data augmentation
  • generative pre-trained transformer

GAP

However, the existing methods could not learn the deep concepts of the rumor text to detect the rumor. In addition, imbalanced datasets in the umor domain reduce the effectiveness of these algorithms.

Idea

leveraging the Generative Pre-trained Transformer 2 (GPT-2) model to generate rumor-like texts, thus creating a balanced dataset. (利用GPT2+增强数据)

  • GPT-2 captures rich semantic information and can produce diverse, high-quality synthetic text samples.

Datasets

PHEME, Twitter15, and Twitter16 datasets.


image.png
image.png

Experimental Results

image.png
最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

推荐阅读更多精彩内容