日本語
 
Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細

登録内容を編集ファイル形式で保存
 
 
ダウンロード電子メール
  Propagation Structure-Aware Graph Transformer for Robust and Interpretable Fake News Detection

Zhu, J., Gao, C., Yin, Z., Li, X., & Kurths, J. (2024). Propagation Structure-Aware Graph Transformer for Robust and Interpretable Fake News Detection. In R., Baeza-Yates, & F., Bonchi (Eds.), KDD '24: Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (pp. 4652-4663). New York: Association for Computing Machinery. doi:10.1145/3637528.3672024.

Item is

基本情報

表示: 非表示:
資料種別: 書籍の一部

ファイル

表示: ファイル

関連URL

表示:

作成者

表示:
非表示:
 作成者:
Zhu, Junyou1, 著者
Gao, Chao1, 著者
Yin, Ze1, 著者
Li, Xianghua1, 著者
Kurths, Jürgen2, 著者              
所属:
1External Organizations, ou_persistent22              
2Potsdam Institute for Climate Impact Research, ou_persistent13              

内容説明

表示:
非表示:
キーワード: -
 要旨: The rise of social media has intensified fake news risks, prompting a growing focus on leveraging graph learning methods such as graph neural networks (GNNs) to understand post-spread patterns of news. However, existing methods often produce less robust and interpretable results as they assume that all information within the propagation graph is relevant to the news item, without adequately eliminating noise from engaged users. Furthermore, they inadequately capture intricate patterns inherent in long-sequence dependencies of news propagation due to their use of shallow GNNs aimed at avoiding the over-smoothing issue, consequently diminishing their overall accuracy. In this paper, we address these issues by proposing the Propagation Structure-aware Graph Transformer (PSGT). Specifically, to filter out noise from users within propagation graphs, PSGT first designs a noise-reduction self-attention mechanism based on the information bottleneck principle, aiming to minimize or completely remove the noise attention links among task-irrelevant users. Moreover, to capture multi-scale propagation structures while considering long-sequence features, we present a novel relational propagation graph as a position encoding for the graph Transformer, enabling the model to capture both propagation depth and distance relationships of users. Extensive experiments demonstrate the effectiveness, interpretability, and robustness of our PSGT.

資料詳細

表示:
非表示:
言語: eng - 英語
 日付: 2024-08-242024-08-24
 出版の状態: Finally published
 ページ: -
 出版情報: -
 目次: -
 査読: 査読あり
 識別子(DOI, ISBNなど): DOI: 10.1145/3637528.3672024
MDB-ID: No data to archive
Model / method: Machine Learning
PIKDOMAIN: RD4 - Complexity Science
Organisational keyword: RD4 - Complexity Science
 学位: -

関連イベント

表示:

訴訟

表示:

Project information

表示:

出版物 1

表示:
非表示:
出版物名: KDD '24: Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
種別: 書籍
 著者・編者:
Baeza-Yates, Ricardo 1, 編集者
Bonchi, Francesco 1, 編集者
所属:
1 External Organizations, ou_persistent22            
出版社, 出版地: New York : Association for Computing Machinery
ページ: - 巻号: - 通巻号: - 開始・終了ページ: 4652 - 4663 識別子(ISBN, ISSN, DOIなど): ISBN: 979-8-4007-0490-1