Deutsch
 
Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Buchkapitel

Propagation Structure-Aware Graph Transformer for Robust and Interpretable Fake News Detection

Urheber*innen

Zhu,  Junyou
External Organizations;

Gao,  Chao
External Organizations;

Yin,  Ze
External Organizations;

Li,  Xianghua
External Organizations;

/persons/resource/Juergen.Kurths

Kurths,  Jürgen
Potsdam Institute for Climate Impact Research;

Externe Ressourcen
Es sind keine externen Ressourcen hinterlegt
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PIKpublic verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Zhu, J., Gao, C., Yin, Z., Li, X., Kurths, J. (2024): Propagation Structure-Aware Graph Transformer for Robust and Interpretable Fake News Detection. - In: Baeza-Yates, R., Bonchi, F. (Eds.), KDD '24: Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, New York : Association for Computing Machinery, 4652-4663.
https://doi.org/10.1145/3637528.3672024


Zitierlink: https://publications.pik-potsdam.de/pubman/item/item_30719
Zusammenfassung
The rise of social media has intensified fake news risks, prompting a growing focus on leveraging graph learning methods such as graph neural networks (GNNs) to understand post-spread patterns of news. However, existing methods often produce less robust and interpretable results as they assume that all information within the propagation graph is relevant to the news item, without adequately eliminating noise from engaged users. Furthermore, they inadequately capture intricate patterns inherent in long-sequence dependencies of news propagation due to their use of shallow GNNs aimed at avoiding the over-smoothing issue, consequently diminishing their overall accuracy. In this paper, we address these issues by proposing the Propagation Structure-aware Graph Transformer (PSGT). Specifically, to filter out noise from users within propagation graphs, PSGT first designs a noise-reduction self-attention mechanism based on the information bottleneck principle, aiming to minimize or completely remove the noise attention links among task-irrelevant users. Moreover, to capture multi-scale propagation structures while considering long-sequence features, we present a novel relational propagation graph as a position encoding for the graph Transformer, enabling the model to capture both propagation depth and distance relationships of users. Extensive experiments demonstrate the effectiveness, interpretability, and robustness of our PSGT.