2.118

影响因子

    高级检索

    GraphCCL: 自注意力增强的因果图对比学习

    GraphCCL: Causal Graph Contrastive Learning with Self-attention Augmentation

    • 摘要: 使用图神经网络处理图或子图层面的任务时,识别图数据中重要的因果子图结构,可以为模型带来更好的泛化性和可解释性。现有的方法在解耦因果子图和环境子图时未考虑这两种结构之间的差异性,这限制了模型挖掘因果子图的性能。基于此,提出了运用对比学习方法来增强因果子图与环境子图之间区分度的模型。实验表明,该模型在图分类任务上较传统方案预测准确率更高,挖掘出的因果子图结构更显著,同时具备良好的泛化能力和可解释性。

       

      Abstract: When using graph neural networks for graph or subgraph-level tasks, identifying important causal subgraph structures within the graph data can improve model generalization and interpretability. However, existing methods do not consider the difference between causal subgraphs and environmental subgraphs when disentangling them, which limits the performance of the model in excavating causal subgraphs. Based on this, a model that applies contrastive learning to enhance the distinction between causal and environmental subgraphs is proposed. Experiments results show that this model achieves higher prediction accuracy in graph classification tasks compared to traditional methods, excavates more significant causal subgraphs, and offers better generalization and explainability.

       

    /

    返回文章
    返回