Document detail
ID

oai:arXiv.org:2407.10688

Topic
Computer Science - Machine Learnin...
Author
Wang, Ziyan He, Yaxuan Liu, Bin
Category

Computer Science

Year

2024

listing date

9/25/2024

Keywords
neural features node graph
Metrics

Abstract

Graph Neural Networks (GNNs) have achieved notable success in the analysis of non-Euclidean data across a wide range of domains.

However, their applicability is constrained by the dependence on the observed graph structure.

To solve this problem, Latent Graph Inference (LGI) is proposed to infer a task-specific latent structure by computing similarity or edge probability of node features and then apply a GNN to produce predictions.

Even so, existing approaches neglect the noise from node features, which affects generated graph structure and performance.

In this work, we introduce a novel method called Probability Passing to refine the generated graph structure by aggregating edge probabilities of neighboring nodes based on observed graph.

Furthermore, we continue to utilize the LGI framework, inputting the refined graph structure and node features into GNNs to obtain predictions.

We name the proposed scheme as Probability Passing-based Graph Neural Network (PPGNN).

Moreover, the anchor-based technique is employed to reduce complexity and improve efficiency.

Experimental results demonstrate the effectiveness of the proposed method.

Wang, Ziyan,He, Yaxuan,Liu, Bin, 2024, Probability Passing for Graph Neural Networks: Graph Structure and Representations Joint Learning

Document

Open

Share

Source

Articles recommended by ES/IODE AI

Batoclimab as induction and maintenance therapy in patients with myasthenia gravis: rationale and study design of a phase 3 clinical trial
gravis myasthenia study clinical phase baseline improvement mg-adl 340 week trial placebo period mg maintenance qw