TheaperDeng / GNN-Generalization-Fairness

Subgroup Generalization and Fairness of Graph Neural Networks
2 stars 4 forks source link

Is the theoretical bound just for MLP not GNN? #1

Open ghost opened 2 years ago

ghost commented 2 years ago

Hi, I really like this paper but I have a question. According to the paper 'f is a ReLU-activated L-layer Multi-Layer Perceptron (MLP) with W1 , . . . , WL as parameters'. Does this paper include Graph structure?

TheaperDeng commented 1 year ago

g is an aggregation function that aggregates the features over local neighbors. and hi(X, G) =f(gi(X, G); W1, W2, . . . , WL), where f is only the "decoder" for the learned representatives.