ClaudMor / gravity_gae_torch_geometric

Pytorch Geometric implementation of the "Gravity-Inspired Graph Autoencoders for Directed Link Prediction" paper.
MIT License
9 stars 1 forks source link

Why using "mean" aggregation? #4

Closed yuxuesong1995 closed 1 month ago

yuxuesong1995 commented 1 month ago

In Convolution.py, the author uses 'mean' aggregation, which means the transformed feature W_T . x is normalized by degree (+1), but why not using deg_i^-(0.5) * deg_j^-(0.5) as in standard GCN?

I have tried changing the forward method with the standard normalization:

# Step 3: Compute normalization.

row, col = edge_index
deg = degree(col, x.size(0), dtype=x.dtype)
deg_inv_sqrt = deg.pow(-0.5)
deg_inv_sqrt[deg_inv_sqrt == float('inf')] = 0
norm = deg_inv_sqrt[row] * deg_inv_sqrt[col]

but the performance was worse than:

# Step 3: Compute normalization.

row, col = edge_index
deg = degree(col, x.size(0), dtype=x.dtype)
deg_inv_sqrt = deg.pow(-1)
norm = deg_inv_sqrt[col]

which I believe is the same as what the author has done in Convolution.py.

It seems that only using degree to normalized the feature is better, but why? I`m confused

yuxuesong1995 commented 1 month ago

sorry, I just note that in the origin paper Gravity-Inspired Graph Autoencoders for Directed Link Prediction , the author has said that in directed graphs, we use out-degree normalization rather than the usual symmetric normalisation.