snap-stanford / ogb

Benchmark datasets, data loaders, and evaluators for graph machine learning
https://ogb.stanford.edu
MIT License
1.89k stars 398 forks source link

GCN precompute_norm #157

Closed eujhwang closed 3 years ago

eujhwang commented 3 years ago

In the ogb-examples link prediction datasets, I noticed that GCN layer for ogbl-ppa uses precompute_norm and set the normalized=False, cached=False in GCNConv and the GCN layer for other datasets doesn't use the precompute_norm and set the normalized=True, cached=True in GCNConv. Are there any particular reasons for doing this, rather than setting normalized=True, cached=True in GCNConv layer for all datasets?

rusty1s commented 3 years ago

The (normalized=True, cached=True) option will compute and cache normalized edge weights in every GCN layer, which is convenient but only feasible in case such data can still fit into GPU memory. We pre-compute normalization coefficients and set (normalized=False, cached=False) for all datasets/models where GPU memory is a major limitation.

eujhwang commented 3 years ago

That makes sense! Thanks for the clarification.