THUDM / GraphMAE

GraphMAE: Self-Supervised Masked Graph Autoencoders in KDD'22
468 stars 75 forks source link

GraphMAE for unsupervised learning #37

Closed Hesham-Aliy closed 1 year ago

Hesham-Aliy commented 1 year ago

Are there any benefits from using GraphMAE instead of any graph Autoencoder for Unsupervised graph representation problems?

THINK2TRY commented 1 year ago

@Hesham-Aliy Thanks for your attention to our work! And very sorry for the late response. Actually, GraphMAE is a special type of graph autoencoder but makes improvements in the training paradigm including masked feature reconstruction, GNN as the decoder and etc. Both GraphMAE and other GAEs can be used for unsupervised graph representation learning, but GraphMAE enjoys some advantages:

  1. Better performance. As reported in our paper, GraphMAE outperforms all baselines in various benchmarks.
  2. Since GraphMAE uses feature reconstruction instead of link reconstruction, it circumvents the need for negative sampling operation and can have better scalability.

Hope this help!