EdisonLeeeee / MaskGAE

[KDD 2023] What’s Behind the Mask: Understanding Masked Graph Modeling for Graph Autoencoders
https://arxiv.org/abs/2205.10053
76 stars 6 forks source link

Config of link prediction on ogbl-collab of MaskGAE and several baselines #7

Closed Newiz430 closed 11 months ago

Newiz430 commented 1 year ago

Hi Dr. Li,

Loved your work that promotes the self-supervised masked structural modeling! I am currently reproducing your results reported in Table 3 and have several questions.

I much appreciate your timely reply so that I can cite your paper! Thank you so much!

EdisonLeeeee commented 1 year ago

Sorry for the late reply; I was on vacation.

Newiz430 commented 11 months ago

Sorry for the late reply; I was on vacation.

  • Thank you for bringing that to our attention. We did not include the experiments on ogbl-collab initially. We will address the careless errors in our arXiv version.
  • I recommend adding batch normalization to the encoder and using learnable embeddings as input node features.
  • This learning process consists of two stages. First, we pre-train GraphMAE to obtain node representations. Then, we train the edge decoder using the learned representations for link prediction tasks.

Not the answer I was looking for about the 3rd question (I was asking the pre-training configs and workflow of GraphMAE without node labels, like if the edges were split during node attribute prediction pre-training, how did you validate the pre-trained model without using the original logistic regression classifier etc). Still thanks a lot for replying tho