EdisonLeeeee / MaskGAE

[KDD 2023] What’s Behind the Mask: Understanding Masked Graph Modeling for Graph Autoencoders
https://arxiv.org/abs/2205.10053
76 stars 6 forks source link

About the performance on graph classification #6

Closed cxw-droid closed 1 year ago

cxw-droid commented 1 year ago

Hi, Thank you for the interesting paper! In the paper, the experiments do not include a performance comparison on graph classification task. Is there any reason for this? Thanks.

EdisonLeeeee commented 1 year ago

Hi

Thank you for your interest. As mentioned in the paper, GAEs are essentially contrastive learning models that aim to maximize the mutual information between paired subgraph views associated with a connected edge. However, in many graph-level datasets, the graphs are often small and much more sparse. Applying masking techniques to these graph structures can significantly limit the propagation of messages between nodes, though it can still achieve a decent performance.

As a result, MaskGAE is more suitable for node-level tasks and, more specifically, link-level tasks.