ChandlerBang / GCond

[ICLR'22] [KDD'22] [IJCAI'24] Implementation of "Graph Condensation for Graph Neural Networks"
https://www.cs.emory.edu/~wjin30/files/GCond.pdf
118 stars 16 forks source link

Question about the task #7

Open Wicknight opened 11 months ago

Wicknight commented 11 months ago

Hello! I am new to dataset distillation and my question may be shallow. It seems to me that dataset distillation is generally for classification tasks, synthesizing condensed data for each class for efficient training, just like the node classification task in your work. I'm wondering if your work can also apply to the link prediction task on the graph?

rockcor commented 10 months ago

Common node classification methods follow the AXW sheme, which facilitates the graph structure learning (fix W and update A/X). Using the same idea, pairwise link prediction task (MLP(v1,v2)), you can fix the W in MLP and update v1,v2. IMP it's pretty trival to handle conflict in v1 when simultaneously updating (v1,v2) and (v1,v3). For subgraph based link prediction, you can refer to DosCond. For pairwise link prediction, you can just select less positive samples by certain strategy.