Xi-yuanWang / GLASS

GLASS: GNN with Labeling Tricks for Subgraph Representation Learning
30 stars 6 forks source link

Query regarding pretraining in GLASS vs SubGNN #5

Closed shhs29 closed 1 year ago

shhs29 commented 1 year ago

Hi,

I had a question regarding the pretraining in GLASS. Is there any difference in the pretraining strategy of GLASS and SubGNN. I believe both GLASS and SubGNN uses edge level tasks for creating pretrained node embeddings.

Also, are the results in table 6 of GLASS the results of the model with/without the use of pretrained node embeddings ?

Thanks and Regards, Shweta Ann Jacob

Xi-yuanWang commented 1 year ago

Dear Shweta,

We have provided the pretraining script.

python GNNEmb.py --use_nodeid --device $gpu_id --dataset $dataset --name $dataset

The embeddings we used are in ./Emb/.

You can check the pretraining script of SubGNN here. We agree that both GLASS and SubGNN uses edge level tasks only and there is no difference in general.

Results in Table 6 use pretrained node embeddings.

Sincerely, Xiyuan Wang

shhs29 commented 1 year ago

Hi Xiyuan,

Thanks a lot for the quick reply.

I was wondering what table 7 does. Does it use both node and edge level tasks for pretraining ?

Also, the values in column GLASS in table 7 seem similar to table 6 values. Does this mean the GLASS column (without SSL) in table 7 still have some pretraining ?

Thanks and Regards, Shweta Ann Jacob

Xi-yuanWang commented 1 year ago

Dear Shweta,

In Table 7, GLASS+SSL use node+edge level SSL, and GLASS use edge level SSL only. Table 7 shows that other SSL is also helpful in some datasets.

Sincerely, Xiyuan Wang

shhs29 commented 1 year ago

Hi Xiyuan,

Thanks a lot. That clarifies my question.

Closing this issue as it is resolved.