ZJU-Fangyin / KCL

Code and Data for the paper: Molecular Contrastive Learning with Chemical Element Knowledge Graph [AAAI 2022]
MIT License
86 stars 11 forks source link

The file 'pretrain.py' and 'Set2Set_0910_2302_78000th_epoch.pkl' are missing. #7

Closed ZillaRU closed 2 years ago

ZillaRU commented 2 years ago

The KEY python script pretrain.py is missing. I found a file named pretrain.py in code/data/pretrain.py. But obviously, it is not the one for pre-training the model.

In addition, could you provide the pre-trained model weightSet2Set_0910_2302_78000th_epoch.pkl so that I can start with fine-tuning? https://github.com/ZJU-Fangyin/KCL/blob/004f5681b77e4e75c791c909696fdb8a208501a2/code/script/finetune.sh#L8

ZillaRU commented 2 years ago

BTW, I fail to locate where you construct the positive & negative pairs for contrastive learning. Correspondingly, the loss in finetune.py does not include a component for contrastive learning and is a simple BCE loss. It confuses me.

ZJU-Fangyin commented 2 years ago

Hi, now you can find pretrain.py under code/. We also added Set2Set_0910_2302_78000th_epoch.pkl. (However, it is just an example, you can try any version of pre-trained model under code/dump/Pretrain/gnn-kmpnn-model to conduct with your fine-tuning.)

For your second question, we only used contrastive learning in the pre-training phase, not in fine-tuning phase. The original molecular graph and its augmented graph acted as the positive pairs, while other molecule graphs (and their augmented graphs) in the same batch are all regarded as their negatives.

Hope to clear your confusion!