legendzzy / ProAffinity-GNN

7 stars 0 forks source link

Missing key(s) in state_dict #1

Open QUEST2179 opened 6 days ago

QUEST2179 commented 6 days ago

Got the following error message after running test.py. Please help, Thanks!

model.load_state_dict(torch.load('model/model_trained.pkl'))

File "/home/miniconda/envs/torch11/lib/python3.11/site-packages/torch/nn/modules/module.py", line 2215, in load_state_dict raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format( RuntimeError: Error(s) in loading state_dict for GraphNetwork: Missing key(s) in state_dict: "graph1.model.atom_convs.0.lin.weight", "graph1.model.atom_convs.1.lin.weight", "graph1.model.mol_conv.lin.weight", "graph2.model.atom_convs.0.lin.weight", "graph2.model.atom_convs.1.lin.weight", "graph2.model.mol_conv.lin.weight", "graph3.model.atom_convs.0.lin.weight", "graph3.model.atom_convs.1.lin.weight", "graph3.model.mol_conv.lin.weight". Unexpected key(s) in state_dict: "graph1.model.atom_convs.0.lin_src.weight", "graph1.model.atom_convs.0.lin_dst.weight", "graph1.model.atom_convs.1.lin_src.weight", "graph1.model.atom_convs.1.lin_dst.weight", "graph1.model.mol_conv.lin_src.weight", "graph1.model.mol_conv.lin_dst.weight", "graph2.model.atom_convs.0.lin_src.weight", "graph2.model.atom_convs.0.lin_dst.weight", "graph2.model.atom_convs.1.lin_src.weight", "graph2.model.atom_convs.1.lin_dst.weight", "graph2.model.mol_conv.lin_src.weight", "graph2.model.mol_conv.lin_dst.weight", "graph3.model.atom_convs.0.lin_src.weight", "graph3.model.atom_convs.0.lin_dst.weight", "graph3.model.atom_convs.1.lin_src.weight", "graph3.model.atom_convs.1.lin_dst.weight", "graph3.model.mol_conv.lin_src.weight", "graph3.model.mol_conv.lin_dst.weight".

Version I used pytorch 2.4 pytorch_geometric 2.6.1
transformers 4.46.2

legendzzy commented 5 days ago

Hi, this issue is due to the version of PyTorch-Geometric, please use the following version: PyTorch-Geometric=2.30

QUEST2179 commented 5 days ago

Great, It works after downgrading PyTorch-Geometric. Thanks a lot for quick response.

May I ask how to interpret the result? Is L1 Test Loss the one I should pay attention to? The bigger L1 Test Loss value is, the tighter the interaction?

legendzzy commented 5 days ago

The shown L1 loss is for binding affinity prediction. It's only the loss for value prediction. For interaction, you should focus on the binding affinity value, the lower the binding affinity value, the better the interaction is.