aqlaboratory / openfold

Trainable, memory-efficient, and GPU-friendly PyTorch reproduction of AlphaFold 2
Apache License 2.0
2.73k stars 514 forks source link

Unexpected key(s) in state_dict: "aux_heads.tm.linear.weight", "aux_heads.tm.linear.bias" #319

Open srayan00 opened 1 year ago

srayan00 commented 1 year ago

I am trying to run inference using OpenFold and I found that finetining_ptm_2.pt was missing in openfold/resources/openfold_params. I found a similar file on hugging faces and used their file. But I am getting the following error :

Traceback (most recent call last): File "/opt/openfold/run_pretrained_openfold.py", line 391, in <module> main(args) File "/opt/openfold/run_pretrained_openfold.py", line 203, in main for model, output_directory in model_generator: File "/opt/openfold/openfold/utils/script_utils.py", line 101, in load_models_from_command_line model.load_state_dict(d) File "/opt/conda/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1605, in load_state_dict self.__class__.__name__, "\n\t".join(error_msgs))) RuntimeError: Error(s) in loading state_dict for AlphaFold: Unexpected key(s) in state_dict: "aux_heads.tm.linear.weight", "aux_heads.tm.linear.bias".

The huggingfaces file is from 11 months ago. So, is there a more recent file that I can use or should I remove those keys altogether?

roivant-matts commented 1 year ago

you may need to add the config_preset command. In the README this is used in the non-docker example, but not the docker example and resolved this issue in my use. --config_preset "model_1_ptm" \

lijxgit commented 1 year ago

I don't use docker to run the code, but get the same issue, and try the config_preset config, it doesn't work. Have somebody solved this problem? please give me a hand.

lijxgit commented 1 year ago

I found that using the "finetuning_ptm_1.pt" to replace "finetining_ptm_2.pt" can work well.

vetmax7 commented 6 months ago

Hello!

I' ve got such error RuntimeError: Error(s) in loading state_dict for AlphaFold: Missing key(s) in state_dict: "aux_heads.tm.linear.weight", "aux_heads.tm.linear.bias". when I tried to use my own checkpoint after train OpenFold.

--openfold_checkpoint_path /checkpoints/14my.ckpt for running run_pretrained_openfold.py script

Who can explain me please how to solve this problem?