bytedance / Protenix

A trainable PyTorch reproduction of AlphaFold 3.
Other
659 stars 52 forks source link

Missing key(s) in state_dict: #10

Open QUEST2179 opened 1 week ago

QUEST2179 commented 1 week ago

tried bash inference_demo.sh but got the following error message. The checkpoint is from wget https://af3-dev.tos-cn-beijing.volces.com/release_model/model_v1.pt. Please help, Thanks!

raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(

RuntimeError: Error(s) in loading state_dict for Protenix: Missing key(s) in state_dict: "input_embedder.atom_attention_encoder.atom_transformer.diffusion_transformer.blocks.0.attention_pair_bias.layernorm_a.layernorm_s.bias", "input_embedder.atom_attention_encoder.atom_transformer.diffusion_transformer.blocks.0.conditioned_transition_block.adaln.layernorm_s.bias", "input_embedder.atom_attention_encoder.atom_transformer.diffusion_transformer.blocks.1.attention_pair_bias.layernorm_a.layernorm_s.bias", "input_embedder.atom_attention_encoder.atom_transformer.diffusion_transformer.blocks.1.conditioned_transition_block.adaln.layernorm_s.bias", "input_embedder.atom_attention_encoder.atom_transformer.diffusion_transformer.blocks.2.attention_pair_bias.layernorm_a.layernorm_s.bias", "input_embedder.atom_attention_encoder.atom_transformer.diffusion_transformer.blocks.2.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.atom_attention_encoder.atom_transformer.diffusion_transformer.blocks.0.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.atom_attention_encoder.atom_transformer.diffusion_transformer.blocks.0.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.atom_attention_encoder.atom_transformer.diffusion_transformer.blocks.1.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.atom_attention_encoder.atom_transformer.diffusion_transformer.blocks.1.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.atom_attention_encoder.atom_transformer.diffusion_transformer.blocks.2.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.atom_attention_encoder.atom_transformer.diffusion_transformer.blocks.2.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.0.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.0.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.1.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.1.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.2.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.2.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.3.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.3.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.4.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.4.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.5.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.5.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.6.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.6.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.7.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.7.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.8.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.8.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.9.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.9.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.10.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.10.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.11.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.11.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.12.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.12.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.13.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.13.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.14.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.14.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.15.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.15.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.16.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.16.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.17.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.17.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.18.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.18.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.19.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.19.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.20.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.20.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.21.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.21.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.22.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.22.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.23.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.diffusion_transformer.blocks.23.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.atom_attention_decoder.atom_transformer.diffusion_transformer.blocks.0.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.atom_attention_decoder.atom_transformer.diffusion_transformer.blocks.0.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.atom_attention_decoder.atom_transformer.diffusion_transformer.blocks.1.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.atom_attention_decoder.atom_transformer.diffusion_transformer.blocks.1.conditioned_transition_block.adaln.layernorm_s.bias", "diffusion_module.atom_attention_decoder.atom_transformer.diffusion_transformer.blocks.2.attention_pair_bias.layernorm_a.layernorm_s.bias", "diffusion_module.atom_attention_decoder.atom_transformer.diffusion_transformer.blocks.2.conditioned_transition_block.adaln.layernorm_s.bias".

zhangyuxuann commented 1 week ago

Hi @QUEST2179, can you download the checkpoint and try again, since i can not reproduce the error message, or you can give more details about how to reproduce the error message.

QUEST2179 commented 1 week ago

still the same error after re-downloading the checkpoint file. what detailed info you need in order to troubleshoot this issue? I am more than happy to provide.

yangyanpinghpc commented 1 week ago

It might be a PyTorch version issue. The code comments indicate that this requires a version greater than 2.1.

self.layernorm_s = nn.LayerNorm(c_s, bias=False) # the pytorch version should be newer than 2.1

zhangyuxuann commented 1 week ago

Hi @QUEST2179, I think the reason is just as pointed out by @yangyanpinghpc, you can use ours docker image or pip install like this https://github.com/bytedance/Protenix/issues/5#issuecomment-2468283405 image https://pytorch.org/docs/2.0/generated/torch.nn.LayerNorm.html?highlight=layernorm#torch.nn.LayerNorm

QUEST2179 commented 1 week ago

Indeed pytorch version issue. thank you very much. The model was loaded successfully after I updated the pytorch, although I got another error.

zhangyuxuann commented 1 week ago

@QUEST2179 Feel free to discuss about the another error