wustl-cig / DOLCE

DOLCE, ICCV2023. Pytorch Implementation.
MIT License
28 stars 5 forks source link

model state_dict mismatch #2

Closed jason5306 closed 8 months ago

jason5306 commented 8 months ago

Thanks for your amazing work! When I tried to run the limtied_ct_samply.py using the given setting in evaluation_medicalCT.sh, I got model state_dict mismatch error as the following:

raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for ConditionalModel:
        Missing key(s) in state_dict: "input_blocks.3.0.op.weight", "input_blocks.3.0.op.bias", "input_blocks.6.0.op.weight", "input_blocks.6.0.op.bias", "input_blocks.9.0.op.weight", "input_blocks.9.0.op.bias", "input_blocks.12.0.op.weight", "input_blocks.12.0.op.bias", "input_blocks.15.0.op.weight", "input_blocks.15.0.op.bias", 
"input_blocks.18.0.op.weight", "input_blocks.18.0.op.bias", "output_blocks.2.4.conv.weight", "output_blocks.2.4.conv.bias", "output_blocks.5.4.conv.weight", "output_blocks.5.4.conv.bias", "output_blocks.8.3.conv.weight", "output_blocks.8.3.conv.bias", "output_blocks.11.3.conv.weight", "output_blocks.11.3.conv.bias", "output_blocks.14.1.conv.weight", "output_blocks.14.1.conv.bias", "output_blocks.17.1.conv.weight", "output_blocks.17.1.conv.bias".
        Unexpected key(s) in state_dict: "input_blocks.3.0.in_layers.0.weight", "input_blocks.3.0.in_layers.0.bias", "input_blocks.3.0.in_layers.2.weight", "input_blocks.3.0.in_layers.2.bias", "input_blocks.3.0.emb_layers.1.weight", "input_blocks.3.0.emb_layers.1.bias", "input_blocks.3.0.out_layers.0.weight", "input_blocks.3.0.out_layers.0.bias", "input_blocks.3.0.out_layers.3.weight", "input_blocks.3.0.out_layers.3.bias", "input_blocks.6.0.in_layers.0.weight", "input_blocks.6.0.in_layers.0.bias", "input_blocks.6.0.in_layers.2.weight", "input_blocks.6.0.in_layers.2.bias", "input_blocks.6.0.emb_layers.1.weight", "input_blocks.6.0.emb_layers.1.bias", "input_blocks.6.0.out_layers.0.weight", "input_blocks.6.0.out_layers.0.bias", "input_blocks.6.0.out_layers.3.weight", "input_blocks.6.0.out_layers.3.bias", "input_blocks.9.0.in_layers.0.weight", "input_blocks.9.0.in_layers.0.bias", "input_blocks.9.0.in_layers.2.weight", "input_blocks.9.0.in_layers.2.bias", "input_blocks.9.0.emb_layers.1.weight", "input_blocks.9.0.emb_layers.1.bias", "input_blocks.9.0.out_layers.0.weight", "input_blocks.9.0.out_layers.0.bias", "input_blocks.9.0.out_layers.3.weight", "input_blocks.9.0.out_layers.3.bias", "input_blocks.12.0.in_layers.0.weight", "input_blocks.12.0.in_layers.0.bias", "input_blocks.12.0.in_layers.2.weight", "input_blocks.12.0.in_layers.2.bias", "input_blocks.12.0.emb_layers.1.weight", "input_blocks.12.0.emb_layers.1.bias", "input_blocks.12.0.out_layers.0.weight", "input_blocks.12.0.out_layers.0.bias", "input_blocks.12.0.out_layers.3.weight", "input_blocks.12.0.out_layers.3.bias", "input_blocks.13.3.norm.weight", "input_blocks.13.3.norm.bias", "input_blocks.13.3.qkv.weight", "input_blocks.13.3.qkv.bias", "input_blocks.13.3.proj_out.weight", "input_blocks.13.3.proj_out.bias", "input_blocks.14.3.norm.weight", 
"input_blocks.14.3.norm.bias", "input_blocks.14.3.qkv.weight", "input_blocks.14.3.qkv.bias", "input_blocks.14.3.proj_out.weight", "input_blocks.14.3.proj_out.bias", "input_blocks.15.0.in_layers.0.weight", "input_blocks.15.0.in_layers.0.bias", "input_blocks.15.0.in_layers.2.weight", "input_blocks.15.0.in_layers.2.bias", "input_blocks.15.0.emb_layers.1.weight", "input_blocks.15.0.emb_layers.1.bias", "input_blocks.15.0.out_layers.0.weight", "input_blocks.15.0.out_layers.0.bias", "input_blocks.15.0.out_layers.3.weight", "input_blocks.15.0.out_layers.3.bias", "input_blocks.18.0.in_layers.0.weight", "input_blocks.18.0.in_layers.0.bias", "input_blocks.18.0.in_layers.2.weight", "input_blocks.18.0.in_layers.2.bias", "input_blocks.18.0.emb_layers.1.weight", "input_blocks.18.0.emb_layers.1.bias", "input_blocks.18.0.out_layers.0.weight", "input_blocks.18.0.out_layers.0.bias", "input_blocks.18.0.out_layers.3.weight", "input_blocks.18.0.out_layers.3.bias", "output_blocks.2.4.in_layers.0.weight", "output_blocks.2.4.in_layers.0.bias", "output_blocks.2.4.in_layers.2.weight", "output_blocks.2.4.in_layers.2.bias", "output_blocks.2.4.emb_layers.1.weight", "output_blocks.2.4.emb_layers.1.bias", "output_blocks.2.4.out_layers.0.weight", "output_blocks.2.4.out_layers.0.bias", "output_blocks.2.4.out_layers.3.weight", "output_blocks.2.4.out_layers.3.bias", "output_blocks.5.4.in_layers.0.weight", "output_blocks.5.4.in_layers.0.bias", "output_blocks.5.4.in_layers.2.weight", "output_blocks.5.4.in_layers.2.bias", "output_blocks.5.4.emb_layers.1.weight", "output_blocks.5.4.emb_layers.1.bias", "output_blocks.5.4.out_layers.0.weight", "output_blocks.5.4.out_layers.0.bias", "output_blocks.5.4.out_layers.3.weight", "output_blocks.5.4.out_layers.3.bias", "output_blocks.6.3.norm.weight", "output_blocks.6.3.norm.bias", "output_blocks.6.3.qkv.weight", "output_blocks.6.3.qkv.bias", "output_blocks.6.3.proj_out.weight", "output_blocks.6.3.proj_out.bias", "output_blocks.7.3.norm.weight", "output_blocks.7.3.norm.bias", "output_blocks.7.3.qkv.weight", "output_blocks.7.3.qkv.bias", "output_blocks.7.3.proj_out.weight", "output_blocks.7.3.proj_out.bias", "output_blocks.8.4.in_layers.0.weight", "output_blocks.8.4.in_layers.0.bias", "output_blocks.8.4.in_layers.2.weight", "output_blocks.8.4.in_layers.2.bias", "output_blocks.8.4.emb_layers.1.weight", "output_blocks.8.4.emb_layers.1.bias", "output_blocks.8.4.out_layers.0.weight", "output_blocks.8.4.out_layers.0.bias", "output_blocks.8.4.out_layers.3.weight", "output_blocks.8.4.out_layers.3.bias", "output_blocks.8.3.norm.weight", "output_blocks.8.3.norm.bias", "output_blocks.8.3.qkv.weight", "output_blocks.8.3.qkv.bias", "output_blocks.8.3.proj_out.weight", "output_blocks.8.3.proj_out.bias", "output_blocks.11.3.in_layers.0.weight", "output_blocks.11.3.in_layers.0.bias", "output_blocks.11.3.in_layers.2.weight", "output_blocks.11.3.in_layers.2.bias", "output_blocks.11.3.emb_layers.1.weight", "output_blocks.11.3.emb_layers.1.bias", "output_blocks.11.3.out_layers.0.weight", "output_blocks.11.3.out_layers.0.bias", "output_blocks.11.3.out_layers.3.weight", "output_blocks.11.3.out_layers.3.bias", "output_blocks.14.1.in_layers.0.weight", "output_blocks.14.1.in_layers.0.bias", "output_blocks.14.1.in_layers.2.weight", "output_blocks.14.1.in_layers.2.bias", "output_blocks.14.1.emb_layers.1.weight", "output_blocks.14.1.emb_layers.1.bias", "output_blocks.14.1.out_layers.0.weight", "output_blocks.14.1.out_layers.0.bias", "output_blocks.14.1.out_layers.3.weight", "output_blocks.14.1.out_layers.3.bias", "output_blocks.17.1.in_layers.0.weight", "output_blocks.17.1.in_layers.0.bias", "output_blocks.17.1.in_layers.2.weight", "output_blocks.17.1.in_layers.2.bias", "output_blocks.17.1.emb_layers.1.weight", "output_blocks.17.1.emb_layers.1.bias", "output_blocks.17.1.out_layers.0.weight", "output_blocks.17.1.out_layers.0.bias", "output_blocks.17.1.out_layers.3.weight", "output_blocks.17.1.out_layers.3.bias".

I wonder if the pretrained weights version is correct. Thank you!

JiamingLiu-Jeremy commented 8 months ago

Hi, I just downloaded and tested the code. It works well for [MODEL_PATH="--model_path ./model_zoo/model512_all.pt"] in my settings.

jason5306 commented 8 months ago

resolved! thank you