dailenson / SDT

This repository is the official implementation of Disentangling Writer and Character Styles for Handwriting Generation (CVPR 2023)
MIT License
968 stars 82 forks source link

The given English checkpoint is not compatible with the model config #20

Closed staghado closed 1 year ago

staghado commented 1 year ago

When i tried to load the checkpoint i got this error :

Im running this :

python test.py --cfg configs/English_CASIA.yml --pretrained_model model_zoo/iter91999_trainloss-0.33127.pth --store_type online --sample_size 500 --dir Generated/English

I have modified the config file PATH to point to the data that i downloaded from the given link. thank you in advance for your help.

||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

RuntimeError: Error(s) in loading state_dict for SDT_Generator: Missing key(s) in state_dict: "writer_head.layers.0.self_attn.in_proj_weight", "writer_head.layers.0.self_attn.in_proj_bias", "writer_head.layers.0.self_attn.out_proj.weight", "writer_head.layers.0.self_attn.out_proj.bias", "writer_head.layers.0.linear1.weight", "writer_head.layers.0.linear1.bias", "writer_head.layers.0.linear2.weight", "writer_head.layers.0.linear2.bias", "writer_head.layers.0.norm1.weight", "writer_head.layers.0.norm1.bias", "writer_head.layers.0.norm2.weight", "writer_head.layers.0.norm2.bias", "writer_head.norm.weight", "writer_head.norm.bias", "glyph_head.layers.0.self_attn.in_proj_weight", "glyph_head.layers.0.self_attn.in_proj_bias", "glyph_head.layers.0.self_attn.out_proj.weight", "glyph_head.layers.0.self_attn.out_proj.bias", "glyph_head.layers.0.linear1.weight", "glyph_head.layers.0.linear1.bias", "glyph_head.layers.0.linear2.weight", "glyph_head.layers.0.linear2.bias", "glyph_head.layers.0.norm1.weight", "glyph_head.layers.0.norm1.bias", "glyph_head.layers.0.norm2.weight", "glyph_head.layers.0.norm2.bias", "glyph_head.norm.weight", "glyph_head.norm.bias", "content_encoder.encoder.layers.2.self_attn.in_proj_weight", "content_encoder.encoder.layers.2.self_attn.in_proj_bias", "content_encoder.encoder.layers.2.self_attn.out_proj.weight", "content_encoder.encoder.layers.2.self_attn.out_proj.bias", "content_encoder.encoder.layers.2.linear1.weight", "content_encoder.encoder.layers.2.linear1.bias", "content_encoder.encoder.layers.2.linear2.weight", "content_encoder.encoder.layers.2.linear2.bias", "content_encoder.encoder.layers.2.norm1.weight", "content_encoder.encoder.layers.2.norm1.bias", "content_encoder.encoder.layers.2.norm2.weight", "content_encoder.encoder.layers.2.norm2.bias", "wri_decoder.layers.0.self_attn.in_proj_weight", "wri_decoder.layers.0.self_attn.in_proj_bias", "wri_decoder.layers.0.self_attn.out_proj.weight", "wri_decoder.layers.0.self_attn.out_proj.bias", "wri_decoder.layers.0.multihead_attn.in_proj_weight", "wri_decoder.layers.0.multihead_attn.in_proj_bias", "wri_decoder.layers.0.multihead_attn.out_proj.weight", "wri_decoder.layers.0.multihead_attn.out_proj.bias", "wri_decoder.layers.0.linear1.weight", "wri_decoder.layers.0.linear1.bias", "wri_decoder.layers.0.linear2.weight", "wri_decoder.layers.0.linear2.bias", "wri_decoder.layers.0.norm1.weight", "wri_decoder.layers.0.norm1.bias", "wri_decoder.layers.0.norm2.weight", "wri_decoder.layers.0.norm2.bias", "wri_decoder.layers.0.norm3.weight", "wri_decoder.layers.0.norm3.bias", "wri_decoder.layers.1.self_attn.in_proj_weight", "wri_decoder.layers.1.self_attn.in_proj_bias", "wri_decoder.layers.1.self_attn.out_proj.weight", "wri_decoder.layers.1.self_attn.out_proj.bias", "wri_decoder.layers.1.multihead_attn.in_proj_weight", "wri_decoder.layers.1.multihead_attn.in_proj_bias", "wri_decoder.layers.1.multihead_attn.out_proj.weight", "wri_decoder.layers.1.multihead_attn.out_proj.bias", "wri_decoder.layers.1.linear1.weight", "wri_decoder.layers.1.linear1.bias", "wri_decoder.layers.1.linear2.weight", "wri_decoder.layers.1.linear2.bias", "wri_decoder.layers.1.norm1.weight", "wri_decoder.layers.1.norm1.bias", "wri_decoder.layers.1.norm2.weight", "wri_decoder.layers.1.norm2.bias", "wri_decoder.layers.1.norm3.weight", "wri_decoder.layers.1.norm3.bias", "wri_decoder.norm.weight", "wri_decoder.norm.bias", "gly_decoder.layers.0.self_attn.in_proj_weight", "gly_decoder.layers.0.self_attn.in_proj_bias", "gly_decoder.layers.0.self_attn.out_proj.weight", "gly_decoder.layers.0.self_attn.out_proj.bias", "gly_decoder.layers.0.multihead_attn.in_proj_weight", "gly_decoder.layers.0.multihead_attn.in_proj_bias", "gly_decoder.layers.0.multihead_attn.out_proj.weight", "gly_decoder.layers.0.multihead_attn.out_proj.bias", "gly_decoder.layers.0.linear1.weight", "gly_decoder.layers.0.linear1.bias", "gly_decoder.layers.0.linear2.weight", "gly_decoder.layers.0.linear2.bias", "gly_decoder.layers.0.norm1.weight", "gly_decoder.layers.0.norm1.bias", "gly_decoder.layers.0.norm2.weight", "gly_decoder.layers.0.norm2.bias", "gly_decoder.layers.0.norm3.weight", "gly_decoder.layers.0.norm3.bias", "gly_decoder.layers.1.self_attn.in_proj_weight", "gly_decoder.layers.1.self_attn.in_proj_bias", "gly_decoder.layers.1.self_attn.out_proj.weight", "gly_decoder.layers.1.self_attn.out_proj.bias", "gly_decoder.layers.1.multihead_attn.in_proj_weight", "gly_decoder.layers.1.multihead_attn.in_proj_bias", "gly_decoder.layers.1.multihead_attn.out_proj.weight", "gly_decoder.layers.1.multihead_attn.out_proj.bias", "gly_decoder.layers.1.linear1.weight", "gly_decoder.layers.1.linear1.bias", "gly_decoder.layers.1.linear2.weight", "gly_decoder.layers.1.linear2.bias", "gly_decoder.layers.1.norm1.weight", "gly_decoder.layers.1.norm1.bias", "gly_decoder.layers.1.norm2.weight", "gly_decoder.layers.1.norm2.bias", "gly_decoder.layers.1.norm3.weight", "gly_decoder.layers.1.norm3.bias", "gly_decoder.norm.weight", "gly_decoder.norm.bias", "pro_mlp_writer.0.weight", "pro_mlp_writer.0.bias", "pro_mlp_writer.2.weight", "pro_mlp_writer.2.bias", "pro_mlp_character.0.weight", "pro_mlp_character.0.bias", "pro_mlp_character.2.weight", "pro_mlp_character.2.bias". Unexpected key(s) in state_dict: "global_encoder.layers.0.self_attn.in_proj_weight", "global_encoder.layers.0.self_attn.in_proj_bias", "global_encoder.layers.0.self_attn.out_proj.weight", "global_encoder.layers.0.self_attn.out_proj.bias", "global_encoder.layers.0.linear1.weight", "global_encoder.layers.0.linear1.bias", "global_encoder.layers.0.linear2.weight", "global_encoder.layers.0.linear2.bias", "global_encoder.layers.0.norm1.weight", "global_encoder.layers.0.norm1.bias", "global_encoder.layers.0.norm2.weight", "global_encoder.layers.0.norm2.bias", "global_encoder.norm.weight", "global_encoder.norm.bias", "local_encoder.layers.0.self_attn.in_proj_weight", "local_encoder.layers.0.self_attn.in_proj_bias", "local_encoder.layers.0.self_attn.out_proj.weight", "local_encoder.layers.0.self_attn.out_proj.bias", "local_encoder.layers.0.linear1.weight", "local_encoder.layers.0.linear1.bias", "local_encoder.layers.0.linear2.weight", "local_encoder.layers.0.linear2.bias", "local_encoder.layers.0.norm1.weight", "local_encoder.layers.0.norm1.bias", "local_encoder.layers.0.norm2.weight", "local_encoder.layers.0.norm2.bias", "local_encoder.norm.weight", "local_encoder.norm.bias", "cls_decoder.layers.0.self_attn.in_proj_weight", "cls_decoder.layers.0.self_attn.in_proj_bias", "cls_decoder.layers.0.self_attn.out_proj.weight", "cls_decoder.layers.0.self_attn.out_proj.bias", "cls_decoder.layers.0.multihead_attn.in_proj_weight", "cls_decoder.layers.0.multihead_attn.in_proj_bias", "cls_decoder.layers.0.multihead_attn.out_proj.weight", "cls_decoder.layers.0.multihead_attn.out_proj.bias", "cls_decoder.layers.0.linear1.weight", "cls_decoder.layers.0.linear1.bias", "cls_decoder.layers.0.linear2.weight", "cls_decoder.layers.0.linear2.bias", "cls_decoder.layers.0.norm1.weight", "cls_decoder.layers.0.norm1.bias", "cls_decoder.layers.0.norm2.weight", "cls_decoder.layers.0.norm2.bias", "cls_decoder.layers.0.norm3.weight", "cls_decoder.layers.0.norm3.bias", "cls_decoder.layers.1.self_attn.in_proj_weight", "cls_decoder.layers.1.self_attn.in_proj_bias", "cls_decoder.layers.1.self_attn.out_proj.weight", "cls_decoder.layers.1.self_attn.out_proj.bias", "cls_decoder.layers.1.multihead_attn.in_proj_weight", "cls_decoder.layers.1.multihead_attn.in_proj_bias", "cls_decoder.layers.1.multihead_attn.out_proj.weight", "cls_decoder.layers.1.multihead_attn.out_proj.bias", "cls_decoder.layers.1.linear1.weight", "cls_decoder.layers.1.linear1.bias", "cls_decoder.layers.1.linear2.weight", "cls_decoder.layers.1.linear2.bias", "cls_decoder.layers.1.norm1.weight", "cls_decoder.layers.1.norm1.bias", "cls_decoder.layers.1.norm2.weight", "cls_decoder.layers.1.norm2.bias", "cls_decoder.layers.1.norm3.weight", "cls_decoder.layers.1.norm3.bias", "cls_decoder.norm.weight", "cls_decoder.norm.bias", "decoder.layers.0.self_attn.in_proj_weight", "decoder.layers.0.self_attn.in_proj_bias", "decoder.layers.0.self_attn.out_proj.weight", "decoder.layers.0.self_attn.out_proj.bias", "decoder.layers.0.multihead_attn.in_proj_weight", "decoder.layers.0.multihead_attn.in_proj_bias", "decoder.layers.0.multihead_attn.out_proj.weight", "decoder.layers.0.multihead_attn.out_proj.bias", "decoder.layers.0.linear1.weight", "decoder.layers.0.linear1.bias", "decoder.layers.0.linear2.weight", "decoder.layers.0.linear2.bias", "decoder.layers.0.norm1.weight", "decoder.layers.0.norm1.bias", "decoder.layers.0.norm2.weight", "decoder.layers.0.norm2.bias", "decoder.layers.0.norm3.weight", "decoder.layers.0.norm3.bias", "decoder.layers.1.self_attn.in_proj_weight", "decoder.layers.1.self_attn.in_proj_bias", "decoder.layers.1.self_attn.out_proj.weight", "decoder.layers.1.self_attn.out_proj.bias", "decoder.layers.1.multihead_attn.in_proj_weight", "decoder.layers.1.multihead_attn.in_proj_bias", "decoder.layers.1.multihead_attn.out_proj.weight", "decoder.layers.1.multihead_attn.out_proj.bias", "decoder.layers.1.linear1.weight", "decoder.layers.1.linear1.bias", "decoder.layers.1.linear2.weight", "decoder.layers.1.linear2.bias", "decoder.layers.1.norm1.weight", "decoder.layers.1.norm1.bias", "decoder.layers.1.norm2.weight", "decoder.layers.1.norm2.bias", "decoder.layers.1.norm3.weight", "decoder.layers.1.norm3.bias", "decoder.norm.weight", "decoder.norm.bias", "pro_head_global.0.weight", "pro_head_global.0.bias", "pro_head_global.2.weight", "pro_head_global.2.bias", "pro_head_local.0.weight", "pro_head_local.0.bias", "pro_head_local.2.weight", "pro_head_local.2.bias".

dailenson commented 1 year ago

Sorry, I carelessly uploaded the wrong checkpoint yesterday. The correct one has been re-uploaded. You can download it from Google Drive.