jiupinjia / stylized-neural-painting

Official Pytorch implementation of the preprint paper "Stylized Neural Painting", in CVPR 2021.
https://jiupinjia.github.io/neuralpainter/
Creative Commons Zero v1.0 Universal
1.56k stars 262 forks source link

RuntimeError: Error(s) in loading state_dict for ZouFCNFusionLight: #27

Open yuyusmile opened 3 years ago

yuyusmile commented 3 years ago

python=3.6.2 torch=1.2.0 nvidia 2080ti 11G cuda=10.0 cudnn=7..6.4.38 python demo_prog.py --img_path ./test_images/apple.jpg --canvas_color 'white' --max_m_strokes 500 --max_divide 5 --renderer oilpaintbrush --renderer_checkpoint_dir checkpoints_G_oilpaintbrush initialize network with normal loading renderer from pre-trained checkpoint... Traceback (most recent call last): File "demo_prog.py", line 113, in optimize_x(pt) File "demo_prog.py", line 49, in optimize_x pt._load_checkpoint() File "/home/banana/GAN/stylized-neural-painting-main12/painter.py", line 71, in _load_checkpoint self.net_G.load_state_dict(checkpoint['model_G_state_dict']) File "/home/banana/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 845, in load_state_dict self.class.name, "\n\t".join(error_msgs))) RuntimeError: Error(s) in loading state_dict for ZouFCNFusionLight: Unexpected key(s) in state_dict: "huangnet.fc4.weight", "huangnet.fc4.bias", "huangnet.conv3.weight", "huangnet.conv3.bias", "huangnet.conv4.weight", "huangnet.conv4.bias", "huangnet.conv5.weight", "huangnet.conv5.bias", "huangnet.conv6.weight", "huangnet.conv6.bias", "dcgan.main.10.weight", "dcgan.main.10.bias", "dcgan.main.10.running_mean", "dcgan.main.10.running_var", "dcgan.main.10.num_batches_tracked", "dcgan.main.12.weight", "dcgan.main.13.weight", "dcgan.main.13.bias", "dcgan.main.13.running_mean", "dcgan.main.13.running_var", "dcgan.main.13.num_batches_tracked", "dcgan.main.15.weight". size mismatch for huangnet.conv1.weight: copying a param with shape torch.Size([32, 16, 3, 3]) from checkpoint, the shape in current model is torch.Size([64, 8, 3, 3]). size mismatch for huangnet.conv1.bias: copying a param with shape torch.Size([32]) from checkpoint, the shape in current model is torch.Size([64]). size mismatch for huangnet.conv2.weight: copying a param with shape torch.Size([32, 32, 3, 3]) from checkpoint, the shape in current model is torch.Size([12, 64, 3, 3]). size mismatch for huangnet.conv2.bias: copying a param with shape torch.Size([32]) from checkpoint, the shape in current model is torch.Size([12]). size mismatch for dcgan.main.3.weight: copying a param with shape torch.Size([512, 512, 4, 4]) from checkpoint, the shape in current model is torch.Size([512, 256, 4, 4]). size mismatch for dcgan.main.4.weight: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([256]). size mismatch for dcgan.main.4.bias: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([256]). size mismatch for dcgan.main.4.running_mean: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([256]). size mismatch for dcgan.main.4.running_var: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([256]). size mismatch for dcgan.main.6.weight: copying a param with shape torch.Size([512, 256, 4, 4]) from checkpoint, the shape in current model is torch.Size([256, 128, 4, 4]). size mismatch for dcgan.main.7.weight: copying a param with shape torch.Size([256]) from checkpoint, the shape in current model is torch.Size([128]). size mismatch for dcgan.main.7.bias: copying a param with shape torch.Size([256]) from checkpoint, the shape in current model is torch.Size([128]). size mismatch for dcgan.main.7.running_mean: copying a param with shape torch.Size([256]) from checkpoint, the shape in current model is torch.Size([128]). size mismatch for dcgan.main.7.running_var: copying a param with shape torch.Size([256]) from checkpoint, the shape in current model is torch.Size([128]). size mismatch for dcgan.main.9.weight: copying a param with shape torch.Size([256, 128, 4, 4]) from checkpoint, the shape in current model is torch.Size([128, 6, 4, 4]).

new-cainiao commented 3 years ago

I also encountered the same question. Have you solved it?

jiupinjia commented 3 years ago

@new-cainiao @yuyusmile Can you follow the README instruction and add --net_G zou-fusion-net? Thanks for your feedback. Please tell me whether it solves your problem.