ohhhyeahhh / PointAttN

Code for the paper "PointAttN: You Only Need Attention for Point Cloud Completion"
https://ojs.aaai.org/index.php/AAAI/article/view/28356
89 stars 13 forks source link

Size Mismatch for test_c3d.py #7

Closed ShenZheng2000 closed 2 years ago

ShenZheng2000 commented 2 years ago

Hello, authors. I am trying to test the Completion3D dataset using your pretrained weight which is placed in PointAttN.yaml as load_model: ./log/PointAttN_cd_debug_c3d/model_c3d.pth

However, after I run in terminal python test_c3d.py -c PointAttN.yaml, the following error occurs.

1184
INFO - 2022-05-19 20:07:08,133 - test_c3d - Length of test dataset:1184
Loaded compiled 3D CUDA chamfer distance
Traceback (most recent call last):
  File "test_c3d.py", line 81, in <module>
    test()
  File "test_c3d.py", line 37, in test
    net.module.load_state_dict(torch.load(args.load_model)['net_state_dict'])
  File "/root/miniconda3/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1406, in load_state_dict
    raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for Model:
        size mismatch for refine.sa3.multihead_attn1.in_proj_weight: copying a param with shape torch.Size([384, 128]) from checkpoint, the shape in current model is torch.Size([1536, 512]).
        size mismatch for refine.sa3.multihead_attn1.in_proj_bias: copying a param with shape torch.Size([384]) from checkpoint, the shape in current model is torch.Size([1536]).
        size mismatch for refine.sa3.multihead_attn1.out_proj.weight: copying a param with shape torch.Size([128, 128]) from checkpoint, the shape in current model is torch.Size([512, 512]).
        size mismatch for refine.sa3.multihead_attn1.out_proj.bias: copying a param with shape torch.Size([128]) from checkpoint, the shape in current model is torch.Size([512]).
        size mismatch for refine.sa3.linear11.weight: copying a param with shape torch.Size([1024, 128]) from checkpoint, the shape in current model is torch.Size([1024, 512]).
        size mismatch for refine.sa3.linear12.weight: copying a param with shape torch.Size([128, 1024]) from checkpoint, the shape in current model is torch.Size([512, 1024]).
        size mismatch for refine.sa3.linear12.bias: copying a param with shape torch.Size([128]) from checkpoint, the shape in current model is torch.Size([512]).
        size mismatch for refine.sa3.norm12.weight: copying a param with shape torch.Size([128]) from checkpoint, the shape in current model is torch.Size([512]).
        size mismatch for refine.sa3.norm12.bias: copying a param with shape torch.Size([128]) from checkpoint, the shape in current model is torch.Size([512]).
        size mismatch for refine.sa3.norm13.weight: copying a param with shape torch.Size([128]) from checkpoint, the shape in current model is torch.Size([512]).
        size mismatch for refine.sa3.norm13.bias: copying a param with shape torch.Size([128]) from checkpoint, the shape in current model is torch.Size([512]).
        size mismatch for refine.sa3.input_proj.weight: copying a param with shape torch.Size([128, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 512, 1]).
        size mismatch for refine.sa3.input_proj.bias: copying a param with shape torch.Size([128]) from checkpoint, the shape in current model is torch.Size([512]).
        size mismatch for refine.conv_ps.weight: copying a param with shape torch.Size([128, 128, 1]) from checkpoint, the shape in current model is torch.Size([512, 512, 1]).
        size mismatch for refine.conv_ps.bias: copying a param with shape torch.Size([128]) from checkpoint, the shape in current model is torch.Size([512]).
        size mismatch for refine1.sa3.multihead_attn1.in_proj_weight: copying a param with shape torch.Size([1536, 512]) from checkpoint, the shape in current model is torch.Size([3072, 1024]).
        size mismatch for refine1.sa3.multihead_attn1.in_proj_bias: copying a param with shape torch.Size([1536]) from checkpoint, the shape in current model is torch.Size([3072]).
        size mismatch for refine1.sa3.multihead_attn1.out_proj.weight: copying a param with shape torch.Size([512, 512]) from checkpoint, the shape in current model is torch.Size([1024, 1024]).
        size mismatch for refine1.sa3.multihead_attn1.out_proj.bias: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([1024]).
        size mismatch for refine1.sa3.linear11.weight: copying a param with shape torch.Size([1024, 512]) from checkpoint, the shape in current model is torch.Size([1024, 1024]).
        size mismatch for refine1.sa3.linear12.weight: copying a param with shape torch.Size([512, 1024]) from checkpoint, the shape in current model is torch.Size([1024, 1024]).
        size mismatch for refine1.sa3.linear12.bias: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([1024]).
        size mismatch for refine1.sa3.norm12.weight: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([1024]).
        size mismatch for refine1.sa3.norm12.bias: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([1024]).
        size mismatch for refine1.sa3.norm13.weight: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([1024]).
        size mismatch for refine1.sa3.norm13.bias: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([1024]).
        size mismatch for refine1.sa3.input_proj.weight: copying a param with shape torch.Size([512, 512, 1]) from checkpoint, the shape in current model is torch.Size([1024, 512, 1]).
        size mismatch for refine1.sa3.input_proj.bias: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([1024]).
        size mismatch for refine1.conv_ps.weight: copying a param with shape torch.Size([512, 512, 1]) from checkpoint, the shape in current model is torch.Size([1024, 1024, 1]).
        size mismatch for refine1.conv_ps.bias: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([1024]).

Since that error does not occur with the PCN dataset, I wonder if if you have uploaded the wrong pretrained weight, or I have made some mistaked placing the pretrained model.

Hope you can clarify it. Thanks!

WangJun-ZJUT commented 2 years ago

Please make sure the model of C3D is used, because we find the error reports that the shape in current model corresponds to the PCN dataset. It should be note that the dataset parameters in PointAttN.yaml need to be modified at the same time.

ShenZheng2000 commented 2 years ago

I have found the problem. In PointAttN.yaml, we should specify both load_model and dataset. Now I have solved the problem, I will close this issue. Thanks for your nice help!