ethanhe42 / epipolar-transformers

Epipolar Transformers (best paper award, CVPR 2020 workshop)
https://yihui.dev/epipolar-transformers
MIT License
413 stars 37 forks source link

Loading pretrain model #22

Closed CheungBH closed 2 years ago

CheungBH commented 3 years ago

Hi. I am trying to load your provided pre-trained models "resnet50-19c8e357.pth" and "pose_resnet_4.5_pixelshuman36m.pth" to test, but I failed. 2021-07-08 10:47:24,038 checkpointer INFO: Loading checkpoint from datasets/resnet50-19c8e357.pth Traceback (most recent call last): File "main.py", line 75, in main() File "main.py", line 68, in main test(cfg) File "/media/hkuit155/Windows/research/epipolar-transformers/engine/tester.py", line 34, in test = checkpointer.load(cfg.WEIGHTS) File "/media/hkuit155/Windows/research/epipolar-transformers/utils/checkpoint.py", line 63, in load self._load_model(checkpoint, prefix=prefix, prefix_replace=prefix_replace) File "/media/hkuit155/Windows/research/epipolar-transformers/utils/checkpoint.py", line 102, in _load_model load_state_dict(self.model, checkpoint.pop("model"), prefix=prefix, prefix_replace=prefix_replace) KeyError: 'model'

I wonder what's happening, and what's there corresponding config files?

ethanhe42 commented 3 years ago

what's the command you used?

zxk19981227 commented 2 years ago

Same problem caused with commands: python main.py --cfg configs/epipolar/keypoint_h36m_zresidual_fixed.yaml DOTRAIN False DOTEST True VIS.VIDEO True DATASETS.H36M.TEST_SAMPLE 2

zxk19981227 commented 2 years ago

I found a probable solution to this problem. As this problem is caused by the format of pth, we can simple use the checkpoint instead of checkpoint.pop('model')

zxk19981227 commented 2 years ago

Another problem occurs when loading the model pose_resnet_4.5_pixels_human36m.pth.

RuntimeError: Error(s) in loading state_dict for Modelbuilder: size mismatch for reference.module.final_layer.weight: copying a param with shape torch.Size([33, 256, 1, 1]) from checkpoint, the shape in current model is torch.Size([20, 256, 1, 1]). size mismatch for reference.module.final_layer.bias: copying a param with shape torch.Size([33]) from checkpoint, the shape in current model is torch.Size([20]). size mismatch for backbone.module.final_layer.weight: copying a param with shape torch.Size([33, 256, 1, 1]) from checkpoint, the shape in current model is torch.Size([20, 256, 1, 1]). size mismatch for backbone.module.final_layer.bias: copying a param with shape torch.Size([33]) from checkpoint, the shape in current model is torch.Size([20]). How can i fix this message?