Closed Comet2dh closed 2 years ago
@Comet2dh
Good catch here. This is a corner case I didn't think about. You tried to resuming a training with the old model, which contains BN stats in the optimizer. In the latest version, the BN stats are removed. Thus you have this exception.
I fixed this in the latest PR #59 . Instead of throwing the ValueError, I raised an exception that reminds people this is not possible unfortuntely because optimizer params are not "named".
If you intend to run inference or finetune, you probably have fogotten some args.
When I tried to use pretrained model: sceneflow_pretrained_model.pth.tar, it reported error below:
Traceback (most recent call last): File "/media/data1/dh/CODE/stereo-transformer/main.py", line 258, in
main(args_)
File "/media/data1/dh/CODE/stereo-transformer/main.py", line 189, in main
optimizer.load_state_dict(checkpoint['optimizer'])
File "/media/data1/dh/anaconda3/lib/python3.7/site-packages/torch/optim/optimizer.py", line 123, in load_state_dict
raise ValueError("loaded state dict contains a parameter group "
ValueError: loaded state dict contains a parameter group that doesn't match the size of optimizer's group