QVPR / Patch-NetVLAD

Code for the CVPR2021 paper "Patch-NetVLAD: Multi-Scale Fusion of Locally-Global Descriptors for Place Recognition"
MIT License
517 stars 72 forks source link

Move checkpoint loading before DataParallel #22

Closed michaelschleiss closed 2 years ago

michaelschleiss commented 2 years ago

Currently feature_extract.py will fail with the following error message if nGPU > 1.

=> loading checkpoint '/home/michael.schleiss/repos/Patch-NetVLAD/patchnetvlad/./pretrained_models/pittsburgh_WPCA128.pth.tar'
Traceback (most recent call last):
  File "feature_extract.py", line 174, in <module>
    main()
  File "feature_extract.py", line 160, in main
    model.load_state_dict(checkpoint['state_dict'])
  File "/home/michael.schleiss/miniconda3/envs/netvlad/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1223, in load_state_dict
    raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for Module:
    Missing key(s) in state_dict: "encoder.module.0.weight", "encoder.module.0.bias", "encoder.module.2.weight", "encoder.module.2.bias", "encoder.module.5.weight", "encoder.module.5.bias", "encoder.module.7.weight", "encoder.module.7.bias", "encoder.module.10.weight", "encoder.module.10.bias", "encoder.module.12.weight", "encoder.module.12.bias", "encoder.module.14.weight", "encoder.module.14.bias", "encoder.module.17.weight", "encoder.module.17.bias", "encoder.module.19.weight", "encoder.module.19.bias", "encoder.module.21.weight", "encoder.module.21.bias", "encoder.module.24.weight", "encoder.module.24.bias", "encoder.module.26.weight", "encoder.module.26.bias", "encoder.module.28.weight", "encoder.module.28.bias", "pool.module.centroids", "pool.module.conv.weight".
    Unexpected key(s) in state_dict: "encoder.0.weight", "encoder.0.bias", "encoder.2.weight", "encoder.2.bias", "encoder.5.weight", "encoder.5.bias", "encoder.7.weight", "encoder.7.bias", "encoder.10.weight", "encoder.10.bias", "encoder.12.weight", "encoder.12.bias", "encoder.14.weight", "encoder.14.bias", "encoder.17.weight", "encoder.17.bias", "encoder.19.weight", "encoder.19.bias", "encoder.21.weight", "encoder.21.bias", "encoder.24.weight", "encoder.24.bias", "encoder.26.weight", "encoder.26.bias", "encoder.28.weight", "encoder.28.bias", "pool.centroids", "pool.conv.weight".

Fixed by moving model.load_state_dict(checkpoint['state_dict']) before wrapping the model into nn.DataParallel.

Tobias-Fischer commented 2 years ago

Many thanks for the catch @michaelschleiss! Looks good to me - @oravus @StephenHausler let's merge?

StephenHausler commented 2 years ago

Yep looks good, feel free to merge