QVPR / Patch-NetVLAD

Code for the CVPR2021 paper "Patch-NetVLAD: Multi-Scale Fusion of Locally-Global Descriptors for Place Recognition"
MIT License
525 stars 74 forks source link

Remove the WPCA layer before train #76

Closed longxiangjingyun closed 1 year ago

longxiangjingyun commented 1 year ago

Hello, professors! Could you explain in detail how to remove the WPCA layer before training? (In the readme.md-->Quick start-->training) issue

Tobias-Fischer commented 1 year ago

Can you try:

if 'WPCA' in model._modules:
    del model._modules['WPCA']
Tobias-Fischer commented 1 year ago

Closing for now - feel free to reopen if above doesn’t work.

longxiangjingyun commented 1 year ago

First, I'm glad to get your answer. I already solve the problems. But I get a new prblem and I will discribe it. In train.py, some keys(line130--epoch, line197--optimizer, line223,224--not_improved,best_score) are not defined in the mapillary_WPCA128.pth pre-model. these pre-models just include 'num_pcs', 'state_dict'. So I can't use pre-models of download files. If conditions permit, can you share the pre-models for adapting the trian.py. I would like to express my sincere gratitude here.

Tobias-Fischer commented 1 year ago

Simply comment out these lines.