ignacio-rocco / weakalign

End-to-end weakly-supervised semantic alignment
MIT License
209 stars 47 forks source link

demo.py #6

Open hlee-git opened 6 years ago

hlee-git commented 6 years ago

Traceback (most recent call last): File "/home/weakalign-master/demo.py", line 77, in model.FeatureExtraction.statedict()[name].copy(checkpoint['state_dict']['FeatureExtraction.' + name])
KeyError: 'FeatureExtraction.model.1.num_batches_tracked'

I tried your code, but there was KeyError like above OrderedDict "checkpoint['state_dict']['FeatureExtraction.model.1.num_batches_tracked']" does not exist I'd appreciate it if you could check for the error.

bluedream1121 commented 6 years ago

I also have that issue after I update pytorch 0.3 to pytorch 0.4.

Who anybody solve this problem?

siavashk commented 6 years ago

I ran into the same issue. The underlying reason is that pytorch has changed its batchnorm some time between versions 0.2 and 0.4: https://github.com/pytorch/pytorch/issues/8481 This pull request is supposed to fix it, but it did not fix it for me. I managed to get the demo running by downgrading to 0.2: conda install pytorch=0.2 torchvision -c pytorch

AziziShekoofeh commented 5 years ago

I also have the same issue and yes, the underlying reason is pytorch/pytorch#8481, however, downgrading is not a practical solution in long run. Anybody plan to address this?

chenyuZha commented 3 years ago

I found the solution: instead to downgrade pytorch to 0.2, you can filter the keys by ignoring num_batches_tracked

 feature_extraction_dict = {k: v for k, v in self.model_aff.FeatureExtraction.state_dict().items() if 'num_batches_tracked' not in k}
            for name, param in feature_extraction_dict.items():
                self.model_aff.FeatureExtraction.state_dict()[name].copy_(checkpoint['state_dict']['FeatureExtraction.' + name])

            feature_reg_dict = {k: v for k, v in self.model_aff.FeatureRegression.state_dict().items() if 'num_batches_tracked' not in k}
            for name, param in feature_reg_dict.items():
                self.model_aff.FeatureRegression.state_dict()[name].copy_(checkpoint['state_dict']['FeatureRegression.' + name])

            feature_reg2_dict = {k: v for k, v in self.model_aff.FeatureRegression2.state_dict().items() if
                                'num_batches_tracked' not in k}
            for name, param in feature_reg2_dict.items():
                self.model_aff.FeatureRegression2.state_dict()[name].copy_(
                    checkpoint['state_dict']['FeatureRegression2.' + name])