Closed AlirezaMorsali closed 1 year ago
Oh, cpu inferring may not be supported. The checkpoint is trained and saved with nn.DataParallel
, and when loading, if no gpu is specified, the code will not load the model with nn.DataParallel
.
A quick way to fix is to use gpu to infer. An alternative way is to remove module.
in state_dict, like state_dict = {k.replace('module.',''): v for k,v in state_dict.items()}
.
Thank you for response. I will try with GPU. In the meantime your suggestion fixed the issue with CPU. I will closing this issue.
Just as a suggestion, it would be very helpful to include your requirements.txt
with library versions if possible as some libraries had breaking changes since this code was written.
Thank you for open sourcing your work. I'm trying to familiarize myself with the repo and tried the inference with the provided pretrained model. I get error when loading the model. I would greatly appreciate your help with this issue.
Here are the arguments I pass to
infer.py
:And here is the error: