Open vdorbala opened 5 months ago
It might be related with the transformers version. I solved the problem by adding strict=False
into load_state_dict
as proposed in the following link which basically ignores non-matching keys. I hope it will work for you.
How can this be done in the colab?
When running the demo script, I get -
Unexpected key(s) in state_dict: "transformer.text_encoder.embeddings.position_ids".
while trying to load the model using -
model, postprocessor = torch.hub.load('ashkamath/mdetr:main', 'mdetr_efficientnetB5', pretrained=True, return_postprocessor=True)
Does the model not work anymore?