Open younesr1 opened 2 years ago
Hi @younesr1, did you ever manage to solve this problem? I'm facing the same issue.
@jaween unfortunately, i did not look deeper into this and did not solve this
@younesr1 Thanks for letting me know.
Of the two Pytorch model saving methods saving just the state_dict
would need the code of the model to restore it. And running print(midas)
shows that it's pretty complicated to try to recreate the model in your own code.
The other method, which might have worked for you, is saving the model and the weights together.
But in my case I am using Docker, so I ended up loading the model off Torch Hub during creation of the Docker image, with torch.hub.load()
. So every time the Docker container is run, torch.hub.load()
uses the locally cached model from the image.
The example code here works fine. However, I would like to avoid loading the model from torch.hub, and just save the model and weights locally.
I have been able to save the weights locally with torch.save(midas.state_dict(), PATH). However, I am unsure where I should torch.load those weights to.
How can I save and use the model locally and avoid using torch.hub altogether?