isl-org / MiDaS

Code for robust monocular depth estimation described in "Ranftl et. al., Towards Robust Monocular Depth Estimation: Mixing Datasets for Zero-shot Cross-dataset Transfer, TPAMI 2022"
MIT License
4.51k stars 624 forks source link

How to Save Model + Weights Locally #170

Open younesr1 opened 2 years ago

younesr1 commented 2 years ago

The example code here works fine. However, I would like to avoid loading the model from torch.hub, and just save the model and weights locally.

I have been able to save the weights locally with torch.save(midas.state_dict(), PATH). However, I am unsure where I should torch.load those weights to.

How can I save and use the model locally and avoid using torch.hub altogether?

jaween commented 1 year ago

Hi @younesr1, did you ever manage to solve this problem? I'm facing the same issue.

younesr1 commented 1 year ago

@jaween unfortunately, i did not look deeper into this and did not solve this

jaween commented 1 year ago

@younesr1 Thanks for letting me know.

Of the two Pytorch model saving methods saving just the state_dict would need the code of the model to restore it. And running print(midas) shows that it's pretty complicated to try to recreate the model in your own code.

The other method, which might have worked for you, is saving the model and the weights together.

But in my case I am using Docker, so I ended up loading the model off Torch Hub during creation of the Docker image, with torch.hub.load(). So every time the Docker container is run, torch.hub.load() uses the locally cached model from the image.