Stable-X / StableNormal

[SIGGRAPH Asia 2024 (Journal Track)] StableNormal: Reducing Diffusion Variance for Stable and Sharp Normal
Apache License 2.0
491 stars 23 forks source link

How to donwload weights? #14

Closed CanCanZeng closed 2 months ago

CanCanZeng commented 2 months ago

Hello, I have successfully run the algorithm from the command line. But I found that every time I run it, I have to access huggingface, which makes it slow to start at beginning. How do I follow the instructions in README to download the weights file locally?

hugoycj commented 2 months ago

To download the necessary model weights, use the following commands:

huggingface-cli download --repo-type model --local-dir ./weights/stable-normal-v0-1 Stable-X/stable-normal-v0-1
huggingface-cli download --repo-type model --local-dir ./weights/yoso-normal-v0-3 Stable-X/yoso-normal-v0-3

Then you could use

predictor = torch.hub.load("Stable-X/StableNormal", "StableNormal", trust_repo=True, local_cache_dir='./weights')

to load the weights from local

CanCanZeng commented 2 months ago

Hello, I followed the command above to download the weights locally, but each time I run it, I still need to access huggingface. If I don't use a proxy, I won't be able to run it. Is there a way to run it completely locally? Additionally, I would like to obtain the output of the network in order to obtain all possible results and save them in a custom format. What should I do?

hugoycj commented 2 months ago

We will check the torch.hub.load("Stable-X/StableNormal", "StableNormal", trust_repo=True, local_cache_dir='./weights') with right local_cache_dir, it could run offline on our side. BTW, you could run following command to check whether the local mode work

python hubconf.py --input ./files/image/021-engine.jpg --output ./out.jpg
hugoycj commented 2 months ago

As for the "all possible results", do you mean the intermediate x0 output from multi-step diffusion?

CanCanZeng commented 2 months ago

I made a mistake, it's true that even if the internet is disconnected, it can still run normally. But what's strange is that sometimes it starts quickly, sometimes it freezes for a while during startup, and sometimes it gives an error. (But now I cannot reproduce the error situation)

All possible outputs refer to both my desire to obtain float type outputs and my desire to see if I can obtain outputs similar to confidence map. I can get float type outputs using hubconf.py now, but does not see anything like confidence map.

hugoycj commented 2 months ago

We don't have a built-in confidence map in our pipeline unlike Marigold, since we don't use ensemble approach by running multiple forward passes to calculate mean and variance as confidence. A similar confidence calculation method can be achieved by storing all the intermediate predicted x0 values from the denoising process.

A more practical solution is to train an uncertainty estimation network like this, which can generate a more reliable confidence map. Also a good confidence map can be calculate by inverse warping, like what Neuris done

image
hugoycj commented 2 months ago

@CanCanZeng Thank you for your patience. I'm closing this issue as the download weight problem appears to be resolved. We're currently setting up a mirror in China for users who can't access Hugging Face. If you have any further questions about confidence levels or StableNormal output, please don't hesitate to open a new issue. We appreciate your feedback and involvement in improving our project.

CanCanZeng commented 1 month ago

A network error occurred again when starting to run. The cached weights are all present, I don't understand why this problem occurs @hugoycj image

hugoycj commented 1 month ago

Seems another network issue caused by dinov2 loading. Do you have any suggestion @lingtengqiu

WXizhi commented 2 weeks ago

A network error occurred again when starting to run. The cached weights are all present, I don't understand why this problem occurs @hugoycj image

There is a solution mentioned in https://github.com/facebookresearch/dinov2/issues/91