ddPn08 / Radiata

Stable diffusion webui based on diffusers.
https://ddpn08.github.io/Radiata/
Apache License 2.0
983 stars 68 forks source link

Model loading from other sources. #34

Open hukk06 opened 1 year ago

hukk06 commented 1 year ago

Super satisfied with the speed, thanks for such an amazing tool.

for TensorRT engine building: Is it possible to get a feature, that its possible to load the model from HDD instead of huggingface? Or if its already possible, how? Also, if the feature is being planned, is .safetensors extension support possible in the future?

Thank you for your consideration.

ddPn08 commented 1 year ago

You can load your local diffusers model. For Docker, the folder must be mounted in advance.

knot2006 commented 1 year ago

Any info how to actual do it? image Gets me this error: image

I am using Local version ( not docker ).

Stax124 commented 1 year ago

@knot2006 it needs to be converted into diffusers format first. As far as my understanding goes, this is not possible yet.

fantasyz commented 1 year ago

I really hope this can be done. I want to use my custom trained models directly from SSD without uploading to hugging face.