Open hukk06 opened 1 year ago
You can load your local diffusers model. For Docker, the folder must be mounted in advance.
Any info how to actual do it? Gets me this error:
I am using Local version ( not docker ).
@knot2006 it needs to be converted into diffusers format first. As far as my understanding goes, this is not possible yet.
I really hope this can be done. I want to use my custom trained models directly from SSD without uploading to hugging face.
Super satisfied with the speed, thanks for such an amazing tool.
for TensorRT engine building: Is it possible to get a feature, that its possible to load the model from HDD instead of huggingface? Or if its already possible, how? Also, if the feature is being planned, is .safetensors extension support possible in the future?
Thank you for your consideration.