Closed FinecoFinit closed 11 months ago
Do you mind sharing the extra model entry that caused this error? Was the source a file or a link of some kind?
I recently refactored some of this code, but I didn't change the logic around loading the pipeline. However, I do see one codepath with a missing else
:
That should only happen if the path exists but is not a directory or file. If that code is not handling symlinks/hardlinks correctly, I should be able to fix that.
Edit: I adjusted the logic to load everything that is not a directory as a tensor/checkpoint file. That might be an issue if you have a link from a folder to a folder, I'll have to set up some tests for that.
@ssube
Well, after latest git pull and huggingface-hub==0.19.4 install I can't download from civitai, even if I have files in .cache.
Relative paths also doesn't work on windows:
"source": "../models/.cache/diffusion-*.safetensors"
Is diffusion-*.safetensors
a specific filename or meant to be a wildcard? The source
field doesn't have any support for wildcards, if the latter. I'm not sure if Python translates slashes, so for Windows that might also need to use backslashes, like "..\\models\\.cache\\diffusion-model.safetensors"
.
I tested downloading from civitai and converting, and that seems to work correctly:
Are you getting an error message in the logs, does the file not download, or what is happening? If you are getting an error, please include that here.
@ssube
[2023-12-16 20:25:58,291] INFO: MainProcess MainThread onnx_web.convert.diffusion.diffusion: converting Stable Diffusion model diffusion-domesticated-v1-5: civitai://188101?type=Model&format=SafeTensor&size=pruned&fp=fp16 -> ..\models\diffusion-domesticated-v1-5/
[2023-12-16 20:25:58,523] ERROR: MainProcess MainThread __main__: error converting diffusion model diffusion-domesticated-v1-5
Traceback (most recent call last):
File "D:\@home\fifi\Documents\onnx-web\api\onnx_web\convert\__main__.py", line 443, in convert_models
convert_model_diffusion(conversion, model)
File "D:\@home\fifi\Documents\onnx-web\api\onnx_web\convert\__main__.py", line 263, in convert_model_diffusion
converted, dest = converter(
File "D:\@home\fifi\Documents\onnx-web\api\onnx_env\lib\site-packages\torch\autograd\grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "D:\@home\fifi\Documents\onnx-web\api\onnx_web\convert\diffusion\diffusion.py", line 372, in convert_diffusion_diffusers
pipeline = download_from_original_stable_diffusion_ckpt(
File "D:\@home\fifi\Documents\onnx-web\api\onnx_env\lib\site-packages\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py", line 1252, in download_from_original_stable_diffusion_ckpt
checkpoint = safe_load(checkpoint_path_or_dict, device="cpu")
File "D:\@home\fifi\Documents\onnx-web\api\onnx_env\lib\site-packages\safetensors\torch.py", line 308, in load_file
with safe_open(filename, framework="pt", device=device) as f:
FileNotFoundError: No such file or directory: "civitai://188101?type=Model&format=SafeTensor&size=pruned&fp=fp16"
Excellent, thank you. I was able to reproduce the error using that stack trace and the right combination of params (it only happens when the options are set to extract torch models).
I pushed a change that fixes it on my machine, please try it again when you have a chance.
@ssube tested, problem resolved
Rolling back to e91e084 fixed the thing