Open YikaPanic opened 4 months ago
this is the same problem I encountered
Same issue. I followed https://github.com/bmaltais/kohya_ss/issues/548 but no luck.
My C://User/<user>/.cache/huggingface/hub/
will have these files created but the model directory is always empty.
lk@DESKTOP-DRABRR1 MINGW64 ~/.cache/huggingface/hub
$ ll
total 2
drwxr-xr-x 1 lk 197121 0 Jul 30 21:21 models--openai--clip-vit-large-patch14/
-rw-r--r-- 1 lk 197121 1 Jul 30 21:18 version.txt
-rw-r--r-- 1 lk 197121 1 Jul 30 21:21 version_diffusers_cache.txt
lk@DESKTOP-DRABRR1 MINGW64 ~/.cache/huggingface/hub
$ ll models--openai--clip-vit-large-patch14/
total 0
It is strange that the process doesn't even create huggingface cache directory structure, as specified in https://huggingface.co/docs/huggingface_hub/main/guides/manage-cache. I honestly don't think that is purely a network issue.
Any help is appreciated!
Can't load tokenizer for 'openai/clip-vit-large-patch14'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'openai/clip-vit-large-patch14' is the correct path to a directory containing all relevant files for a CLIPTokenizer tokenizer.