d8ahazard / sd_dreambooth_extension

Other
1.85k stars 283 forks source link

[Bug]: OSError: Can't load tokenizer for 'laion/CLIP-ViT-bigG-14-laion2B-39B-b160k'. #1461

Closed yincangshiwei closed 3 months ago

yincangshiwei commented 4 months ago

Is there an existing issue for this?

What happened?

An error occurred while creating the model

Steps to reproduce the problem

image

Commit and libraries

No

Command Line Arguments

No

Console logs

Wrote /root/训练/huge_in/db.json: 289 interrogations, 0 tags.
Extracting config from /root/stable-diffusion-webui/extensions/sd_dreambooth_extension/dreambooth/../configs/sdxl-training-unfrozen.yaml
Extracting checkpoint from /root/stable-diffusion-webui/models/Stable-diffusion/sdxl/通用/juggernautXL_v9Rundiffusionphoto2.safetensors
Something went wrong, removing model directory
Traceback (most recent call last):
  File "/root/miniconda3/lib/python3.10/site-packages/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py", line 1691, in download_from_original_stable_diffusion_ckpt
    tokenizer_2 = CLIPTokenizer.from_pretrained(
  File "/root/miniconda3/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1788, in from_pretrained
    raise EnvironmentError(
OSError: Can't load tokenizer for 'laion/CLIP-ViT-bigG-14-laion2B-39B-b160k'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'laion/CLIP-ViT-bigG-14-laion2B-39B-b160k' is the correct path to a directory containing all relevant files for a CLIPTokenizer tokenizer.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/root/stable-diffusion-webui/extensions/sd_dreambooth_extension/dreambooth/sd_to_diff.py", line 167, in extract_checkpoint
    pipe = StableDiffusionXLPipeline.from_single_file(
  File "/root/miniconda3/lib/python3.10/site-packages/diffusers/loaders.py", line 2822, in from_single_file
    pipe = download_from_original_stable_diffusion_ckpt(
  File "/root/miniconda3/lib/python3.10/site-packages/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py", line 1695, in download_from_original_stable_diffusion_ckpt
    raise ValueError(
ValueError: With local_files_only set to False, you must first locally save the tokenizer in the following path: 'laion/CLIP-ViT-bigG-14-laion2B-39B-b160k' with `pad_token` set to '!'.
Couldn't find /root/stable-diffusion-webui/models/dreambooth/huge/working/unet
Unable to extract checkpoint!

Additional information

No response

github-actions[bot] commented 4 months ago

This issue is stale because it has been open for 14 days with no activity. Remove stale label or comment or this will be closed in 30 days

zielony12 commented 2 months ago

So how do I fix this? I can find this tokenizer on huggingface but I'm not sure how do I use it. There are many files in the repo, not just one file with the requested name.