TheLastBen / fast-stable-diffusion

fast-stable-diffusion + DreamBooth
MIT License
7.55k stars 1.31k forks source link

Can no longer load SD1.5 models in tester #1343

Open RoboPhred opened 1 year ago

RoboPhred commented 1 year ago

When trying to load either of my two stable diffusion 1.5 training sessions in the "Test the trained model" step, I get the following error. I have tried deleting the sd folder from my gdrive to get a fresh install of the UI, but the error persists.

LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 859.52 M params.
Downloading100% 961k/961k [00:00<00:00, 1.88MB/s]
Downloading100% 4.52k/4.52k [00:00<00:00, 2.93MB/s]
Failed to create model quickly; will retry using slow method.
LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 859.52 M params.
loading stable diffusion model: TypeError
Traceback (most recent call last):
  File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/modules/sd_models.py", line 338, in load_model
    sd_model = instantiate_from_config(sd_config.model)
  File "/content/gdrive/MyDrive/sd/stablediffusion/ldm/util.py", line 79, in instantiate_from_config
    return get_obj_from_str(config["target"])(**config.get("params", dict()))
  File "/content/gdrive/MyDrive/sd/stablediffusion/ldm/models/diffusion/ddpm.py", line 563, in __init__
    self.instantiate_cond_stage(cond_stage_config)
  File "/content/gdrive/MyDrive/sd/stablediffusion/ldm/models/diffusion/ddpm.py", line 630, in instantiate_cond_stage
    model = instantiate_from_config(config)
  File "/content/gdrive/MyDrive/sd/stablediffusion/ldm/util.py", line 79, in instantiate_from_config
    return get_obj_from_str(config["target"])(**config.get("params", dict()))
  File "/content/gdrive/MyDrive/sd/stablediffusion/ldm/modules/encoders/modules.py", line 99, in __init__
    self.tokenizer = CLIPTokenizer.from_pretrained(version)
  File "/usr/local/lib/python3.8/dist-packages/transformers/tokenization_utils_base.py", line 1777, in from_pretrained
    return cls._from_pretrained(
  File "/usr/local/lib/python3.8/dist-packages/transformers/tokenization_utils_base.py", line 1932, in _from_pretrained
    tokenizer = cls(*init_inputs, **init_kwargs)
  File "/usr/local/lib/python3.8/dist-packages/transformers/models/clip/tokenization_clip.py", line 328, in __init__
    with open(merges_file, encoding="utf-8") as merges_handle:
TypeError: expected str, bytes or os.PathLike object, not NoneType

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/webui.py", line 74, in initialize
    modules.sd_models.load_model()
  File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/modules/sd_models.py", line 341, in load_model
    sd_model = instantiate_from_config(sd_config.model)
  File "/content/gdrive/MyDrive/sd/stablediffusion/ldm/util.py", line 79, in instantiate_from_config
    return get_obj_from_str(config["target"])(**config.get("params", dict()))
  File "/content/gdrive/MyDrive/sd/stablediffusion/ldm/models/diffusion/ddpm.py", line 563, in __init__
    self.instantiate_cond_stage(cond_stage_config)
  File "/content/gdrive/MyDrive/sd/stablediffusion/ldm/models/diffusion/ddpm.py", line 630, in instantiate_cond_stage
    model = instantiate_from_config(config)
  File "/content/gdrive/MyDrive/sd/stablediffusion/ldm/util.py", line 79, in instantiate_from_config
    return get_obj_from_str(config["target"])(**config.get("params", dict()))
  File "/content/gdrive/MyDrive/sd/stablediffusion/ldm/modules/encoders/modules.py", line 99, in __init__
    self.tokenizer = CLIPTokenizer.from_pretrained(version)
  File "/usr/local/lib/python3.8/dist-packages/transformers/tokenization_utils_base.py", line 1777, in from_pretrained
    return cls._from_pretrained(
  File "/usr/local/lib/python3.8/dist-packages/transformers/tokenization_utils_base.py", line 1932, in _from_pretrained
    tokenizer = cls(*init_inputs, **init_kwargs)
  File "/usr/local/lib/python3.8/dist-packages/transformers/models/clip/tokenization_clip.py", line 328, in __init__
    with open(merges_file, encoding="utf-8") as merges_handle:
TypeError: expected str, bytes or os.PathLike object, not NoneType

Stable diffusion model failed to load, exiting

It seems to work fine when testing my SD v2 models however

TheLastBen commented 1 year ago

it might be a problem with your repos, try with a clean sd folder

JackG0T commented 1 year ago

I have the same problem

loboere commented 1 year ago

same , delete the sd folder and use the latest version of the drembooth notebook and the error still appears

JackG0T commented 1 year ago

Guys, try this colab, looks like it works fine now

https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast_stable_diffusion_AUTOMATIC1111.ipynb?authuser=2

beepbeepcat commented 1 year ago

Facing the same issue, deleting sd folder and installing new version doesnt help

TheLastBen commented 1 year ago

you need to use the latest colab https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast_stable_diffusion_AUTOMATIC1111.ipynb

darwin2k commented 1 year ago

this workbook still throws the error for me: https://github.com/TheLastBen/fast-stable-diffusion/blob/main/fast-DreamBooth.ipynb

one day before it worked :/

TheLastBen commented 1 year ago

I'm using the same notebook at this very moment and it works, remove your sd folder

darwin2k commented 1 year ago

I am quite confused. I deleted all data on google drive - did a fresh run - now it breaks at training with new images... no custom image.

UnicodeDecodeError: 'utf-8' codec can't decode byte 0x80 in position 64: invalid start byte

During handling of the above exception, another exception occurred:
TheLastBen commented 1 year ago

one of your images is probably corrupt, try converting the image to png or jpg

darwin2k commented 1 year ago

one of your images is probably corrupt, try converting the image to png or jpg

I will try to figure out if the image files are the problem. thanks!

Excalibro1 commented 1 year ago

There is definitely an issue here, I've tried multiple times with the latest notebook and by removing the sd folder (which usually always does the trick)

I trained a new 1.5 model just to be sure and still errors out. When trying to load that same recently trained 1.5 through the fast-stablediffusion instead of fast-dreambooth notebook the model works perfectly.

Also I've seen 2 different errors one occurred right after training and trying to test: TypeError: expected str, bytes or os.PathLike object, not NoneType and the other happens when trying to load an old 1.5 model: Can't load tokenizer for 'openai/clip-vit-large-patch14'.

All v2 models working on fast-dreambooth No 1.5 models working on fast-dreambooth All models working on fast-stablediffusion-auto I might still be doing something wrong but tried everything I can think of.

darwin2k commented 1 year ago

There was something messed up with my google Drive I think, somehow the trainig worked again after using another Google account.

However I am running in the same issue like @Excalibro1 described. stable diffusion 1.5

TypeError: expected str, bytes or os.PathLike object, not NoneType

I can confirm the error when loading an old model also. Can't load tokenizer for 'openai/clip-vit-large-patch14'

TheLastBen commented 1 year ago

working on a fix

Excalibro1 commented 1 year ago

working on a fix

Just tested the update you pushed and it worked perfectly cheers.