TheLastBen / fast-stable-diffusion

fast-stable-diffusion + DreamBooth
MIT License
7.54k stars 1.31k forks source link

KeyError: "Stable Diffusion XL" #2391

Open mmehrle opened 1 year ago

mmehrle commented 1 year ago

This started happening today - on every single model I tried. Automatic1111 has pushed v1.5.0 (with SD XL support :) to the main branch, so I think it's related:

Traceback (most recent call last):
  File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/webui.py", line 61, in <module>
    from modules import shared, sd_samplers, upscaler, extensions, localization, ui_tempdir, ui_extra_networks, config_states
  File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/modules/sd_samplers.py", line 1, in <module>
    from modules import sd_samplers_compvis, sd_samplers_kdiffusion, shared
  File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/modules/sd_samplers_compvis.py", line 9, in <module>
    from modules import sd_samplers_common, prompt_parser, shared
  File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/modules/sd_samplers_common.py", line 5, in <module>
    from modules import devices, processing, images, sd_vae_approx, sd_samplers, sd_vae_taesd
  File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/modules/processing.py", line 16, in <module>
    import modules.sd_hijack
  File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/modules/sd_hijack.py", line 5, in <module>
    import modules.textual_inversion.textual_inversion
  File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/modules/textual_inversion/textual_inversion.py", line 16, in <module>
    from modules import shared, devices, sd_hijack, processing, sd_models, images, sd_samplers, sd_hijack_checkpoint, errors, hashes
  File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/modules/sd_models.py", line 17, in <module>
    from modules import paths, shared, modelloader, devices, script_callbacks, sd_vae, sd_disable_initialization, errors, hashes, sd_models_config, sd_unet, sd_models_xl
  File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/modules/sd_models_config.py", line 9, in <module>
    sd_xl_repo_configs_path = os.path.join(paths.paths['Stable Diffusion XL'], "configs", "inference")
KeyError: 'Stable Diffusion XL'
YANSANSEI commented 1 year ago

Same error i dont want XL

polax007 commented 1 year ago

same here. why ? stablexl

mmehrle commented 1 year ago

Same error i dont want XL

Glad to see I'm not alone! I haven't made any changes to my script and it was working fine two days ago. Clearly an compatibility issue after Automatic1111 pushed v1.5.0 with SD XL support. It's probably not in the fast-stable-diffusion build.

TheLastBen commented 1 year ago

use the latest notebook, the notebook was updated to be compatible with 1.5.0

polax007 commented 1 year ago

https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast_stable_diffusion_AUTOMATIC1111.ipynb the last one is this one?

is that one support SD XL. can t see XL model on model version (1.5 to 2.1)

thanks

TheLastBen commented 1 year ago

use the model link for now, will add sdxl shortly

MaxTran96 commented 1 year ago

Does that mean you will add support for SDXL lora training on colab notebook :D

TheLastBen commented 1 year ago

@MaxTran96 only for inference in the A1111 notebook. As for training, you'll need to use RunPod or Paperspace

polax007 commented 1 year ago

xl2

use the model link for now, will add sdxl shortly

If I run the notebook with path to model. can I load SD1.5 and XL without rebooting the notebook? xl

TheLastBen commented 1 year ago

yes

mmehrle commented 1 year ago

With the new notebook I am still having an issue when attempting to load a different model:

image

Or is that because I didn't select 1.5 on the top?

TheLastBen commented 1 year ago

@mmehrle put it in "MODEL_LINK" not PATH

mmehrle commented 1 year ago

@mmehrle put it in "MODEL_LINK" not PATH

YES - I figured that out eventually :-)

phephanhoang commented 1 year ago

same, error and stop working on Colab when load SDXL1.0 model

TheLastBen commented 1 year ago

@phephanhoang if you're using colab pro, set the runtime to high-RAM

phephanhoang commented 1 year ago

@phephanhoang if you're using colab pro, set the runtime to high-RAM errro Still failed to load

phephanhoang commented 1 year ago

I load SDXL very fast and use it normally on other custom builds but I still prefer to use Fast-SD but this error makes me so tired

TheLastBen commented 1 year ago

@phephanhoang did you change any launching argument ? like adding --no-half ?

phephanhoang commented 1 year ago

@phephanhoang did you change any launching argument ? like adding --no-half ?

No I didn't change anything I tried to re-install anything but It still doesn't work

TheLastBen commented 1 year ago

remove the model and redownload it through the models cell, choose SDXL from the dropdown and run the cell

phephanhoang commented 1 year ago

remove the model and redownload it through the models cell, choose SDXL from the dropdown and run the cell

sd-erro

still

TheLastBen commented 1 year ago

you're using 12GB of RAM, change your runtime to "High-RAM"

polax007 commented 1 year ago

Is Colab Pro+ is needed for SDXL or colab pro is enough?

TheLastBen commented 1 year ago

even in colab pro, you need to set it manually to high RAM

phephanhoang commented 1 year ago

even in colab pro, you need to set it manually to high RAM

as default I always set it to High-RAM. Tried with 3 type of GPUs but still not working

TheLastBen commented 1 year ago

to make sure it's not some extension you installed, try running the notebook without the first cell

YANSANSEI commented 1 year ago

ARIGATO😍

mmehrle commented 1 year ago

even in colab pro, you need to set it manually to high RAM

as default I always set it to High-RAM. Tried with 3 type of GPUs but still not working

That's weird - I always seem to be doing fine with a regular colab VPS. Never worked for me with TTS but with SD it's fine, go figure...