Open mmehrle opened 1 year ago
Same error i dont want XL
same here. why ?
Same error i dont want XL
Glad to see I'm not alone! I haven't made any changes to my script and it was working fine two days ago. Clearly an compatibility issue after Automatic1111 pushed v1.5.0 with SD XL support. It's probably not in the fast-stable-diffusion build.
use the latest notebook, the notebook was updated to be compatible with 1.5.0
https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast_stable_diffusion_AUTOMATIC1111.ipynb the last one is this one?
is that one support SD XL. can t see XL model on model version (1.5 to 2.1)
thanks
use the model link for now, will add sdxl shortly
Does that mean you will add support for SDXL lora training on colab notebook :D
@MaxTran96 only for inference in the A1111 notebook. As for training, you'll need to use RunPod or Paperspace
use the model link for now, will add sdxl shortly
If I run the notebook with path to model. can I load SD1.5 and XL without rebooting the notebook?
yes
With the new notebook I am still having an issue when attempting to load a different model:
Or is that because I didn't select 1.5 on the top?
@mmehrle put it in "MODEL_LINK" not PATH
@mmehrle put it in "MODEL_LINK" not PATH
YES - I figured that out eventually :-)
same, error and stop working on Colab when load SDXL1.0 model
@phephanhoang if you're using colab pro, set the runtime to high-RAM
@phephanhoang if you're using colab pro, set the runtime to high-RAM Still failed to load
I load SDXL very fast and use it normally on other custom builds but I still prefer to use Fast-SD but this error makes me so tired
@phephanhoang did you change any launching argument ? like adding --no-half ?
@phephanhoang did you change any launching argument ? like adding --no-half ?
No I didn't change anything I tried to re-install anything but It still doesn't work
remove the model and redownload it through the models cell, choose SDXL from the dropdown and run the cell
remove the model and redownload it through the models cell, choose SDXL from the dropdown and run the cell
still
you're using 12GB of RAM, change your runtime to "High-RAM"
Is Colab Pro+ is needed for SDXL or colab pro is enough?
even in colab pro, you need to set it manually to high RAM
even in colab pro, you need to set it manually to high RAM
as default I always set it to High-RAM. Tried with 3 type of GPUs but still not working
to make sure it's not some extension you installed, try running the notebook without the first cell
ARIGATO😍
even in colab pro, you need to set it manually to high RAM
as default I always set it to High-RAM. Tried with 3 type of GPUs but still not working
That's weird - I always seem to be doing fine with a regular colab VPS. Never worked for me with TTS but with SD it's fine, go figure...
This started happening today - on every single model I tried. Automatic1111 has pushed v1.5.0 (with SD XL support :) to the main branch, so I think it's related: