TheLastBen / fast-stable-diffusion

fast-stable-diffusion + DreamBooth
MIT License
7.53k stars 1.31k forks source link

Automatic1111 doesn't start after loading model. #370

Closed Neverdusk closed 2 years ago

Neverdusk commented 2 years ago

I seem to keep getting this output log after running every step. I don't receive any link or server after running the Colab. Capture

For reference, I inserted a drive folder path into the Model Load step, using a folder that contains multiple models. Capture1

After this failed a few times, I tried using the shared link method, but that also produced the same result. Every step previously succeeded, and the repo was confirmed to be up to date in the previous step. This is my first time using this Colab, so I'm not quite sure if I'm doing anything wrong, but any help is appreciated.

askiiart commented 2 years ago

Do you have a token? If not, you'll need that.

Neverdusk commented 2 years ago

I thought the token was only for downloading from Huggingface? I've been trying to use the path_to_trained_model or link_to_trained_model methods. Since I already have my models uploaded to Google Drive.

askiiart commented 2 years ago

You need it every time, sorry. I honestly don't know why, but you do.

Neverdusk commented 2 years ago

I input a token but received the same error. Does the Colab only work with Stable Diffusion 1.4/1.5 and not other models?

zakpurp commented 2 years ago

I think so. Use stable diffusion 1.5 for me it works well

zakpurp commented 2 years ago

Go on hugging face and make sure you accept term to use stable-diffusion-v1. 5

Neverdusk commented 2 years ago

I've already accepted the terms, but I'm trying to use Waifu Diffusion 1.2, since it has an interesting hybrid style.

Is the Colab unable to use the alternate models I already have in Google Drive? Or any custom models?

askiiart commented 2 years ago

Only dreambooth can use different models

Neverdusk commented 2 years ago

Got it. Thanks for the help.

TheLastBen commented 2 years ago

The error is because you're out of RAM (not VRAM), use the 4.7GB CKPT version and it'll work, you can use any model you want. you need the token only when you download the original 1.5 model

Neverdusk commented 2 years ago

Thank you, I'll try with a smaller model then.