Open RoboPhred opened 1 year ago
it might be a problem with your repos, try with a clean sd folder
I have the same problem
same , delete the sd folder and use the latest version of the drembooth notebook and the error still appears
Guys, try this colab, looks like it works fine now
Facing the same issue, deleting sd folder and installing new version doesnt help
you need to use the latest colab https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast_stable_diffusion_AUTOMATIC1111.ipynb
this workbook still throws the error for me: https://github.com/TheLastBen/fast-stable-diffusion/blob/main/fast-DreamBooth.ipynb
one day before it worked :/
I'm using the same notebook at this very moment and it works, remove your sd folder
I am quite confused. I deleted all data on google drive - did a fresh run - now it breaks at training with new images... no custom image.
UnicodeDecodeError: 'utf-8' codec can't decode byte 0x80 in position 64: invalid start byte
During handling of the above exception, another exception occurred:
one of your images is probably corrupt, try converting the image to png or jpg
one of your images is probably corrupt, try converting the image to png or jpg
I will try to figure out if the image files are the problem. thanks!
There is definitely an issue here, I've tried multiple times with the latest notebook and by removing the sd folder (which usually always does the trick)
I trained a new 1.5 model just to be sure and still errors out. When trying to load that same recently trained 1.5 through the fast-stablediffusion instead of fast-dreambooth notebook the model works perfectly.
Also I've seen 2 different errors one occurred right after training and trying to test: TypeError: expected str, bytes or os.PathLike object, not NoneType
and the other happens when trying to load an old 1.5 model: Can't load tokenizer for 'openai/clip-vit-large-patch14'.
All v2 models working on fast-dreambooth No 1.5 models working on fast-dreambooth All models working on fast-stablediffusion-auto I might still be doing something wrong but tried everything I can think of.
There was something messed up with my google Drive I think, somehow the trainig worked again after using another Google account.
However I am running in the same issue like @Excalibro1 described. stable diffusion 1.5
TypeError: expected str, bytes or os.PathLike object, not NoneType
I can confirm the error when loading an old model also.
Can't load tokenizer for 'openai/clip-vit-large-patch14'
working on a fix
working on a fix
Just tested the update you pushed and it worked perfectly cheers.
When trying to load either of my two stable diffusion 1.5 training sessions in the "Test the trained model" step, I get the following error. I have tried deleting the sd folder from my gdrive to get a fresh install of the UI, but the error persists.
It seems to work fine when testing my SD v2 models however