Closed vislupus closed 9 months ago
You seem to have a wrong numpy version. This does not happen wif everything is installed exactly as described in the readme.md. I am not sure about how Colab handles things.
Regarding the vram: this example code does not use additional performance tricks, so it requires the full model to be loaded and processed on the gpu at all times. So you might need more than 16gb vram to run the SDXL model.
cheers
Hi,
Thanks for the release of both the code and models.
I've tried to run the code on Google Colab, and I've got an error. Probably, I'm missing something, but in case it is an error. You can see the Colab.
Also, it looks like it needs too much VRAM for free Colab. Can you tell me if that is true?
Thanks