Closed Davide-R0 closed 6 months ago
Won't fix (also can't fix tbh). What you can try is to manually set the VAE to FP16 with --vae-in-fp16
, but this should be the default anyways, see console log. The experimental flag --vae-in-cpu
might also work, see https://github.com/lllyasviel/Fooocus/blob/main/ldm_patched/modules/model_management.py#L564
In case this doesn't work: please use the supported default resolutions SDXL has been trained on for best results, see https://github.com/lllyasviel/Fooocus/pull/1617, or get better hardware so VAE fits on your VRAM.
Checklist
What happened?
Hello!
This error happen in case the aspect ratios is set in only the
config.txt
file, if the same aspect ratio is set even or only inmodules/config.py
the problem doesn't occur. Probably not GPU (GTX1650 4Gb VRAM) trouble.The error that come out is
torch.cuda.OutOfMemoryError
.Steps to reproduce the problem
"1920*1080"
Quality
modeWhat should have happened?
work, no errors
What browsers do you use to access Fooocus?
Brave, Other
Where are you running Fooocus?
Locally
What operating system are you using?
Linux 6.6.21-gentoo
Console logs
Additional information