lllyasviel / stable-diffusion-webui-forge

GNU Affero General Public License v3.0
7.55k stars 729 forks source link

Bluescreen of Death "memory_management" everytime I try to generate #1824

Open Raz0rStorm opened 1 week ago

Raz0rStorm commented 1 week ago

spec: 16g vram and 32g ram, trying to load a 22gb FLUX model, every forge setting is set to default.

DocShotgun commented 1 week ago

If you're trying to run this model in full (16 bit) precision, you're probably running out of memory. You need a lot more memory than just the amount needed to store the model weights to actually generate.

For reference, I'm using a 4090 24gb, and it was still using more than 32gb of system ram in order to do all the offloading needed to make FLUX run.

sw0ad commented 6 days ago

I had the same problem when generating on PDXL models with lora. I didn't find a solution, I rolled back the update

Raz0rStorm commented 1 day ago

If you're trying to run this model in full (16 bit) precision, you're probably running out of memory. You need a lot more memory than just the amount needed to store the model weights to actually generate.

For reference, I'm using a 4090 24gb, and it was still using more than 32gb of system ram in order to do all the offloading needed to make FLUX run.

I switched to using a NF4 model that is 6gb in size and it's still CUDA out of memory; Do the author of Forge have a guide for this?