jquesnelle / txt2imghd

A port of GOBIG for Stable Diffusion
MIT License
691 stars 82 forks source link

RuntimeError: CUDA out of memory #21

Open alta750 opened 1 year ago

alta750 commented 1 year ago

I've added model.half() underneath model = instantiate_from_config(config.model) and init_image = init_image.half() underneath init_image = repeat(init_image, '1 ... -> b ...', b=batch_size)

I get this error: RuntimeError: CUDA out of memory. Tried to allocate 1024.00 MiB (GPU 0; 4.00 GiB total capacity; 2.56 GiB already allocated; 183.30 MiB free; 2.58 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

What should I do?

rokwenelenya commented 1 year ago

maybe is because your GPU has only 4GB of VRAM, what image size was your prompt made with?

alta750 commented 1 year ago

I was trying with 512 x 512

rav-en commented 1 year ago

i seem to be having the same problem with a 10gb 3080 card. even if i just boot linux right into cmd with zero usage of gfx ram i still have it complaining about memory. and that is with a 512 x 512 pic...

alta750 commented 1 year ago

I tried this instead: https://github.com/AUTOMATIC1111/stable-diffusion-webui with --medvram --opt-split-attention in the webui-user.bat file, which works for me just fine. You might not need that command though, since you have 10gb vram

jblemee commented 1 year ago

Line 62, replace model.cuda() by model.cuda().half()

Skiddoh commented 1 year ago

for me the parameters --medvram and --opt-split-attention are not available on txt2imghd.py - did i miss a step during "installation"? i tried recreating it from copying parts of the optimized/webui but without success. Any ideas on how to get it running on low VRAM GPUs yet?

alta750 commented 1 year ago

for me the parameters --medvram and --opt-split-attention are not available on txt2imghd.py - did i miss a step during "installation"? i tried recreating it from copying parts of the optimized/webui but without success. Any ideas on how to get it running on low VRAM GPUs yet?

Those parameters are for a different one, you need to download this first. Follow the install instruction and then in the webui-user.bat file you'll put --medvram --opt-split-attention in the commandline_args