AUTOMATIC1111 / stable-diffusion-webui

Stable Diffusion web UI
GNU Affero General Public License v3.0
136.37k stars 25.99k forks source link

[Bug]: CUDA out of memory #10403

Open Rihnami opened 1 year ago

Rihnami commented 1 year ago

Is there an existing issue for this?

What happened?

Change GTX 1650 to RTX 2060 Super and got this error

Steps to reproduce the problem

Try to generate image

What should have happened?

It must work

Commit where the problem happens

64fc936738d296f5eb2ff495006e298c2aeb51bf

What platforms do you use to access the UI ?

Windows

What browsers do you use to access the UI ?

No response

Command Line Arguments

--medvram --always-batch-cond-uncond

List of extensions

a1111-sd-webui-locon LDSR Lora ScuNET SwinIR prompt-bracket-checker

Console logs

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 3.33 GiB (GPU 0; 8.00 GiB total capacity; 1.67 GiB already allocated; 4.47 GiB free; 1.73 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation.  See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

Additional information

New updates on torch 2.0 didn't work consistently and i back to 64fc936738d296f5eb2ff495006e298c2aeb51bf But after change GPU i got this error Once after restarting the PC it worked, but the next day it didn’t "Tried to allocate 3.33 GiB. 4.47 GiB free" What?

s-b-repo commented 1 year ago

try lowvram 8 gbs is not that much when we are talking about this kind of application

s-b-repo commented 1 year ago

also you can try on linux or update graphics drivers if not it might be a problem in torch you can try using 2 gpus at once might give more than 8 gb

Rihnami commented 1 year ago

try lowvram 8 gbs is not that much when we are talking about this kind of application

But on GTX1650 medvram works fine and there were no errors

s-b-repo commented 1 year ago

yeah does not make sense this can be a real hard to solve issue can use verbose on it and send logs in pastebin

WilliamPatin commented 1 year ago

It seems to be related to the source image's dimensions not being a multiple of 4. In my case, changing the Width and Height parameters of Resize to to one of the nearest multiple of 4 fixes the issue.

Example

Source file dimensions Resize to params Result
700 x 1050 700 x 1050 RAM Error
700 x 1050 704 x 1048 Works
700 x 1050 696 x 1048 RAM Error

Happy calculating!

Sakura-Luna commented 1 year ago

It seems to be related to the source image's dimensions not being a multiple of 4. In my case, changing the Width and Height parameters of Resize to to one of the nearest multiple of 4 fixes the issue.

To avoid problems, you should use multiples of 8.

yi commented 1 year ago

2 gpus at once might give more than 8 gb

@coolst3r Can you point me in the direction of how to get torch to use two 8G GPUs at the same time on linux? Thank you!

WilliamPatin commented 1 year ago

It seems to be related to the source image's dimensions not being a multiple of 4. In my case, changing the Width and Height parameters of Resize to to one of the nearest multiple of 4 fixes the issue.

To avoid problems, you should use multiples of 8.

You're right, had a lapsus. Hence why it didn't work with every multiple of 4.

olakase12345987 commented 1 year ago

I have the same problem on a 2070 super, the only solution I found was restarting the program to clean the cache.

AvisP commented 1 year ago

@CreativeSau how did you set the Resize to option, I can't find it in the UI. Having the same CUDA out of memory issue, hopefully the multiple of 8 size helps

Sakura-Luna commented 1 year ago

The smallest VRAM overhead is a multiple of 64. Multiples of 8 are all resolutions supported by WebUI.

AvisP commented 1 year ago

Thanks @Sakura-Luna but I don't see any input panel for resize option, can you let me know how do I set input image width and height to multiple of 64?

Sakura-Luna commented 1 year ago

Thanks @Sakura-Luna but I don't see any input panel for resize option, can you let me know how do I set input image width and height to multiple of 64?

Just enter the value directly into the corresponding text box.

s-b-repo commented 1 year ago

2 gpus at once might give more than 8 gb

@coolst3r Can you point me in the direction of how to get torch to use two 8G GPUs at the same time on linux? Thank you!

you can just plug in 2 gpus in your motherboard

wagontrader commented 10 months ago

Ran into same error using UI v1.6.0 and loading 2 checkpoints at the same time. Failed after all sampling completed and just before sending final render to the UI. It appears that all my shared VRAM was used and CUDA needed 540MB which it tried to get from VRAM even though there was available RAM.

Fixed by checking the option: Only keep one model on device

s-b-repo commented 10 months ago

why is called cs go 2 not just a release it as a update why add 2 to the name