Closed Azaki9 closed 8 months ago
I wonder if it’s related to the driver update offloading to ram. You can adjust a setting now to improve, and keep it in VRAM.
Direct Link: https://nvidia.custhelp.com/app/answers/detail/a_id/5490
Please find the steps in the troubleshooting guide here: https://github.com/lllyasviel/Fooocus/blob/8e62a72a63b30a3067d1a1bc3f8d226824bd9283/troubleshoot.md#i-am-using-nvidia-with-6gb-vram-i-get-cuda-out-of-memory
I downloaded all the files for Fooocus but whenever I try to input an image for a variation or image prompt, I get this error:
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 30.00 MiB. GPU 0 has a total capacty of 6.00 GiB of which 1.57 GiB is free. Of the allocated memory 3.23 GiB is allocated by PyTorch, and 178.84 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
Here's the full log for reference:
Side note: I'm using a Lenovo Legion 5 pro laptop (16ACH16) with 16gb of ram and 3060RTX and i noticed that my memory usage goes crazy at the step (moving to GPU) like 99.9999% of ram is used
I tried to search for a solution on how to change this
max_split_size_mb
but couldn't find anything useful.excuse my lack of knowledge i barely know how to do
HTML