Closed Ratinod closed 2 months ago
As of the latest commit 7596566a8972c325fddfbd9587ed25258fe4e022 , it is no longer possible to use SUPIR with 8GB VRAM.
*** torch.cuda.OutOfMemoryError: Allocation on device 0 would exceed allowed memory. (out of memory) Currently allocated : 7.28 GiB Requested : 1.56 MiB Device limit : 8.00 GiB Free (according to CUDA): 0 bytes PyTorch limit (set by user-supplied memory fraction) : 17179869184.00 GiB *** Exception: Failed to load SDXL model
If you go back to commit 49afedaddde27ca1c2092abccb5930a10112d5f0 then everything works as before.
Ok, I made the change optional now, default is off and it should load weights to RAM like it used to. Reason for the change was to speed up the really slow model loading time.
Problem solved.
As of the latest commit 7596566a8972c325fddfbd9587ed25258fe4e022 , it is no longer possible to use SUPIR with 8GB VRAM.
If you go back to commit 49afedaddde27ca1c2092abccb5930a10112d5f0 then everything works as before.