Closed LL020202 closed 5 months ago
I used the sd1.5 model on comfyui, and when I didn't use stable fast inference, the graphics memory consumption was about 5.5GB. However, when I used stable fast, it reached 6GB or more
You can disable cuda graph if your GPU vram isn't sufficient.
I used the sd1.5 model on comfyui, and when I didn't use stable fast inference, the graphics memory consumption was about 5.5GB. However, when I used stable fast, it reached 6GB or more