Open Mixomo opened 1 month ago
I have same problem with my 3090 Cogstudio only take less than 5GB VRAM. how to fix it? thanks
Yes. I have a 4090. I want to use all the ram available to speed up processing. It would be good if there was a VRAM limit option in the UI.
Yes same issue, alot of unused vram and cpu cores
Happy with this script. It's the way, do it yourself at home :) Same issue. Only 2-6 Go Vram used on my RTX 3060 12 Go.
Easy install with Pinokio. But mentioned that version is optimized for 8Go Vram cards. So, there are parameters somewhere... Sample - default settings (50 steps / 6 guid) but 1:15 to render. :( But very good result... with very low Vram👍
https://github.com/user-attachments/assets/5b83ee26-6e7b-4925-a1a6-146a7c714473
Any updates on this issue?
Any updates on this issue?
I dont think so
Anyone know of a separate fork that uses all of my GPU power? The pinokio install is fast and easy, but it's really slow at this point since it's not using all my vram.
This is ridiculous, why does it only use 3-6 Gbs for you but the 8 Gbs of my 2070s is not enough?
Is it possible to use all the VRAM of the GPU (I have 24 gb) to accelerate the generation? Although I check the option to disable CPU offload, it only uses 2 or 3 GB of VRAM.
Thanks in advance.