Open merket opened 1 month ago
I don't know I have an 3090 RTX and it working fine
I don't know I have an 3090 RTX and it working fine
but 3090 got 24G Vram, this only take less 5G Vrams.
I don't know I have an 3090 RTX and it working fine
I believe you wanted to show that it only uses 5GB so it should've worked on mine as well, but as I mentioned in my OP; someone on discord said that on GPUs without RTX it falsely tries to allocate wrong amount of VRAM to the gpu... So If I had a gpu with RTX and only 8gbs of ram, may be it would've tried to allocate only 3-5 gbs... but without RTX architecture the script goes bonkers.
I'm using comfyUI so not sure if it applies here, but do you have cpu offloading, tiling and slicing enabled?
I don't know I have an 3090 RTX and it working fine
I believe you wanted to show that it only uses 5GB so it should've worked on mine as well, but as I mentioned in my OP; someone on discord said that on GPUs without RTX it falsely tries to allocate wrong amount of VRAM to the gpu... So If I had a gpu with RTX and only 8gbs of ram, may be it would've tried to allocate only 3-5 gbs... but without RTX architecture the script goes bonkers.
Yes, even I have 3090 my card with 24 VRAM, but still less 5g vrams only.
I don't know I have an 3090 RTX and it working fine
I believe you wanted to show that it only uses 5GB so it should've worked on mine as well, but as I mentioned in my OP; someone on discord said that on GPUs without RTX it falsely tries to allocate wrong amount of VRAM to the gpu... So If I had a gpu with RTX and only 8gbs of ram, may be it would've tried to allocate only 3-5 gbs... but without RTX architecture the script goes bonkers.
Yes, even I have 3090 my card with 24 VRAM, but still less 5g vrams only.
Well, thank you for your input, but as I described there seems to be an allocation problem with the cards which do not have RTX and that is the whole problem. Meaning that even if the job needed less than 8gbs it tries to allocate much more than needed to VRAM in old cards.
At least that is what they told me on discord, I was just trying to confirm that, because I would very much to be able to use CogStudio on my non RTX card with low vram, as well.
I don't know I have an 3090 RTX and it working fine
I believe you wanted to show that it only uses 5GB so it should've worked on mine as well, but as I mentioned in my OP; someone on discord said that on GPUs without RTX it falsely tries to allocate wrong amount of VRAM to the gpu... So If I had a gpu with RTX and only 8gbs of ram, may be it would've tried to allocate only 3-5 gbs... but without RTX architecture the script goes bonkers.
Yes, even I have 3090 my card with 24 VRAM, but still less 5g vrams only.
Well, thank you for your input, but as I described there seems to be an allocation problem with the cards which do not have RTX and that is the whole problem. Meaning that even if the job needed less than 8gbs it tries to allocate much more than needed to VRAM in old cards.
At least that is what they told me on discord, I was just trying to confirm that, because I would very much to be able to use CogStudio on my non RTX card with low vram, as well.
I suggest that they add the option of using the percentage of VRAM of the graphics card or how much VRAM to use, so that everyone can set it according to their own graphics card.
I have 2070S RTX and the part about only 5 gigs is a lie, it crashes with CUDA OOM.
I was so excited to see that this app would run with low VRAMs under 10GB. But I believe unless the GPU is RTX architecture we can not use it right?
Can somebody confirm or deny this?
Because when I try to run it I get an instant error of "can not allocate 35GB" for my 8gb card... Somebody on discord said that this is an issue related to not having RTX.