Closed FurkanGozukara closed 1 week ago
Fixed an error in the code. It should work up to 33.
Fixed an error in the code. It should work up to 33.
you are the man will test thank you so much
@kohya-ss any chance we could force it up?
i tested and works great with 33 blocks
and with cpu offloading enabled now 8 GB GPUs can also train - uses 7 GB VRAM
when can you merge it into main branch?
I think this issue is fixed.
yes working great ty so much
First of all @kohya-ss so amazing work
My previously 10.2 second / it 23.1 GB VRAM using config now 7.08 second / it and it uses 21.8 GB
My previously 15.1 GB using 13.8 second / it config is now 9.06 second still uses same VRAM
Now the issue starts after this now it can maximum swap 28 blocks
Previously it was higher so it was able to train even with 36 block swap and as low as 6 GB GPUs
Now i got this error with 29 block swap