Closed mediocreatmybest closed 1 year ago
added
Awesome. I'll update and give this a shot.
I was hoping that Flash attention could make training with prior loss preservation work on my 12 GB VRAM GPU, but I am still getting out of memory errors...
Anyone got more luck?
I was hoping that Flash attention could make training with prior loss preservation work on my 12 GB VRAM GPU, but I am still getting out of memory errors...
Anyone got more luck?
What model version are you using? I don't think it's possible yet with Sd2, even with flash attention it still blows past 12GB. I've tried a few projects with the same result.
I was only using SD 1.4 and 1.5... staying away from 2+ ;) But even with those models I'm getting out of memory errors...
As you already support Xformers it would be great to see some support for Flash-attention as it can help lower the vram requirements.
For example sd_dreambooth extention now has flash attention that lets fine tuning for lower end GPUS.
https://github.com/HazyResearch/flash-attention
Thanks!