OneTrainer has a feature that allows you to train all layers for a LoRA/DoRA, do we have a similar option?
If not, is it something that we can consider supporting?
My understanding is that this full layer training, together with DoRA, will lead to higher quality overall.
Hi, I'm having a question.
OneTrainer has a feature that allows you to train all layers for a LoRA/DoRA, do we have a similar option? If not, is it something that we can consider supporting?
My understanding is that this full layer training, together with DoRA, will lead to higher quality overall.