Closed thavocado closed 11 months ago
Have you added decouple
as true in the optimizer arguments?
Thanks for the reply.
I didn’t set that option at all.
Could that be the cause of the problem?
yea, i know someone that tried using Dadap without that argument and got a similar problem
Kohya has like default arguments for each optimizer but this trainer has the same for every one so you need to apply the needed ones by yourself
Thanks a lot! I’ll give it a test tomorrow!
no problem :) let me know if your issue still persists
I know my scripts work fine with dadapt, though I personally don't use it. All that my ui does is just pass args to kohya, do some validation, and generally tries to make things a bit easier for people. I've had somebody else just state that it works yesterday as well, so it's very possibly because of your args
So turns out I was using DAdaptAdam and that setting decouple to true absolutely produces working Loras now! Woo! Thank you so much for your help! ❤️🥳
Alright, I'm going to close this issue now, then. I should probably force decouple to be on for dadapt so people don't need to worry about it.
So turns out I was using DAdaptAdam and that setting decouple to true absolutely produces working Loras now! Woo! Thank you so much for your help! ❤️🥳
Great! Nice that you solved it. Also yes, i too also think using decouple as default on Dadap can be a good thing so people isn't confused and there's not a lot of issues open here because of that :D
No problems with Adam. Output Loras have no effect on generated image. No such issues with Kohya. Using default reasonable settings.