Closed neojam closed 3 months ago
Hi there, 2 points.
lr_scheduler_type
is REX it should be fine.ss_optimizer
you posted it starts with came_pytorch
which is the library we used in our modification. The library that the implementation of Derrian's uses is pytorch_optimizer
I hope this helps you to understand why its different. If you have any other question don't hesitate to ask :)
Hi thanks for the info. That sounds both interesting and confusing ^^'. So is there another fork of this trainer?
Here is the concept Lora that works pretty well, so i wanted to try out the settings for my bakes: https://civitai.com/models/332691/broken-glass-breaking-glass-or-concept-lora-xl
Hi thanks for the info. That sounds both interesting and confusing ^^'. So is there another fork of this trainer?
Here is the concept Lora that works pretty well, so i wanted to try out the settings for my makes: https://civitai.com/models/332691/broken-glass-breaking-glass-or-concept-lora-xl
Yup, FallenIncursio used (or still uses) the modified version of the trainer. But don't worry, the dev branch uses REX properly and the CAME from pytorch_optimizer
should be better than the one from came_pytorch
ok, thanks for the explanation.
IMHO the "lr_scheduler" in dev branch still needs a change to reflect the fact that REX was used in "lr_scheduler_type". As it currently is (setting it to previously selected scheduler for toml and meta) seems wrong and will give false info about what scheduler was really used during the training for people who check meta and want to try the settings.
ok, thanks for the explanation.
IMHO the "lr_scheduler" in dev branch still needs a change to reflect the fact that REX was used in "lr_scheduler_type". As it currently is (setting it to previously selected scheduler for toml and meta) seems wrong and will give false info about what scheduler was really used during the training for people who check meta and want to try the settings.
Derrian doesn't want to modify the sd scripts code, unless there's a way to add that without touching its code it won't be implemented
AFAIK, the meta can be easily changed post training, so maybe use it as a workaround to change "ss_lr_scheduler" to "REX" (and maybe later for other things as well, if more non-standard features get implemented). But for now I guess the Issue can be closed.
I don't think it's a worthwhile use of my time to reinvent the wheel, so I'll be closing this issue now
Hi there,
i just wanted to try out the CAME Otimizer + REX scheduler, but it looks like the REX scheduler is not being set correctly (according to toml and meta of the lora. Im on the dev branch btw.)
"lr_scheduler" in toml file is always being set to whatever scheduler you choose in the scheduler dropdown menu, before selecting it to "rex". So if you selected "polynomial" and then selected the "rex", the settings are saved as following in the toml-file
If you bake the lora, the meta will say
I got interested in came+rex (and so in this project, since apparently other "kohya forks" do not have that) after checking the meta of few good trained loras on civitai. They have the following in the meta:
But i cant match the settings, since i cant select the scheduler to "REX".
Any Idea whats going on?
Also maybe as a side request: It would be great, if there were was a way to set the settings directly from lora's metadata (Lora-meta to Toml tool, or just directly opening a lora in LoRA_Easy_Training_Scripts GUI and have the most settings already set).