bmaltais / kohya_ss

Apache License 2.0
9.43k stars 1.22k forks source link

when loading saved Preset, Learning Rate always auto-fills #2576

Open rafstahelin opened 3 months ago

rafstahelin commented 3 months ago

Hi @bmaltais I never use the top most Learning Rate field. For me text encoder is always 1/10th of unet lr. I am saving my presets, but as a rule the learning rate just auto-fills and throws off my my separate lr's. Is there a way to change this it is very distracting

image


Also, when using preset with custom sdxl model with saved sdxl path, when loading the preset, I loose the v2/v_/SDXl checkboxes. I have to choose a SAI model, then repaste the original custom model path in order for the checkboxes to appear. This is unfortunate, as always train SDXL, and it is not possible to save a custom sdxl model path and keep the SDXL checkbox ticked

image

Thank you dear sir!

bmaltais commented 3 months ago

Hummm, thank you for reporting. I will check what is going on with those.

bmaltais commented 3 months ago

I just tested loading a json config and all appear to be right for SDXL checkbox. You mention preset... you mean you are using user presets and it is failing with those? Let me try that. It also appear to work fine for SDXL model retaining the SDXL checkbox. Perhaps I don't understand the issue... but I don't think I can reproduce it.

As far as the learning_rate field, it need to be set to a float value, it can't really be empty. The original sd-scripts trainer was only using this input... but after the te and unet learning rate inputs were added it sort of became legacy. If you provide values for the te and unet learning rate those will be used instead of the legacy learning_rate value... so wathever it is set at is irrelevant... Let me check the code just in case...

OK... for LoRA I have implemented code that will not pass the legacy learning_rate if it is set to 0

I am not sure this will address your issue... but let start with that. I will push the code update to the dev branch soon for it.

bmaltais commented 3 months ago

OK, it is merged... I hope this change will not cause more issues for other users... Worst case I will revert to what it used to be and you might need to endure the issue until a better solution can be implemented... but I the moment I don't have any idea... because thie field is used with other trainers and does not have the same "meaning" as in LoRA... and there is the complexity...

rafstahelin commented 3 months ago

OK, it is merged... I hope this change will not cause more issues for other users... Worst case I will revert to what it used to be and you might need to endure the issue until a better solution can be implemented... but I the moment I don't have any idea... because thie field is used with other trainers and does not have the same "meaning" as in LoRA... and there is the complexity...

Ok got it. Will test in dev. Thanks! 🙏

rafstahelin commented 3 months ago

@bmaltais changes seem to be working well for me Thanks Bernard