kijai / ComfyUI-FluxTrainer

Apache License 2.0
464 stars 23 forks source link

Prodigy optimizer and weight_decay #15

Closed bananasss00 closed 2 months ago

bananasss00 commented 2 months ago

Thank you for your work. Could you please add the Prodigy config similar to how it is done for Adafactor? Here are its arguments: https://pytorch-optimizers.readthedocs.io/en/latest/optimizer/#pytorch_optimizer.Prodigy

1) Alternatively, could you add an additional text field for optimizer_args in the "Optimizer Config" node, similar to the "Init Flux LoRA Training->additional_args" node? 2) Additionally, I am interested in the weight_decay parameter. It would be great if you could add it for Adafactor and the "Optimizer Config" as well.

kijai commented 2 months ago

image

This should do it? I think I should separate the lr_scheduler settings to it's own node too...

bananasss00 commented 2 months ago

image

This should do it? I think I should separate the lr_scheduler settings to it's own node too...

Thank you