1) Alternatively, could you add an additional text field for optimizer_args in the "Optimizer Config" node, similar to the "Init Flux LoRA Training->additional_args" node?
2) Additionally, I am interested in the weight_decay parameter. It would be great if you could add it for Adafactor and the "Optimizer Config" as well.
Thank you for your work. Could you please add the Prodigy config similar to how it is done for Adafactor? Here are its arguments: https://pytorch-optimizers.readthedocs.io/en/latest/optimizer/#pytorch_optimizer.Prodigy
1) Alternatively, could you add an additional text field for optimizer_args in the "Optimizer Config" node, similar to the "Init Flux LoRA Training->additional_args" node? 2) Additionally, I am interested in the weight_decay parameter. It would be great if you could add it for Adafactor and the "Optimizer Config" as well.