When using 6.1. Visualize loss graph (Optional) section to see the lr changing, I found that it shows the d*lr is always the same.
Not sure if it actually works or not, here are some of my parameters:
`[additional_network_arguments]
no_metadata = false
unet_lr = 1.0
text_encoder_lr = 0.5
network_module = "networks.lora"
network_dim = 128
network_alpha = 64
network_train_unet_only = false
network_train_text_encoder_only = false
When using 6.1. Visualize loss graph (Optional) section to see the lr changing, I found that it shows the d*lr is always the same.
Not sure if it actually works or not, here are some of my parameters: `[additional_network_arguments] no_metadata = false unet_lr = 1.0 text_encoder_lr = 0.5 network_module = "networks.lora" network_dim = 128 network_alpha = 64 network_train_unet_only = false network_train_text_encoder_only = false
[optimizer_arguments] min_snr_gamma = 5 optimizer_type = "DAdaptation" learning_rate = 1 max_grad_norm = 1.0 optimizer_args = [ "decouple=True", "weight_decay=0.01", "betas=0.9,0.999",] lr_scheduler = "constant" lr_warmup_steps = 0`