Linaqruf / kohya-trainer

Adapted from https://note.com/kohya_ss/n/nbf7ce8d80f29 for easier cloning
Apache License 2.0
1.86k stars 309 forks source link

Lr never changes when using D-Adaptation Adam #328

Open TsGrolken opened 10 months ago

TsGrolken commented 10 months ago

When using 6.1. Visualize loss graph (Optional) section to see the lr changing, I found that it shows the d*lr is always the same. 20240110222058

Not sure if it actually works or not, here are some of my parameters: `[additional_network_arguments] no_metadata = false unet_lr = 1.0 text_encoder_lr = 0.5 network_module = "networks.lora" network_dim = 128 network_alpha = 64 network_train_unet_only = false network_train_text_encoder_only = false

[optimizer_arguments] min_snr_gamma = 5 optimizer_type = "DAdaptation" learning_rate = 1 max_grad_norm = 1.0 optimizer_args = [ "decouple=True", "weight_decay=0.01", "betas=0.9,0.999",] lr_scheduler = "constant" lr_warmup_steps = 0`