Linaqruf / kohya-trainer

Adapted from https://note.com/kohya_ss/n/nbf7ce8d80f29 for easier cloning
Apache License 2.0
1.84k stars 302 forks source link

Changing Learning Rate while training? #203

Closed Bammargiela closed 1 year ago

Bammargiela commented 1 year ago

How can I go about changing the learning rate during training using Lora Dreambooth/Finetune? Can I just press the stop button, go change it in the other cell, and then press play again on the Start Training cell? Or is there a way to use the lr scheduler that is capable of changing the learning rate every n epochs? I want to try changing it multiple times during training every few epochs and run some tests like that. I've tried looking at different optimizers/schedulers but I'm still a little confused on how to do this. Thanks for your help.

Bammargiela commented 1 year ago

Ok, I figured this out but other people are experiencing the same issue with loading from weights not working.

Rudy34160 commented 1 year ago

Have you found a solution to assign a progressive learning rate to your training?

Linaqruf commented 1 year ago

specify lr_scheduler to change your learning rate during training

Rudy34160 commented 1 year ago

specify lr_scheduler to change your learning rate during training

How can we modify the learning rate during training with lr-scheduler? 🤔 I mean, how can I specify it to be at LR 1e-3 until epoch 10, then change to 1e-4 until epoch 150 , then 1e-5 until 300 for example? 🤔

Bammargiela commented 1 year ago

So with polynomial it does drop down gradually and you can schedule it to do the rate you want with the power I believe. I can't really find any good info on exactly what rate it drops at and how much the power changes it. I've been trying to figure out how to resume from weights but it doesn't seem to work when I do it. That would be the best bet to change learning rates while training the same model. Unfortunately it's either broken or I'm missing something.