Open minamo817 opened 10 months ago
Do you mean like using scheduler? Modifies the alpha (learning rate) during the training. You can use or implement one to affect the optimizer according your needs. https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.StepLR.html#torch.optim.lr_scheduler.StepLR
Now, if you want to modify the weight of each loss independently, you will need to modify the ESPnet models of the corresponding task. For example, in ASR: https://github.com/espnet/espnet/blob/84b7a2fc7259feeb22e58499e2e90fead0d7e4c1/espnet2/asr/espnet_model.py#L204-L354 You will need to implement a linear/step increment/decrement for each loss. Currently, there is no clean way to do that.
thanks for reply. I think my requirement is easier than steplr. I'm reading the code of enh/espnet_model.py. If I can get the "current epoch" of training, it's going to be easy to modify the loss weight by passing this "current epoch" into forward_loss. but how can I get "current epoch" during training?
The easy way: You can get the epoch from the reporter and forward it into the model. https://github.com/espnet/espnet/blob/84b7a2fc7259feeb22e58499e2e90fead0d7e4c1/espnet2/train/trainer.py#L587-L588
Another wat: Build a global config in espnet, and update it each epoch at: https://github.com/espnet/espnet/blob/84b7a2fc7259feeb22e58499e2e90fead0d7e4c1/espnet2/train/trainer.py#L301
An import it in your model as from espnet import global_config
or something like that.
I used this another way, thank you for instruction.
Hi there,
I'm working on a model and I need to change some hyperparameters as the epochs go by.
Is there a straightforward way to do this? Like, I'm thinking of changing the loss weights over the epochs based on some rules or patterns. Specifically, I want to adjust the weights of different loss functions during training.
Any help or tips on how to set this up would be really cool.
Thanks!