Currently you can't change the learning rate mid-training using ProbFlow w/ the PyTorch backend. But could implement that pretty easily using torch.optim.lr_scheduler.LambdaLR.
The only trick would be that LambdaLR takes a function which should return a multiplicative factor, not the actual LR value. So would need to pass it the desired LR divided by the original LR.
Currently you can't change the learning rate mid-training using ProbFlow w/ the PyTorch backend. But could implement that pretty easily using torch.optim.lr_scheduler.LambdaLR.
The only trick would be that LambdaLR takes a function which should return a multiplicative factor, not the actual LR value. So would need to pass it the desired LR divided by the original LR.