I have trained a base model using 4 female datasets using the followding lr.
optD = torch.optim.Adam(netD.parameters(), lr=1e-4, betas=(0.5, 0.9))
Then I want to finetune the model using a new small female dataset, should I change the lr.
Maybe lr=0.00001 without decay?
Does anyone have done the experiements and have any conclusion?
I have trained a base model using 4 female datasets using the followding lr. optD = torch.optim.Adam(netD.parameters(), lr=1e-4, betas=(0.5, 0.9))
Then I want to finetune the model using a new small female dataset, should I change the lr. Maybe lr=0.00001 without decay? Does anyone have done the experiements and have any conclusion?