ChrisChen1023 / HINT

HINT: High-quality INpainting Transformer with Enhanced Attention and Mask-aware Encoding
MIT License
30 stars 4 forks source link

learning rate #3

Closed mumingerlai closed 8 months ago

mumingerlai commented 8 months ago

Thanks for your code. I see that the experimental settings in your paper are like this: "The learning rate is initially set to 1e-4 and is halved at the 75% milestone of the training progress." but I don't find the halving of the training progress setting in your code. Could you tell me where it is?

ChrisChen1023 commented 8 months ago

Thanks for your code. I see that the experimental settings in your paper are like this: "The learning rate is initially set to 1e-4 and is halved at the 75% milestone of the training progress." but I don't find the halving of the training progress setting in your code. Could you tell me where it is?

Hi there,

Thanks for your interest. For the learning rate, you could use 'torch.optim.lr_scheduler.MultiStepLR()' to set the milestones, or you could just save the ckeckpoint and resume the training with the smaller learning rate. In our experiment we use the latter plan. Hope this could help.

mumingerlai commented 8 months ago

Thank you very much for your answer. I understand.