Kid-Liet / Reg-GAN

149 stars 21 forks source link

are you really using the decay in learning rate? #14

Closed zeeshannisar closed 2 years ago

zeeshannisar commented 2 years ago

Hello Author (@Kid-Liet), Thanks for your clean implementation.

I am curious to know if you are really using the weight decay here in your implementation (NC+R)? if so in which line (can you please point me to that line)? Though I can see you have set decay_epoch=20 but I can not see its usage in CycTrainer.py file.

Also in your paper you have mentioned that the batch size was set too 1 with weight decay 0.0001. what do you mean by this line?

Kid-Liet commented 2 years ago

You are right. There is no weight decay in our code. You can modify it if you need it, and the result is almost the same.