juntang-zhuang / Adabelief-Optimizer

Repository for NeurIPS 2020 Spotlight "AdaBelief Optimizer: Adapting stepsizes by the belief in observed gradients"
BSD 2-Clause "Simplified" License
1.04k stars 108 forks source link

denom = (exp_avg_var.add_(group['eps']).sqrt() / math.sqrt(bias_correction2)).add_(group['eps']) #18

Closed yuanwei2019 closed 3 years ago

yuanwei2019 commented 3 years ago

作者你好,我发现Adabelief-Optimizer/PyTorch_Experiments/AdaBelief.py里的第157行: ‘ denom = (exp_avgvar.add(group['eps']).sqrt() / math.sqrt(biascorrection2)).add(group['eps'])’ exp_avgvar.add(eps)这样是不是每次修正偏差都会导致exp_avg_var加上一个eps,和文中的St更新公式不一样。是不是应该改成exp_avg_var.add(group['eps'])或者是使用add_实验效果好?

juntang-zhuang commented 3 years ago

感谢指出,我没有测过用add。写code的时候没有想到这一点直接把后面的add_(group['eps'])复制到前面去了。有可能改过来之后效果更好,因为随着增加eps*t,分母变大导致stepsize逐渐变小,可能导致后期fientune不起作用。稍后我测一下。