zhangbinchi / certified-deep-unlearning

Open-source code for "Towards Certified Unlearning for Deep Neural Networks" (ICML 2024).
MIT License
1 stars 0 forks source link

我在resnet18上套了这个方法,沿用论文建议的参数组合,运行unlearn.py时获得nan值 #3

Open NKUShaw opened 2 weeks ago

NKUShaw commented 2 weeks ago

Epoch: 0, Sum: 15.257469927426428 Epoch: 100, Sum: nan Epoch: 200, Sum: nan Epoch: 300, Sum: nan Epoch: 400, Sum: nan Epoch: 500, Sum: nan

NKUShaw commented 2 weeks ago

尝试微调了各种参数,降低scale之后也只有最开始的sum降低了一点。。。。后面还是nan

zhangbinchi commented 2 weeks ago

你好,也许是inverse Hessian estimator没有收敛的原因,可以尝试增大scale值,希望有帮助

NKUShaw commented 2 weeks ago

扩大到10000后sum值可以看见了,在提高,想问一下这个代表的是什么,越高越好还是越低越好?

zhangbinchi commented 2 weeks ago

scale值需要满足Proposition 3.5中关于H的条件(足够大)才能保证inverse Hessian estimator收敛,这里输出的sum代表了estimator级数中的前n项和的范数以便于check它是否发散,sum值会随着scale值的增加而增加,只要不是nan就没有显著影响

NKUShaw commented 2 weeks ago

明白了,已经unlearn完毕,效果是我要的,感谢帮助