OPTML-Group / Unlearn-Saliency

[ICLR24 (Spotlight)] "SalUn: Empowering Machine Unlearning via Gradient-based Weight Saliency in Both Image Classification and Generation" by Chongyu Fan*, Jiancheng Liu*, Yihua Zhang, Eric Wong, Dennis Wei, Sijia Liu
https://www.optml-group.com/posts/salun_iclr24
MIT License
90 stars 12 forks source link

Can not decrease the foreget acc #8

Closed moyudely closed 5 months ago

moyudely commented 5 months ago

Hi, thank you for sharing your great work! I have been trying to reproduce the algorithm recently. I followed the steps in README.md to train and forget. However, the forget acc did not decrease, or in other words, the decrease was not significant,even when using FT、 GA and RL. Did I overlook any details? If you could clarify, I would be extremely grateful!I conducted experiments on CIFAR10 and RESNET18. For example, GA: python -u main_forget.py --save_dir Unlearn-Saliency-master/Classification/result_GA --mask Unlearn-Saliency-master/Classification/train_model/0model_SA_best.pth.tar --unlearn GA --num_indexes_to_replace 4500 --unlearn_lr 0.0001 --unlearn_epochs 5 and the result is: image

a-F1 commented 5 months ago

Thank you for your attention and interest in our work! I think this issue is caused by a too small learning rate. For example, for GA, you can appropriately increase the learning rate, such as to 5e-4 or 1e-3.