bboylyg / ABL

Anti-Backdoor learning (NeurIPS 2021)
78 stars 9 forks source link

How many epochs are needed? #3

Closed htwang14 closed 2 years ago

htwang14 commented 2 years ago

Hi Yige, I noticed in the paper you mentioned using 20 epochs for pretraining (before the isolating a portion of potentially poisoned images), 60 epochs for finetuning afterwards, and lastly 20 epochs for unlearning. However, in the config.py, the default epoch settings seem to be tuning_epochs=10, finetuning_epochs=60, unlearning_epochs=5. I'm wondering which setting would achieve better results. Should I change the epoch values to tuning_epochs=20, finetuning_epochs=60, unlearning_epochs=20 (as indicated in the paper) for better results? Thank you in advance!

bboylyg commented 2 years ago

Hi, we believe our ABL is robust against different learning epochs during the training process. In fact, comparing with the settings described in our paper, we find the settings (tuning_epochs=10, finetuning_epochs=60, unlearning_epochs=5) adapted in code are enough to achieve considerable defense results against most of the attacks, with the lower computational overhead for training epochs. You can attempt other settings to balance the clean accuracy and attack success rate against different attacks.

htwang14 commented 2 years ago

Thank you so much!