alibaba / FederatedScope

An easy-to-use federated learning platform
https://www.federatedscope.io
Apache License 2.0
1.26k stars 206 forks source link

How do you implement defense algorithm for backdoor attacks #732

Closed tntstation closed 7 months ago

tntstation commented 8 months ago

I want to customize a backdoor attack defense algorithm, which code should I modify, can you give me some suggestions?

Osier-Yi commented 8 months ago

Thanks for your interest!

If your defense operation is in the training phase, you can refer how to add clip_grad(norm_bound, weight_difference, difference_flat) into the training procedure in file federatedscope/attack/trainer/benign_trainer.py.

Specifically, you need to first define the action (e.g., the clip_grad function). Then hook it into the training procedure by first define the hook function https://github.com/alibaba/FederatedScope/blob/dd5f87bb47bbb1c95302214aeed5d7185692b7ee/federatedscope/attack/trainer/benign_trainer.py#L118 After that, register this function in the trainer wrapper https://github.com/alibaba/FederatedScope/blob/dd5f87bb47bbb1c95302214aeed5d7185692b7ee/federatedscope/attack/trainer/benign_trainer.py#L44

Please feel free to ask if you have further questions. Thank you!

tntstation commented 8 months ago

Thank you for your answer, but I still have some questions. I hope the server can act as a defender, adding noise after aggregation in each round and clustering it before aggregation. What should I do?