Yangfan-Jiang / Federated-Learning-with-Differential-Privacy

Implementation of dp-based federated learning framework using PyTorch
MIT License
281 stars 55 forks source link

how to preset (epsilon, delta) pair? #4

Closed chenslcool closed 3 years ago

chenslcool commented 3 years ago

if I want to preset (epsilon, delta) pair as privacy guarantee, what should I do? just modify FLClient.update, not change the sensivity, but modify new_param[name] += add guassian noise defined in gaussian_noise(), will it work? I want to use the code of def gaussian_noise() Waht should I do? Recalculate the sensitivity? Thanks!!!

Yangfan-Jiang commented 3 years ago

Yes, it works. You need compute the sensitivity according to some published papers.

However, def gaussian_noise() calculate the variance according to the basic definition of Gaussian Mechanism, which will NOT provide a tight DP bound.

Here are some papers about how to provide a tight upper bound on DP/Renyi-PD: [1] Abadi, Martin, et al. "Deep learning with differential privacy." Proceedings of the 2016 ACM SIGSAC conference on computer and communications security. 2016. [2] Mironov, Ilya, Kunal Talwar, and Li Zhang. "R\'enyi Differential Privacy of the Sampled Gaussian Mechanism." arXiv preprint arXiv:1908.10530 (2019).

In addition, "TensorFlow Privacy" (https://github.com/tensorflow/privacy) provides analysis tools for computing the privacy guarantees, which I think may be helpful.

chenslcool commented 3 years ago

thanks! By the way, What 'LS' in 'Gaussian noise for CDP-FedAVG-LS Algorithm' means?

Yangfan-Jiang commented 3 years ago

Low Sensitivity. It comes from an unpublished manuscript.

I've modified the code of def gaussian_noise, please use the latest version. You can also use def gaussian_noise_ls() directly if you know how to compute the variance of gaussian noise to achieve (epsilon, delta)-DP.

chenslcool commented 3 years ago

thanks very much!