wenzhu23333 / Differential-Privacy-Based-Federated-Learning

Everything you want about DP-Based Federated Learning, including Papers and Code. (Mechanism: Laplace or Gaussian, Dataset: femnist, shakespeare, mnist, cifar-10 and fashion-mnist. )
GNU General Public License v3.0
348 stars 55 forks source link

Sensitivity caculation of different mechanism #18

Closed Toby-Kang closed 7 months ago

Toby-Kang commented 9 months ago

Hi wenzhu!

In the file dp_mechanism.py, there are two different sensitivity calculations. I understand how the function cal_sensitivity() works, but a bit confused about the function cal_sensitivity_MA(). I wonder why the coefficients of output in the latter function is 1 instead of 2.

I'd appreciate it if you can help, thanks!

wenzhu23333 commented 9 months ago

Good question, I'm not sure why it's C; I simply replicated the algorithmic part from the paper "Deep Learning with Differential Privacy." If you know why it's C, please enlighten me, as this question has puzzled me for quite some time.

Toby-Kang commented 9 months ago

One possible explanation that I found on bilibili is as follows:

The sensitivity calculation is dependent on the definition of the adjacent dataset. If adjacent dataset is defined as adding/removing a data sample, then the maximal contribution of this one sample to the gradient norm is C. Hence, the sensitivity is C. If adjacent dataset is defined as replacing one sample, then the sensitivity would be 2C instead.

The explanation is available in the comment section of this video: https://www.bilibili.com/video/BV1r44y1u7iP/

wenzhu23333 commented 8 months ago

One possible explanation that I found on bilibili is as follows:

The sensitivity calculation is dependent on the definition of the adjacent dataset. If adjacent dataset is defined as adding/removing a data sample, then the maximal contribution of this one sample to the gradient norm is C. Hence, the sensitivity is C. If adjacent dataset is defined as replacing one sample, then the sensitivity would be 2C instead.

The explanation is available in the comment section of this video: https://www.bilibili.com/video/BV1r44y1u7iP/

You mean adjacent dataset has mutiple definition? Is there any paper support?

Toby-Kang commented 8 months ago

You can refer to the lecture notes available below in the reference. At page 4 there is a discussion of two different types of neighbouring dataset:

Finally, we will generally use the notion “neighbouring datasets” where one point in X is changed arbitrarily to obtain X0. This is sometimes called “bounded” differential privacy, in contrast to “unbounded” differential privacy, where a point is either added or removed. In theory, these notions are equivalent up to a factor of 2, as an arbitrary change can be performed by removing one point and adding another.

Reference: G. Kamath. CS860 Lecture 3: Intro to Differential Privacy, Part 1[Lecture notes]. Available: http://www.gautamkamath.com/CS860notes/lec3.pdf

wenzhu23333 commented 8 months ago

Thank you very much for your clear explanation; I now understand. It appears that there may be two definitions of sensitivity, and I guess the author primarily wanted to emphasize the characteristic of sensitivity being "bounded." As for a specific, strict definition, it seems there is no unified consensus at the moment.