wenzhu23333 / Differential-Privacy-Based-Federated-Learning

Everything you want about DP-Based Federated Learning, including Papers and Code. (Mechanism: Laplace or Gaussian, Dataset: femnist, shakespeare, mnist, cifar-10 and fashion-mnist. )
GNU General Public License v3.0
348 stars 55 forks source link

problem of setting #16

Closed nekopalaa closed 10 months ago

nekopalaa commented 11 months ago

请问代码中的梯度裁剪阈值为什么默认设置为了10,现在主流的设置不是0.1吗?

wenzhu23333 commented 11 months ago

0.1很难训练起来,你可以试试,clip的值和模型结构和大小都有关系

nekopalaa commented 11 months ago

还有一个问题,就是本地训练epoch为什么只能设置为1,本地敏感度不是2C*lr/|D_i|吗本地多次训练还会改变敏感度吗

wenzhu23333 commented 11 months ago

会,原因请参考这篇文章 Y. Zhou, et al.,"Exploring the Practicality of Differentially Private Federated Learning: A Local Iteration Tuning Approach" in IEEE Transactions on Dependable and Secure Computing, doi: 10.1109/TDSC.2023.3325889.

nekopalaa commented 10 months ago

请问我在运行cifar数据集的时候为什么一直有这个警告: E:\anaconda\envs\py39\lib\site-packages\torch\nn\modules\module.py:1053: UserWarning: Using a non-full backward hook when the forward contains multiple autograd Nodes is deprecated and will be removed in future versions. This hook will be missing some grad_input. Please use register_full_backward_hook to get the documented behavior. warnings.warn("Using a non-full backward hook when the forward contains multiple autograd Nodes "

wenzhu23333 commented 10 months ago

请问我在运行cifar数据集的时候为什么一直有这个警告: E:\anaconda\envs\py39\lib\site-packages\torch\nn\modules\module.py:1053: UserWarning: Using a non-full backward hook when the forward contains multiple autograd Nodes is deprecated and will be removed in future versions. This hook will be missing some grad_input. Please use register_full_backward_hook to get the documented behavior. warnings.warn("Using a non-full backward hook when the forward contains multiple autograd Nodes "

这个警告与依赖库opacus有关,不影响训练,目前opacus的版本是1.1.1,迁移到高版本可修复,迁移计划中

nekopalaa commented 10 months ago

莎士比亚数据集能否完成独立同分布的划分,我发现代码里面莎士比亚数据集只能默认非独立同分布

wenzhu23333 commented 10 months ago

莎士比亚数据集能否完成独立同分布的划分,我发现代码里面莎士比亚数据集只能默认非独立同分布

可以的,我这里给出的是莎士比亚数据集noniid的一种划分方法,你也可以自己去Leaf重新Sample一份iid分布的数据集出来。