pytorch / opacus

Training PyTorch models with differential privacy
https://opacus.ai
Apache License 2.0
1.65k stars 328 forks source link

make_private with clipping parameter #559

Closed lucacorbucci closed 1 month ago

lucacorbucci commented 1 year ago

🐛 Bug

Hi everyone, I was trying to use the function make_private to wrap my Pytorch training objects. I wanted to use the AdaClipDPOptimizer and I found that there is a parameter "clipping" in the make_private function to use it. I passed "adaptive" as a parameter to the make_private function but it doesn't work. Link to colab: https://colab.research.google.com/drive/1VivVsyU31onR1EAePuQQUMRzGGRgAi94?usp=sharing

Please reproduce using our template Colab and post here the link

To Reproduce

:warning: We cannot help you without you sharing reproducible code. Do not ignore this part :) Steps to reproduce the behavior:

  1. Call the make_private function setting the clipping parameter as "adaptive"

The error is TypeError: init() missing 5 required keyword-only arguments: 'target_unclipped_quantile', 'clipbound_learning_rate', 'max_clipbound', 'min_clipbound', and 'unclipped_num_std'. This happens because the AdaClipDPOptimizer is expecting other parameters. However, when the optimizer is instantiated in the _prepare_optimizer function, only a few parameters are passed.

Expected behavior

Environment

Please copy and paste the output from our environment collection script (or fill out the checklist below manually).

You can get the script and run it with:

wget https://raw.githubusercontent.com/pytorch/pytorch/master/torch/utils/collect_env.py
# For security purposes, please check the contents of collect_env.py before running it.
python collect_env.py

Additional context

alexandresablayrolles commented 1 year ago

Thanks for raising this issue! It looks like it's just a matter of passing down the parameters to the optimizer? I can take a look during the week, if you want to make a PR I'm happy to look at it as well.

lucacorbucci commented 1 year ago

Thanks for raising this issue! It looks like it's just a matter of passing down the parameters to the optimizer? I can take a look during the week, if you want to make a PR I'm happy to look at it as well.

Hi @alexandresablayrolles I'd like to fix this and to make a PR. Am I still in time?

alexandresablayrolles commented 1 year ago

Absolutely!

sebasrb09 commented 1 year ago

Hi! I'm also trying to use adaptive clipping, getting the same error. Has this issue been fixed ?

OrabeZoher commented 3 months ago

is there any update on this issue? I would like also to apply adaptive clipping!

eigengravy commented 2 months ago

hi @HuanyuZhang, I have opened a pr with the fix. please lmk if any further changes are required.