Closed lucacorbucci closed 1 month ago
Thanks for raising this issue! It looks like it's just a matter of passing down the parameters to the optimizer? I can take a look during the week, if you want to make a PR I'm happy to look at it as well.
Thanks for raising this issue! It looks like it's just a matter of passing down the parameters to the optimizer? I can take a look during the week, if you want to make a PR I'm happy to look at it as well.
Hi @alexandresablayrolles I'd like to fix this and to make a PR. Am I still in time?
Absolutely!
Hi! I'm also trying to use adaptive clipping, getting the same error. Has this issue been fixed ?
is there any update on this issue? I would like also to apply adaptive clipping!
hi @HuanyuZhang, I have opened a pr with the fix. please lmk if any further changes are required.
🐛 Bug
Hi everyone, I was trying to use the function make_private to wrap my Pytorch training objects. I wanted to use the AdaClipDPOptimizer and I found that there is a parameter "clipping" in the make_private function to use it. I passed "adaptive" as a parameter to the make_private function but it doesn't work. Link to colab: https://colab.research.google.com/drive/1VivVsyU31onR1EAePuQQUMRzGGRgAi94?usp=sharing
Please reproduce using our template Colab and post here the link
To Reproduce
The error is TypeError: init() missing 5 required keyword-only arguments: 'target_unclipped_quantile', 'clipbound_learning_rate', 'max_clipbound', 'min_clipbound', and 'unclipped_num_std'. This happens because the AdaClipDPOptimizer is expecting other parameters. However, when the optimizer is instantiated in the _prepare_optimizer function, only a few parameters are passed.
Expected behavior
Environment
Please copy and paste the output from our environment collection script (or fill out the checklist below manually).
You can get the script and run it with:
conda
,pip
, source):Additional context