pytorch / opacus

Training PyTorch models with differential privacy
https://opacus.ai
Apache License 2.0
1.67k stars 332 forks source link

GDP Accountant get_epsilon method should not pop from history #535

Closed kiddyboots216 closed 1 year ago

kiddyboots216 commented 1 year ago

🐛 Bug

get_epsilon in the GDP accountant calls self.history.pop() which for obvious reasons is terrible. You can't iteratively check epsilon and then print it at the end of training, for example.

Please reproduce using our template Colab and post here the link

To Reproduce

:warning: We cannot help you without you sharing reproducible code. Do not ignore this part :) Steps to reproduce the behavior:

1. 2. 3.

Expected behavior

Environment

Please copy and paste the output from our environment collection script (or fill out the checklist below manually).

You can get the script and run it with:

wget https://raw.githubusercontent.com/pytorch/pytorch/master/torch/utils/collect_env.py
# For security purposes, please check the contents of collect_env.py before running it.
python collect_env.py

Additional context

ffuuugor commented 1 year ago

This is very much true, thanks for bringing this to our attention.

@ashkan-software I think it'll be sufficient to replace self.history.pop() with self.history[-1]