pytorch / opacus

Training PyTorch models with differential privacy
https://opacus.ai
Apache License 2.0
1.71k stars 341 forks source link

reduce logging severity for set_to_none #471

Closed ffuuugor closed 2 years ago

ffuuugor commented 2 years ago

We want to warn users of an unexpected behaviour with set_to_none flag. Normally, both nn.Module and Optimizer let clients choose whether they want to remove the .grad attribute altogether or just set it to None. We, on the other hand, don't want to remove the attributes - it's more convenient to assume the .grad_sample attribute is always present. It's not an absolute requirement, but we did that historically and I don't see a case for changing it now.

However, default value for set_to_none is False, meaning most users are getting annoying logging notifications on every training step.

facebook-github-bot commented 2 years ago

@ffuuugor has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

ffuuugor commented 2 years ago

QQ: is there a way to make this message appear only once?

I mean, we can always set a flag for this on a module/optimizer level - do you think it's worth it?