pytorch / opacus

Training PyTorch models with differential privacy
https://opacus.ai
Apache License 2.0
1.65k stars 328 forks source link

Per sample gradient is not initialized. Not updated in backward pass? Any Solution??Can anyone post a link to the Colab notebook that has the solution implemented? #560

Closed Hafizamariaiqbal closed 1 year ago

Hafizamariaiqbal commented 1 year ago

🚀 Feature

Motivation

Pitch

Alternatives

Additional context

alexandresablayrolles commented 1 year ago

Can you provide a minimal reproducing code?

alexandresablayrolles commented 1 year ago

Thanks. You have to replace this line:

eps, best_alpha = self.privacy_engine.accountant.get_privacy_spent(
            delta=self.delta
        )

With this:

eps = self.privacy_engine.accountant.get_epsilon(
            delta=self.delta
        )
Hafizamariaiqbal commented 1 year ago

Thanks. You have to replace this line:

eps, best_alpha = self.privacy_engine.accountant.get_privacy_spent(
            delta=self.delta
        )

With this:

eps = self.privacy_engine.accountant.get_epsilon(
            delta=self.delta
        )

Thank you so much for Help. Now code is working.

Hafizamariaiqbal commented 1 year ago

Thanks. You have to replace this line:

eps, best_alpha = self.privacy_engine.accountant.get_privacy_spent(
            delta=self.delta
        )

With this:

eps = self.privacy_engine.accountant.get_epsilon(
            delta=self.delta
        )

Thank you so much for Help. Now code is working.

One more help require if you can please. I want to represent results into a graph form.

alexandresablayrolles commented 1 year ago

Closing the issue as the problem is solved. For the graph, I think you should keep a list like this:

epsilons = []
for ...:
    eps = self.privacy_engine.accountant.get_epsilon(
            delta=self.delta
        )
    epsilons.append(eps)

And then plot using something like

plt.plot(epsilons)