pytorch / opacus

Training PyTorch models with differential privacy
https://opacus.ai
Apache License 2.0
1.65k stars 328 forks source link

Implementing augmentation multiplicity using Functorch [Error: AttributeError: 'Tensor' object has no attribute '_forward_counter] #626

Closed mshubhankar closed 4 months ago

mshubhankar commented 5 months ago

🐛 Bug

I am trying to implement augmentation multiplicity (as implemented in Paper 1 and Paper 2 ) using Opacus's new functionality of functorch. I am following the exact steps as pointed out by @alexandresablayrolles in #455 and #575. However, I am facing a bug in the predictions = fmodel(params, batch) line where it says that AttributeError: 'Tensor' object has no attribute '_forward_counter'. My intuition is that the attribute _forward_counter should be added to the model while calling make_functional() but there has been some code change which might be causing this.

Any help is appreciated. Thanks!

Colab reproducible link

To Reproduce

Steps to reproduce the behavior:

  1. Open colab link
  2. Run until the end till the train() function cell
  3. The bug line has been pointed out in comments

Expected behavior

I would expect the forward function to work with the new functorch functionality.

HuanyuZhang commented 4 months ago

This problem arises when using functorch on models with added hooks. However, both functorch and the hook serve the purpose of calculating per-sample gradients. Therefore, their coexistence is unnecessary. A fix is to leverage "no_op" grad_sample_module which does not add hooks:

model, optimizer, train_loader = privacy_engine.make_private(
            module=model,
            optimizer=optimizer,
            data_loader=train_loader,
            noise_multiplier=args.sigma,
            max_grad_norm=max_grad_norm,
            clipping=clipping,
            grad_sample_mode="no_op", ## avoid adding hooks
        ) 

Synced offline and proved the fix worked.