pytorch / opacus

Training PyTorch models with differential privacy
https://opacus.ai
Apache License 2.0
1.67k stars 332 forks source link

AttributeError: 'DPMultiheadAttention' object has no attribute 'batch_first' #512

Closed Simha55 closed 9 months ago

Simha55 commented 1 year ago

🐛 Bug

Hi there is a bug with model.eval() and DPMultiheadAttention. I have attached the colab link: https://colab.research.google.com/drive/1fwK-5o5EUJEIWQMRg0m__cIyCy5_J-ij?usp=sharing Please check the def testing(model, bs, criterion, data): function in the code if model.eval() is commented then it is not showing error else it is showing error. please check this.

Please reproduce using our template Colab and post here the link

To Reproduce

:warning: We cannot help you without you sharing reproducible code. Do not ignore this part :) Steps to reproduce the behavior:

1. 2. 3.

Expected behavior

Environment

Please copy and paste the output from our environment collection script (or fill out the checklist below manually).

You can get the script and run it with:

wget https://raw.githubusercontent.com/pytorch/pytorch/master/torch/utils/collect_env.py
# For security purposes, please check the contents of collect_env.py before running it.
python collect_env.py

Additional context

karthikprasad commented 1 year ago

Thanks for reporting the bug @Simha55, I'll take a look at this and get back to you.

Simha55 commented 1 year ago

Hi @karthikprasad, Thanks for your message. I would like to know if the issue is resolved. Thank you.

karthikprasad commented 1 year ago

Hello @Simha55. Thanks for following up. The issue is not yet resolved; Opacus's DPMHA doesn't yet support batch_first attribute, and there is no reason not to. I have added the tracker on this issue and we will be fixing it soon (before the next release)

HuanyuZhang commented 9 months ago

Closed this issue, since we launched fixes in PR (https://github.com/pytorch/opacus/pull/598). Btw, it is also feasible to use (https://github.com/lxuechen/private-transformers) for transformers in hugging face.