Closed Simha55 closed 9 months ago
Thanks for reporting the bug @Simha55, I'll take a look at this and get back to you.
Hi @karthikprasad, Thanks for your message. I would like to know if the issue is resolved. Thank you.
Hello @Simha55. Thanks for following up.
The issue is not yet resolved; Opacus's DPMHA doesn't yet support batch_first
attribute, and there is no reason not to. I have added the tracker on this issue and we will be fixing it soon (before the next release)
Closed this issue, since we launched fixes in PR (https://github.com/pytorch/opacus/pull/598). Btw, it is also feasible to use (https://github.com/lxuechen/private-transformers) for transformers in hugging face.
🐛 Bug
Hi there is a bug with model.eval() and DPMultiheadAttention. I have attached the colab link: https://colab.research.google.com/drive/1fwK-5o5EUJEIWQMRg0m__cIyCy5_J-ij?usp=sharing Please check the def testing(model, bs, criterion, data): function in the code if model.eval() is commented then it is not showing error else it is showing error. please check this.
Please reproduce using our template Colab and post here the link
To Reproduce
1. 2. 3.
Expected behavior
Environment
Please copy and paste the output from our environment collection script (or fill out the checklist below manually).
You can get the script and run it with:
conda
,pip
, source):Additional context