Closed jfb54 closed 7 months ago
Thanks for reporting bugs. Will fix it and #512 to make it work.
I am facing the same issue. In addition to what described in the issue description, forward()
call also lacks is_causal
parameter.
Closed this issue, since we launched fixes in PR(https://github.com/pytorch/opacus/pull/598). Btw, it is also feasible to use (https://github.com/lxuechen/private-transformers) for transformers in hugging face.
🐛 Bug
Not only is the API missing the batch_first parameter (https://github.com/pytorch/opacus/issues/512), it is missing the in_proj_weight parameter. Thus it makes it impossible to use Opacus with a transformer.
Expected behavior
Ability to access the in_proj_weight parameter so that it may be initialized.
Environment
Opacus 1.4 PyTorch 2.01