Closed reanatom closed 1 year ago
Thanks for reporting this! This is because Opacus uses Poisson sampling which is necessary to ensure privacy guarantees. Poisson sampling means that the batch size becomes random, so it will not always be 32 (this is why you observe a varying x.shape[0]
). You should modify your code to take that into account.
Alternatively, you can set poisson_sampling=False
in make_private
but this is not recommended as you lose privacy guarantees (but might be useful for debugging).
Thanks, you solved my problem perfectly!
🐛 Bug
Hello, when I use opacus, there is a tensor dimension mismatch in the stage of calculating loss in the forward process, and this problem will disappear provided that you do not use the opacus framework, I found that the specific problem data is the shape of the input image, when I set the batch size of 32, The 0th dimension of x.shape will change randomly, so that the corresponding dimension cannot be matched when I calculate loss, you can see the extract function in the code, of course, maybe there is some problem with the network model, here is my code, I hope to get your answer.
model code:
main function
Please reproduce using our template Colab and post here the link
To Reproduce
1. 2. 3.
Expected behavior
Environment
Please copy and paste the output from our environment collection script (or fill out the checklist below manually).
You can get the script and run it with:
conda
,pip
, source):pipAdditional context