pytorch / opacus

Training PyTorch models with differential privacy
https://opacus.ai
Apache License 2.0
1.68k stars 335 forks source link

Invalid combnation of arguments related to empty batch #676

Open SoumiDas opened 6 days ago

SoumiDas commented 6 days ago

Hi,

I have been trying doing a DP based finetuning on a dataset using Pythia 1B model. I receive the following error at epoch 5 when I Increase the dataset size to around 1000.

TypeError: zeros() received an invalid combination of arguments - got (tuple, dtype=type), but expected one of:

  • (tuple of ints size, *, tuple of names names, torch.dtype dtype, torch.layout layout, torch.device device, bool pin_memory, bool requires_grad)
  • (tuple of ints size, *, Tensor out, torch.dtype dtype, torch.layout layout, torch.device device, bool pin_memory, bool requires_grad)

This is arising from line 60-61 of opacus/data_loader.py which checks if len(batch) > 0 and tries to collate. Where am I going wrong or what can be the workaround to it?

Please help!

P.S. The configrations I use are number of epochs = 5, training set = 1000, batch size 8, and I am using BatchManager with max_physical_batch_size as 8.

EnayatUllah commented 5 days ago

Are you using Poisson subsampling ? Also, can you use the bug report Colab so that we look at your code and reproduce the issue?