microsoft / dp-transformers

Differentially-private transformers using HuggingFace and Opacus
MIT License
123 stars 22 forks source link

Bumping supported versions of HF #13

Closed donebydan closed 2 years ago

donebydan commented 2 years ago

Now supporting HF versions >=4.20.1 Minor changes needed:

  1. use_amp -> use_cuda_amp and use_cpu_amp
  2. Additional compute_loss_context_manager in training script