microsoft / dp-transformers

Differentially-private transformers using HuggingFace and Opacus
MIT License
109 stars 20 forks source link

Integrating opacus with huggingface #45

Open SoumiDas opened 3 months ago

SoumiDas commented 3 months ago

Hello,

I was trying to see if the Opacus library can be used with the Trainer module of huggingface. I see a code snippet in the readme that says to do the callback dp_transformers.PrivacyEngineCallback in the Trainer module. However, it leads to an error that says the callback does not exist. I was unable to figure it out in the source code either.

It would be really helpful if I could be pointed to the part of the source code that has this callback or maybe guide me with any solution for the error.

Thanks!

huseyinatahaninan commented 3 months ago

Hi @SoumiDas, indeed this repo is all about integrating Opacus library with the Trainer module of HF. Have you seen these examples by any chance? https://github.com/microsoft/dp-transformers/tree/main/research/fine_tune_llm_w_qlora -- here we provide examples of DP training with the Trainer module of HF (https://github.com/microsoft/dp-transformers/blob/main/research/fine_tune_llm_w_qlora/fine-tune-dp.py). Let us know please if you have further questions.

You can see here https://github.com/microsoft/dp-transformers/blob/main/src/dp_transformers/dp_utils.py where we create OpacusDPTrainer from the HF Trainer and where the DPCallback is added.