microsoft / dp-transformers

Differentially-private transformers using HuggingFace and Opacus
MIT License
109 stars 20 forks source link

Fine-tune LLMs using QLoRA with and without DP. #40

Closed huseyinatahaninan closed 9 months ago

huseyinatahaninan commented 9 months ago

Scripts to train LLMs using QLoRA with and without DP. It also includes sample datasets and runs to demonstrate few scenarios.