microsoft / dp-transformers

Differentially-private transformers using HuggingFace and Opacus
MIT License
109 stars 20 forks source link

No module named 'dp_transformers.layers' #46

Open MarkDeng1 opened 3 months ago

MarkDeng1 commented 3 months ago

I was trying to run fine-tune-dp.py under "research/synthetic-text-generation-with-DP" directory

Error occurs below: Traceback (most recent call last): File "/root/autodl-tmp/dp-transformers/research/synthetic-text-generation-with-DP/fine-tune-dp.py", line 14, in from dp_transformers.layers.dp_merged_linear import mark_only_lora_as_trainable ModuleNotFoundError: No module named 'dp_transformers.layers'

huseyinatahaninan commented 3 months ago

Yes, while we updated repo and improved package versions, I did not unfortunately have the time to update this folder "research/synthetic-text-generation-with-DP" and sync with the current version of the repo :( You can use the previous version v1.0.0 that should work with this research folder (https://github.com/microsoft/dp-transformers/tree/v1.0.0/src/dp_transformers/layers) but it's gonna be a bit old. The other option is to update this folder based on a similar fashion of (https://github.com/microsoft/dp-transformers/blob/main/research/fine_tune_llm_w_qlora/fine-tune-dp.py) to use the peft library from HF. I've been meaning to do that but unfortunately could not find time to do so yet. Hope this helps.