jyhong836 / llm-dp-finetune

End-to-end codebase for finetuning LLMs (LLaMA 2, 3, etc.) with or without DP
Other
6 stars 2 forks source link

About Llama-3 Fine-tune #1

Open Zuo-Lihan opened 2 months ago

Zuo-Lihan commented 2 months ago

image

Can you help me fix it? I'll be grateful.

jyhong836 commented 2 months ago

Could you check if all packages are installed with the correct versions?

pip install transformers==4.29.0
pip install pydantic==1.10
pip install deepspeed~=0.8.3
# if fast tokenization is used
pip install tokenizers==0.13.3
pip install fastDP@git+https://github.com/jyhong836/fast-differential-privacy.git  # for `zero grad DP stage3()` error.
jyhong836 commented 2 months ago

Also, please try to load the llama3 in a separate script. I believe this is not a problem of our code. It is more likely that it is caused by the version of transformers.