hiyouga / LLaMA-Factory

Unify Efficient Fine-Tuning of 100+ LLMs
Apache License 2.0
25.52k stars 3.16k forks source link

请问支持模型并行吗?如果我想要在48GB*8显卡上全量微调llama3-8b,怎么设置呢? #4596

Closed qy1026 closed 5 days ago

qy1026 commented 5 days ago

Reminder

System Info

None

Reproduction

None

Expected behavior

No response

Others

No response

hiyouga commented 5 days ago

用 deepspeed zero3 或者 fsdp