kongds / MoRA

MoRA: High-Rank Updating for Parameter-Efficient Fine-Tuning
https://arxiv.org/abs/2405.12130
Apache License 2.0
338 stars 19 forks source link

How to run the fine-tuning using slurm? #16

Open AlessioQuercia opened 2 months ago

AlessioQuercia commented 2 months ago

Could you provide a slurm script to run the fine-tuning code? Apparently there are some issues with deepspeed, by just using the provided instructions.

kongds commented 2 months ago

Thank you for your interest in our work.

Our experiments are directly conducted with deepspeed over multiple nodes by the provided script (without slurm). Maybe you need some configs to make slurm run on multiple nodes. (Or you can run script on single node by change --num_nodes=4 to --num_nodes=1)