4AI / LS-LLaMA

A Simple but Powerful SOTA NER Model | Official Code For Label Supervised LLaMA Finetuning
https://arxiv.org/abs/2310.01208
MIT License
131 stars 20 forks source link

How to train with multiple GPUs #2

Closed xiaohei1001 closed 9 months ago

xiaohei1001 commented 10 months ago

Thanks for the great work! Now I have some issues, when I use multiple GPUs (RTX 3090) for training, the GPU memory overflows, while with a single GPU everything works fine. Please tell me how I can use multiple GPUs for training.

SeanLee97 commented 10 months ago

hi, @xiaohei1001. Thanks for your suggestion. We will upgrade our code later to support multiple gpus. You can also modify our code to support it by following instructions of transformers trainer https://huggingface.co/docs/transformers/perf_train_gpu_many.