unslothai / unsloth

Finetune Llama 3.2, Mistral, Phi, Qwen 2.5 & Gemma LLMs 2-5x faster with 80% less memory
https://unsloth.ai
Apache License 2.0
18.69k stars 1.31k forks source link

Unable to load unsloth models in just single GPU in a multi GPU system #983

Open karthik-codex opened 2 months ago

karthik-codex commented 2 months ago

I have two RTX A6000 and I understand the opensource unsloth version only supports single GPU. Is there anyway I can config to use only one GPU? I currently get this error when I import unsloth and load models

image

coenvdgrinten commented 2 months ago

In the command line, you can try setting the CUDA_VISIBLE_DEVICES environment variable to a single number, e.g.: export CUDA_VISIBLE_DEVICES=0

You could look at nvidia-smi for which GPU you want to use.

karthik-codex commented 2 months ago

I still get the same error

mahiatlinux commented 2 months ago

Try using the CUDA_VISIBLE_DEVICES environment variable in the script itself.

import os
os.environ['CUDA_VISIBLE_DEVICES'] = '0'  # Use GPU 0

or

import os
os.environ['CUDA_VISIBLE_DEVICES'] = '1'  # Use GPU 1

Put it at the top of the script. By the way, the first GPU is "0" and second one is "1".

Sehyo commented 2 months ago

I have implemented a fix for this here: https://github.com/unslothai/unsloth/pull/974

danielhanchen commented 2 months ago

Will look into @Sehyo's PR and implement a fix - sorry everyone!

jlin816 commented 2 months ago

I'm still getting this issue, is there a fix planned? I patched with @Sehyo 's fix locally, but 50% of runs still fail:

trainer.train()
  File "<string>", line 154, in train
  File "<string>", line 219, in _fast_inner_training_loop
RuntimeError: Unsloth currently does not support multi GPU setups - but we are working on it!
giuliabaldini commented 4 weeks ago

Any updates on this? Also having the same problem. Would it be possible to put it as a warning to be able to ignore it?