RupertLuo / Valley

The official repository of "Video assistant towards large language model makes everything easy"
207 stars 14 forks source link

Can I inference with Valley-13b by V100 GPUs? #9

Closed TonyXuQAQ closed 1 year ago

TonyXuQAQ commented 1 year ago

As you mentioned before, I can train Valley-13b by 16 V100 gpus with deepspeed zero3. I wonder whether I can infer with Valley-13b by V100. Does valley support multi-GPU inference?

RupertLuo commented 1 year ago

Yes,you need to set the device of model to 'auto' like this:

model = ValleyLlamaForCausalLM.from_pretrained(model_name, torch_dtype=torch.float16, device_map='auto')
TonyXuQAQ commented 1 year ago

Thanks for the prompt reply!

fightingaaa commented 12 months ago

Yes,you need to set the device of model to 'auto' like this:

model = ValleyLlamaForCausalLM.from_pretrained(model_name, torch_dtype=torch.float16, device_map='auto')

hello,By setting the device of model to 'auto' , the following error occurred. How to solve it? Thanks in advance Expected all tensors to be on the same device, but found at least two devices, cuda:1 and cuda:0!