Open erjiaxiao opened 2 weeks ago
Hi @erjiaxiao ,
I did not try to run llara on multiple gpus for inference. The error log hints at some compatibility issues or hardware config issues but I'm not 100% sure. Could you confirm you are using the same version of some important packages (i.e. torch, cuda,..) as llava? I would like to test multiple GPU inference as well but unfortunately, I'm on travel now. I will try to get back to you before next weekend. Thank you for your understanding.
Best,
OK, Thank you! I will take a look at the problem. Have a good trip!
Hello @LostXine, when running llara on multiple GPUs, I encountered the following error:
However, everything works fine when I run llara on a single GPU. Are there any specific configurations required for multiple GPU usage?