microsoft / LLaVA-Med

Large Language-and-Vision Assistant for Biomedicine, built towards multimodal GPT-4 level capabilities.
Other
1.58k stars 201 forks source link

How to start a model worker using multiple GPUs?And where is the "--num-gpus"? #82

Open WZA-GH opened 4 months ago

WZA-GH commented 4 months ago

We tried to launch the model worker on a machine with multiple RTX3090 GPUs, but we couldn't use this command python -m llava.serve.model_worker --host 0.0.0.0 --controller http://localhost:10000 --port 40000 --worker http://localhost:40000 --model-path microsoft/llava-med-v1.5-mistral-7b --multi-modal --num-gpus 2. But after running it we got an error: TypeError: LlavaMistralForCausalLM.__init__() got an unexpected keyword argument 'num_gpus'.

thedaffodil commented 4 months ago

did you solve the problem?